tensorflow交叉熵损失函数-cross_entropy_with_logits 在库函数中比较

本文详细探讨了二分类与多分类任务中交叉熵损失函数的理论背景,包括sigmoid和softmax的适用场景,以及如何避免溢出和稳定计算。实例演示了SigmoidBinaryCrossEntropyLoss的实现,并展示了在多分类问题中使用softmax和mask处理的技巧。

交叉熵损失函数设计:

  • softmax_cross_entropy_with_logits_v2
  • sparse_softmax_cross_entropy_with_logits
  • softmax_cross_entropy_with_logits_v2
  • sparse_softmax_cross_entropy_with_logits
  • sigmoid_cross_entropy_with_logits[sigmoid_cross_entropy_with_logits_v2]

首先,理论上:
二分类:
在这里插入图片描述
直接用 sigmoid_cross_entropy_with_logits[或者sigmoid_cross_entropy_with_logits_v2] 就可以一个label 一个raw logist
多分类:
在这里插入图片描述
-Sum(Yt*LogYp)
其实,这个多分类功能能处理的情况包含两种情况:

  Measures the probability error in discrete classification tasks in which each
  class is independent and not mutually exclusive.  For instance, one could
  perform multilabel classification where a picture can contain both an elephant
  and a dog at the same time.
  Measures the probability error in discrete classification tasks in which the
  classes are mutually exclusive (each entry is in exactly one class).  For
  example, each CIFAR-10 image is labeled with one and only one label: an image
  can be a dog or a truck, but not both.

  **NOTE:**  While the classes are mutually exclusive, their probabilities
  need not be.  All that is required is that each row of `labels` is
  a valid probability distribution.  If they are not, the computation of the
  gradient will be incorrect.
  For brevity, let `x = logits`, `z = labels`.  The logistic loss is

        z * -log(sigmoid(x)) + (1 - z) * -log(1 - sigmoid(x))
      = z * -log(1 / (1 +
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值