1、entropy
entropy中文叫做熵,也叫不确定性,某种程度上也叫惊喜度(measure of surprise)
=
如果p(x)采用0-1分部,那么entropy=1log1=0
而对于entropy越大,即熵越大,越不稳定,惊喜度越高
例
import torch
a=torch.full([4],1/4.)
a*torch.log2(a)
# tensor([-0.5000, -0.5000, -0.5000, -0.5000])
-(a*torch.log2(a)).sum()
#输出 tensor(2.)
输出熵为2,不确定性较大惊喜度就比较高
import torch
a=torch.tensor([0.001,0.001,0.001,0.999])
-(a*torch.log2(a)).sum()
# 输出tensor(0.0313)
此时,输出熵为0.0313,比较小,不确定性小,惊喜度很低。
2、cross entropy</