torch.nn.CrossEntropyLoss
nn.logSoftmax()对input进行softmax操作
nn.NLLLoss()将softmax后的inout与target比较
nn.CrossEntropyLoss()是nn.logSoftmax()和nn.NLLLoss()的整合
import torch
import torch.nn as nn
input=torch.Tensor([[-0.1234, -0.2345,-0.3456]])
target = torch.tensor([0])
entroy=nn.CrossEntropyLoss()
output = entroy(input, target)
print(output)
tensor(0.9916)
m = nn.LogSoftmax()
loss = nn.NLLLoss()
input=m(input)
print(input)
output = loss(input, target)
print('output:',output)
tensor([[-0.9916, -1.1027, -1.2138]])
output: tensor(0.9916)
758

被折叠的 条评论
为什么被折叠?



