多分类任务时, tf.nn.sparse_softmax_cross_entropy_with_logits
for batchTrain in nextBatch(trainX, trainlabels, config.batchSize):
print(batchTrain[0]) # 32 *type_num
print(batchTrain[1]) # 32 * 1
# print(len(batchTrain[0]))
print(type(batchTrain[1]))
loss= tf.nn.sparse_softmax_cross_entropy_with_logits(labels=batchTrain[1], logits=batchTrain[0])
# loss = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=tf.reshape(batchTrain[1],[-1]), logits=batchTrain[0])
print(loss)
break
其中
batchTrain[0]维度为32*type_num
[[0. 0. 0. ... 0. 0. 0.]
[0. 0. 0. ... 0. 0. 0.]
[0. 0. 0. ... 0. 0. 0.]
...
[0. 0. 0. ... 0. 0. 0.]
[0. 0. 0. ... 0. 0. 0.]
[0. 0. 0. ... 0. 0. 0.]]
batchTrain[1]的维度为32*1
[[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]
[8]]
求损失函数时,labels应比logits少一个维度,上面两者维度都为2
loss= tf.nn.sparse_softmax_cross_entropy_with_logits(labels=batchTrain[1], logits=batchTrain[0])
改为
loss = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=tf.reshape(batchTrain[1],[-1]), logits=batchTrain[0])