1.ValueError: Only call `softmax_cross_entropy_with_logits` with named arguments (labels=..., logits=..., ...)
错误代码:cross_entropy2=tf.reduce_sum(tf.nn.softmax_cross_entropy_with_logits_v2(logits, y_))
修正:cross_entropy2=tf.reduce_sum(tf.nn.softmax_cross_entropy_with_logits_v2(labels=logits, logits=y_))
2.WARNING:softmax_cross_entropy_with_logits (from tensorflow.python.ops.nn_ops) is deprecated and will be removed in a future version.
Instructions for updating:
Future major versions of TensorFlow will allow gradients to flow
into the labels input on backprop by default.
See tf.nn.softmax_cross_entropy_with_logits_v2.
错误代码:tf.nn.softmax_cross_entropy_with_logits
修正:tf.nn.softmax_cross_entropy_with_logits_v2

本文解决TensorFlow中使用softmax_cross_entropy_with_logits_v2函数时出现的两个常见错误:参数顺序错误及函数过时警告。通过正确的参数设置与更新到最新版本函数,确保代码运行无误。
414

被折叠的 条评论
为什么被折叠?



