TensorFlow两种方式计算Cross Entropy

时间:2024-01-08 08:05:08

sparse_softmax_cross_entropy_with_logitssoftmax_cross_entropy_with_logits

import tensorflow as tf

y=tf.constant([[0.1,0.8,0.2]])
y_=tf.constant([[0,1,0]]) cross_entropy1 = tf.nn.sparse_softmax_cross_entropy_with_logits(logits=y, labels=tf.argmax(y_, 1))
cross_entropy2 = tf.nn.softmax_cross_entropy_with_logits(logits=y, labels=y_) with tf.Session() as sess:
tf.global_variables_initializer().run()
print(sess.run(cross_entropy1))
print(sess.run(cross_entropy2))

[ 0.71559191]
[ 0.71559191]

可以看出,softmax_cross_entropy_with_logits第二个参数传入的参数是原数组,而sparse_softmax_cross_entropy_with_logits传入的是原数组中为1的索引位置。