tf.nn.sparse_softmax_cross_entropy_with_logits(y, tf.argmax(y_)):
y表示的是实际类别,y_表示预测结果,这实际上面是把原来的神经网络输出层的softmax和cross_entrop何在一起计算,为了追求速度。
例如:
import tensorflow as tf; import numpy as np; import matplotlib.pyplot as plt; y2 = tf.convert_to_tensor([[0, 0, 1, 0]], dtype=tf.int64) y_2 = tf.convert_to_tensor([[-2.6, -1.7, 3.2, 0.1]], dtype=tf.float32) c2 = tf.nn.sparse_softmax_cross_entropy_with_logits(y_2, tf.argmax(y2, 1)) y3 = tf.convert_to_tensor([[0, 0, 1, 0], [0, 0, 1, 0]], dtype=tf.int64) y_3 = tf.convert_to_tensor([[-2.6, -1.7, -3.2, 0.1], [-2.6, -1.7, 3.2, 0.1]], dtype=tf.float32) c3 = tf.nn.sparse_softmax_cross_entropy_with_logits(y_3, tf.argmax(y3, 1)) y4 = tf.convert_to_tensor([[0, 1, 0, 0]], dtype=tf.int64) y_4 = tf.convert_to_tensor([[-2.6, -1.7, -3.2, 0.1]], dtype=tf.float32) c4 = tf.nn.sparse_softmax_cross_entropy_with_logits(y_4, tf.argmax(y4, 1)) with tf.Session() as sess: print 'c2: ' , sess.run(c2) print 'c3: ' , sess.run(c3) print 'c4: ' , sess.run(c4)输出: c2: [ 0.05403676]