tf.nn.softmax_cross_entropy_with_logits

交叉熵损失函数

  • 为什么要用交叉熵损失函数?
  • 代码说明

代码

import tensorflow as tf

#  神经网络 output
logits = tf.constant([[1.0, 2.0, 3.0], [1.0, 2.0, 3.0], [1.0, 2.0, 3.0]])

# step 1. softmax
y = tf.nn.softmax(logits)

# true label
y_ = tf.constant([[0.0, 0.0, 1.0], [0.0, 0.0, 1.0], [0.0, 0.0, 1.0]])

# step 2. cross_entropy
cross_entropy = -tf.reduce_sum(y_ * tf.log(y))

# cross_entropy just one step
# do not forget tf.reduce_sum
cross_entropy2 = tf.reduce_sum(tf.nn.softmax_cross_entropy_with_logits(logits = logits, labels = y_)) 

with tf.Session() as sess:
    softmax = sess.run(y)
    c_e = sess.run(cross_entropy)
    c_e2 = sess.run(cross_entropy2)
    print("step1: softmax result = ")
    print(softmax)
    print("step2: cross_entropy result = ")
    print(c_e)
    print("Function(softmax_cross_entropy_with_logits) result = ")
    print(c_e2)
tf.nn.softmax_cross_entropy_with_logits_第1张图片
运行结果

你可能感兴趣的:(tf.nn.softmax_cross_entropy_with_logits)