该命令一般用于cost function
import tensorflow as tf
N_CLASSES = 5
labels = [1,4,0,2,3,0,1,1]
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
a = sess.run(tf.one_hot(labels,N_CLASSES))
b = sess.run(tf.argmax(a, 1))
print('a:')
print(a)
print('b:')
print(b)
cost = tf.losses.softmax_cross_entropy(a,logits,weight = 1)
以上代码最后一行不运行,得到结果如下
a:
[[ 0. 1. 0. 0. 0.]
[ 0. 0. 0. 0. 1.]
[ 1. 0. 0. 0. 0.]
[ 0. 0. 1. 0. 0.]
[ 0. 0. 0. 1. 0.]
[ 1. 0. 0. 0. 0.]
[ 0. 1. 0. 0. 0.]
[ 0. 1. 0. 0. 0.]]
b:
[1 4 0 2 3 0 1 1]
其中b演示了原路返回。
结合最后一行代码就好理解了。
最后一层输出的维数=N,此例=5。
H为交叉熵,M为batch无误,此例M=8。
from keras.utils import np_utils
N_CLASSES = 3
label = [0,0,0,1,1,1,2,2,2]
train_label = np_utils.to_categorical(label, N_CLASSES)
train_label
Out[21]:
array([[1., 0., 0.],
[1., 0., 0.],
[1., 0., 0.],
[0., 1., 0.],
[0., 1., 0.],
[0., 1., 0.],
[0., 0., 1.],
[0., 0., 1.],
[0., 0., 1.]], dtype=float32)