tensorflow中sparse_softmax_cross_entropy_with_logits()与softmax_cross_entropy_with_logits()的区别

InvalidArgumentError (see above for traceback): logits and labels must be same size

有时候复制别人的代码,会出现这样的错误:InvalidArgumentError (see above for traceback): logits and labels must be same size
这有可能是在定义交叉熵的时候,用错了函数

with tf.name_scope("loss"):
    xentropy = tf.nn.softmax_cross_entropy_with_logits(labels=y,                                                              logits=logits)
    loss = tf.reduce_mean(xentropy, name="loss")
with tf.name_scope("loss"):
    xentropy = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y,                                                              logits=logits)
    loss = tf.reduce_mean(xentropy, name="loss")
with tf.name_scope("loss"):
    xentropy = tf.nn.sigmoid_cross_entropy_with_logits(labels=y,                                                              logits=logits)
    loss = tf.reduce_mean(xentropy, name="loss")

你可能感兴趣的:(tensorflow)