tensorflow损失函数

tf.losses.mean_squared_error(label, predict)

对label类型并不要求

>>>y_int = tf.convert_to_tensor([1,0,1], tf.int32)
>>>y = tf.convert_to_tensor([1,0,1], tf.float32)
>>>y_pred = tf.convert_to_tensor([0.2,0.3,0.4], tf.float32)
>>>m_loss = tf.losses.mean_squared_error(y_int, y_pred)
>>>m2_loss = tf.losses.mean_squared_error(y, y_pred)
>>>m3_loss = tf.reduce_mean(tf.pow(tf.subtract(y, y_pred), 2))
>>>sess = tf.Session()
>>>sess.run([m_loss, m2_loss, m3_loss])
[0.36333334, 0.36333334, 0.36333334]

tf.losses.sigmoid_cross_entropy(multi_class_labels=label, logits=predict)

二分类交叉熵,输入的值先计算sigmoid,再计算交叉熵。
对label类型并不要求与logit一致。
默认输出reduce_mean

>>>y_int = tf.convert_to_tensor([1,0,1], tf.int32)
>>>y = tf.convert_to_tensor([1,0,1], tf.float32)
>>>y_pred = tf.convert_to_tensor([0.2,0.3,0.4], tf.float32)
>>>ce_loss1 = tf.losses.sigmoid_cross_entropy(y_int, y_pred)
>>>ce_loss2 = tf.losses.sigmoid_cross_entropy(y, y_pred)
>>>ce_loss3 = -tf.reduce_mean(tf.multiply(y, tf.log(tf.sigmoid(y_pred))) + tf.multiply(1-y, tf.log(1-tf.sigmoid(y_pred))))
>>>ce_loss4 = tf.reduce_mean(y_pred - tf.multiply(y, y_pred) + tf.log(1+tf.exp(-y_pred)))
>>>ce_loss5 = tf.reduce_mean(tf.nn.sigmoid_cross_entropy_with_logits(labels=y, logits=y_pred))
>>>sess.run([ce_loss1, ce_loss2, ce_loss3, ce_loss4, ce_loss5])
[0.6551698, 0.6551698, 0.6551698, 0.6551698, 0.6551698]

ce_loss4的求解可以参考tf.nn.sigmoid_cross_entropy_with_logits详解
tensorflow损失函数_第1张图片

tf.nn.sigmoid_cross_entropy_with_logits(labels=labels,logits=logits)

参考上面的例子。
输入是未经sigmoid层的输出。
labels的类型需要与logits一致。

tf.nn.sigmoid_cross_entropy_with_logits(labels=y, logits=logits)

权重版本的交叉熵,只针对正样本权重的交叉熵。

>>>y_int = tf.convert_to_tensor([1,0,1], tf.int32)
>>>y = tf.convert_to_tensor([1,0,1], tf.float32)
>>>y_pred = tf.convert_to_tensor([0.2,0.3,0.4], tf.float32)
>>>weight = tf.convert_to_tensor([1.0, 0.0, 1.0])
>>>weight2 = tf.convert_to_tensor([1.0, 0.0, 1.0])
>>>sess.run(tf.reduce_mean(tf.nn.weighted_cross_entropy_with_logits(y, y_pred, weight)))
0.6551698
>>>sess.run(tf.reduce_mean(tf.nn.weighted_cross_entropy_with_logits(y, y_pred, weight2)))
0.6551698

[tensorflow损失函数系列]weighted_cross_entropy_with_logits

你可能感兴趣的:(Tensorflow,tensorflow,人工智能,python)