Keras手册记录之loss function,待更新

loss  function: https://keras.io/losses/

mean_squared_error

keras.losses.mean_squared_error(y_true, y_pred)

mean_absolute_error

keras.losses.mean_absolute_error(y_true, y_pred)

mean_absolute_percentage_error

keras.losses.mean_absolute_percentage_error(y_true, y_pred)

mean_squared_logarithmic_error

keras.losses.mean_squared_logarithmic_error(y_true, y_pred)

squared_hinge

keras.losses.squared_hinge(y_true, y_pred)

hinge

keras.losses.hinge(y_true, y_pred)

categorical_hinge

keras.losses.categorical_hinge(y_true, y_pred)

logcosh

keras.losses.logcosh(y_true, y_pred)

Logarithm of the hyperbolic cosine of the prediction error. This means that 'logcosh' works mostly like the mean squared error, but will not be so strongly affected by the occasional wildly incorrect prediction. 返回的是Tensor with one scalar loss entry per sample.

categorical_crossentropy,your targets should be in categorical format (e.g. if you have 10 classes, the target for each sample should be a 10-dimensional vector that is all-zeros except for a 1 at the index corresponding to the class of the sample).

形如 [0, 0, 0, 0, 0, 0, 0, 0, 0, 1]

keras.losses.categorical_crossentropy(y_true, y_pred)

convert integer targets into categorical targets, you can use the Keras utility,  to_categorical

from keras.utils import to_categorical categorical_labels = to_categorical(int_labels, num_classes=None)

sparse_categorical_crossentropy

keras.losses.sparse_categorical_crossentropy(y_true, y_pred)

binary_crossentropy

keras.losses.binary_crossentropy(y_true, y_pred)

kullback_leibler_divergence

keras.losses.kullback_leibler_divergence(y_true, y_pred)

poisson

keras.losses.poisson(y_true, y_pred)

cosine_proximity

keras.losses.cosine_proximity(y_true, y_pred)

 

 

 

 

 

 

你可能感兴趣的:(dl,keras)