Kullback–Leibler divergence

In probability theory and information theory, the Kullback–Leibler divergence[1][2][3] (also information divergence, information gain, or relative entropy) is a non-commutative measure of the difference between two probability distributions P and Q.

 

 

please browse http://en.wikipedia.org/wiki/Kullback-Leibler_divergence for detail

你可能感兴趣的:(Machine,Learning)