Difference between Cross Entropy and Joint Entropy

Even they are using same notation, but have different meaning.


Cross Entropy:
x ~ p, y ~ q


H(p,q) = - Sum( p(x)*log(q(y)) )


Joint Entropy:
x ~ p, y ~ q, (x,y) ~ f
H(p,q) = - sum( f(x,y) * log( f(x,y)) )


There a picture can help understanding:


Difference between Cross Entropy and Joint Entropy_第1张图片

你可能感兴趣的:(Difference between Cross Entropy and Joint Entropy)