Jethro's Braindump

Gibbs' Inequality

The relative entropy or kl divergence between two probability distributions P(x) and Q(x) defined over the same alphabet AX is:

DKL(P||Q)=xP(x)logP(x)Q(x)

Gibbs Inequality states that:

DKL(P||Q)0

for any P and Q.

Links to this note