Jethro's Braindump

Gibbs' Inequality

The relative entropy or kl divergence between two probability distributions \(P(x)\) and \(Q(x)\) defined over the same alphabet \(\mathcal{A}_X\) is:

\begin{equation} D_{\textrm{KL}}(P||Q) = \sum_{x} P(x) \log \frac{P(x)}{Q(x)} \end{equation}

Gibbs Inequality states that:

\begin{equation} D_{\textrm{KL}}(P||Q) \ge 0 \end{equation}

for any \(P\) and \(Q\).

Links to this note