# Gibb's Inequality

The relative entropy or §kl_divergence between two probability distributions $$P(x)$$ and $$Q(x)$$ defined over the same alphabet $$\mathcal{A}_X$$ is:

$$D_{\textrm{KL}}(P||Q) = \sum_{x} P(x) \log \frac{P(x)}{Q(x)}$$

Gibb’s Inequality states that:

$$D_{\textrm{KL}}(P||Q) \ge 0$$

for any $$P$$ and $$Q$$.

Icon by Laymik from The Noun Project. Website built with ♥ with Org-mode, Hugo, and Netlify.