# Gibbs' Inequality

The relative entropy or kl divergence between two probability distributions $$P(x)$$ and $$Q(x)$$ defined over the same alphabet $$\mathcal{A}_X$$ is:

$$D_{\textrm{KL}}(P||Q) = \sum_{x} P(x) \log \frac{P(x)}{Q(x)}$$

Gibbs Inequality states that:

$$D_{\textrm{KL}}(P||Q) \ge 0$$

for any $$P$$ and $$Q$$.