Jethro's Braindump

Entropy

tags
Information Theory, Gibbs’ Inequality

Definitions

The Shannon information content of an outcome x, measured in bits, is defined to be:

h(x)=log21P(x)

The entropy of an ensemble X is defined to be the average Shannon information content of an outcome:

H(X)xAXP(x)log1P(x)

Entropy is 0 when the outcome is deterministic, and maximized with value log(|AX|) when the outcomes are uniformly distributed.

The joint entropy of two ensembles X,Y is:

H(X,Y)x,yAxAyP(x,y)log1P(x,y)

Entropy is additive if the ensembles are independent:

H(X,Y)=H(X)+H(Y)

Entropy is decomposable.