Entropy
Definitions
The Shannon information content of an outcome
The entropy of an ensemble
Entropy is 0 when the outcome is deterministic, and maximized with
value
The joint entropy of two ensembles
Entropy is additive if the ensembles are independent:
Entropy is decomposable.