In information theory, Pinsker's inequality, named after its inventor Mark Semenovich Pinsker, is an inequality that bounds the total variation distance (or statistical distance) in terms of the Kullback-Leibler divergence. The inequality is tight up to constant factors.
Pinsker's inequality states that, if P and Q are two probability distributions, then
is the total variation distance (or statistical distance) between P and Q and
An inverse of the inequality cannot hold: There are distributions with for any but .
- Thomas M. Cover and Joy A. Thomas: Elements of Information Theory, 2nd edition, Willey-Interscience, 2006
- Nicolo Cesa-Bianchi and Gábor Lugosi: Prediction, Learning, and Games, Cambridge University Press, 2006