Pinsker's inequality

From Wikipedia, the free encyclopedia
Jump to: navigation, search

In information theory, Pinsker's inequality, named after its inventor Mark Semenovich Pinsker, is an inequality that bounds the total variation distance (or statistical distance) in terms of the Kullback–Leibler divergence. The inequality is tight up to constant factors.[1]

Pinsker's inequality states that, if P and Q are two probability distributions, then

\delta(P,Q) \le \sqrt{\frac{1}{2} D_{\mathrm{KL}}(P\|Q)}

where

\delta(P,Q)=\sup \{ |P(A) - Q(A)| : A\text{ is an event to which probabilities are assigned.} \}

is the total variation distance (or statistical distance) between P and Q and

D_{\mathrm{KL}}(P\|Q) = \sum_i \ln\left(\frac{P(i)}{Q(i)}\right) P(i)\!

is the Kullback–Leibler divergence in nats.

Pinsker first proved the inequality with a worse constant. The inequality in the above form was proved independently by Kullback, Csiszár, and Kemperman.[2]

An inverse of the inequality cannot hold: for every \epsilon > 0, there are distributions with \delta(P,Q)\le\epsilon but D_{\mathrm{KL}}(P\|Q) = \infty.[3]

References[edit]

  1. ^ Csiszár, Imre; Körner, János (2011). Information Theory: Coding Theorems for Discrete Memoryless Systems. Cambridge University Press. p. 44. ISBN 9781139499989. 
  2. ^ Tsybakov, Alexandre (2009). Introduction to Nonparametric Estimation. Springer. p. 132. ISBN 9780387790527. 
  3. ^ The divergence becomes infinite whenever one of the two distributions assigns probability zero to an event while the other assigns it a nonzero probability (no matter how small); see e.g. Basu, Mitra; Ho, Tin Kam (2006). Data Complexity in Pattern Recognition. Springer. p. 161. ISBN 9781846281723. .

Additional reading[edit]

  • Thomas M. Cover and Joy A. Thomas: Elements of Information Theory, 2nd edition, Willey-Interscience, 2006
  • Nicolo Cesa-Bianchi and Gábor Lugosi: Prediction, Learning, and Games, Cambridge University Press, 2006