Jump to content

Concentration inequality

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Duananzhao (talk | contribs) at 03:20, 9 June 2011. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In mathematics, concentration inequalities provide probability bounds on how a random variable deviates from some value (e.g. its expectation). The laws of large numbers of classical probability theory state that sums of independent random variables are, under very mild conditions, close to their expectation with a large probability. Such sums are the most basic examples of random variables concentrated around their mean. Recent results shows that such behavior is shared by other functions of independent random variables.

Markov's inequality

If X is any random variable and a > 0, then

Proof can be found here.

We can extend Markov's inequality to a strictly increasing and non-negative function . We have

Chebyshev's inequality

Chebyshev's inequality is a special case of generalized Markov's inequality when

If X is any random variable and a > 0, then

Where Var(X) is the variance of X, defined as:

Asymptotic Behavior of Binomial Distribution

If a random variable X follows the binomial distribution with parameter and . The probability of getting exact successes in trials is given by the probability mass function

Let and 's are i.i.d. Bernoulli random variables with parameter . follows the binomial distribution with parameter with parameter and . Central Limit Theorem suggests when , is approximately normally distributed with mean and variance , and

For , where is a constant, the limit distribution of binomial distribution is the Poisson distribution

General Chernoff Inequality

Chernoff bound gives exponentially decreasing bounds on tail distributions of sums of independent random variables. Let denote i.i.d. random variables, satisfying , for .

we have lower tail inequality:

If satisfies , we have upper tail inequality:

If are i.i.d., and is the variance of . A typical version of Chernoff Inequality is:

Hoeffdin's Inequality

Hoeffding's inequality can be stated as follows:

If : are independent. Assume that the are almost surely bounded; that is, assume for that

Then, for the empirical mean of these variables

we have the inequalities (Hoeffding 1963, Theorem 2 [1]):


Bennett's Inequality

Let X1, … Xn be independent random variables, and assume (for simplicity but without loss of generality) they all have zero expected value. Further assume |Xi| ≤ a almost surely for all i, and let

Then for any t ≥ 0,

where h(u) = (1 + u)log(1 + u) – u.[2]


References

  1. ^ Wassily Hoeffding, Probability inequalities for sums of bounded random variables, Journal of the American Statistical Association 58 (301): 13–30, March 1963. (JSTOR)
  2. ^ Devroye, Luc; Lugosi, Gábor (2001). Combinatorial methods in density estimation. Springer. p. 11. ISBN 9780387951171.