In mathematics, concentration inequalities provide probability bounds on how a random variable deviates from some value (e.g., its expectation). The laws of large numbers of classical probability theory state that sums of independent random variables are, under very mild conditions, close to their expectation with a large probability. Such sums are the most basic examples of random variables concentrated around their mean. Recent results shows that such behavior is shared by other functions of independent random variables.
If X is a nonnegative random variable and a > 0, then
Proof can be found here.
We can extend Markov's inequality to a strictly increasing and non-negative function . We have
Chebyshev's inequality is a special case of generalized Markov's inequality when
If X is any random variable and a > 0, then
Where Var(X) is the variance of X, defined as:
Asymptotic behavior of binomial distribution
Let and 's are i.i.d. Bernoulli random variables with parameter . follows the binomial distribution with parameter and . Central Limit Theorem suggests when , is approximately normally distributed with mean and variance , and
For , where is a constant, the limit distribution of binomial distribution is the Poisson distribution
General Chernoff inequality
A Chernoff bound gives exponentially decreasing bounds on tail distributions of sums of independent random variables. Let denote independent but not necessarily identical random variables, satisfying , for .
we have lower tail inequality:
If satisfies , we have upper tail inequality:
If are i.i.d., and is the variance of . A typical version of Chernoff Inequality is:
Hoeffding's inequality can be stated as follows:
If : are independent. Assume that the are almost surely bounded; that is, assume for that
Then, for the empirical mean of these variables
we have the inequalities (Hoeffding 1963, Theorem 2 ):
Then for any t ≥ 0,
Bernstein inequalities give bounds on the probability that the sum of random variables deviates from its mean. In the simplest case, let X1, ..., Xn be independent Bernoulli random variables taking values +1 and −1 with probability 1/2, then for every positive ,
The Efron–Stein inequality (or influence inequality, or MG bound on variance) bounds the variance of a general function.
Suppose that , are independent with and having the same distribution for all .
- Chung, Fan. "Old and New Concentration Inequalities" (PDF). Old and New Concentration Inequalities. Retrieved 2010.
- Wassily Hoeffding, Probability inequalities for sums of bounded random variables, Journal of the American Statistical Association 58 (301): 13–30, March 1963. (JSTOR)
- Bennett, G. (1962). "Probability Inequalities for the Sum of Independent Random Variables". Journal of the American Statistical Association 57 (297): 33–45. doi:10.2307/2282438. JSTOR 2282438.
- Devroye, Luc; Lugosi, Gábor (2001). Combinatorial methods in density estimation. Springer. p. 11. ISBN 978-0-387-95117-1.
- Fan, X.; Grama, I.; Liu, Q. (2012). "Hoeffding's inequality for supermartingales". Stochastic Processes and their Applications 122: 3545–3559. doi:10.1016/j.spa.2012.06.009.