Jump to content

Concentration inequality: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
No edit summary
No edit summary
Line 41: Line 41:
==General Chernoff Inequality==
==General Chernoff Inequality==


[[Chernoff bound]] gives exponentially decreasing bounds on tail distributions of sums of independent random variables. Let <math>X_i</math> denote ''i.i.d.'' random variables, satisfying <math>X_i \geq E(X_i)-a_i-M</math>, for <math>1 \leq i \leq n</math>.<math>X = \sum_{i=1}^n X_i</math> we have,
[[Chernoff bound]] gives exponentially decreasing bounds on tail distributions of sums of independent random variables. Let <math>X_i</math> denote ''i.i.d.'' random variables, satisfying <math>X_i \geq E(X_i)-a_i-M</math>, for <math>1 \leq i \leq n</math>.
: <math>X = \sum_{i=1}^n X_i</math> we have lower tail inequality:


: <math>
: <math>
\Pr[X \leq E(X)-\lambda]\leq e^{-\frac{\lambda^2}{2(Var(X)+\sum_{i=1}^n a_i^2+M\lambda/3)}}
\Pr[X \leq E(X)-\lambda]\leq e^{-\frac{\lambda^2}{2(Var(X)+\sum_{i=1}^n a_i^2+M\lambda/3)}}
</math>
</math>
If <math>X_i</math> satisfies <math>X_i \geq E(X_i)+a_i+M</math>, we have upper tail inequality:

: <math>
\Pr[X \leq E(X)+\lambda]\leq e^{-\frac{\lambda^2}{2(Var(X)+\sum_{i=1}^n a_i^2+M\lambda/3)}}
</math>

If <math>X_i</math> are ''i.i.d.'', <math>|X_i| \leq 1</math> and <math>\sigma^2</math> is the variance of <math>X_i</math>. A typical version of Chernoff Inequality is:
: <math>
\Pr[|X| \geq k\sigma]\leq 2e^{-k^2/4}
</math>

==Hoeffdin's Inequality==

[[Hoeffdin's inequality]] can be stated as follows:

If :<math>X_1, \dots, X_n \!</math> are independent. Assume that the <math>X_i</math> are [[almost sure]]ly bounded; that is, assume for <math>1 \leq i \leq n</math> that
:<math>\Pr(X_i \in [a_i, b_i]) = 1. \!</math>

Then, for the empirical mean of these variables

:<math>\overline{X} = (X_1 + \cdots + X_n)/n \!</math>

we have the inequalities (Hoeffding 1963, Theorem 2 <ref>Wassily Hoeffding, Probability inequalities for sums of bounded random variables, ''Journal of the American Statistical Association'' '''58''' (301): 13&ndash;30, March 1963. ([http://links.jstor.org/sici?sici=0162-1459%28196303%2958%3A301%3C13%3APIFSOB%3E2.0.CO%3B2-D JSTOR])
</ref>):

:<math>\Pr(\overline{X} - \mathrm{E}[\overline{X}] \geq t) \leq \exp \left( - \frac{2t^2n^2}{\sum_{i=1}^n (b_i - a_i)^2} \right),\!</math>
:<math>\Pr(|\overline{X} - \mathrm{E}[\overline{X}]| \geq t) \leq 2\exp \left( - \frac{2t^2n^2}{\sum_{i=1}^n (b_i - a_i)^2} \right),\!</math>

==References==


{{reflist}}


[[Category:Mathematics]]
[[Category:Mathematics]]

Revision as of 03:16, 9 June 2011

In mathematics, concentration inequalities provide probability bounds on how a random variable deviates from some value (e.g. its expectation). The laws of large numbers of classical probability theory state that sums of independent random variables are, under very mild conditions, close to their expectation with a large probability. Such sums are the most basic examples of random variables concentrated around their mean. Recent results shows that such behavior is shared by other functions of independent random variables.

Markov's inequality

If X is any random variable and a > 0, then

Proof can be found here.

We can extend Markov's inequality to a strictly increasing and non-negative function . We have

Chebyshev's inequality

Chebyshev's inequality is a special case of generalized Markov's inequality when

If X is any random variable and a > 0, then

Where Var(X) is the variance of X, defined as:

Asymptotic Behavior of Binomial Distribution

If a random variable X follows the binomial distribution with parameter and . The probability of getting exact successes in trials is given by the probability mass function

Let and 's are i.i.d. Bernoulli random variables with parameter . follows the binomial distribution with parameter with parameter and . Central Limit Theorem suggests when , is approximately normally distributed with mean and variance , and

For , where is a constant, the limit distribution of binomial distribution is the Poisson distribution

General Chernoff Inequality

Chernoff bound gives exponentially decreasing bounds on tail distributions of sums of independent random variables. Let denote i.i.d. random variables, satisfying , for .

we have lower tail inequality:

If satisfies , we have upper tail inequality:

If are i.i.d., and is the variance of . A typical version of Chernoff Inequality is:

Hoeffdin's Inequality

Hoeffdin's inequality can be stated as follows:

If : are independent. Assume that the are almost surely bounded; that is, assume for that

Then, for the empirical mean of these variables

we have the inequalities (Hoeffding 1963, Theorem 2 [1]):

References

  1. ^ Wassily Hoeffding, Probability inequalities for sums of bounded random variables, Journal of the American Statistical Association 58 (301): 13–30, March 1963. (JSTOR)