In probability theory, Bernstein inequalities give bounds on the probability that the sum of random variables deviates from its mean. In the simplest case, let X1, ..., Xn be independent Bernoulli random variables taking values +1 and −1 with probability 1/2 (this distribution is also known as the Rademacher distribution), then for every positive ,
1. Let be independent zero-mean random variables. Suppose that almost surely, for all . Then, for all positive ,
2. Let be independent random variables. Suppose that for some positive real and every integer ,
Then
3. Let be independent random variables. Suppose that
for all integer . Denote
Then,
4. Bernstein also proved generalizations of the inequalities above to weakly dependent random variables. For example, inequality (2) can be extended as follows. be possibly non-independent random variables. Suppose that for all integer ,
Then
More general results for martingales can be found in Fan et al. (2015).[5]
Proofs
The proofs are based on an application of Markov's inequality to the random variable
(according to: S.N.Bernstein, Collected Works, Nauka, 1964)
^S.N.Bernstein, "On a modification of Chebyshev’s inequality and of the error formula of Laplace" vol. 4, #5 (original publication: Ann. Sci. Inst. Sav. Ukraine, Sect. Math. 1, 1924)
^Bernstein, S. N. (1937). "На определенных модификациях неравенства Чебишева". Doklady Akademii Nauk SSSR. 17 (6): 275–277. {{cite journal}}: Unknown parameter |trans_title= ignored (|trans-title= suggested) (help)
^S.N.Bernstein, "Theory of Probability" (Russian), Moscow, 1927
^J.V.Uspensky, "Introduction to Mathematical Probability", McGraw-Hill Book Company, 1937