Jump to content

Bernstein inequalities (probability theory)

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by WikiCleanerBot (talk | contribs) at 04:22, 9 September 2023 (v2.05b - Bot T20 CW#61 - Fix errors for CW project (Reference before punctuation)). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In probability theory, Bernstein inequalities give bounds on the probability that the sum of random variables deviates from its mean. In the simplest case, let X1, ..., Xn be independent Bernoulli random variables taking values +1 and −1 with probability 1/2 (this distribution is also known as the Rademacher distribution), then for every positive ,

Bernstein inequalities were proven and published by Sergei Bernstein in the 1920s and 1930s.[1][2][3][4] Later, these inequalities were rediscovered several times in various forms. Thus, special cases of the Bernstein inequalities are also known as the Chernoff bound, Hoeffding's inequality and Azuma's inequality. The martingale case of the Bernstein inequality is known as Freedman's inequality [5] and it's refinement is known as Hoeffding's inequality.[6]

Some of the inequalities

1. Let be independent zero-mean random variables. Suppose that almost surely, for all Then, for all positive ,

2. Let be independent zero-mean random variables. Suppose that for some positive real and every integer ,

Then

3. Let be independent zero-mean random variables. Suppose that

for all integer Denote

Then,

4. Bernstein also proved generalizations of the inequalities above to weakly dependent random variables. For example, inequality (2) can be extended as follows. Let be possibly non-independent random variables. Suppose that for all integers ,

Then

More general results for martingales can be found in Fan et al. (2015).[7]

Proofs

The proofs are based on an application of Markov's inequality to the random variable

for a suitable choice of the parameter .

Generalizations

The Bernstein inequality can be generalized to Gaussian random matrices. Let be a scalar where is a complex Hermitian matrix and is complex vector of size . The vector is a Gaussian vector of size . Then for any , we have

where is the vectorization operation and where is the largest eigenvalue of . The proof is detailed here.[8] Another similar inequality is formulated as

where .

See also

References

  1. ^ S.N.Bernstein, "On a modification of Chebyshev's inequality and of the error formula of Laplace" vol. 4, #5 (original publication: Ann. Sci. Inst. Sav. Ukraine, Sect. Math. 1, 1924)
  2. ^ Bernstein, S. N. (1937). "Об определенных модификациях неравенства Чебышева" [On certain modifications of Chebyshev's inequality]. Doklady Akademii Nauk SSSR. 17 (6): 275–277.
  3. ^ S.N.Bernstein, "Theory of Probability" (Russian), Moscow, 1927
  4. ^ J.V.Uspensky, "Introduction to Mathematical Probability", McGraw-Hill Book Company, 1937
  5. ^ Freedman, D.A. (1975). "On tail probabilities for martingales". Ann. Probab. 3: 100–118.
  6. ^ Fan, X.; Grama, I.; Liu, Q. (2012). "Hoeffding's inequality for supermartingales". Stochastic Process. Appl. 122: 3545–3559.
  7. ^ Fan, X.; Grama, I.; Liu, Q. (2015). "Exponential inequalities for martingales with applications". Electronic Journal of Probability. 20. Electron. J. Probab. 20: 1–22. arXiv:1311.6273. doi:10.1214/EJP.v20-3496. S2CID 119713171.
  8. ^ Ikhlef, Bechar (2009). "A Bernstein-type inequality for stochastic processes of quadratic forms of Gaussian variables". arXiv:0909.3595 [math.ST].

(according to: S.N.Bernstein, Collected Works, Nauka, 1964)

A modern translation of some of these results can also be found in Prokhorov, A.V.; Korneichuk, N.P.; Motornyi, V.P. (2001) [1994], "Bernstein inequality", Encyclopedia of Mathematics, EMS Press