From Wikipedia, the free encyclopedia
In probability theory, Kolmogorov's two-series theorem is a result about the convergence of random series. It follows from Kolmogorov's inequality and is used in one proof of the strong law of large numbers.
Statement of the theorem
[edit]
Let be independent random variables with expected values and variances , such that converges in and converges in . Then converges in almost surely.
Assume WLOG . Set , and we will see that with probability 1.
For every ,
Thus, for every and ,
While the second inequality is due to Kolmogorov's inequality.
By the assumption that converges, it follows that the last term tends to 0 when , for every arbitrary .
- Durrett, Rick. Probability: Theory and Examples. Duxbury advanced series, Third Edition, Thomson Brooks/Cole, 2005, Section 1.8, pp. 60–69.
- M. Loève, Probability theory, Princeton Univ. Press (1963) pp. Sect. 16.3
- W. Feller, An introduction to probability theory and its applications, 2, Wiley (1971) pp. Sect. IX.9