Ratio test
Part of a series of articles about |
Calculus |
---|
In mathematics, the ratio test is a test (or "criterion") for the convergence of a series
where each term is a real or complex number and an is nonzero when n is large. The test was first published by Jean le Rond d'Alembert and is sometimes known as d'Alembert's ratio test or as the Cauchy ratio test.[1]
The test
The usual form of the test makes use of the limit
(1) |
The ratio test states that:
- if L < 1 then the series converges absolutely;
- if L > 1 then the series is divergent;
- if L = 1 or the limit fails to exist, then the test is inconclusive, because there exist both convergent and divergent series that satisfy this case.
It is possible to make the ratio test applicable to certain cases where the limit L fails to exist, if limit superior and limit inferior are used. The test criteria can also be refined so that the test is sometimes conclusive even when L = 1. More specifically, let
- .
Then the ratio test states that:[2][3]
- if R < 1, the series converges absolutely;
- if r > 1, the series diverges;
- if for all large n (regardless of the value of r), the series also diverges; this is because is nonzero and increasing and hence an does not approach zero;
- the test is otherwise inconclusive.
If the limit L in (1) exists, we must have L = R = r. So the original ratio test is a weaker version of the refined one.
Examples
Convergent because L < 1
Consider the series or sequence of series
Applying the ratio test, one computes the limit
Since this limit is less than 1, the series converges.
Divergent because L > 1
Consider the series
Putting this into the ratio test:
Thus the series diverges.
Inconclusive because L = 1
Consider the three series
The first series (1 + 1 + 1 + 1 + ⋯) diverges, the second one (the one central to the Basel problem) converges absolutely and the third one (the alternating harmonic series) converges conditionally. However, the term-by-term magnitude ratios of the three series are respectively and . So, in all three cases, one has that the limit is equal to 1. This illustrates that when L = 1, the series may converge or diverge, and hence the original ratio test is inconclusive. In such cases, more refined tests are required to determine convergence or divergence.
Proof
Below is a proof of the validity of the original ratio test.
Suppose that . We can then show that the series converges absolutely by showing that its terms will eventually become less than those of a certain convergent geometric series. To do this, let . Then r is strictly between L and 1, and for sufficiently large n (say, n greater than N). Hence for each n > N and i > 0, and so
That is, the series converges absolutely.
On the other hand, if L > 1, then for sufficiently large n, so that the limit of the summands is non-zero. Hence the series diverges.
Extensions for L = 1
This section needs expansion with: De Morgan's hierarchy and usage examples.
And proofs should be provided.. You can help by adding to it. (August 2013) |
As seen in the previous example, the ratio test may be inconclusive when the limit of the ratio is 1. Extensions to ratio test, however, sometimes allows one to deal with this case. For instance, the aforementioned refined version of the test handles the case
Below are some other extensions. In all the tests below we assume that Σan is a sum with positive an.
Raabe's test
This extension is due to Joseph Ludwig Raabe. It states that if
then the series will be convergent if R > 1 (this includes the case R = ∞) and divergent if R < 1.[4] If R = 1, the test is inconclusive. d'Alembert's ratio test and Raabe's test are the first and second theorems in a hierarchy of such theorems due to Augustus De Morgan.[citation needed]
Proof of the validity of Raabe's test
Say . In fact we need not assume the limit exists; if then diverges, while if the sum converges.
The proof proceeds essentially by comparison with . Suppose first that . Of course if then for large , so the sum diverges; assume then that . There exists such that for all , which is to say that . Thus , which implies that for ; since this shows that diverges.
The proof of the other half is entirely analogous, with most of the inequalities simply reversed. We need a preliminary inequality to use in place of the simple that was used above: Fix and . Note that . So ; hence .
Suppose now that . Arguing as in the first paragraph, using the inequality established in the previous paragraph, we see that there exists such that for ; since this shows that converges.
Higher order tests
The next cases in de Morgan's hierarchy are Bertrand's and Gauss's test. Each test involves slightly different higher order asymptotics. Bertrand's test asserts that if
then the series converges if lim inf ρn > 1, and diverges if lim sup ρn < 1.[5]
Gauss's test asserts that if
where r > 1 and Cn is bounded, then the series converges if h > 1 and diverges if h ≤ 1.[6]
These are both special cases of Kummer's test. Let ζn be an auxiliary sequence of positive constants. Let
Then if ρ > 0, the series converges. If ρ < 0 and Σ1/ζn diverges, then the series diverges. Otherwise the test is inconclusive.[7]
Proof of Kummer's test
If then fix a positive number . There exists a natural number such that for every
Since , for every
In particular for all which means that starting from the index the sequence is monotonically decreasing and positive which in particular implies that it is bounded below by 0. Therefore the limit
- exists.
This implies that the positive telescoping series
- is convergent,
and since for all
by the direct comparison test for positive series, the series is convergent.
On the other hand, if , then there is an N such that is increasing for . In particular, there exists an for which for all , and so diverges by comparison with .
See also
Footnotes
References
- d'Alembert, J. (1768), Opuscules, vol. V, pp. 171–183.
- Apostol, Tom M. (1974), Mathematical analysis (2nd ed.), Addison-Wesley, ISBN 978-0-201-00288-1: §8.14.
- Knopp, Konrad (1956), Infinite Sequences and Series, New York: Dover publications, Inc., ISBN 0-486-60153-6: §3.3, 5.4.
- Rudin, Walter (1976), Principles of Mathematical Analysis (3rd ed.), New York: McGraw-Hill, Inc., ISBN 0-07-054235-X: §3.34.
- "Bertrand criterion", Encyclopedia of Mathematics, EMS Press, 2001 [1994]
- "Gauss criterion", Encyclopedia of Mathematics, EMS Press, 2001 [1994]
- "Kummer criterion", Encyclopedia of Mathematics, EMS Press, 2001 [1994]
- Watson, G. N.; Whittaker, E. T. (1963), A Course in Modern Analysis (4th ed.), Cambridge University Press, ISBN 0-521-58807-3: §2.36, 2.37.