Cauchy condensation test

From Wikipedia, the free encyclopedia
  (Redirected from Cauchy's condensation test)
Jump to: navigation, search
Not to be confused with Cauchy's convergence test.

In mathematics, the Cauchy condensation test, named after Augustin-Louis Cauchy, is a standard convergence test for infinite series. For a non-negative, non-increasing sequence of real numbers, the series converges if and only if the "condensed" series converges. Moreover, if they converge, the sum of the condensed series is no more than twice as large as the sum of the original.

Estimate[edit]

The Cauchy condensation test follows from the stronger estimate

which should be understood as an inequality of extended real numbers. The essential thrust of a proof follows, following the line of Oresme's proof of the divergence of the harmonic series.

To see the first inequality, the terms of the original series are rebracketed into runs whose lengths are powers of two, and then each run is bounded above by replacing each term by the largest term in that run: the first one, since the terms are non-increasing.

To see the second, the two series are again rebracketed into runs of power of two length, but "offset" as shown below, so that the run of which begins with lines up with the end of the run of which ends with , so that the former stays always "ahead" of the latter.

Visualization of the above argument. Partial sums of the series , , and are pictured.

Integral comparison[edit]

The "condensation" transformation recalls the integral variable substitution yielding .

Pursuing this idea, the integral test for convergence gives us, in the case of monotone f, that converges if and only if converges. The substitution yields the integral and another integral test brings us to the condensed series .

Examples[edit]

The test can be useful for series where n appears as in a denominator in f. For the most basic example of this sort, the harmonic series is transformed into the series , which clearly diverges.

As a more complex example, take

.

Here the series definitely converges for a > 1, and diverges for a < 1. When a = 1, the condensation transformation gives the series

.

The logarithms 'shift to the left'. So when a = 1, we have convergence for b > 1, divergence for b < 1. When b = 1 the value of c enters.

This result readily generalizes: the condensation test, applied repeatedly, can be used to show that for , the generalized Bertrand series

converges for and diverges for .[1] Here denotes the mth compositional iterate of a function , so that

The lower limit of the sum, , was chosen so that all terms of the series are positive. Notably, these series provide examples of infinite sums that converge or diverge arbitrarily slowly. For instance, in the case of and , the partial sum exceeds 10 only after (a googolplex) terms yet the series diverges nevertheless.

Schlömilch's Generalization[edit]

Let[2] u(n) be a strictly increasing sequence of positive integers such that the ratio of successive differences is bounded: for some N,

Then the convergence of the series is equivalent to the convergence of:

Taking so that , the Cauchy condensation test emerges as a special case.

References[edit]

  1. ^ Rudin, Walter (1976). Principles of Mathematical Analysis (PDF). New York: McGraw-Hill. pp. 62–63. ISBN 0-07-054235-X. 
  2. ^ http://people.brandeis.edu/~joyner/everytopic/LiflyandCauchyTalk.pdf, p. 7/28
  • Bonar, Khoury (2006). Real Infinite Series. Mathematical Association of America. ISBN 0-88385-745-6.

External links[edit]