Abel's test

From Wikipedia, the free encyclopedia
  (Redirected from Abel test)
Jump to: navigation, search

In mathematics, Abel's test (also known as Abel's criterion) is a method of testing for the convergence of an infinite series. The test is named after mathematician Niels Henrik Abel. There are two slightly different versions of Abel's test – one is used with series of real numbers, and the other is used with power series in complex analysis. Abel's uniform convergence test is a criterion for the uniform convergence of a series of functions dependent on parameters.

Abel's test in real analysis[edit]

Suppose the following statements are true:

  1. is a convergent series,
  2. {bn} is a monotone sequence, and
  3. {bn} is bounded.

Then is also convergent.

It is important to understand that this test is mainly pertinent and useful in the context of non absolutely convergent series . For absolutely convergent series, this theorem, albeit true, is almost evident.

Abel's test in complex analysis[edit]

A closely related convergence test, also known as Abel's test, can often be used to establish the convergence of a power series on the boundary of its circle of convergence. Specifically, Abel's test states that if a sequence of positive real numbers is decreasing monotonically (for n > m (for large enough n, in other words)) with

then the power series

converges everywhere on the closed unit circle, except when z = 1. Abel's test cannot be applied when z = 1, so convergence at that single point must be investigated separately. Notice that Abel's test implies in particular that the radius of convergence is at least 1. It can also be applied to a power series with radius of convergence R ≠ 1 by a simple change of variables ζ = z/R.[1] Notice that Abel's test is a generalization of the Leibniz Criterion by taking z=-1.

Proof of Abel's test: Suppose that z is a point on the unit circle, z ≠ 1. For each , we define

By multiplying this function by (1-z), we obtain>

The first summand is constant, the second converges uniformly to zero (since by assumption the sequence converges to zero). It only remains to show that the series converges. We will show this by showing that it even converges absolutely: where the last sum is a converging telescoping sum. It should be noted that the absolute value vanished because the sequence is decreasing by assumption.

Hence, the sequence converges (even uniformly) on the closed unit disc. If , we may divide by (1-z) and obtain the result.

Abel's uniform convergence test[edit]

Abel's uniform convergence test is a criterion for the uniform convergence of a series of functions or an improper integration of functions dependent on parameters. It is related to Abel's test for the convergence of an ordinary series of real numbers, and the proof relies on the same technique of summation by parts.

The test is as follows. Let {gn} be a uniformly bounded sequence of real-valued continuous functions on a set E such that gn+1(x) ≤ gn(x) for all x ∈ E and positive integers n, and let {ƒn} be a sequence of real-valued functions such that the series Σƒn(x) converges uniformly on E. Then Σƒn(x)gn(x) converges uniformly on E.

Notes[edit]

  1. ^ (Moretti, 1964, p. 91)

References[edit]

External links[edit]