The Chebyshev polynomials are two sequences of polynomials, denoted Tn(x) and Un(x). They are defined as follows. By the double angle formula,
is a polynomial in cos(θ), so define T2(x)=2x2-1. The other Tn(x) are defined similarly, using cos(nθ)=Tn(cos(θ)). Similarly, define the other sequence by sin(nθ)=Un-1(cos(θ))sin(θ), where we have used de Moivre's formula to note that sin(nθ) is sin(θ) times a polynomial in cos(θ). For instance,
gives U2(θ)=4x2-1. The Tn(x) and Un(x) are called Chebyshev polynomials of the first and second kind, respectively.
The Tn(x) are orthogonal with respect to the inner product
and Un(x) are orthogonal with respect to a different product. This follows from the fact that the Chebyschev polynomials solve the Chebyshev differential equations
The Chebyshev polynomials Tn are polynomials with the largest possible leading coefficient whose absolute value on the interval [−1,1] is bounded by 1. They are also the extremal polynomials for many other properties. Chebyshev polynomials are important in approximation theory because the roots of Tn(x), which are also called Chebyshev nodes, are used as nodes in polynomial interpolation. The resulting interpolation polynomial minimizes the problem of Runge's phenomenon and provides an approximation that is close to the polynomial of best approximation to a continuous function under the maximum norm. This approximation leads directly to the method of Clenshaw–Curtis quadrature.
These polynomials were named after Pafnuty Chebyshev,. The letter T is used because of the alternative transliterations of the name Chebyshev as Tchebycheff, Tchebyshev (French) or Tschebyschow (German).
The Chebyshev polynomials of the first kind can be defined as the unique polynomials satisfying
or, in other words, as the unique polynomials satisfying
for n = 0, 1, 2, 3, ... which is a variant (equivalent transpose) of Schröder's equation, viz. Tn(x) is functionally conjugate to nx, codified in the nesting property below. Further compare to the spread polynomials, in the section below.
That cos nx is an nth-degree polynomial in cos x can be seen by observing that cos nx is the real part of one side of de Moivre's formula. The real part of the other side is a polynomial in cos x and sin x, in which all powers of sin x are even and thus replaceable through the identity cos2x + sin2x = 1.
By the same reasoning, sin nx is the imaginary part of the polynomial, in which all powers of sin x are odd and thus, if one is factored out, the remaining can be replaced to create a (n-1)th-degree polynomial in cos x.
The identity is quite useful in conjunction with the recursive generating formula, inasmuch as it enables one to calculate the cosine of any integral multiple of an angle solely in terms of the cosine of the base angle.
Evaluating the first two Chebyshev polynomials,
one can straightforwardly determine that
and so forth.
Two immediate corollaries are the composition identity (or nesting property specifying a semigroup)
and the expression of complex exponentiation in terms of Chebyshev polynomials: given z = a + bi,
When working with Chebyshev polynomials quite often products of two of them occur. These products can be reduced to combinations of Chebyshev polynomials with lower or higher degree and concluding statements about the product are easier to make. It shall be assumed that in the following the index m is greater than or equal to the index n and n is not negative. For Chebyshev polynomials of the first kind the product expands to
For n = 1 this results in the already known recurrence formula, just arranged differently, and with n = 2 it forms the recurrence relation for all even or all odd Chebyshev polynomials (depending on the parity of the lowest m) which allows to design functions with prescribed symmetry properties. Three more useful formulas for evaluating Chebyshev polynomials can be concluded from this product expansion:
For Chebyshev polynomials of the second kind with products may be written as:
for m ≥ n.
By this, like above, with n = 2 the recurrence formula of Chebyshev polynomials of the second kind forms for both types of symmetry to
depending on whether m starts with 2 or 3.
Relations between Chebyshev polynomials of the first and second kinds
The Chebyshev polynomials of the first and second kinds correspond to a complementary pair of Lucas sequencesṼn(P,Q) and Ũn(P,Q) with parameters P = 2x and Q = 1:
It follows that they also satisfy a pair of mutual recurrence equations:
The Chebyshev polynomials of the first and second kinds are also connected by the following relations:
The recurrence relationship of the derivative of Chebyshev polynomials can be derived from these relations:
A Chebyshev polynomial of either kind with degree n has n different simple roots, called Chebyshev roots, in the interval [−1,1]. The roots of the Chebyshev polynomial of the first kind are sometimes called Chebyshev nodes because they are used as nodes in polynomial interpolation. Using the trigonometric definition and the fact that
one can show that the roots of Tn are
Similarly, the roots of Un are
The extrema of Tn on the interval −1 ≤ x ≤ 1 are located at
One unique property of the Chebyshev polynomials of the first kind is that on the interval −1 ≤ x ≤ 1 all of the extrema have values that are either −1 or 1. Thus these polynomials have only two finite critical values, the defining property of Shabat polynomials. Both the first and second kinds of Chebyshev polynomial have extrema at the endpoints, given by:
which, if evaluated as shown above, poses a problem because it is indeterminate at x = ±1. Since the function is a polynomial, (all of) the derivatives must exist for all real numbers, so the taking to limit on the expression above should yield the desired value:
where only x = 1 is considered for now. Factoring the denominator:
Since the limit as a whole must exist, the limit of the numerator and denominator must independently exist, and
The denominator (still) limits to zero, which implies that the numerator must be limiting to zero, i.e. Un − 1(1) = nTn(1) = n which will be useful later on. Since the numerator and denominator are both limiting to zero, L'Hôpital's rule applies:
The proof for x = −1 is similar, with the fact that Tn(−1) = (−1)n being important.
Indeed, the following, more general formula holds:
This latter result is of great use in the numerical solution of eigenvalue problems.
where the prime at the summation symbols means that the term contributed by k = 0 is to be halved, if it appears.
Concerning integration, the first derivative of the Tn implies that
and the recurrence relation for the first kind polynomials involving derivatives establishes that for n ≥ 2
Remark: By the Equioscillation theorem, among all the polynomials of degree ≤ n, the polynomial f minimizes ||f||∞ on [−1,1] if and only if there are n + 2 points −1 ≤ x0 < x1 < ... < xn + 1 ≤ 1 such that |f(xi)| = ||f||∞.
Of course, the null polynomial on the interval [−1,1] can be approach by itself and minimizes the ∞-norm.
Above, however, |f| reaches its maximum only n + 1 times because we are searching for the best polynomial of degree n ≥ 1 (therefore the theorem evoked previously cannot be used).
For every nonnegative integer n, Tn(x) and Un(x) are both polynomials of degree n. They are even or odd functions of x as n is even or odd, so when written as polynomials of x, it only has even or odd degree terms respectively. In fact,
The leading coefficient of Tn is 2n − 1 if 1 ≤ n, but 1 if 0 = n.
The non-smooth function (top) y = −x3H(−x), where H is the Heaviside step function, and (bottom) the 5th partial sum of its Chebyshev expansion. The 7th sum is indistinguishable from the original function at the resolution of the graph.
In the appropriate Sobolev space, the set of Chebyshev polynomials form an orthonormal basis, so that a function in the same space can, on −1 ≤ x ≤ 1 be expressed via the expansion:
Furthermore, as mentioned previously, the Chebyshev polynomials form an orthogonal basis which (among other things) implies that the coefficients an can be determined easily through the application of an inner product. This sum is called a Chebyshev series or a Chebyshev expansion.
Since a Chebyshev series is related to a Fourier cosine series through a change of variables, all of the theorems, identities, etc. that apply to Fourier series have a Chebyshev counterpart. These attributes include:
The Chebyshev polynomials form a complete orthogonal system.
The Chebyshev series converges to f(x) if the function is piecewisesmooth and continuous. The smoothness requirement can be relaxed in most cases — as long as there are a finite number of discontinuities in f(x) and its derivatives.
At a discontinuity, the series will converge to the average of the right and left limits.
The abundance of the theorems and identities inherited from Fourier series make the Chebyshev polynomials important tools in numeric analysis; for example they are the most popular general purpose basis functions used in the spectral method, often in favor of trigonometric series due to generally faster convergence for continuous functions (Gibbs' phenomenon is still a problem).
Consider the Chebyshev expansion of log(1 + x). One can express
One can find the coefficients an either through the application of an inner product or by the discrete orthogonality condition. For the inner product,
Alternatively, when you cannot evaluate the inner product of the function you are trying to approximate, the discrete orthogonality condition gives an often useful result for approximate coefficients,
where δij is the Kronecker delta function and the xk are the N Gauss–Chebyshev zeros of TN(x):
For any N, these approximate coefficients provide an exact approximation to the function at xk with a controlled error between those points. The exact coefficients are obtained with N = ∞, thus representing the function exactly at all points in [−1,1]. The rate of convergence depends on the function and its smoothness.
As an interpolant, the N coefficients of the (N − 1)th partial sum are usually obtained on the Chebyshev–Gauss–Lobatto points (or Lobatto grid), which results in minimum error and avoids Runge's phenomenon associated with a uniform grid. This collection of points corresponds to the extrema of the highest order polynomial in the sum, plus the endpoints and is given by:
Shifted Chebyshev polynomials of the first kind are defined as
When the argument of the Chebyshev polynomial is in the range of 2x − 1 ∈ [−1,1] the argument of the shifted Chebyshev polynomial is x ∈ [0,1]. Similarly, one can define shifted polynomials for generic intervals [a,b].
^Rivlin, Theodore J. The Chebyshev polynomials. Pure and Applied Mathematics. Wiley-Interscience [John Wiley & Sons], New York-London-Sydney,1974. Chapter 2, "Extremal Properties", pp. 56--123.
^Chebyshev polynomials were first presented in: Chebyshev, P. L. (1854). "Théorie des mécanismes connus sous le nom de parallélogrammes". Mémoires des Savants étrangers présentés à l’Académie de Saint-Pétersbourg. 7: 539–586.
Dette, Holger (1995). "A Note on Some Peculiar Nonlinear Extremal Phenomena of the Chebyshev Polynomials". Proceedings of the Edinburgh Mathematical Society. 38 (2): 343–355. arXiv:math/9406222. doi:10.1017/S001309150001912X.