Asymptotic analysis

(Redirected from )

In mathematical analysis, asymptotic analysis, also known as asymptotics, is a method of describing limiting behavior.

As an illustration, suppose that we are interested in the properties of a function f(n) as n becomes very large. If f(n) = n2 + 3n, then as n becomes very large, the term 3n becomes insignificant compared to n2. The function f(n) is said to be "asymptotically equivalent to n2, as n → ∞". This is often written symbolically as f(n) ~ n2, which is read as "f(n) is asymptotic to n2".

An example of an important asymptotic result is the prime number theorem. The theorem states that if π(x) is the number of prime numbers that are less than or equal to x, then

${\displaystyle \pi (x)\sim {\frac {x}{\log x}}.}$

Definition

Formally, given functions f(x) and g(x), we define a binary relation

${\displaystyle f(x)\sim g(x)\quad ({\text{as }}x\to \infty )}$

if and only if (de Bruijn 1981, §1.4)

${\displaystyle \lim _{x\to \infty }{\frac {f(x)}{g(x)}}=1.}$

The symbol ~ is the tilde. The relation is an equivalence relation on the set of functions of x; the functions f and g are said to be asymptotically equivalent. The domain of f and g can be any set for which the limit is defined: e.g. real numbers, complex numbers, positive integers.

The same notation is also used for other ways of passing to a limit: e.g. x → 0, x ↓ 0, |x| → 0. The way of passing to the limit is often not stated explicitly, if it is clear from the context.

Although the above definition is common in the literature, it is problematic if g(x) is zero infinitely often as x goes to the limiting value. For that reason, some authors use an alternative definition. The alternative definition, in little-o notation, is that f ~ g if and only if

${\displaystyle f(x)-g(x)=o(g(x)).}$

This definition is equivalent to the prior definition if g(x) is not zero in some neighbourhood of the limiting value.[1][2]

Properties

If ${\displaystyle f\sim g}$, then

• ${\displaystyle f^{r}\sim g^{r}}$for every real r, and
• ${\displaystyle \log(f)\sim \log(g).}$

If ${\displaystyle f\sim g}$ and ${\displaystyle a\sim b}$, then

• ${\displaystyle f\times a\sim g\times b,}$ and
• ${\displaystyle f/a\sim g/b.}$

This allows asymptotically equivalent functions to be freely exchanged in many algebraic expressions.

Examples of asymptotic formulas

${\displaystyle n!\sim {\sqrt {2\pi n}}\left({\frac {n}{e}}\right)^{n}}$
—this is Stirling's approximation
For a positive integer n, the partition function, p(n), gives the number of ways of writing the integer n as a sum of positive integers, where the order of addends is not considered.
${\displaystyle p(n)\sim {\frac {1}{4n{\sqrt {3}}}}e^{\pi {\sqrt {2n/3}}}}$
The Airy function, Ai(x), is a solution of the differential equation  y''xy = 0; it has many applications in physics.
${\displaystyle \operatorname {Ai} (x)\sim {\frac {e^{-{\frac {2}{3}}x^{3/2}}}{2{\sqrt {\pi }}x^{1/4}}}}$
{\displaystyle {\begin{aligned}H_{\alpha }^{(1)}(z)&\sim {\sqrt {\frac {2}{\pi z}}}e^{i\left(z-{\frac {2\pi \alpha -\pi }{4}}\right)}\\H_{\alpha }^{(2)}(z)&\sim {\sqrt {\frac {2}{\pi z}}}e^{-i\left(z-{\frac {2\pi \alpha -\pi }{4}}\right)}\end{aligned}}}

Asymptotic expansion

An asymptotic expansion of a function f(x) is in practice an expression of that function in terms of a series, the partial sums of which do not necessarily converge, but such that taking any initial partial sum provides an asymptotic formula for f. The idea is that successive terms provide an increasingly accurate description of the order of growth of f.

In symbols, it means we have

${\displaystyle f\sim g_{1}}$

but also

${\displaystyle f-g_{1}\sim g_{2}}$

and

${\displaystyle f-g_{1}-\cdots -g_{k-1}\sim g_{k}}$

for each fixed k.

In view of the definition of the ${\displaystyle \sim }$ symbol, the last equation means

${\displaystyle f-(g_{1}+\cdots +g_{k})=o(g_{k})}$

in the little o notation, i.e.,

${\displaystyle f-(g_{1}+\cdots +g_{k})}$ is much smaller than ${\displaystyle g_{k}}$.

The relation

${\displaystyle f-g_{1}-\cdots -g_{k-1}\sim g_{k}}$ takes its full meaning if ${\displaystyle \forall k,g_{k+1}=o(g_{k})}$,

which means the ${\displaystyle g_{k}}$ form an asymptotic scale.

In that case, some authors may abusively write

${\displaystyle f\sim g_{1}+\cdots +g_{k}}$

to denote the statement

${\displaystyle f-(g_{1}+\cdots +g_{k})=o(g_{k})\,.}$

One should however be careful that this is not a standard use of the ${\displaystyle \sim }$ symbol, and that it does not correspond to the definition given in § Definition.

In the present situation, this relation ${\displaystyle g_{k}=o(g_{k-1})}$ actually follows from combining steps k and (k − 1), by subtracting ${\displaystyle f-g_{1}-\cdots -g_{k-2}=g_{k-1}+o(g_{k-1})}$ from ${\displaystyle f-g_{1}-\cdots -g_{k-2}-g_{k-1}=g_{k}+o(g_{k})}$ one gets

${\displaystyle g_{k}+o(g_{k})=o(g_{k-1})\,,}$

i.e., ${\displaystyle g_{k}=o(g_{k-1})}$.

In case the asymptotic expansion does not converge, for any particular value of the argument there will be a particular partial sum which provides the best approximation and adding additional terms will decrease the accuracy. This optimal partial sum will usually have more terms as the argument approaches the limit value.

Examples of asymptotic expansions

${\displaystyle {\frac {e^{x}}{x^{x}{\sqrt {2\pi x}}}}\Gamma (x+1)\sim 1+{\frac {1}{12x}}+{\frac {1}{288x^{2}}}-{\frac {139}{51840x^{3}}}-\cdots \ (x\rightarrow \infty )}$
${\displaystyle xe^{x}E_{1}(x)\sim \sum _{n=0}^{\infty }{\frac {(-1)^{n}n!}{x^{n}}}\ (x\rightarrow \infty )}$
${\displaystyle {\sqrt {\pi }}xe^{x^{2}}{\rm {erfc}}(x)\sim 1+\sum _{n=1}^{\infty }(-1)^{n}{\frac {(2n-1)!!}{n!(2x^{2})^{n}}}\ (x\rightarrow \infty )}$
where (2n − 1)!! is the double factorial.

Worked example

Asymptotic expansions often occur when an ordinary series is used in a formal expression that forces the taking of values outside of its domain of convergence. For example, we might start with the ordinary series

${\displaystyle {\frac {1}{1-w}}=\sum _{n=0}^{\infty }w^{n}}$

The expression on the left is valid on the entire complex plane ${\displaystyle w\neq 1}$, while the right hand side converges only for ${\displaystyle |w|<1}$. Multiplying by ${\displaystyle e^{-w/t}}$ and integrating both sides yields

${\displaystyle \int _{0}^{\infty }{\frac {e^{-w/t}}{1-w}}\,dw=\sum _{n=0}^{\infty }t^{n+1}\int _{0}^{\infty }e^{-u}u^{n}\,du}$

The integral on the left hand side can be expressed in terms of the exponential integral. The integral on the right hand side, after the substitution ${\displaystyle u=w/t}$, may be recognized as the gamma function. Evaluating both, one obtains the asymptotic expansion

${\displaystyle e^{-1/t}\operatorname {Ei} \left({\frac {1}{t}}\right)=\sum _{n=0}^{\infty }n!\;t^{n+1}}$

Here, the right hand side is clearly not convergent for any non-zero value of t. However, by keeping t small, and truncating the series on the right to a finite number of terms, one may obtain a fairly good approximation to the value of ${\displaystyle \operatorname {Ei} (1/t)}$. Substituting ${\displaystyle x=-1/t}$ and noting that ${\displaystyle \operatorname {Ei} (x)=-E_{1}(-x)}$ results in the asymptotic expansion given earlier in this article.

Asymptotic distribution

In mathematical statistics, an asymptotic distribution is a hypothetical distribution that is in a sense the "limiting" distribution of a sequence of distributions. A distribution is an ordered set of random variables

${\displaystyle Z_{i}}$

for ${\displaystyle i=1}$ to ${\displaystyle n}$, for some positive integer ${\displaystyle n}$. An asymptotic distribution allows ${\displaystyle i}$ to range without bound, that is, ${\displaystyle n}$ is infinite.

A special case of an asymptotic distribution is when the late entries go to zero—that is, the Zi go to 0 as i goes to infinity. Some instances of "asymptotic distribution" refer only to this special case.

This is based on the notion of an asymptotic function which cleanly approaches a constant value (the asymptote) as the independent variable goes to infinity; "clean" in this sense meaning that for any desired closeness epsilon there is some value of the independent variable after which the function never differs from the constant by more than epsilon.

An asymptote is a straight line that a curve approaches but never meets or crosses. Informally, one may speak of the curve meeting the asymptote "at infinity" although this is not a precise definition. In the equation

${\displaystyle y={\frac {1}{x}},}$

${\displaystyle y}$ becomes arbitrarily small in magnitude as ${\displaystyle x}$ increases.

Applications

Asymptotic analysis is used in several mathematical sciences. In statistics, asymptotic theory provides limiting approximations of the probability distribution of sample statistics, such as the likelihood ratio statistic and the expected value of the deviance. Asymptotic theory does not provide a method of evaluating the finite-sample distributions of sample statistics, however. Non-asymptotic bounds are provided by methods of approximation theory.

Examples of applications are the following.

Asymptotic analysis is a key tool for exploring the ordinary and partial differential equations which arise in the mathematical modelling of real-world phenomena.[3] An illustrative example is the derivation of the boundary layer equations from the full Navier-Stokes equations governing fluid flow. In many cases, the asymptotic expansion is in power of a small parameter, ε: in the boundary layer case, this is the nondimensional ratio of the boundary layer thickness to a typical lengthscale of the problem. Indeed, applications of asymptotic analysis in mathematical modelling often[3] center around a nondimensional parameter which has been shown, or assumed, to be small through a consideration of the scales of the problem at hand.

Asymptotic expansions typically arise in the approximation of certain integrals (Laplace's method, saddle-point method, method of steepest descent) or in the approximation of probability distributions (Edgeworth series). The Feynman graphs in quantum field theory are another example of asymptotic expansions which often do not converge.