# Fisher–Tippett–Gnedenko theorem

In statistics, the Fisher–Tippett–Gnedenko theorem (also the Fisher–Tippett theorem or the extreme value theorem) is a general result in extreme value theory regarding asymptotic distribution of extreme order statistics. The maximum of a sample of iid random variables after proper renormalization can only converge in distribution to one of 3 possible distributions, the Gumbel distribution, the Fréchet distribution, or the Weibull distribution. Credit for the extreme value theorem and its convergence details are given to Fréchet (1927), Fisher and Tippett (1928), Mises (1936) and Gnedenko (1943).

The role of the extremal types theorem for maxima is similar to that of central limit theorem for averages, except that the central limit theorem applies to the average of a sample from any distribution with finite variance, while the Fisher–Tippet–Gnedenko theorem only states that if the distribution of a normalized maximum converges, then the limit has to be one of a particular class of distributions. It does not state that the distribution of the normalized maximum does converge.

## Statement

Let $X_{1},X_{2},\ldots ,X_{n}$ be a sequence of independent and identically-distributed random variables with cumulative distribution function $F$ . Suppose that there exist two sequences of real numbers $a_{n}>0$ and $b_{n}\in \mathbb {R}$ such that the following limits converge to a non-degenerate distribution function:

$\lim _{n\to \infty }P\left({\frac {\max\{X_{1},\dots ,X_{n}\}-b_{n}}{a_{n}}}\leq x\right)=G(x)$ ,

or equivalently:

$\lim _{n\to \infty }F^{n}\left(a_{n}x+b_{n}\right)=G(x)$ .

In such circumstances, the limit distribution $G$ belongs to either the Gumbel, the Fréchet or the Weibull family.

In other words, if the limit above converges, then up to a linear change of coordinates $G(x)$ will assume the form:

$G_{\gamma }(x)=\exp \left(-(1+\gamma \,x)^{-1/\gamma }\right),\;\;1+\gamma \,x>0$ or else

$G_{0}(x)=\exp \left(-\exp(-x)\right)$ for some parameter $\gamma .$ This is the cumulative distribution function of the generalized extreme value distribution (GEV) with extreme value index $\gamma$ . The GEV distribution groups the Gumbel, Fréchet and Weibull distributions into a single one. Note that the second formula (the Gumbel distribution) is the limit of the first as $\gamma$ goes to zero.

## Conditions of convergence

The Fisher–Tippett–Gnedenko theorem is a statement about the convergence of the limiting distribution $G(x)$ above. The study of conditions for convergence of $G$ to particular cases of the generalized extreme value distribution began with Mises (1936) and was further developed by Gnedenko (1943).

Let $F$ be the distribution function of $X$ , and $X_{1},\dots ,X_{n}$ an i.i.d. sample thereof. Also let $x^{*}$ be the populational maximum, i.e. $x^{*}=\sup\{x\mid F(x)<1\}$ . The limiting distribution of the normalized sample maximum, given by $G$ above, will then be:

• A Fréchet distribution ($\gamma >0$ ) if and only if $x^{*}=\infty$ and $\lim _{t\rightarrow \infty }{\frac {1-F(ut)}{1-F(t)}}=u^{-1/|\gamma |}$ for all $u>0$ .
This corresponds to what is called a heavy tail. In this case, possible sequences that will satisfy the theorem conditions are $b_{n}=0$ and $a_{n}=F^{-1}\left(1-{\frac {1}{n}}\right)$ .
• A Gumbel distribution ($\gamma =0$ ), with $x^{*}$ finite or infinite, if and only if $\lim _{t\rightarrow x^{*}}{\frac {1-F(t+uf(t))}{1-F(t)}}=e^{-u}$ for all $u>0$ with $f(t):={\frac {\int _{t}^{x^{*}}1-F(s)ds}{1-F(t)}}$ .
Possible sequences here are $b_{n}=F^{-1}\left(1-{\frac {1}{n}}\right)$ and $a_{n}=f\left(F^{-1}\left(1-{\frac {1}{n}}\right)\right)$ .
• A Weibull distribution ($\gamma <0$ ) if and only if $x^{*}$ is finite and $\lim _{t\rightarrow 0^{+}}{\frac {1-F(x^{*}-ut)}{1-F(x^{*}-t)}}=u^{1/|\gamma |}$ for all $u>0$ .
Possible sequences here are $b_{n}=x^{*}$ and $a_{n}=x^{*}-F^{-1}\left(1-{\frac {1}{n}}\right)$ .

## Examples

### Fréchet distribution

For the Cauchy distribution

$f(x)=(\pi ^{2}+x^{2})^{-1}$ the cumulative distribution function is:

$F(x)=1/2+{\frac {1}{\pi }}\arctan(x/\pi )$ $1-F(x)$ is asymptotic to $1/x,$ or

$\ln F(x)\sim -1/x$ and we have

$\ln F(x)^{n}=n\ln F(x)\sim -n/x.$ Thus we have

$F(x)^{n}\approx \exp(-n/x)$ and letting $u=x/n-1$ (and skipping some explanation)

$\lim _{n\to \infty }F(nu+n)^{n}=\exp(-(1+u)^{-1})=G_{1}(u)$ for any $u.$ The expected maximum value therefore goes up linearly with n.

### Gumbel distribution

Let us take the normal distribution with cumulative distribution function

$F(x)={\frac {1}{2}}{\text{erfc}}(-x/{\sqrt {2}}).$ We have

$\ln F(x)\sim -{\frac {\exp(-x^{2}/2)}{{\sqrt {2\pi }}x}}$ and

$\ln F(x)^{n}=n\ln F(x)\sim -{\frac {n\exp(-x^{2}/2)}{{\sqrt {2\pi }}x}}.$ Thus we have

$F(x)^{n}\approx \exp \left(-{\frac {n\exp(-x^{2}/2)}{{\sqrt {2\pi }}x}}\right).$ If we define $c_{n}$ as the value that satisfies

${\frac {n\exp(-c_{n}^{2}/2)}{{\sqrt {2\pi }}c_{n}}}=1$ then around $x=c_{n}$ ${\frac {n\exp(-x^{2}/2)}{{\sqrt {2\pi }}x}}\approx \exp(c_{n}(c_{n}-x)).$ As n increases, this becomes a good approximation for a wider and wider range of $c_{n}(c_{n}-x)$ so letting $u=c_{n}(c_{n}-x)$ we find that

$\lim _{n\to \infty }F(u/c_{n}+c_{n})^{n}=\exp(-\exp(-u))=G_{0}(u).$ Equivalently,

$\lim _{n\to \infty }P{\Bigl (}{\frac {\max\{X_{1},\ldots ,X_{n}\}-c_{n}}{1/c_{n}}}\leq u{\Bigr )}=\exp(-\exp(-u))=G_{0}(u).$ We can see that $\ln c_{n}\sim (\ln \ln n)/2$ and then

$c_{n}\sim {\sqrt {2\ln n}}$ so the maximum is expected to climb ever more slowly toward infinity.

### Weibull distribution

We may take the simplest example, a uniform distribution between 0 and 1, with cumulative distribution function

$F(x)=x$ from 0 to 1.

Approaching 1 we have

$\ln F(x)^{n}=n\ln F(x)\sim -n(1-x).$ Then

$F(x)^{n}\approx \exp(nx-n).$ Letting $u=1+n(1-x)$ we have

$\lim _{n\to \infty }F(u/n+1-1/n)^{n}=\exp \left(-(1-u)\right)=G_{-1}(u).$ The expected maximum approaches 1 inversely proportionally to n.