# Fisher–Tippett–Gnedenko theorem

In statistics, the Fisher–Tippett–Gnedenko theorem (also the Fisher–Tippett theorem or the extreme value theorem) is a general result in extreme value theory regarding asymptotic distribution of extreme order statistics. The maximum of a sample of iid random variables after proper renormalization can only converge in distribution to one of 3 possible distributions, the Gumbel distribution, the Fréchet distribution, or the Weibull distribution. Credit for the extreme value theorem and its convergence details are given to Fréchet (1927),[1] Ronald Fisher and Leonard Henry Caleb Tippett (1928),[2] Mises (1936)[3][4] and Gnedenko (1943).[5]

The role of the extremal types theorem for maxima is similar to that of central limit theorem for averages, except that the central limit theorem applies to the average of a sample from any distribution with finite variance, while the Fisher–Tippet–Gnedenko theorem only states that if the distribution of a normalized maximum converges, then the limit has to be one of a particular class of distributions. It does not state that the distribution of the normalized maximum does converge.

## Statement

Let ${\displaystyle X_{1},X_{2},\ldots ,X_{n}}$ be a sequence of independent and identically-distributed random variables with cumulative distribution function ${\displaystyle F}$. Suppose that there exist two sequences of real numbers ${\displaystyle a_{n}>0}$ and ${\displaystyle b_{n}\in \mathbb {R} }$ such that the following limits converge to a non-degenerate distribution function:

${\displaystyle \lim _{n\to \infty }P\left({\frac {\max\{X_{1},\dots ,X_{n}\}-b_{n}}{a_{n}}}\leq x\right)=G(x)}$,

or equivalently:

${\displaystyle \lim _{n\to \infty }F^{n}\left(a_{n}x+b_{n}\right)=G(x)}$.

In such circumstances, the limit distribution ${\displaystyle G}$ belongs to either the Gumbel, the Fréchet or the Weibull family.[6]

In other words, if the limit above converges we will have ${\displaystyle G(x)}$ assume the form:[7]

${\displaystyle G_{\gamma }(x)=\exp \left(-(1+\gamma \,x)^{-1/\gamma }\right),\;\;1+\gamma \,x_{a,b}>0}$

or else

${\displaystyle G_{0}(x)=\exp \left(-\exp(-x)\right)}$

for some parameter ${\displaystyle \gamma .}$ This is the cumulative distribution function of the generalized extreme value distribution (GEV) with extreme value index ${\displaystyle \gamma }$. The GEV distribution groups the Gumbel, Fréchet and Weibull distributions into a single one. Note that the second formula (the Gumbel distribution) is the limit of the first as ${\displaystyle \gamma }$ goes to zero.

## Conditions of convergence

The Fisher–Tippett–Gnedenko theorem is a statement about the convergence of the limiting distribution ${\displaystyle G(x)}$ above. The study of conditions for convergence of ${\displaystyle G}$ to particular cases of the generalized extreme value distribution began with Mises, R. (1936)[3][5][4] and was further developed by Gnedenko, B. V. (1943).[5]

Let ${\displaystyle F}$ be the distribution function of ${\displaystyle X}$, and ${\displaystyle X_{1},\dots ,X_{n}}$ an i.i.d. sample thereof. Also let ${\displaystyle x^{*}}$ be the populational maximum, i.e. ${\displaystyle x^{*}=\sup\{x\mid F(x)<1\}}$. The limiting distribution of the normalized sample maximum, given by ${\displaystyle G}$ above, will then be:[7]

• A Fréchet distribution (${\displaystyle \gamma >0}$) if and only if ${\displaystyle x^{*}=\infty }$ and ${\displaystyle \lim _{t\rightarrow \infty }{\frac {1-F(ut)}{1-F(t)}}=u^{-1/|\gamma |}}$ for all ${\displaystyle u>0}$.
This corresponds to what we may call a heavy tail. In this case, possible sequences that will satisfy the theorem conditions are ${\displaystyle b_{n}=0}$ and ${\displaystyle a_{n}=F^{-1}\left(1-{\frac {1}{n}}\right)}$.
• A Gumbel distribution (${\displaystyle \gamma =0}$), with ${\displaystyle x^{*}}$ finite or infinite, if and only if ${\displaystyle \lim _{t\rightarrow 0^{-}}{\frac {1-F(t+uf(t))}{1-F(t)}}=e^{-u}}$ for all ${\displaystyle u>0}$ with ${\displaystyle f(t):={\frac {\int _{t}^{x^{*}}1-F(s)ds}{1-F(t)}}}$.
Possible sequences here are ${\displaystyle b_{n}=F^{-1}\left(1-{\frac {1}{n}}\right)}$ and ${\displaystyle a_{n}=f\left(F^{-1}\left(1-{\frac {1}{n}}\right)\right)}$.
• A Weibull distribution (${\displaystyle \gamma <0}$) if and only if ${\displaystyle x^{*}}$ is finite and ${\displaystyle \lim _{t\rightarrow 0^{+}}{\frac {1-F(x^{*}-ut)}{1-F(x^{*}-t)}}=u^{1/|\gamma |}}$ for all ${\displaystyle u>0}$.
Possible sequences here are ${\displaystyle b_{n}=x^{*}}$ and ${\displaystyle a_{n}=x^{*}-F^{-1}\left(1-{\frac {1}{n}}\right)}$.

## Examples

### Fréchet distribution

If we take the Cauchy distribution

${\displaystyle f(x)=(\pi ^{2}+x^{2})^{-1}}$

the cumulative distribution function is:

${\displaystyle F(x)=1/2+{\frac {1}{\pi }}\arctan(s/\pi )}$

${\displaystyle 1-F(x)}$ is asymptotic to ${\displaystyle 1/x,}$ or

${\displaystyle \ln F(x)\sim -1/x}$

and we have

${\displaystyle \ln F(x)^{n}=-n\ln F(x)\sim -n/x.}$

Thus we have

${\displaystyle F(x)^{n}\approx \exp(-n/x)}$

and letting ${\displaystyle u=x/n-1}$ (and skipping some explanation)

${\displaystyle \lim _{n\to \infty }F(nu+n)^{n}=\exp(-(1+u)^{-1})=G_{1}(u)}$

for any ${\displaystyle u.}$ The expected maximum value therefore goes up linearly with n.

### Gumbel distribution

Let us take the normal distribution with cumulative distribution function

${\displaystyle F(x)={\frac {1}{2}}{\text{erfc}}(-x/{\sqrt {2}}).}$

We have

${\displaystyle \ln F(x)\sim -{\frac {\exp(-x^{2}/2)}{{\sqrt {2\pi }}x}}}$

and

${\displaystyle \ln F(x)^{n}=-n\ln F(x)\sim -{\frac {n\exp(-x^{2}/2)}{{\sqrt {2\pi }}x}}.}$

Thus we have

${\displaystyle F(x)^{n}\approx \exp \left(-{\frac {n\exp(-x^{2}/2)}{{\sqrt {2\pi }}x}}\right).}$

If we define ${\displaystyle b_{n}}$ as the value that satisfies

${\displaystyle {\frac {n\exp(-b_{n}^{2}/2)}{{\sqrt {2\pi }}b_{n}}}=1}$

then around ${\displaystyle x=b_{n}}$

${\displaystyle {\frac {n\exp(-x^{2}/2)}{{\sqrt {2\pi }}x}}\approx \exp(b_{n}(b_{n}-x)).}$

As n increases, this becomes a good approximation for a wider and wider range of ${\displaystyle b_{n}(b_{n}-x)}$ so letting ${\displaystyle u=b_{n}(b_{n}-x)}$ we find that

${\displaystyle \lim _{n\to \infty }F(u/b_{n}+b_{n})^{n}=\exp(-\exp(-u))=G_{0}(u).}$

We can see that ${\displaystyle \ln b_{n}\sim (\ln \ln n)/2}$ and then

${\displaystyle b_{n}\sim {\sqrt {2\ln n}}}$

so the maximum is expected to climb ever more slowly toward infinity.

### Weibull distribution

We may take the simplest example, a uniform distribution between 0 and 1, with cumulative distribution function

${\displaystyle F(x)=x}$ from 0 to 1.

Approaching 1 we have

${\displaystyle \ln F(x)^{n}=n\ln F(x)\sim -n(1-x).}$

Then

${\displaystyle F(x)^{n}\approx \exp(nx-n).}$

Letting ${\displaystyle u=1+n(1-x)}$ we have

${\displaystyle \lim _{n\to \infty }F(u/n+1)^{n}=\exp \left(-(1-u)\right)=G_{-1}(u).}$

The expected maximum approaches 1 inversely proportionally to n.

## Notes

1. ^ Fréchet, M. (1927), "Sur la loi de probabilité de l'écart maximum", Annales de la Société Polonaise de Mathématique, 6 (1): 93–116
2. ^ Fisher, R.A.; Tippett, L.H.C. (1928), "Limiting forms of the frequency distribution of the largest and smallest member of a sample", Proc. Camb. Phil. Soc., 24 (2): 180–190, Bibcode:1928PCPS...24..180F, doi:10.1017/s0305004100015681
3. ^ a b Mises, R. von (1936). "La distribution de la plus grande de n valeurs". Rev. Math. Union Interbalcanique 1: 141–160.
4. ^ a b Falk, Michael; Marohn, Frank (1993). "Von Mises conditions revisited". The Annals of Probability: 1310–1328.
5. ^ a b c Gnedenko, B.V. (1943), "Sur la distribution limite du terme maximum d'une serie aleatoire", Annals of Mathematics, 44 (3): 423–453, doi:10.2307/1968974, JSTOR 1968974
6. ^ Mood, A.M. (1950). "5. Order Statistics". Introduction to the theory of statistics. New York, NY, US: McGraw-Hill. pp. 251–270.
7. ^ a b Haan, Laurens; Ferreira, Ana (2007). Extreme value theory: an introduction. Springer.