# Generalized entropy index

The generalized entropy index has been proposed as a measure of income inequality in a population.[1] It is derived from information theory as a measure of redundancy in data. In information theory a measure of redundancy can be interpreted as non-randomness or data compression; thus this interpretation also applies to this index. In additional interpretation of the index is as biodiversity as entropy has also been proposed as a measure of diversity.[2]

## Formula

The formula for general entropy for real values of $\alpha$ is:

$GE(\alpha) =\begin{cases} \frac{1}{N \alpha (\alpha-1)}\sum_{i=1}^N\left[\left(\frac{y_i}{\overline{y}}\right)^\alpha - 1\right],& \alpha \ne 0, 1,\\ \frac{1}{N}\sum_{i=1}^N\frac{y_{i}}{\overline{y}}\ln\frac{y_{i}}{\overline{y}},& \alpha=1,\\ -\frac{1}{N}\sum_{i=1}^N\ln\frac{y_{i}}{\overline{y}},& \alpha=0. \end{cases}$

where N is the number of cases (e.g., households or families), $y_i$ is the income for case i and $\alpha$ is the weight given to distances between incomes at different parts of the income distribution.

A feature of the generalized entropy index is that it can be transformed into a subclass of the Atkinson index when assuming that $\epsilon=1-\alpha$ and that the social welfare function is the natural logarithm. The transformation is $A=1-e^{-GE}$. Moreover, it is the unique class of inequality measures that has the properties of the Atkinson index and which also is additive decomposable. Many popular indices, including Gini index, do not satisfy additive decomposability.[1]

Note that the generalized entropy index has several inequality statistics as special cases. For example, GE(0) is the mean log deviation, GE(1) is the Theil index, and GE(2) is half the coefficient of variation.