# Empirical likelihood

Empirical likelihood (EL) is an estimation method in statistics. Empirical likelihood estimates require few assumptions about the error distribution compared to similar methods like maximum likelihood. EL can handle data well as long as it is independent and identically distributed (iid). EL performs well even when the distribution is asymmetric or censored. EL methods are also useful since they can easily incorporate constraints and prior information. Art Owen pioneered work in this area with his 1988 paper.

## Estimation procedure

EL estimates are calculated by maximizing the empirical likelihood function subject to constraints based on the estimating function and the trivial assumption that the probability weights of the likelihood function sum to 1.[1] This procedure is represented:

${\displaystyle \max _{\pi _{i},\theta }\sum _{i=1}^{n}\ln \pi _{i}}$

Subject to the constraints

${\displaystyle s.t.\sum _{i=1}^{n}\pi _{i}=1,\sum _{i=1}^{n}\pi _{i}h(y_{i};\theta )=0}$[2]

The value of the theta parameter can be found by solving the Lagrangian:

${\displaystyle {\mathcal {L}}=\sum _{i=1}^{n}\ln \pi _{i}+\mu (1-\sum _{i=1}^{n}\pi _{i})-n\tau '\sum _{i=1}^{n}\pi _{i}h(y_{i};\theta )}$[3]

There is a clear analogy between this maximization problem and the one solved for maximum entropy.