# Generalized estimating equation

In statistics, a generalized estimating equation (GEE) is used to estimate the parameters of a generalized linear model with a possible unknown correlation between outcomes.

Parameter estimates from the GEE are consistent even when the covariance structure is misspecified, under mild regularity conditions. The focus of the GEE is on estimating the average response over the population ("population-averaged" effects) rather than the regression parameters that would enable prediction of the effect of changing one or more covariates on a given individual. GEEs are usually used in conjunction with Huber–White standard error estimates, also known as "robust standard error" or "sandwich variance" estimates. In the case of a linear model with a working independence variance structure, these are known as "heteroscedasticity consistent standard error" estimators. Indeed, the GEE unified several independent formulations of these standard error estimators in a general framework.

GEEs belong to a class of regression techniques that are referred to as semiparametric because they rely on specification of only the first two moments. They are a popular alternative to the likelihood–based generalized linear mixed model which is more sensitive to variance structure specification. They are commonly used in large epidemiological studies, especially multi-site cohort studies, because they can handle many types of unmeasured dependence between outcomes.

## Formulation

Given a mean model $\mu _{ij}$ for subject $i$ and time $j$ that depends upon regression parameters $\beta _{k}$ , and variance structure, $V_{i}$ , the estimating equation is formed via:

$U(\beta )=\sum _{i=1}^{N}{\frac {\partial \mu _{i}}{\partial \beta }}V_{i}^{-1}\{Y_{i}-\mu _{i}(\beta )\}\,\!$ The parameters $\beta _{k}$ are estimated by solving $U(\beta )=0$ and are typically obtained via the Newton–Raphson algorithm. The variance structure is chosen to improve the efficiency of the parameter estimates. The Hessian of the solution to the GEEs in the parameter space can be used to calculate robust standard error estimates. The term "variance structure" refers to the algebraic form of the covariance matrix between outcomes, Y, in the sample. Examples of variance structure specifications include independence, exchangeable, autoregressive, stationary m-dependent, and unstructured. The most popular form of inference on GEE regression parameters is the Wald test using naive or robust standard errors, though the Score test is also valid and preferable when it is difficult to obtain estimates of information under the alternative hypothesis. The likelihood ratio test is not valid in this setting because the estimating equations are not necessarily likelihood equations. Model selection can be performed with the GEE equivalent of the Akaike Information Criterion (AIC), the Quasilikelihood under the Independence model Criterion (QIC).

### Relationship with Generalized Method of Moments

The generalized estimating equation is a special case of the generalized method of moments (GMM). This relationship is immediately obvious from the requirement that the score function satisfy the equation:

$\mathbb {E} [U(\beta )]={1 \over {N}}\sum _{i=1}^{N}{\frac {\partial \mu _{i}}{\partial \beta }}V_{i}^{-1}\{Y_{i}-\mu _{i}(\beta )\}\,\!=0$ ## Computation

Software for solving generalized estimating equations is available in MATLAB, SAS (proc genmod), SPSS (the gee procedure), Stata (the xtgee command), R (packages gee, geepack and multgee), and Python (package statsmodels).

Comparisons among software packages for the analysis of binary correlated data  and ordinal correlated data via GEE are available.