# Partial leverage

In regression analysis, partial leverage is a measure of the contribution of the individual independent variables to the leverage of each observation. That is, if hi is the ith element of the diagonal of the hat matrix, the partial leverage is a measure of how hi changes as a variable is added to the regression model.

The partial leverage is computed as:

${\displaystyle \left(\mathrm {PL} _{j}\right)_{i}={\frac {\left(X_{j\bullet [j]}\right)_{i}^{2}}{\sum _{k=1}^{n}\left(X_{j\bullet [j]}\right)_{k}^{2}}}}$

where

j = index of independent variable
i = index of observation
Xj·[j] = residuals from regressing Xj against the remaining independent variables

Note that the partial leverage is the leverage of the ith point in the partial regression plot for the jth variable. Data points with large partial leverage for an independent variable can exert undue influence on the selection of that variable in automatic regression model building procedures.

In statistics, high-leverage points are those that are outliers with respect to the independent variables. In other words, high-leverage points have no neighbouring points in ${\displaystyle \mathbb {R} ^{p}}$ space, where p is the number of independent variables in a regression model. This makes the fitted model likely to pass close to a high leverage observation. Hence high-leverage points have the potential to cause large changes in the parameter estimates when they are deleted—i.e., to be influential points. Although an influential point will typically have high leverage, a high leverage point is not necessarily an influential point. The leverage is typically defined as the diagonal of the hat matrix, which is

${\displaystyle H=X(X'X)^{-1}X'.\,}$