# Fraction of variance unexplained

In statistics, the fraction of variance unexplained (FVU) in the context of a regression task is the fraction of variance of the regressand (dependent variable) Y which cannot be explained, i.e., which is not correctly predicted, by the explanatory variables X.

## Formal definition

Suppose we are given a regression function ${\displaystyle f}$ yielding for each ${\displaystyle y_{i}}$ an estimate ${\displaystyle {\widehat {y}}_{i}=f(x_{i})}$ where ${\displaystyle x_{i}}$ is the vector of the ith observations on all the explanatory variables. We define the fraction of variance unexplained (FVU) as:

{\displaystyle {\begin{aligned}{\text{FVU}}&={{\text{VAR}}_{\text{err}} \over {\text{VAR}}_{\text{tot}}}={{\text{SS}}_{\text{err}}/n \over {\text{SS}}_{\text{tot}}/n}={{\text{SS}}_{\text{err}} \over {\text{SS}}_{\text{tot}}}\left(=1-{{\text{SS}}_{\text{reg}} \over {\text{SS}}_{\text{tot}}},{\text{ only true in some cases such as linear regression}}\right)\\[6pt]&=1-R^{2},\end{aligned}}}

where R2 is the coefficient of determination and VARerr and VARtot are the variance of the residuals and the sample variance of the dependent variable. SSerr (the sum of squared predictions errors, equivalently the residual sum of squares), SStot (the total sum of squares), and SSreg (the sum of squares of the regression, equivalently the explained sum of squares) are given by

{\displaystyle {\begin{aligned}{\text{SS}}_{\text{err}}&=\sum _{i=1}^{N}\;(y_{i}-{\widehat {y}}_{i})^{2}\\{\text{SS}}_{\text{tot}}&=\sum _{i=1}^{N}\;(y_{i}-{\bar {y}})^{2}\\{\text{SS}}_{\text{reg}}&=\sum _{i=1}^{N}\;({\widehat {y}}_{i}-{\bar {y}})^{2}{\text{ and}}\\{\bar {y}}&={\frac {1}{N}}\sum _{i=1}^{N}\;y_{i}.\end{aligned}}}

Alternatively, the fraction of variance unexplained can be defined as follows:

${\displaystyle {\text{FVU}}={\frac {\operatorname {MSE} (f)}{\operatorname {var} [Y]}},}$

where MSE(f) is the mean squared error of the regression function ƒ.

## Explanation

It is useful to consider the second definition to understand FVU. When trying to predict Y, the most naïve regression function that we can think of is the constant function predicting the mean of Y, i.e., ${\displaystyle f(x_{i})={\bar {y}}}$. It follows that the MSE of this function equals the variance of Y; that is, SSerr = SStot, and SSreg = 0. In this case, no variation in Y can be accounted for, and the FVU then has its maximum value of 1.

More generally, the FVU will be 1 if the explanatory variables X tell us nothing about Y in the sense that the predicted values of Y do not covary with Y. But as prediction gets better and the MSE can be reduced, the FVU goes down. In the case of perfect prediction where ${\displaystyle {\hat {y}}_{i}=y_{i}}$ for all i, the MSE is 0, SSerr = 0, SSreg = SStot, and the FVU is 0.