In statistics, an additive model (AM) is a nonparametric regression method. It was suggested by Jerome H. Friedman and Werner Stuetzle (1981)[1] and is an essential part of the ACE algorithm. The AM uses a one-dimensional smoother to build a restricted class of nonparametric regression models. Because of this, it is less affected by the curse of dimensionality than e.g. a p-dimensional smoother. Furthermore, the AM is more flexible than a standard linear model, while being more interpretable than a general regression surface at the cost of approximation errors. Problems with AM include model selection, overfitting, and multicollinearity.

## Description

Given a data set ${\displaystyle \{y_{i},\,x_{i1},\ldots ,x_{ip}\}_{i=1}^{n}}$ of n statistical units, where ${\displaystyle \{x_{i1},\ldots ,x_{ip}\}_{i=1}^{n}}$ represent predictors and ${\displaystyle y_{i}}$ is the outcome, the additive model takes the form

${\displaystyle E[y_{i}|x_{i1},\ldots ,x_{ip}]=\beta _{0}+\sum _{j=1}^{p}f_{j}(x_{ij})}$

or

${\displaystyle Y=\beta _{0}+\sum _{j=1}^{p}f_{j}(X_{j})+\varepsilon }$

Where ${\displaystyle E[\epsilon ]=0}$, ${\displaystyle Var(\epsilon )=\sigma ^{2}}$ and ${\displaystyle E[f_{j}(X_{j})]=0}$. The functions ${\displaystyle f_{j}(x_{ij})}$ are unknown smooth functions fit from the data. Fitting the AM (i.e. the functions ${\displaystyle f_{j}(x_{ij})}$) can be done using the backfitting algorithm proposed by Andreas Buja, Trevor Hastie and Robert Tibshirani (1989).[2]