# LogitBoost

In machine learning and computational learning theory, LogitBoost is a boosting algorithm formulated by Jerome Friedman, Trevor Hastie, and Robert Tibshirani. The original paper[1] casts the AdaBoost algorithm into a statistical framework. Specifically, if one considers AdaBoost as a generalized additive model and then applies the cost functional of logistic regression, one can derive the LogitBoost algorithm.

## Minimizing the LogitBoost cost function

LogitBoost can be seen as a convex optimization. Specifically, given that we seek an additive model of the form

${\displaystyle f=\sum _{t}\alpha _{t}h_{t}}$

the LogitBoost algorithm minimizes the logistic loss:

${\displaystyle \sum _{i}\log \left(1+e^{-y_{i}f(x_{i})}\right)}$

## References

1. ^ Jerome Friedman, Trevor Hastie and Robert Tibshirani. Additive logistic regression: a statistical view of boosting. Annals of Statistics 28(2), 2000. 337–407. http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.51.9525