From Wikipedia, the free encyclopedia
Jump to navigation Jump to search
XGBoost logo.png
Developer(s)The XGBoost Contributors
Initial releaseMarch 27, 2014; 6 years ago (2014-03-27)
Stable release
1.2.1[1] / October 13, 2020; 3 months ago (2020-10-13)
Repository Edit this at Wikidata
Written inC++
Operating systemLinux, macOS, Windows
TypeMachine learning
LicenseApache License 2.0

XGBoost[2] is an open-source software library which provides a gradient boosting framework for C++, Java, Python,[3] R,[4] Julia,[5] Perl,[6] and Scala. It works on Linux, Windows,[7] and macOS.[8] From the project description, it aims to provide a "Scalable, Portable and Distributed Gradient Boosting (GBM, GBRT, GBDT) Library". It runs on a single machine, as well as the distributed processing frameworks Apache Hadoop, Apache Spark, and Apache Flink. It has gained much popularity and attention recently as the algorithm of choice for many winning teams of machine learning competitions.[9]


XGBoost initially started as a research project by Tianqi Chen[10] as part of the Distributed (Deep) Machine Learning Community (DMLC) group. Initially, it began as a terminal application which could be configured using a libsvm configuration file. It became well known in the ML competition circles after its use in the winning solution of the Higgs Machine Learning Challenge. Soon after, the Python and R packages were built, and XGBoost now has package implementations for Java, Scala, Julia, Perl, and other languages. This brought the library to more developers and contributed to its popularity among the Kaggle community, where it has been used for a large number of competitions.[9]

It was soon integrated with a number of other packages making it easier to use in their respective communities. It has now been integrated with scikit-learn for Python users and with the caret package for R users. It can also be integrated into Data Flow frameworks like Apache Spark, Apache Hadoop, and Apache Flink using the abstracted Rabit[11] and XGBoost4J.[12] XGBoost is also available on OpenCL for FPGAs.[13] An efficient, scalable implementation of XGBoost has been published by Tianqi Chen and Carlos Guestrin.[14]


Salient features of XGBoost which make it different from other gradient boosting algorithms include:[15][16][17]


  • John Chambers Award (2016)[18]
  • High Energy Physics meets Machine Learning award (HEP meets ML) (2016)[19]

See also[edit]


  1. ^ "Release 1.2.1· dmlc/xgboost". GitHub. Retrieved 2020-08-08.
  2. ^ "GitHub project webpage".
  3. ^ "Python Package Index PYPI: xgboost". Retrieved 2016-08-01.
  4. ^ "CRAN package xgboost". Retrieved 2016-08-01.
  5. ^ "Julia package listing xgboost". Retrieved 2016-08-01.
  6. ^ "CPAN module AI::XGBoost". Retrieved 2020-02-09.
  7. ^ "Installing XGBoost for Anaconda in Windows". Retrieved 2016-08-01.
  8. ^ "Installing XGBoost on Mac OSX". Retrieved 2016-08-01.
  9. ^ a b "XGBoost - ML winning solutions (incomplete list)". Retrieved 2016-08-01.
  10. ^ "Story and Lessons behind the evolution of XGBoost". Retrieved 2016-08-01.
  11. ^ "Rabit - Reliable Allreduce and Broadcast Interface". Retrieved 2016-08-01.
  12. ^ "XGBoost4J". Retrieved 2016-08-01.
  13. ^ "XGBoost on FPGAs". Retrieved 2019-08-01.
  14. ^ Chen, Tianqi; Guestrin, Carlos (2016). "XGBoost: A Scalable Tree Boosting System". In Krishnapuram, Balaji; Shah, Mohak; Smola, Alexander J.; Aggarwal, Charu C.; Shen, Dou; Rastogi, Rajeev (eds.). Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, August 13-17, 2016. ACM. pp. 785–794. arXiv:1603.02754. doi:10.1145/2939672.2939785.
  15. ^ Gandhi, Rohith (2019-05-24). "Gradient Boosting and XGBoost". Medium. Retrieved 2020-01-04.
  16. ^ "Boosting algorithm: XGBoost". Towards Data Science. 2017-05-14. Retrieved 2020-01-04.
  17. ^ "Tree Boosting With XGBoost – Why Does XGBoost Win "Every" Machine Learning Competition?". Synced. 2017-10-22. Retrieved 2020-01-04.
  18. ^ "John Chambers Award Previous Winners". Retrieved 2016-08-01.
  19. ^ "HEP meets ML Award". Retrieved 2016-08-01.