# xgboost
**Repository Path**: mirrors/xgboost
## Basic Information
- **Project Name**: xgboost
- **Description**: XGBoost是"极端梯度提升"(eXtreme Gradient Boosting)的简称
- **Primary Language**: C
- **License**: Apache-2.0
- **Default Branch**: master
- **Homepage**: None
- **GVP Project**: No
## Statistics
- **Stars**: 16
- **Forks**: 2
- **Created**: 2017-06-05
- **Last Updated**: 2025-08-09
## Categories & Tags
**Categories**: mathlibs
**Tags**: None
## README
eXtreme Gradient Boosting
===========
[](https://buildkite.com/xgboost/xgboost-ci)
[](https://github.com/dmlc/xgboost/actions)
[](https://xgboost.readthedocs.org)
[](./LICENSE)
[](https://cran.r-project.org/web/packages/xgboost)
[](https://pypi.python.org/pypi/xgboost/)
[](https://anaconda.org/conda-forge/py-xgboost)
[](https://optuna.org)
[](https://twitter.com/XGBoostProject)
[](https://api.securityscorecards.dev/projects/github.com/dmlc/xgboost)
[](https://colab.research.google.com/github/comet-ml/comet-examples/blob/master/integrations/model-training/xgboost/notebooks/how_to_use_comet_with_xgboost_tutorial.ipynb)
[Community](https://xgboost.ai/community) |
[Documentation](https://xgboost.readthedocs.org) |
[Resources](demo/README.md) |
[Contributors](CONTRIBUTORS.md) |
[Release Notes](https://xgboost.readthedocs.io/en/latest/changes/index.html)
XGBoost is an optimized distributed gradient boosting library designed to be highly ***efficient***, ***flexible*** and ***portable***.
It implements machine learning algorithms under the [Gradient Boosting](https://en.wikipedia.org/wiki/Gradient_boosting) framework.
XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way.
The same code runs on major distributed environment (Kubernetes, Hadoop, SGE, Dask, Spark, PySpark) and can solve problems beyond billions of examples.
License
-------
© Contributors, 2021. Licensed under an [Apache-2](https://github.com/dmlc/xgboost/blob/master/LICENSE) license.
Contribute to XGBoost
---------------------
XGBoost has been developed and used by a group of active community members. Your help is very valuable to make the package better for everyone.
Checkout the [Community Page](https://xgboost.ai/community).
Reference
---------
- Tianqi Chen and Carlos Guestrin. [XGBoost: A Scalable Tree Boosting System](https://arxiv.org/abs/1603.02754). In 22nd SIGKDD Conference on Knowledge Discovery and Data Mining, 2016
- XGBoost originates from research project at University of Washington.
Sponsors
--------
Become a sponsor and get a logo here. See details at [Sponsoring the XGBoost Project](https://xgboost.ai/sponsors). The funds are used to defray the cost of continuous integration and testing infrastructure (https://xgboost-ci.net).
## Open Source Collective sponsors
[](#backers) [](#sponsors)
### Sponsors
[[Become a sponsor](https://opencollective.com/xgboost#sponsor)]
### Backers
[[Become a backer](https://opencollective.com/xgboost#backer)]
