|
10 | 10 |
|
11 | 11 | [Documentation](https://xgboost.readthedocs.org) |
|
12 | 12 | [Resources](demo/README.md) |
|
13 |
| -[Installation](https://xgboost.readthedocs.org/en/latest/build.html) | |
14 |
| -[Release Notes](NEWS.md) | |
15 |
| -[RoadMap](https://github.com/dmlc/xgboost/issues/873) |
| 13 | +[Contributors](CONTRIBUTORS.md) | |
| 14 | +[Community](https://xgboost.ai/community) | |
| 15 | +[Release Notes](NEWS.md) |
16 | 16 |
|
17 | 17 | XGBoost is an optimized distributed gradient boosting library designed to be highly ***efficient***, ***flexible*** and ***portable***.
|
18 | 18 | It implements machine learning algorithms under the [Gradient Boosting](https://en.wikipedia.org/wiki/Gradient_boosting) framework.
|
19 | 19 | XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way.
|
20 | 20 | The same code runs on major distributed environment (Hadoop, SGE, MPI) and can solve problems beyond billions of examples.
|
21 | 21 |
|
22 |
| -What's New |
23 |
| ----------- |
24 |
| -* [XGBoost GPU support with fast histogram algorithm](https://github.com/dmlc/xgboost/tree/master/plugin/updater_gpu) |
25 |
| -* [XGBoost4J: Portable Distributed XGboost in Spark, Flink and Dataflow](http://dmlc.ml/2016/03/14/xgboost4j-portable-distributed-xgboost-in-spark-flink-and-dataflow.html), see [JVM-Package](https://github.com/dmlc/xgboost/tree/master/jvm-packages) |
26 |
| -* [Story and Lessons Behind the Evolution of XGBoost](http://homes.cs.washington.edu/~tqchen/2016/03/10/story-and-lessons-behind-the-evolution-of-xgboost.html) |
27 |
| -* [Tutorial: Distributed XGBoost on AWS with YARN](https://xgboost.readthedocs.io/en/latest/tutorials/aws_yarn.html) |
28 |
| -* [XGBoost brick](NEWS.md) Release |
29 |
| - |
30 |
| -Ask a Question |
31 |
| --------------- |
32 |
| -* For reporting bugs please use the [xgboost/issues](https://github.com/dmlc/xgboost/issues) page. |
33 |
| -* For generic questions or to share your experience using XGBoost please use the [XGBoost User Group](https://groups.google.com/forum/#!forum/xgboost-user/) |
34 |
| - |
35 |
| -Help to Make XGBoost Better |
36 |
| ---------------------------- |
37 |
| -XGBoost has been developed and used by a group of active community members. Your help is very valuable to make the package better for everyone. |
38 |
| -- Check out [call for contributions](https://github.com/dmlc/xgboost/issues?q=is%3Aissue+label%3Acall-for-contribution+is%3Aopen) and [Roadmap](https://github.com/dmlc/xgboost/issues/873) to see what can be improved, or open an issue if you want something. |
39 |
| -- Contribute to the [documents and examples](https://github.com/dmlc/xgboost/blob/master/doc/) to share your experience with other users. |
40 |
| -- Add your stories and experience to [Awesome XGBoost](demo/README.md). |
41 |
| -- Please add your name to [CONTRIBUTORS.md](CONTRIBUTORS.md) and after your patch has been merged. |
42 |
| - - Please also update [NEWS.md](NEWS.md) on changes and improvements in API and docs. |
43 |
| - |
44 | 22 | License
|
45 | 23 | -------
|
46 | 24 | © Contributors, 2016. Licensed under an [Apache-2](https://github.com/dmlc/xgboost/blob/master/LICENSE) license.
|
47 | 25 |
|
| 26 | +Contribute to XGBoost |
| 27 | +--------------------- |
| 28 | +XGBoost has been developed and used by a group of active community members. Your help is very valuable to make the package better for everyone. |
| 29 | +Checkout the [Community Page](https://xgboost.ai/community) |
| 30 | + |
48 | 31 | Reference
|
49 | 32 | ---------
|
50 |
| -- Tianqi Chen and Carlos Guestrin. [XGBoost: A Scalable Tree Boosting System](http://arxiv.org/abs/1603.02754). In 22nd SIGKDD Conference on Knowledge Discovery and Data Mining, 2016 |
51 |
| -- XGBoost originates from research project at University of Washington, see also the [Project Page at UW](http://dmlc.cs.washington.edu/xgboost.html). |
| 33 | +- Tianqi Chen and Carlos Guestrin. [XGBoost: A Scalable Tree Boosting System](http://arxiv.org/abs/1603.02754). In 22nd SIGKDD Conference on Knowledge Discovery and Data Mining, 2016 |
| 34 | +- XGBoost originates from research project at University of Washington. |
0 commit comments