LightGBM is a gradient boosting framework that is using tree based learning algorithms. It is designed to be distributed and efficient with following advantages:
- Fast training efficiency
- Low memory usage
- Better accuracy
- Parallel learning supported
- Deal with large scale of data
For the details, please refer to Features.
The experiments on the public data also shows that LightGBM can outperform other existing boosting tools on both learning efficiency and accuracy, with significant lower memory consumption. What's more, the experiments shows that LightGBM can achieve linear speed-up by using multiple machines for training in specific settings.
For a quick start, please follow the Installation Guide and Quick Start.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.