gradient-boosting-machine
There are 95 repositories under gradient-boosting-machine topic.
benedekrozemberczki/awesome-decision-tree-papers
A collection of research papers on decision, classification and regression trees with implementations.
szilard/benchm-ml
A minimal benchmark for scalability, speed and accuracy of commonly used open source implementations (R packages, Python scikit-learn, H2O, xgboost, Spark MLlib etc.) of the top machine learning algorithms for binary classification (random forests, gradient boosted trees, deep neural networks etc.).
benedekrozemberczki/awesome-gradient-boosting-papers
A curated list of gradient boosting research papers with implementations.
jphall663/interpretable_machine_learning_with_python
Examples of techniques for training interpretable ML models, explaining ML models, and debugging ML models for accuracy, discrimination, and security.
serengil/chefboost
A Lightweight Decision Tree Framework supporting regular algorithms: ID3, C4.5, CART, CHAID and Regression Trees; some advanced techniques: Gradient Boosting, Random Forest and Adaboost w/categorical features support for Python
ledell/useR-machine-learning-tutorial
useR! 2016 Tutorial: Machine Learning Algorithmic Deep Dive http://user2016.org/tutorials/10.html
mdabros/SharpLearning
Machine learning for C# .Net
wepe/tgboost
Tiny Gradient Boosting Tree
szilard/GBM-perf
Performance of various open source GBM implementations
sh1ng/arboretum
Gradient Boosting powered by GPU(NVIDIA CUDA)
serengil/decision-trees-for-ml
Building Decision Trees From Scratch In Python
WLOGSolutions/telco-customer-churn-in-r-and-h2o
Showcase for using H2O and R for churn prediction (inspired by ZhouFang928 examples)
RubixML/Housing
An example project that predicts house prices for a Kaggle competition using a Gradient Boosted Machine.
yubin-park/bonsai-dt
Programmable Decision Tree Framework
benedekrozemberczki/BoostedFactorization
An implementation of "Multi-Level Network Embedding with Boosted Low-Rank Matrix Approximation" (ASONAM 2019).
WindQAQ/ML2017
NTUEE Machine Learning, 2017 Spring
haghish/mlim
mlim: single and multiple imputation with automated machine learning
rmitsuboshi/miniboosts
A collection of boosting algorithms written in Rust 🦀
RudreshVeerkhare/CustomXGBoost
Modified XGBoost implementation from scratch with Numpy using Adam and RSMProp optimizers.
szilard/GBM-tune
Tuning GBMs (hyperparameter tuning) and impact on out-of-sample predictions
StatMixedML/Py-BoostLSS
An extension of Py-Boost to probabilistic modelling
joymnyaga/CreditAnalytics-Loan-Prediction
A predictive model that uses several machine learning algorithms to predict the eligibility of loan applicants based on several factors
navdeep-G/interpretable-ml
Techniques & resources for training interpretable ML models, explaining ML models, and debugging ML models.
serengil/h2o-ai-101
This repository covers h2o ai based implementations
gtesei/fast-furious
code (R, Matlab/Octave), models and meta-models I needed in my Machine Learning Lab but I didn't found on the shelf
bgreenwell/MLDay18
Material from "Random Forests and Gradient Boosting Machines in R" presented at Machine Learning Day '18
xxl4tomxu98/autoencoder-feature-extraction
Use auto encoder feature extraction to facilitate classification model prediction accuracy using gradient boosting models
tugrulhkarabulut/Tree-Based-Methods
Implementation of Decision Tree and Ensemble Learning algorithms in Python with numpy
drmerlot/startml
R package for automatic hyper parameter tuning and ensembles with deep learning, gradient boosting machines, and random forests. Powered by h2o.
leffff/stackboost
Open source gradient boosting library
RachanaJayaram/MalwarePrediction
Using machine learning models to predict the probability of a windows system getting infected by various families of malware, based on different properties of that system.
koalaverse/machine-learning-in-R
A bookdown version of the UseR 2016 machine learning tutorial given by Erin LeDell
fahd09/kaggle_competitions_old
My contributions in Kaggle, mostly in a notebook format. Just for fun.
Mephistopheles-0/Performance-Prediction-with-Wearable-Tech-Data
analyze data from accelerometers placed on the belt, forearm, arm, and dumbbell of six participants. These individuals were tasked with executing barbell lifts, both correctly and incorrectly, in five distinct manners
perpetual-ml/perpetual
A self-generalizing, hyperparameter-free gradient boosting machine