work best when the predictors are as independent from one another as possible. One way to get diverse classifiers is to train them using very different algorithms. This increases the chance that they will make very different types of errors, improving the ensemble’s accuracy.
- RandomForestClassifier
- VotingClassifier
- Bagging and Pasting
- Extra-Trees
- AdaBoost
- Using XGBoost
- Gradient Boosting
- Gradient Boosting with Early stopping