/Boosting-Machine-Learning-Models-in-Python

Boosting Machine Learning Models in Python [Video], by Packt Publishing

Primary LanguageJupyter NotebookMIT LicenseMIT

Boosting Machine Learning Models in Python [Video]

This is the code repository for Boosting Machine Learning Models in Python [Video], published by Packt. It contains all the supporting project files necessary to work through the video course from start to finish.

About the Video Course

Machine Learning ensembles are models composed of a few other models that are trained separately and then combined in some way to make an overall prediction. These powerful techniques are often used in applied Machine Learning to achieve the best overall performance.

In this unique course, after installing the necessary tools you will jump straight into the bagging method so as to get the best results from algorithms that are highly sensitive to specific data—for example, algorithms based on decision trees. Next, you will discover another powerful and popular class of ensemble methods called boosting. Here you'll achieve maximal algorithm performance by training a sequence of models, where each given model improves the results of the previous one. You will then explore a much simpler technique called voting, where results from multiple models are achieved using simple statistics such as the mean average. You will also work hands-on with algorithms such as stacking and XGBoost to improve performance.

By the end of this course, you will know how to use a variety of ensemble algorithms in the real world to boost your Machine Learning models.

What You Will Learn

  • Discover and use the main concepts behind ensemble techniques and learn why they are important in applied Machine Learning
  • Learn how to use bagging to combine predictions from multiple algorithms and predict more accurately than from any individual algorithm
  • Use boosting to create a strong classifier from a series of weak classifiers and improve the final performance
  • Explore how even a very simple ensemble technique such as voting can help you maximize performance
  • Also learn a powerful and less well-known stacking technique, where you combine different models with another machine learning algorithm to focus on distinctive features of your dataset for each individual model
  • Evaluate which ensemble technique is good for a particular problem
  • Train, test, and evaluate your own XGBoost models

Instructions and Navigation

Assumed Knowledge

To fully benefit from the coverage included in this course, you will need:

● Working Python 3 knowledge

● Ability to run simple commands in shell (Terminal)

● Some basic ML experience

To fully benefit from the coverage included in this course, you will need:

This course has the following software requirements:

● Conda package manager with Python3.7 (https://conda.io/en/master/miniconda.html )

● Conda python packages: jupyter, scikit-learn (sklearn), matplotlib, pandas, mlxtend, xgboost

This course has been tested on the following system configuration:

● OS: macOS High Sierra

● Processor: 1,3 GHz Intel Core 5

● Memory: 4 GB

● Storage: 121 GB

Related Products