🗓️ October, 2020
⏰ 16:00 - 18:00 & 18:30 - 20:30
Students will learn many of the most common machine learning methods to include:
- A proper modeling process
- Feature engineering
- Linear and logistic regression
- Regularized models
- K-nearest neighbors
- Random forests
- Gradient boosting machines
- Stacking / super learners
- And more!
This module will teach students how to build and tune these various models with R and Python packages that have been tested and approved due to their ability to scale well (i.e. glmnet, ranger, xgboost, h2o, scikit-learn). However, the motivation in almost every case is to describe the techniques in a way that helps develop intuition for its strengths and weaknesses.
This module will step through the process of building, visualizing, testing, and comparing supervised models. The goal is to expose you to building machine learning models using a variety of algorithms. By the end of this module you should:
- Understand how to apply an end-to-end modeling process that allows you to find an optimal model.
- Be able to properly pre-process your feature and target variables.
- Interpret, apply and compare today's most popular and effective machine learning algorithms.
- Methodically and efficiently tune these algorithms.
- Visualize and compare how features impact these models.
This module makes a few assumptions of your established knowledge regarding your programming skills and exposure to basic statistical concepts. Below are my assumptions and the relevant courses that you should have already attended to make sure you are properly prepared. The material provides examples in both R and Python so as long as you are proficient with the assumptions below for one language then you will be good to go.
Assumptions | Resource |
---|---|
Comfortable with R & Python programming | link |
Proficient with basic data wrangling tasks | link |
Knowledgable of foundational statistics | link |
Prior to session 1, please run the following scripts to ensure you have the necessary packages used throughout.
Language | Requirements |
---|---|
Python | link |
R | link |
Session | Description | Reading(s) | Slides | Source code |
---|---|---|---|---|
1 | Introduction to machine learning | Notebook | HTML | [R] [Python] |
2 | The modeling process | Notebook | HTML | [R] [Python] |
3 | Feature and target engineering | Notebook | HTML | [R] [Python] |
4 | Portfolio builder #1 | Notebook | ||
5 | Linear regression | Notebook | HTML | [R] [Python] |
6 | Logistic regression | Notebook | HTML | [R] [Python] |
7 | Regularized regression | Notebook | HTML | [R] [Python] |
8 | Portfolio builder #2 | Notebook | ||
9 | Multivariate adaptive regression splines | Notebook | HTML | [R] [Python] |
10 | K-nearest neighbors | Notebook | HTML | [R] [Python] |
11 | Decision trees | Notebook | HTML | [R] [Python] |
12 | Bagging | Notebook | HTML | [R] [Python] |
13 | Random forests | Notebook | HTML | [R] [Python] |
14 | Portfolio builder #3 | Notebook | ||
15 | Gradient boosting | Notebook | HTML | [R] [Python] |
16 | Stacked models | Notebook | HTML | [R] [Python] |
17 | Portfolio builder #4 | Notebook |