In this lab, you'll practice many concepts you have learned so far, from adding interactions and polynomials to your model to AIC and BIC!
You will be able to:
- Build a linear regression model with interactions and polynomial features
- Use AIC and BIC to select the best value for the regularization parameter
Import all the necessary packages.
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import warnings
warnings.filterwarnings('ignore')
from sklearn.linear_model import LinearRegression
from sklearn.model_selection import cross_val_score
from sklearn.model_selection import KFold
from sklearn.preprocessing import scale
from sklearn.datasets import load_boston
- Import the Boston housing dataset
- Split the data into target (
y
) and predictors (X
) -- ensure these both are DataFrames - Scale all the predictors using
scale
. Convert these scaled features into a DataFrame - Build at a baseline model using scaled variables as predictors. Use 5-fold cross-validation (set
random_state
to 1) and use the$R^2$ score to evaluate the model
# Your code here
Look at all the possible combinations of variables for interactions by adding interactions one by one to the baseline model. Next, evaluate that model using 5-fold cross-validation and store the
Print the 7 most important interactions.
# Your code here
Write code to include the 7 most important interactions in your data set by adding 7 columns. Name the columns "var1_var2" with var1 and var2 the two variables in the interaction.
# Your code here
Try polynomials of degrees 2, 3, and 4 for each variable, in a similar way you did for interactions (by looking at your baseline model and seeing how
(var_name, degree, R2)
, so eg. ('DIS', 3, 0.732)
# Your code here
For each variable, print out the maximum R2 possible when including Polynomials.
# Your code here
Which two variables seem to benefit most from adding polynomial terms?
Add Polynomials for the two features that seem to benefit the most, as in have the best R squared compared to the baseline model. For each of the two features, raise to the Polynomial that generates the best result. Make sure to start from the data set df_inter
so the final data set has both interactions and polynomials in the model.
# Your code here
Check out your final data set and make sure that your interaction terms as well as your polynomial terms are included.
# Your code here
Check out the R-squared of the full model.
# Your code here
You learned that when using Lasso regularization, your coefficients shrink to 0 when using a higher regularization parameter. Now the question is which value we should choose for the regularization parameter.
This is where the AIC and BIC come in handy! We'll use both criteria in what follows and perform cross-validation to select an optimal value of the regularization parameter
Read the page here: https://scikit-learn.org/stable/auto_examples/linear_model/plot_lasso_model_selection.html and create a similar plot as the first one listed on the page.
import numpy as np
import matplotlib.pyplot as plt
from sklearn.linear_model import LassoCV, LassoLarsCV, LassoLarsIC
# Your code here
Finally, use the best value for the regularization parameter according to AIC and BIC, and compare R-squared and MSE using train-test split. Compare with the baseline model.
from sklearn.metrics import mean_squared_error, mean_squared_log_error
from sklearn.model_selection import train_test_split
from sklearn.linear_model import Lasso
# Split X_scaled and y into training and test sets
# Set random_state to 1
X_train, X_test, y_train, y_test = None
# Code for baseline model
linreg_all = None
# Print R2 and MSE
# Split df_inter and y into training and test sets
# Set random_state to 1
X_train, X_test, y_train, y_test = None
# Code for lasso with alpha from AIC
lasso = None
# Print R2 and MSE
# Code for lasso with alpha from BIC
lasso = None
# Print R2 and MSE
From this section, you know that when using lasso, more parameters shrink to zero as your regularization parameter goes up. In Scikit-learn there is a function lasso_path()
which visualizes the shrinkage of the coefficients while
This notebook shows how you can use AIC and BIC purely for feature selection. Try this code out on our Boston housing data!
https://xavierbourretsicotte.github.io/subset_selection.html
Congratulations! You now know how to create better linear models and how to use AIC and BIC for both feature selection and to optimize your regularization parameter when performing Ridge and Lasso.