If github in unable to render a Jupyter notebook, copy the link of the notebook and enter into the nbviewer: https://nbviewer.jupyter.org/
There are two very different solution approaches for the Linear Regression problem.
- The “closed-form” solution approach known as the Ordinary Least Squares (OLS) method.
- Iterative optimization approach known as Gradient Descent (GD).
We will perform an extensive investigation of these two approaches using Scikit-Learn in a series of four notebooks. For this exploration we will use the Boston Housing dataset that has 506 samples and 13 features.
There are four notebooks on sklearn Linear Regression.
- Linear Regression-1-OLS -- OLS method & Regularized OLS Method (Ridge Rergression)
- Linear Regression-2-OLS Polynomial Regression-Frequentist Approach (MLE) -- Polynomial regression using the OLS method
- Linear Regression-3-OLS Polynomial Regression-Bayesian Approach (MAP) -- Polynomial regression using the regularized OLS method
- Linear Regression-4-Gradient Descent -- Iterative optimization approach (Gradient Descent & Stochastic Gradient Descent)