/Gradient_Descent

As part of Optimization techniques for ML, I implemented gradient descent technique for 'Least sqaure function' as our objective function. I have supplement it with initial data visualization via correlation matrix and pairplots (via seaborn).

Primary LanguageJupyter NotebookMIT LicenseMIT

Gradient_Descent for 'Least Square Method'

As part of Optimization techniques for ML, I implemented gradient descent technique to optimize the objective 'Least sqaure function'. The goal is to predict height of a person from their weight (and gender) via linear regression. I have supplemented the analysis with data visualization employing correlation matrix and pairplots (via seaborn).

Data-set I: Height Prediction form weight (and gender).

alt text

Model: Linear Regression.

Data Visualization: PairPlot and Correlation plot

alt text

alt text

Convergence Analysis: Two different learning rate ($\gamma$) gives drastically different converegence.

With $\gamma$ = 0.1 (Trial learning rate.)

alt text

With $\gamma = \frac{1}{L}$; where L is the Smoothness parameter of the optimized Least sqaure function.

alt text