/Gradient_Descent_for_Linear_Regression

This is my implementation of gradient descent algorithm for linear regression. It implements normal (batch) GD, mini-batch, stochastic, adagrad, rmsprop and adam improves as well.

Primary LanguageJupyter Notebook

Watchers