/gradient-descent-optimizers-linear-regression

Gradient descent optimizers for linear regression

Primary LanguageJupyter Notebook

gradient-descent-optimizers-linear-regression

Gradient descent optimizers for linear regression.

Here are the implemented algorithms :

  • Vanilla gradient descent
  • Momentum and batch
  • Adagrad
  • RMSProp
  • Adam
  • Adamax
  • Nesterov Accelerated Gradient
  • Nadam

It compares the regression functions and the error evolution.

TODO:
  • Fix Adadelta optimizer