Gradient descent optimizers for linear regression.
Here are the implemented algorithms :
- Vanilla gradient descent
- Momentum and batch
- Adagrad
- RMSProp
- Adam
- Adamax
- Nesterov Accelerated Gradient
- Nadam
It compares the regression functions and the error evolution.
- Fix Adadelta optimizer