non-linear-optimizers

Implementation from scratch of non linear optimizers frequently used in the optimization of the loss in neural networks.

  • Gradient descent
  • Momentum
  • Nesterov momentum
  • AdaGrad
  • RMSProp
  • Adam

The optimizers are tested and visualized using the following functions:

  • paraboloid
  • Matyas function
  • Easom function
  • Bukin function
  • three hump camel function

presentation

website with list of functions