/Optimization-Algorithms

Visualizations for different numerical optimization algorithms applied to linear regression problems

Primary LanguageJupyter Notebook

Optimization-Algorithms

These notebooks gives visualizations for different numerical optimization algorithms applied to linear regression problems.
It offers my interpretation of the the differences between different optimization techniques, the problems they might face and the effects of different hyperparameters on the learning process. image

Algorithms covered:

  • Batch Gradient Descent
  • Stochastic Gradient Descent (SGD)
  • Mini-batch Gradient Descent
  • Momentum-based Gradient Descent
  • Nestrov Accelearted Gradient (NAG)
  • Adaptive Gradient (AdaGrad)
  • Root Mean Squared Propagation (RMSProp)
  • Adaptive Moment Estimation (ADAM)

Libraries used:

  • Numpy
  • Matplotlib