/Gradient-optimizers-from-scratch

here I developed the optimizers from scratch in python using jupyter for learning purposes and testing on linear data.

Primary LanguageJupyter Notebook

Gradient-optimizers-from-scratch

here I developed the optimizers from scratch in python using jupyter for learning purposes and testing on linear data.

lab 1 is about full batch gradient descent lab 2 is about : 1. stochastic gradient descent 2. mini batch gradient descent 3. multivariant gradient descent

lab 3 is about : 1. momentum batch gradient descent 2. NAG

lab 4 is about : 1. Adagrad optimizer 2. RMS optimizer 3. Adam optimizer