/Optimization-and-Regularization-from-scratch

Implementation of optimization and regularization algorithms in deep neural networks from scratch

Primary LanguagePython

Optimization-and-Regularization-from-scratch

Implementation of optimization and regularization algorithms in deep neural networks from scratch

In this repository, I implemented and investigated different optimaziation algorithms including Adam, Adagrad, Gradient Descent and RMSProp along with L1 and L2 regularization methods to classify samples in the cifar dataset.

Gradient Descent

Adagrad

RMSProp

Adam