/The_evolution_of_gradient_descent

This is the code for "The Evolution of Gradient Descent" by Siraj Raval on Youtube

Primary LanguageJupyter Notebook

The_evolution_of_gradient_descent

This is the code for "The Evolution of Gradient Descent" by Siraj Raval on Youtube

Coding Challenge - Due Date, Thursday June 8th at 12 PM PST

This weeks coding challenge is to write out the Adam optimization strategy from scratch. In the process you'll learn about all the other gradient descent variants and why Adam works so well. Bonus points if you add a visual element to it by plotting it in a Jupyter notebook. Good luck!

Overview

This is the code for this videon on Youtube by Siraj Raval. In the video, we go over the different optimizer options that Tensorflow gives us. Under the hood, they are all variants of gradient descent.

Dependencies

  • matplotlib
  • pyplot
  • numpy

install missing dependencies with pip

Usage

Run jupyter notebook to see the code that compares gradient descent to stochastic gradient descent run in the browser. I've also got 2 seperate python files, one for adadelta and one for the nesterov method. Run those straight from terminal with the python command.

Credits

The credits for this code go to GRYE and dtnewman. I've merely created a wrapper to get people started.