/optimization-techniques

This repository will comprise primary optimization algorithms in Python language. Optimization is an extremely important part of machine learning.

Primary LanguageJupyter NotebookApache License 2.0Apache-2.0

Optimization Techniques

This repository will comprise primary optimization algorithms implementations in Python language. These algorithms based on Gradient Descent Algorithm. You can try this algorithms on my Heroku App.

Algorithms

  1. Gradient Descent Algorithm
  2. Steepest Descent Algorithm
  3. Gradient Descent with Momentum Algorithm
  4. RMSprop Algorithm
  5. Adam Optimization Algorithm

Website

I made a simple website with Python-Flask for using this algorithms much more simpler. These are screenshots from the site.

Screenshot from Site

Screenshot from Game

Resources

I put useful links below to learn Optimization Techniques.

Articles

  1. Optimizers Explained - Adam, Momentum and Stochastic Gradient Descent
  2. An overview of gradient descent optimization algorithms
  3. Intro to optimization in deep learning: Momentum, RMSProp and Adam
  4. Gradient Descent — Intro and Implementation in python

Videos

  1. Applied Optimization - Steepest Descent
  2. Gradient Descent With Momentum (C2W2L06)
  3. RMSProp (C2W2L07)
  4. Adam Optimization Algorithm (C2W2L08)