/Numerical-Optimization-Notes

My notes and implementation of different numerical optimization algorithms.

Primary LanguageJupyter Notebook

Numerical Optimization Notes

This repo hosts my implememtations for different optimization algorithms. Broadly numerical optimizations algorithms can be divided into 3 categories:--

1.) Line Search methods : adapted to convex cost functions

  • steepest descent,
  • coordinate descent,
  • conjugate gradient,
  • quasi-Newton,
  • Newton, etc.
  1. Trust Region methods:
  • Cauchy Point
  • Dogleg
  • 2D-subspace minimization
  • Nearly exact
  • Newton method
  • CG-Newton

3.) Evolutionary methods : adapted to multi-modal cost functions

  • genetic algorithms,
  • evolution strategies,
  • particle swarm,
  • ant colony,
  • simulated annealing, etc.

4.) Pattern search methods : adapted to noisy cost functions

  • Nelder-Mead simplex,
  • Torczon’s multidirectional search, etc.