/term5-optimization

Term 5: Optimization methods lab works.

Primary LanguageJupyter Notebook

term5-optimization

Term 5: Optimization methods lab works.


Unconstrained optimization

Lab 1: 0th order methods

No gradient needed. Basically, just sophisticated search algorithms

  • Scalar optimization
    • dumb section
    • fibonacci section
    • gold section
  • Hooke-Jeeves
  • Nelder-Mead
  • Evolution method(s)

Lab 2: 1st and 2nd order methods

A gradient and a hessian matrix needed

  • Gradient descent
    • dumb
    • with dynamic "learning rate"
    • the "steepest descent" algorithm
    • the conjugate gradients method
  • Newtonian descent
    • dumb
    • with dynamic "learning rate"

Constrained optimization

Lab 3: Classical Lagrange

// to be announced

Lab 4: Advanced Lagrange (with loss functions)

// to be announced