/awesome-soms

A curated list of resources for second-order stochastic optimization

GNU General Public License v3.0GPL-3.0

Awesome Second-Order Methods Awesome

A curated list of resources for second-order stochastic optimization methods in machine learning.

Table of Contents

Books and Lecture Notes

Papers

Overview

Analysis of the Hessian

Diagonal Scaling

Hessian-free Optimization

Quasi-Newton

Gauss-Newton

Fisher Information

Other

Implementation in JAX

  • Optax - mostly first-order accelerated methods

  • Somax - second-order stochastic solvers

  • JAXopt - deterministic second-order methods (e.g., Gauss-Newton, Levenberg Marquardt), stochastic first-order methods PolyakSGD, ArmijoSGD

  • KFAC-JAX - implementation of KFAC from the DeepMind team

  • AdaHessianJax - implementation of the AdaHessian optimizer by Nestor Demeure