/Numpy-Implementation-of-Neural-Nets

Numpy implementation of Neural Networks with SGDM, ADAM and BFGS solvers, suitable for surface fitting

Primary LanguageJupyter Notebook

Numpy-Implementation-of-Neural-Net

Numpy implementation of Neural Networks with various solvers.

  • Capable of handling multivariate function approximation tasks. ( $\mathbb{R}^{N} \rightarrow \mathbb{R}$ )
  • This repository implements a two-stage optimization method which is popular in the scientific machine learning community, outperforms several SGD-based methods such as Adam and SGDM in various scientific computing tasks.
  • A more generalized and sophisticated version ( $\mathbb{R}^{N} \rightarrow \mathbb{R}^{M}$ ) can be found in my MATLAB File Exchange.
  • "CompareWithTorch" provides a comparison between pure SGD-Based methods(Adam) and the two-stage optimization strategy.

Reference

  1. Numerical Optimization, Nocedal & Wright.
  2. Practical Quasi-Newton Methods for Training Deep Neural Networks, Goldfarb, et al.
  3. Kronecker-factored Quasi-Newton Methods for Deep Learning, Yi Ren, et al.