/GOSE

local minima finding algorithm that can escape from saddle point in one step

Primary LanguagePython

Gradient descent with One-Step Escaping (GOSE)

This repository contains pytorch code that produces the one step escape method (negative curvature descent step) in the paper: Saving Gradient and Negative Curvature Computations: Finding Local Minima More Efficiently. We adopt the one step escape method together with Adam (Adam: A Method for Stochastic Optimization) for training deep networks.

Reference