/sgd

A comprehensive gradient descent package

Primary LanguageGo

SGD GoDoc

Stochastic Gradient Descent is a class of first-order numerical optimization algorithms. These algorithms have proven very useful for numerous Machine Learning tasks. Package sgd implements many popular SGD variants, such as AdaGrad, Adam, RMSProp, and Equilibrated SGD. It is designed for Machine Learning tasks, but is easily extensible to other tasks as well.