/Variance_Reduced_Optimizers_Pytorch

PyTorch Implementation of Variance Reduced Optimization Algorithms -- SARAH and SVRG.

Primary LanguagePythonApache License 2.0Apache-2.0

Variance Reduced Optimizers in PyTorch

This repo contains implementation of SVRG, SARAH (SpiderBoost), SCSG and Geom-SARAH algorithms based on PyTorch. It was used to produce experiments for the paper Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization (Horvath et al., 2020).

To replicate our experiments, first, recreate conda environment from environment.yml. Run scripts are available in runs/ directory.

If you find this useful, please consider citing:

@article{horvath2020adaptivity,
  title={Adaptivity of stochastic gradient methods for nonconvex optimization},
  author={Horv{\'a}th, Samuel and Lei, Lihua and Richt{\'a}rik, Peter and Jordan, Michael I},
  journal={arXiv preprint arXiv:2002.05359},
  year={2020}
}

References: