This code comes jointly with the following reference (alternatively, it is on arXiv)
[1] Adrien Taylor, and Francis Bach. "Stochastic first-order methods: non-asymptotic and computer-aided analyses via potential functions," Proceedings of the International Conference on Learning Theory (COLT), 2019.
Date: February 4, 2019
Note: This code requires YALMIP along with a suitable SDP solver (e.g., Sedumi, SDPT3, Mosek).
A_GradientDescent
Code for reproducing the result of Section 3.2.1, Theorem 3 and Figure 1 (gradient method).B_ProximalGradientDescent
Code for reproducing the result presented in Appendix C.2, Theorem 9 and Figure 1-like results (proximal gradient method).C_StepsizeSelection_FirstAcceleratedMethod
Code for reproducing the first result of Appendix C.3, Theorem 10 and Figure 3 (first accelerated gradient method).D_StepsizeSelection_SecondAcceleratedMethod
Code for reproducing the second result of Appendix C.3, Theorem 11 and Figure 4 (second accelerated gradient method).
A_StochasticGradientDescent
Code for reproducing the first result of Section 3.2.2, Theorem 5 and Figure 2 (stochastic gradient descent).B_StochasticGradientDescentWithAveraging
Code for reproducing the second result of Section 3.2.2, Theorem 6 (stochastic gradient descent with averaging).C_StochasticGradientDescentWithPrimalAveraging
Code for reproducing the third result of Section 3.2.2, Theorem 7 (stochastic gradient descent with primal averaging).D_StochasticGradientDescent_EvaluationAtAveragedPoint
Code for reproducing the result of Appendix D.4, Theorem 12 (stochastic gradient descent with evaluation at the averaged iterate).
Stochastic first-order methods: unbiased oracles arising from sampling in expectations of smooth convex functions
A_ParameterSelection
Code for reproducing the result of Section 4 and Appendix E, Theorem 8 and Figure 5 (obtaining stochastic gradient descent with primal averaging, using the parameter selection technique).
A_ParameterSelection
Code for recovering the result of Appendix G, Theorem 15 (stochastic gradient descent with primal averaging from the parameter selection technique).
A_StochasticGradientDescentWithPrimalAveraging
Code for verifying the result of Appendix F, Theorem 15 (stochastic gradient descent with primal averaging, parameter selection technique).