CS6230-OML
Coursework pertaining to CS6230 : Optimization Methods in Machine Learning offered in Fall 2017
Assignment - 1
- Convex Sets and Functions
- Properties of (Quasi)Convex/Concave Functions
Programming Assignment - 1
- Plotting special functions
- Experimenting with Gradient Descent with different step size setting schemes
Assignment - 2
- Subgradient calculus and properties
- Proximal gradient descent based proofs
Programming Assignment - 2
- Partly theoretical as well: Karush-Kuhn-Tucker conditions, Strong convexity, Dual Formulation
- Cost Sensitive SVM in CVXPY
Paper Presentation
- LaTeX script for generating the slides used for presenting the paper Newton-Type Methods for Non-Convex Optimization Under Inexact Hessian Information by Peng Xu, Farbod Roosta-Khorasani and Michael W. Mahoney
Project
- Empirical study of benefits of Adaptive Gradient Methods
- Mostly inspired by The Marginal Value of Adaptive Gradient Methods in Machine Learning by Ashia C. Wilson, Rebecca Roelofs, Mitchell Stern, Nathan Srebro and Benjamin Recht