Course at DS3 summer school 2021
- Alexandre Gramfort, Inria
- Quentin Bertrand, Inria
Modern machine learning heavily relies on optimization tools, typically to minimize the so-called loss functions on training sets. The objective of this course is to give an overview of the most commonly employed gradient-based algorithms: (proximal) gradient descent, (proximal) coordinate descent, L-BFGS and stochastic gradient descent. As the course is meant to be practical one will see how all these algorithms can be implemented in Python on a logistic regression problem for binary classification. Slides are available to cover some theory and Jupyter notebooks are available for the programming sessions. All notebooks end with some excercises to further practice.
Python (>=3.6) with numpy, scipy and matplotlib