Official coursebook information
Lectures:
Fri 13:15-15:00 in CO2
Exercises:
Fri 15:15-17:00 in BC01
This course teaches an overview of modern mathematical optimization methods, for applications in machine learning and data science. In particular, scalability of algorithms to large datasets will be discussed in theory and in implementation.
- Instructors:
- Martin Jaggi martin.jaggi@epfl.ch
- Nicolas Flammarion nicolas.flammarion@epfl.ch
- Assistants:
- Aditya Varre aditya.varre@epfl.ch
- Amirkeivan Mohtashami amirkeivan.mohtashami@epfl.ch
- Matteo Pagliardini matteo.pagliardini@epfl.ch
- Scott Pesme scott.pesme@epfl.ch
Contents:
Convexity, Gradient Methods, Proximal algorithms, Subgradient Methods, Stochastic and Online Variants of mentioned methods, Coordinate Descent, Frank-Wolfe, Accelerated Methods, Primal-Dual context and certificates, Lagrange and Fenchel Duality, Second-Order Methods including Quasi-Newton Methods, Derivative-Free Optimization.
Advanced Contents:
Parallel and Distributed Optimization Algorithms
Computational Trade-Offs (Time vs Data vs Accuracy), Lower Bounds
Non-Convex Optimization: Convergence to Critical Points, Alternating minimization, Neural network training
Nr | Date | Topic | Materials | Exercises |
---|---|---|---|---|
1 | 25.2. | Introduction, Convexity | notes, slides | lab01 |
2 | 4.3. | Gradient Descent | notes, slides | lab02 |
3 | 11.3. | Projected Gradient Descent | notes, slides | lab03 |
4 | 18.3. | Proximal and Subgradient Descent | notes, slides | lab04 |
5 | 25.3. | Stochastic Gradient Descent, Non-Convex Optimization | notes, slides | lab05 |
6 | 1.4. | Non-Convex Optimization, Accelerated Gradient Descent | notes, slides | lab06 |
7 | 8.4. | Newton's Method & Quasi-Newton | notes, slides | lab07 |
. | 15.4. | easter vacation |
- | |
. | 22.4. | easter vacation |
- | |
8 | 29.4. | Coordinate Descent | notes, slides | lab08 |
9 | 6.5. | Frank-Wolfe | notes, slides | lab09 |
10 | 13.5. | Accelerated Gradient, Gradient-free, adaptive methods | notes, slides | lab10 |
11 | 20.5. | Opt for ML in Practice | notes, slides | Q&A |
12 | 27.5. | Mini-Project week |
- | |
13 | 3.6. | Opt for ML in Practice | notes, slides | Q&A Projects |
The weekly exercises consist of a mix of theoretical and practical Python
exercises for the corresponding topic each week (starting week 2). Solutions to theory exercises are available here, and for practicals in the lab folder.
A mini-project
will focus on the practical implementation: Here we encourage students to investigate the real-world performance of one of the studied optimization algorithms or variants, helping to provide solid empirical evidence for some behaviour aspects on a real machine-learning task. The project is mandatory and done in groups of 3 students. It will count 25% to the final grade. Project reports (3 page PDF) are due June 17th. Here is a detailed project description.
Final written exam in exam session on Thursday 07.07.2022 from 09h15 to 12h15 (in CE1, CE1106, CE3) Format: Closed book. Theoretical questions similar to exercises. You are allowed to bring one cheat sheet (A4 size paper, both sides can be used). For practice: exam 2020, solutions 2020, exam 2019, solutions 2019, exam 2018, solutions 2018.
- Convex Optimization: Algorithms and Complexity, by SĂ©bastien Bubeck (free online)
- Convex Optimization, Stephen Boyd and Lieven Vandenberghe (free online)
- Introductory Lectures on Convex Optimization, Yurii Nesterov (free online)