/OptML_course

EPFL Course - Optimization for Machine Learning - CS-439

Primary LanguageJupyter Notebook

EPFL Course - Optimization for Machine Learning - CS-439

Official coursebook information

Lectures: Fri 13:15-15:00 in CE2

Exercises: Fri 15:15-17:00 in CE2

This course teaches an overview of modern mathematical optimization methods, for applications in machine learning and data science. In particular, scalability of algorithms to large datasets will be discussed in theory and in implementation.

Team

Contents:

Convexity, Gradient Methods, Proximal algorithms, Subgradient Methods, Stochastic and Online Variants of mentioned methods, Coordinate Descent, Frank-Wolfe, Accelerated Methods, Primal-Dual context and certificates, Lagrange and Fenchel Duality, Second-Order Methods including Quasi-Newton Methods, Derivative-Free Optimization.

Advanced Contents:

Parallel and Distributed Optimization Algorithms, Synchronous and Asynchronous Communication.

Computational Trade-Offs (Time vs Data vs Accuracy), Lower Bounds.

Non-Convex Optimization: Convergence to Critical Points, Alternating minimization

Program:

Nr Date Topic Materials Exercises
#1 23.2. Introduction, Convexity notes, slides lab01
#2 2.3. Gradient Descent notes, slides lab02
#3 9.3. Projected Gradient Descent notes, slides lab03
#4 16.3. Projected, Proximal Gradient Descent notes, slides lab04
#5 23.3. Subgradient, Stochastic Gradient Descent notes, slides lab05
. 30.3. easter vacation -
. 6.4. easter vacation -
#6 13.4. Newton's Method notes, slides lab06
#7 20.4. Quasi-Newton methods notes, slides lab07
#8 27.4. Frank-Wolfe notes, slides lab08
#9 4.5. Coordinate Descent notes, slides lab09
#10 11.5. Mini-Project week
#11 18.5. Accelerated Methods notes, slides lab10
#12 25.5. Duality, Gradient-free methods, Applications notes, slides
#13 1.6. Opt for ML in Practice       slides

Exercises:

The weekly exercises consist of a mix of theoretical and practical Python exercises for the corresponding topic each week (starting week 2). Solutions to theory exercises are available here, and for practicals in the lab folder.

Mini-project:

An optional mini-project will focus on the practical implementation: Here we encourage students to investigate the real-world performance of one of the studied optimization algorithms or variants, helping to provide solid empirical evidence for some behaviour aspects on a real machine-learning task. The project is optional and done in groups of 2-3 students. If students decide to do the project, and if their project grade exceeds their exam grade, it will count 20% to the final grade. Project reports (2 page PDF) are due May 24th. Here is a detailed project description.

Assessment:

Final written exam in exam session. Date: Friday 06.07.2018 from 16h15 to 19h15 (in CE1515) Format: Closed book. Theoretical questions similar to exercises. You are allowed to bring one cheat sheet (A4 size paper, both sides can be used), either handwritten or 11 point minimum font size.

Links to related courses and materials

Recommended Books