Name | ||
---|---|---|
Instructor | Joan Bruna | bruna@cims.nyu.edu |
TA | Sanyam Kapoor | sanyam@nyu.edu |
This graduate level course presents fundamental tools of probabilistic graphical models, with an emphasis on designing and manipulating generative models, and performing inferential tasks when applied to various types of data.
We will study latent graphical models (Latent Dirichlet Allocation, Gaussian Processes), state-space models for (Kalman Filter, HMMs), Gibbs Models, Deep generative models (Variational autoencoders, GANs) covering both the methods (inference, sampling algorithms, learning, exponential families) and modeling applications to text, images and physics data.
Tuesdays, 4:55-6:35pm, in 60 FA 110
Mondays, 4:55-6:35pm in 60 FA 110
JB: Tuesdays, 3:00pm-4:45pm. Location: 60 5th ave, 6th floor, room 612.
SK: TBA.
problem sets (40%) + midterm exam (25%) + final project (30%) + participation (5%).
We will use Piazza to answer questions and post announcements about the course. Students' use of Piazza, particularly for adequately answering other students' questions, will contribute toward their participation grade.
Most of the lectures videos will be posted to NYU Classes. Note, however, that class attendance is required.
This semester, the lab sessions will feature the inverse curricula pioneered by C. Resnick in my previous class (see here and here for more details). The two topics where we will apply depth-first-learning are Normalizing Flows and Introduction to Model-based RL.
Week | Lecture Date | Topic | Reference | Deliverables |
---|---|---|---|---|
1 | 9/4 | Lec1 Introduction and Logistics. Inference Examples. Bayesian Networks. Slides | Murphy Chapter 1 (optional; review for most) Notes on Bayesian networks (Sec. 2.1) Algorithm for d-separation (optional) |
PS1, due 9/11 |
2 | 9/11 | Guest Lecture: Rajesh Ranganath (NYU) | ||
3 | 9/18 | Lec2 Undirected Graphical Models. Markov Random Fields. Ising Model. Applications to Statistical Physics. Slides | Notes on MRFs (Sec. 2.2-2.4) Notes on exponential families Notes on Hammersley-Clifford Theorem |
PS2, due 9/25 |
4 | 9/25 | Lec3 The Hammersley-Clifford Theorem. Belief Propagation. Slides | Barber 27.1-27.3.1 Murphy Sec. 24.1-24.2.4 Introduction to Probabilistic Topic Models Explore topic models of: politics over time, state-of-the-union addresses, Wikipedia |
PS3, due 10/9 ipython notebook Project Proposal, due 10/23 |
5 | 10/2 | Lec4: BP (cont'd). Gibbs Sampling. PCA. Slides | Elements of Statistical Learning, Ch.14 Finding Structure in Randomness (...), Halko, Martinsson, Tropp |
|
6 | 10/9 | No Lecture (legislative monday) | ||
7 | 10/16 | Midterm Exam | ||
8 | 10/23 | Lec5 PCA (cont'd). ICA. The EM algorithm Slides | Graphical Models, Exponential Families and Variational INference, Chapter 3 Variational INference with Stochastic Search | PS4, due 11/5 |
9 | 10/30 | Lec6 EM (cont'd), MCMC Slides | Variational Inference: A review for Statisticians, by Blei, McAuliffe, Kucukelbir AutoEncoding Variational Bayes (Kingma, Welling | |
10 | 11/6 | Lec7 MCMC (cont'd), Variational Inference Slides | Graphical Models, Exponential Families and Variational INference, Chapter 3 Variational INference with Stochastic Search | PS5 |
11 | 11/13 | Lec8 VI (cont'd) Variational Autoencoders Slides | References on Slides | |
12 | 11/20 | Lec9 VAE, Structured Output Prediction Slides | PS6, due 12/8 | |
13 | 11/27 | Lec10 Structured Output Prediction (cont'd), EP, Unrolling inference with Neural Networks Slides | references in slides, Expectation-Propagation Notes | |
14 | 12/4 | Lec11 Deep Generative Models (1/2): Implicit Modeling Slides | Geometrical Insights for Implicit Modeling, Bottou et al. and references in slides | Project writeup, due 12/19. |
15 | 12/11 | Lec12 Deep Generative Models (2/2): Auto-regressive models. Open Problems Slides | ||
16 | 12/18 | Final Day Poster Presentations of Final Projects Location: Center for Data Science, 60 5th ave, in the 7th floor open space |
There is no required book. Assigned readings will come from freely-available online material.
- Kevin Murphy, Machine Learning: a Probabilistic Perspective, MIT Press, 2012. You can read this online for free from NYU Libraries. We recommend the latest (4th) printing, as earlier editions had many typos. You can tell which printing you have as follows: check the inside cover, below the "Library of Congress" information. If it says "10 9 8 ... 4" you've got the (correct) fourth print.
- Daphne Koller and Nir Friedman, Probabilistic Graphical Models: Principles and Techniques, MIT Press, 2009.
- Mike Jordan's notes on Probabilistic Graphical Models
- MIT lecture notes on algorithms for inference.
- Probabilistic Programming and Bayesian Methods for Hackers by Cam Davidson Pilon
- Trevor Hastie, Rob Tibshirani, and Jerry Friedman, Elements of Statistical Learning, Second Edition, Springer, 2009. (Can be downloaded as PDF file.)6
- David Barber, Bayesian Reasoning and Machine Learning , Cambridge University Press, 2012. (Can be downloaded as PDF file.)
- Review notes from Stanford's machine learning class
- Sam Roweis's probability review
- Convex Optimization by Stephen Boyd and Lieven Vandenberghe.
- Carlos Ferndandez's notes on Statistics and Probability for Data Science DS-GA 1002
- Mike Jordan and Martin Wainwright, Graphical Models, Exponential Families, and Variational Inference
We expect you to try solving each problem set on your own. However, when being stuck on a problem, we encourage you to collaborate with other students in the class, subject to the following rules:
- You may discuss a problem with any student in this class, and work together on solving it. This can involve brainstorming and verbally discussing the problem, going together through possible solutions, but should not involve one student telling another a complete solution.
- Once you solve the homework, you must write up your solutions on your own, without looking at other people's write-ups or giving your write-up to others.
- In your solution for each problem, you must write down the names of any person with whom you discussed it. This will not affect your grade.
- Do not consult solution manuals or other people's solutions from similar courses.
During the semester you are allowed at most two extensions on the homework assignment. Each extension is for at most 48 hours and carries a penalty of 25% off your assignment.