Inference and Representation Fall 2019

Course staff

Soledad Villar soledad.villar@nyu.edu Instructor
Zhengdao Chen zc1216@nyu.edu TA
Efe Onaran eonaran@nyu.edu Grader

Syllabus

This is a graduate level course that presents fundamental tools of statistical inference, probabilistic graphical models and generative models for machine learning.

Some of the covered topics include latent graphical models (Latent Dirichlet Allocation, Gaussian Processes), state-space models (Kalman Filter, Hidden Markov Models), Gibbs Models and Deep generative models (Variational autoencoders, GANs).

Lecture (required)

Tuesdays 4:55pm-6:35pm, in 60 5th Ave, FA 110.

Recitation (required)

Mondays 4:55pm-5:45pm, in 60 5th Ave, FA 110.

Office hours

SV: Tuesdays, 3:00pm-4:45pm. Location: 60 5th ave, 6th floor, room 617.

Grading

Homework 40%, midterm exam 25%, final project 30%, participation 5%.

Materials

There is no required book. Assigned readings will come from freely-available online material.

Core Materials

Background on Probability and Optimization

Further Reading

Additional Readings for the Recitation

Important dates

  • October 15. No class and no office hours (legislative Monday).
  • October 29. Midterm
  • December 3, 4, 6. Final project presentations (see schedule).
  • December 12. Final project due.
Date Topic References Homework
Sept 3rd Introduction to inference and graphical models. Priors, likelihood functions, posteriors. MacKay chapters 2 and 21. Section 2.1 of Jordan and Wainwright. Example seen in class. HW 1 due 9/16.
Sept 9th (recitation) Basics of probability; data fitting and maximum likelihood inference. PRML chapter 1.
Sept 10th Bayesian networks, naive bayes, hidden markov models Murphy chapters 10, 17, 3.
Sept 16th (recitation) Markov chains and PageRank. Murphy sections 17.2, 17.4; Lecture notes on Markov chains
Sept 17th Bayesian networks (cont.) Bayes Ball algorithm, Undirected graphical models Murphy chapter 10 and sections 19.1-19.4 HW2 due 10/2
Sept 23rd (recitation) Hidden Markov models, Viterbi algorithm Book chapter on HMMs
Sept 24th EM for mixtures of Gaussians and training HMMs Bishop chapter 9 and https://web.stanford.edu/~jurafsky/slp3/A.pdf
Sept 30th (recitation) Baum-Welch algorithm (EM algorithm) for HMMs Book chapter on HMMs
Oct 1st Belief propagation and stochastic block model Murphy chapter 20 and https://arxiv.org/pdf/1702.00467.pdf . See Zhengdao's paper about community detection using GNNs .
Oct 7th (recitation) Belief propagation Lecture notes on BP and slides on BP
Oct 8th Introduction to error correcting codes. Introduction to sampling methods Chapters 1 and 47 of MacKay. Bishop chapter 11 HW 3 due October 22 (noon)
Oct 21st (recitation) Markov Chain Monte Carlo (MCMC) Lecture notes on MCMC
Nov 4th (recitation) MCMC cont'd; General EM algorithm Notes on EM
Nov 5th MCMC techniques for detecting Gerrymandering, Variational Autoencoders Gerrymandering and papers 1, 2. VAEs tutorial and code example Project proposal due 11/13
Nov 11th (recitation) EM cont'd; basics of neural networks
Nov 12th Variational inference https://arxiv.org/abs/1601.00670
Nov 18th GANs https://papers.nips.cc/paper/5423-generative-adversarial-nets.pdf
Nov 19th Wasserstein GAN, GLO https://arxiv.org/pdf/1701.07875.pdf and https://arxiv.org/pdf/1707.05776.pdf
Nov 25th (recitation) Community detection and GNN Community detection, GCN (Kipf & Welling), MPNN (Gilmer et al.)
Nov 26th Gaussian processes Chapter 2 of http://www.gaussianprocess.org/gpml/chapters/
Dec 2nd (recitation) Representation learning with autoencoders and predictive coding Autoencoders, Contrastive Predictive Coding, Noise Contrastive Estimation