University of Toronto's CSC412: Probabilitistic Machine Learning Course. In 2020 Winter, it was the same course as STA414: Statistical Methods for Machine Learning II .
I took this course in 2020 Winter with David Duvenaud and Jesse Bettencourt. There were 4 programming assignments and a midterm(final was canceled due to COVID-19). It introduces machine learning from a probabilistic point of view. Topics include: Graphical Models, Message Passing, Variational Inference, Amortized Inference etc.
The course structure was not mature enough at the time I took it -- there were no lecture slides, and one of the assignments assumed we know about multi-layer perceptron, which wasn't a pre-requisite for this course. Personally, back at that time as a stats student, I really struggled with coding and debugging, but if you have some experience with numpy that would be much easier. I would say take CSC311 before taking this course.
https://probmlcourse.github.io/csc412 (May be deactivated in the future)
PA0: Practice some prerequisites like manual & automatic differentiation, and also get Julia, LaTeX and Atom to start working.
PA1: Basics of decision theory, Gradient-based Model Fitting.
PA2: Bayesian inference in large models with continuous latent variables, stochastic variational inference. Implemented a variant of TrueSkill model.
PA3: Implemented Variational AutoEncoder(VAE) from scratch on binarized MNIST digits for classification, generation and prediction(of bottom half given top half).