CS229: Machine Learning Solutions
This repository compiles the problem sets and my solutions to Stanford's Machine Learning graduate class (CS229), taught by Prof. Andrew Ng.
The problems sets are the ones given for the class of Fall 2017.
For each problem set, solutions are provided as an iPython Notebook.
Problem Set 1: Supervised Learning
The first problem set deals with simple supervised learning models:
- Link to the problem set: Problem Set 1
The solutions to each exercise can be found in the following notebooks:
- Exercise 1: Logistic regression and Newton's method
- Exercise 2: Poisson regression, exponential family
- Exercise 3: Gaussian discriminant analysis
- Exercise 4: Linear invariance of optimization algorithms
- Exercise 5: Regression for denoising quasar spectra
Problem Set 2: Supervised Learning II
The second problem set continues exploring supervised learning, this time tackling more sophisticated models:
- Link to the problem set: Problem Set 2
The solutions to each exercise can be found in the following notebooks:
- Exercise 1: Logistic regression: training stability
- Exercise 2: Model Calibration
- Exercise 3: Bayesian Logistic Regression and weight decay
- Exercise 4: Constructing Kernels
- Exercise 5: Kernelizing the Perceptron
- Exercise 6: Spam classification
Problem Set 3: Deep Learning & Unsupervised Learning
The third problem set explores unsupervised learning:
- Link to the problem set: Problem Set 3
The solutions to each exercise can be found in the following notebooks:
- Exercise 1: A Simple Neural Network
- Exercise 2: Expectation-Maximization for MAP Estimation
- Exercise 3: Expectation-Maximization Application: Paper Reviews
- Exercise 4: KL divergence and Maximum Likelihood
- Exercise 5: K-means for Compression
Problem Set 4: Expectation Maximization, Deep Learning & Reinforcement Learning
The fourth and final problem set explores deep learning, reinforcement learning, and unsupervised learning:
- Link to the problem set: Problem Set 4
The solutions to each exercise can be found in the following notebooks:
- Exercise 1: Neural Networks: MNIST image classification
- Exercise 2: Expectation-Maximization Convergence
- Exercise 3: Principal Component Analysis
- Exercise 4: Independent Component Analysis
- Exercise 5: Markov Decision Processes
- Exercise 6: Reinforcement Learning: the Inverted Pendulum