/Stanford-ML

My code for Stanford ML. Organised with notes as well

Primary LanguageMatlab

Stanford ML

CH1 (Linear Regression):
1. Linear Regression algorithm
2. Batch Gradient Descent algorithm
3. Linear Algebra Basics
4. Syntax for MATLAB

CH2 (Multivariate Regression):
1. Multiple Features
2. Feature Scaling
3. Mean Normalisation
4. J(0) vs No of iterations curve convergence based on learning rate
5. Polynomial Regression
6. Normal Equation vs Gradient Descent

CH3 (Logistic Regression and Regularisation)
1. Threshold Classification
2. Logistic Regression Model/Sigmoid function
3. Decision Boundary (Linear and Non-Linear)
4. Logistic Regression cost function and curves
5. Gradient Descent for Logistic Regression
6. Fminunc function for gradient minimisation
7. One-vs-all multi class classification
8. Regularisation/Overfitting/Underfitting problem
9. Regularised Logistic Regression

CH4 (Intro to Neural Networks)
1. Neuro Model
2. Neural Network matrix
3. One-vs-all 

CH5 (Neural Networks)
1. Cost Function for Neural Network
2. Gradient Computations
3. Backpropogation Algorithm
4. Forward Propagation
5. Estimation of Gradients
6. Random Initialisation: Symmetry Breaking

CH6 (ML System Design)
1. Model Selection
2. Training/Cross Validation/Test Sets
3. Error vs Degree of Polynomial
4. Bias or Variance Problem
5. Choosing Regularisation Parameter
6. Precision and Recall

CH7 (Support Vector Machines)
1. SVM Hypothesis
2. SVM Decision Boundary and it's Math
3. Non-linear Decision Boundary
4. Kernels and Similarity
5. Choosing Kernel Landmarks

CH8 (Clustering and Dimension Reduction)
1. Unsupervised Learning - K-means algorithm
2. Principal Component Analysis and it's Math

CH9 (Anomaly Detection and Recommender Systems)
1. Density Estimation Intuition
2. Gaussian/Normal Distribution Math
3. Anomaly Algorithm
4. Real time number evaluation
5. Non-gaussian features
-
6. Optimisation objective for recommender system
7. Collaborative Filtering Algorithm
8. Vetorized Example