This tutorial covers some Machine Learning basics with video presentations and code implementations.
- Open the file in Github
- Click the button said "Open in Colab"
- Launch Google Colab and start working!
-
Gradient Descent & Linear Regression
2.1 (8 mins) Watch [Cost Function - Andrew Ng] (https://www.bilibili.com/video/BV1AD4y1Q7RH?p=5)
2.2 (11 mins) Watch [Gradient Descent - Andrew Ng] (https://www.bilibili.com/video/BV1AD4y1Q7RH?p=5)
2.3 (12 mins) Watch Gradient Descent Intuition - Andrew Ng
2.4 (10 mins) Watch Gradient Descent for Linear Regression - Andrew Ng
2.5 (19 mins) Watch But what is a Neural Network? | Deep learning, Part 1
2.6 (Optional, 10 mins, 偏数学推导) [Why the gradient is the direction of steepest ascent - Khan Academy] (https://www.bilibili.com/video/BV1iE411K7qv) -
Gradient Descent, Forward&BackPropogation
3.1 (21 mins with 1.25x) Watch video Gradient Descent - 3Blue1Brown
3.2 (14 mins with 1.25x) Watch video Feedforward propagation - 3Blue1Brown
3.3 (10 mins with 1.25x) Watch video Backpropagation - 3Blue1Brown
3.4 (Optional) Watch video for better understanding Backpropagation 1 - Andrew Ng
3.5 (Optional) Watch video for better understanding Backpropagation 2 - Andrew Ng
3.6 (Optional) Read Feedforward propagation&Backpropagation 深度学习数学基础
3.7 (10 mins) Compile Linear Regression to understand how to update weight with Pytorch and have visulization. More details about Pytorch will be introduced in Module 3
Code Implementation: Forward/Backward Propagation
This notebook illustrates the Forward/Backward Propagation using CBOW(Continuous bag-of-words) model, which is a word embedding model learns to predict the center word given some context words. -
Logistic regression
Code Implementation: Logistic regression