- Hyperparameter Tuning, Regularization and Optimization
- Recall that different types of initializations lead to different results
- Setting up Machine Learning Application: train/dev/test sets
- Diagnose the bias and variance issues in DL models : bias-variance trade-off
- Regularization methods such as dropout or L2 regularization.
- Deal with experimental issues such as Vanishing gradients in DL models
- Recall various optimization methods such as Momentum, RMSProp,(Stochastic) Gradient Descent (SGD), and Adam
- Use random minibatches to accelerate the convergence and improve optimization
- Use gradient checking to verify the correctness of backpropagation implementation
- Initialization.ipynb
- Regularization.ipynb
- Gradient Checking v1.ipynb
- Optimization methods.ipynb
- Tensorflow Tutorial.ipynb