Deep Learning Specialization on Coursera
Master Deep Learning, and Break into AI
Instructor: Andrew Ng
This repo contains all my work for this specialization. All the code base, quiz questions, screenshot, and images, are taken from, unless specified, Deep Learning Specialization on Coursera.
Goals
- Learn the foundations of Deep Learning
- Understand how to build neural networks
- Learn how to lead successful machine learning projects
- Learn about Convolutional networks, RNNs, LSTM, Adam, Dropout, BatchNorm, Xavier/He initialization, and more.
- Work on case studies from healthcare, autonomous driving, sign language reading, music generation, and natural language processing.
- Practice all these ideas in Python and in TensorFlow.
Courses
Course 1: Neural Networks and Deep Learning
Course 2: Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization
-
Week 1 - Practical aspects of Deep Learning
-
Learning Objectives
- Recall that different types of initializations lead to different results
- Recognize the importance of initialization in complex neural networks.
- Recognize the difference between train/dev/test sets
- Diagnose the bias and variance issues in your model
- Learn when and how to use regularization methods such as dropout or L2 regularization.
- Understand experimental issues in deep learning such as Vanishing or Exploding gradients and learn how to deal with them
- Use gradient checking to verify the correctness of your backpropagation implementation
-
-
Week 2 - Optimization algorithms
-
Learning Objectives
- Remember different optimization methods such as (Stochastic) Gradient Descent, Momentum, RMSProp and Adam
- Use random minibatches to accelerate the convergence and improve the optimization
- Know the benefits of learning rate decay and apply it to your optimization
-
Try notebooks in the cloud
To try out example notebooks interactively in your web browser, just click on the binder link:
Contributing
Contributions are welcome! For bug reports or requests please submit an issue.