The assignment allows to:-
-
Understand the different initialization methods and their impact on our model performance
-
Implement zero initialization and and see it fails to "break symmetry",
-
Recognize that random initialization "breaks symmetry" and yields more efficient models,
-
Understand that we could use both random initialization and scaling to get even better training performance on your model.
The assignment allows to:-
-
Understand that different regularization methods that could help our model.
-
Implement dropout and see it work on data.
-
Recognize that a model without regularization gives us a better accuracy on the training set but nor necessarily on the test set.
-
Understand that we could use both dropout and regularization on our model.
The assignment allows to:-
-
Implement gradient checking from scratch.
-
Understand how to use the difference formula to check our backpropagation implementation.
-
Recognize that our backpropagation algorithm should give us similar results as the ones we got by computing the difference formula.
-
Learn how to identify which parameter's gradient was computed incorrectly.
The assignment allows to:-
-
Understand the intuition between Adam and RMS prop
-
Recognize the importance of mini-batch gradient descent
-
Learn the effects of momentum on the overall performance of our model
In this notebook we will learn all the basics of Tensorflow. We will implement useful functions and draw the parallel with what we did using Numpy in previous notebooks. We will understand what Tensors and operations are, as well as how to execute them in a computation graph.
After completing this assignment we will also be able to implement our own deep learning models using Tensorflow.