/Improving-Deep-Neural-Networks-Hyperparameter-tuning-Regularization-and-Optimization

Curso Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization. Segundo curso del programa especializado Deep Learning. Este repositorio contiene todos los ejercicios resueltos. https://www.coursera.org/learn/neural-networks-deep-learning

Primary LanguageJupyter NotebookGNU General Public License v3.0GPL-3.0

Improving Deep Neural Networks Hyperparameter tuning Regularization and Optimization | Certificate

This course will teach you the "magic" of getting deep learning to work well. Rather than the deep learning process being a black box, you will understand what drives performance, and be able to more systematically get good results. You will also learn TensorFlow.

After 3 weeks, you will:

  • Understand industry best-practices for building deep learning applications.
  • Be able to effectively use the common neural network "tricks", including initialization, L2 and dropout regularization, Batch normalization, gradient checking.
  • Be able to implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence.
  • Understand new best-practices for the deep learning era of how to set up train/dev/test sets and analyze bias/variance.
  • Be able to implement a neural network in TensorFlow.

This is the second course of the Deep Learning Specialization.

Schedule

Week 1

Practical aspects of Deep Learning

Week 2

Optimization algorithms

Week 3

Hyperparameter tuning, Batch Normalization and Programming Frameworks