Introduction

Introduction

This lesson summarizes the topics we'll be covering in section 46 and why they'll be important to you as a data scientist.

Objectives

You will be able to:

  • Understand and explain what is covered in this section
  • Understand and explain why the section will help you to become a data scientist

Deep NLP - Sequence Models

In this section, we'll continue on our learning path of deep NLP with the notion of sequence models

Sequence Model Use Cases

We'll kick off this section by teaching you about Sequence Models, what makes them different from traditional Multi-Layer Perceptrons (or classical densely connected neural networks), and you'll learn about use cases for sequence models, such as applications with text. You'll also learn that deep learning sequence models are also referred to as "Recurrent Neural Networks".

Understanding Recurrent Neural Networks

Next, you'll dive deeper into understanding Recurrent Neural Networks (RNNs). You'll learn about basic RNN structures and about an adapted form or backpropagation which is referred to as "backpropagation through time".

LSTMs and GRUs

Additionally, you'll learn about two advanced and special types of neurons in the RNNs that typically outperform basic RNNs, "Long Short Term Memory"(LSTM) Cells and "Gated Recurrent Units" (GRU)! You'll explore the problems they solve, and compare and contrast the two neurons types to get a feel for what exactly they do and how they do it!

Bidirectional Sequence Models

In the last part of this section, you'll learn about so-called "bi-directional RNNs", and why they excel at NLP tasks.

Summary

In this section, you'll dive deeper into NLP in combination with deep Recurrent Neural Networks!