This short lesson summarizes key takeaways from section 44.
You will be able to:
- Understand and explain what was covered in this section
- Understand and explain why this section will help you become a data scientist
The key takeaways from this section include:
- A Sequence Model is a general term for a special class of Deep Neural Networks that work with a time series of data as an input
- In the deep learning context, sequence models are also referred to as "Recurrent Neural Networks" (RNNs)
- Sequence Models are often used for text classification, sequence generation, etc.
- A basic Recurrent Neural Network is a neural network that passes it's output from a given example back into itself as input for the next example (which feels a little bit like the time series models we've seen)
- RNN architectures use a special type of backpropagation referred to as "backpropagation through time"
- To special types of RNNs are referred to as "Long Short Term Memory" (LSTM) and "Gated Recurrent Unit" (GRU) models.