/rnn-notebooks

RNN(SimpleRNN, LSTM, GRU) Tensorflow2.0 & Keras Notebooks (Workshop materials)

Primary LanguageJupyter Notebook

rnn-notebooks

RNN(SimpleRNN, LSTM, GRU) Tensorflow2.0 & Keras Notebooks (Workshop materials)

class.vision

class.vision

Slides

RNN.pdf

Video

Some parts are freely available from our Aparat channel or you can purchase a full package including 32 videos in Persian from class.vision

Notebooks

Intro to RNN:

01_simple-RNN.ipynb

How we can inference with diffrent sequence length?!

02_1_simple-RNN-diffrent-sequence-length.ipynb

02_2_simple-RNN-diffrent-sequence-length.ipynb

Cryptocurrency predicting

  • when we use return_sequences=True ?
  • Stacked RNN (Deep RNN)
  • using a LSTM layer

03_1_Cryptocurrency-predicting.ipynb

03_2_Cryptocurrency-predicting.ipynb

CNN + LSTM for Ball movement classification

  • what is TimeDistributed layer in Keras?
  • Introduction to video classification
  • CNN + LSTM

04_simple-CNN-LSTM.ipynb

Action Recognition with pre-trained CNN and LSTM

  • How using pre-trained CNN as a feature extracture for RNN
  • using GRU layer

05-1-video-action-recognition-train-extract-features-with-cnn

05-2_video-action-recognition-train-rnn.ipynb

Word Embedding and Analogy

  • Using Glove
  • Cosine Similarity
  • Analogy

06_analogy-using-embeddings.ipynb

Text Classification

  • What is Bag of Embeddings?
  • Using Embedding Layer in keras
  • Set embedding layer with pre-trained embedding
  • Using RNN for NLP Tasks

07_text-classification-Emojify.ipynb

Language Model and Text generation (On Persian poetry, Shahnameh)

  • what is TF Dataset
  • Stateful VS Stateless
  • When we need batch_input_shape ?

08_shahnameh-text-generation-language-model.ipynb

Seq2Seq networks (Encoder-Decoder)

Understanding a mathematical strings with seq2seq

  • using RepeatVector for connecting encoder to decoder
  • use encoder hidden state as an input decoder

09_add-numbers-with-seq2seq.ipynb

NMT (Natural Machine Trnslate) with Attention in Keras

10_Neural-machine-translation-with-attention-for-date-convert.ipynb

NMT with Attention and teacher forcing in TF2.0

  • Teacher forcing
  • Loss with Mask for zero padding!
  • Using Model-Subclassing

11_nmt-with-attention.ipynb

Image Captioning with Attention

12_image-captioning-with-attention.ipynb