/character-level-rnn

character-level model (layered LSTM) using Keras

Primary LanguagePython

character-level-rnn

This script implements a multi-layer LSTM for training character-level language models. The model learns to predit the next character in a sequence.

The language model was initially describe in this blogpost, and written in Torch. I have rewritten the model in Keras.

Training

  • Currently the model trains on 2-layered LSTM with 512 hidden nodes, with a batch size of 100 and 20 character length.
  • After 60 iterations, the model has learned basic English phrases and puncutation, but not Shakespearean prose. I have changed the model to be 3-layered, 512 hidden nodes, batch size of 100 and 60 character length.