zhuwr0423/Word-level-language-modeling-RNN
This example trains a multi-layer RNN (Elman, GRU, or LSTM) on a language modeling task. By default, the training script uses the PTB dataset, provided. The trained model can then be used by the generate script to generate new text. The code is tested under Linux Anaconda 3 and reproducible.
Stargazers
No one’s star this repository yet.