This is a TensorFlow implementation of Generating Sequences With Recurrent Neural Networks by Alex Graves.
It have two functions:
- Handwriting Prediction: Randomly generate a line of handwriting (set
mode=predict
). - Handwriting Synthesis: Given a string, generate the corresponding handwriting (set
mode=synthesis
).
This project is adapted from hardmaru's great work. The util.py
is just from there and before running train.py
and sample.py
you need to follow the instruction and download the necessary files.
I hope to make this model simple, just show the main algorithm as clear as possible without struggling with bundles of optimization methods in deep learning. But if you wish, you can add them easily by yourself.
This is the result with default setting:
- rnn state size = 256
- rnn length = 300
- num of layers = 2
- number of mixture gaussian = 20
and 20+ epochs. Not so fancy but can be recognized as something like handwritting, huh?
This is the result with the string "a quick brown fox jumps over the lazy dog".
In addition, the scribe project by greydanus also helps me a lot, expecially the use of tf.batch_matmul()
.