Some references that we find useful
http://selenium-python.readthedocs.io
https://pypi.python.org/pypi/subhd.py/0.1.4
http://blog.topspeedsnail.com/archives/8226
https://pypi.python.org/pypi/subliminal
http://graphemica.com
http://stackoverflow.com/questions/27435855/how-does-one-print-a-unicode-character-code-in-python
https://zhidao.baidu.com/question/480799001.html
https://github.com/overtrue/pinyin https://pypi.python.org/pypi/hanziconv
http://stats.stackexchange.com/questions/236987/the-simplest-seq2seq-model-for-word-mirroring
https://github.com/ishalyminov/tensorflow/blob/master/tensorflow/examples/udacity/6_3_lstm_seq2seq.ipynb
https://arxiv.org/pdf/1609.09552.pdf
https://arxiv.org/pdf/1402.1128v1.pdf
https://github.com/KnHuq/Dynamic-Tensorflow-Tutorial/blob/master/LSTM/LSTM.py
https://gist.github.com/danijar/d11c77c5565482e965d1919291044470 https://github.com/tensorflow/tensorflow/blob/r0.12/tensorflow/python/ops/seq2seq.py
http://www.slideshare.net/emorynlp/rnn-lstm-and-seq2seq-models http://www.slideshare.net/KeonKim/attention-mechanisms-with-tensorflow
http://stackoverflow.com/questions/40044937/does-the-tensorflow-embedding-attention-seq2seq-method-implement-a-bidirectional https://www.tensorflow.org/api_docs/python/rnn_cell/rnn_cells_for_use_with_tensorflow_s_core_rnn_methods
http://www.slideshare.net/KeonKim/attention-mechanisms-with-tensorflow http://www.wildml.com/2016/01/attention-and-memory-in-deep-learning-and-nlp/