/neural-machine-translation

build a neural machine translator using seq2seq, attention mechanism.

Primary LanguagePython

Neural Machine Translation

Machine Translation Framework.

Machine Translation, Image Caption, Text Summary, Music Generation, ChatBot all could be trained by this model.

Environment:

Tensorflow: >= 1.2

References:

  1. Sutskever et al., 2014
  2. Cho et al., 2014
  3. Neural Machine Translation and Sequence-to-sequence Models: A Tutorial

Framework Points:

  • Seq2Seq by tensorflow
    • Encoder
    • Decoder
    • Optimizer
    • weighted crossentropy
  • Attention Model in tensorflow.
  • Stack based Model
  • Data Pipeline in tensorflow.

TO DO:

  • a convenient GUI
  • CNN based Encoder.

e.g:

'知识就是力量' ==> 'Knowledge is power'