/NMT_chainer

RNN机器翻译课设源码

Primary LanguagePython

Attention-based NMT model built by Chainer

This is a LSTM NMT model of Japanese-English translation using Chianer 1.24. The main idea is baesd on the attention model proposed in the paper: Neural machine translation by jointly learning to align and translate.

It adopted the 'global attention with dot product' introduced in the paper called Effective Approaches to Attention-based Neural Machine Translation .
It adopted dropout introduced in the paper: RECURRENT NEURAL NETWORK REGULARIZATION.

For mroe information about Chainer, please refer to the chainer documentation.

The requirement of this code: miniconda + python 3+
seaborn + pandas + matplotlib
tqdm + chainer=1.24 + ipython