THUMT: An Open Source Toolkit for Neural Machine Translation
Contents
- Introduction
- Online Demo
- Implementations
- Notable Features
- License
- Citation
- Development Team
- Contact
- Derivative Repositories
Introduction
Machine translation is a natural language processing task that aims to translate natural languages using computers automatically. Recent several years have witnessed the rapid development of end-to-end neural machine translation, which has become the new mainstream method in practical MT systems.
THUMT is an open-source toolkit for neural machine translation developed by the Natural Language Processing Group at Tsinghua University. The website of THUMT is: http://thumt.thunlp.org/.
Online Demo
The online demo of THUMT is available at http://translate.thumt.cn/. The languages involved include Ancient Chinese, Arabic, Chinese, English, French, German, Indonesian, Japanese, Portugese, Russian, and Spanish.
Implementations
THUMT has currently three main implementations:
-
THUMT-TensorFlow: a new implementation developed with TensorFlow. It implements the sequence-to-sequence model (Seq2Seq) (Sutskever et al., 2014), the standard attention-based model (RNNsearch) (Bahdanau et al., 2014), and the Transformer model (Transformer) (Vaswani et al., 2017).
-
THUMT-PyTorch: a new implementation developed with PyTorch. It implements the Transformer model (Transformer) (Vaswani et al., 2017).
-
THUMT-Theano: the original project developed with Theano, which is no longer updated because MLA put an end to Theano. It implements the standard attention-based model (RNNsearch) (Bahdanau et al., 2014), minimum risk training (MRT) (Shen et al., 2016) for optimizing model parameters with respect to evaluation metrics, semi-supervised training (SST) (Cheng et al., 2016) for exploiting monolingual corpora to learn bi-directional translation models, and layer-wise relevance propagation (LRP) (Ding et al., 2017) for visualizing and anlayzing RNNsearch.
The following table summarizes the features of three implementations:
Implementation | Model | Criterion | Optimizer | LRP | Additional Features |
---|---|---|---|---|---|
Theano | RNNsearch | MLE, MRT, SST | SGD, AdaDelta, Adam | RNNsearch | N.A. |
TensorFlow | Seq2Seq, RNNsearch, Transformer | MLE | Adam | RNNsearch, Transformer | Distributed Training, Mixed Precision Training, Gradient Aggregation, Model Ensemble |
PyTorch | Transformer | MLE | SGD, Adadelta, Adam | N.A. | Distributed Training, Mixed Precision Training, Gradient Aggregation, Model Ensemble |
We recommend using THUMT-TensorFlow or THUMT-PyTorch, which delivers better translation performance than THUMT-Theano. We will keep adding new features to THUMT-TensorFlow and THUMT-PyTorch.
It is also possible to exploit layer-wise relevance propagation to visualize the relevance between source and target words with THUMT:
Notable Features
- Transformer (Vaswani et al., 2017)
- Multi-GPU training & decoding
- Distributed training
- Float16 training
- Model ensemble & Averaging
- Relative position embedding (Shaw et al., 2018)
- Visualization with layer-wise relevance propagation (LRP) (Ding et al., 2017)
License
The source code is dual licensed. Open source licensing is under the BSD-3-Clause, which allows free use for research purposes. For commercial licensing, please email thumt17@gmail.com.
Citation
Please cite the following paper:
Jiacheng Zhang, Yanzhuo Ding, Shiqi Shen, Yong Cheng, Maosong Sun, Huanbo Luan, Yang Liu. 2017. THUMT: An Open Source Toolkit for Neural Machine Translation. arXiv:1706.06415.
Development Team
Project leaders: Maosong Sun, Yang Liu, Huanbo Luan
Project members: Jiacheng Zhang, Yanzhuo Ding, Shiqi Shen, Yong Cheng, Zhixing Tan
Contact
If you have questions, suggestions and bug reports, please email thumt17@gmail.com.
Derivative Repositories
- UCE4BT (Improving Back-Translation with Uncertainty-based Confidence Estimation)
- L2Copy4APE (Learning to Copy for Automatic Post-Editing)
- Voting4SC (Modeling Voting for System Combination in Machine Translation)
- Document-Transformer (Improving the Transformer Translation Model with Document-Level Context)
- PR4NMT (Prior Knowledge Integration for Neural Machine Translation using Posterior Regularization)