Current implementation for VNMT only support 1 layer NMT! Deep layers are meaningless.
Source code for variational neural machine translation, we will make it available soon!
If you use this code, please cite our paper:
@InProceedings{zhang-EtAl:2016:EMNLP20162,
author = {Zhang, Biao and Xiong, Deyi and su, jinsong and Duan, Hong and Zhang, Min},
title = {Variational Neural Machine Translation},
booktitle = {Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing},
month = {November},
year = {2016},
address = {Austin, Texas},
publisher = {Association for Computational Linguistics},
pages = {521--530},
url = {https://aclweb.org/anthology/D16-1050}
}
Our source is based on the GroundHog. Before use our code, please install it.
To train a good VNMT model, you need follow two steps.
Pretrain a base model using the GroundHog.
Go to the work
directory, and put the pretrained model to this directory, i.e. use the pretrained model to initialize the parameters of VNMT.
Simply Run (Clearly, before that you need re-config the chinese.py
file to your own dataset :))
run.sh
That's it!
Notice that our test and deveopment set is the NIST dataset, which follow the sgm
format! Please see the work/data/dev
for example.
For any comments or questions, please email Biao Zhang.