/AMI

Primary LanguagePython

AMI

Implementations of the paper Adversarial Mutual Information for Text Generation. The code is based on the OpenNMT.

Required

All dependencies can be installed by:

pip install -r requirements.opt.txt

Data

We support dialogue generation and neural machine translation datasets, and the datasets we used in the paper are PersonaChat and WMT translation dataset.

Preprocessing

See build_dataset.sh for preprocessing commands.

Run

See train.sh and test.sh for training and testing commands.

If you want to train on GPU, you need to set, as an example: CUDA_VISIBLE_DEVICES=1,3 -world_size 2 -gpu_ranks 0 1 to use (say) GPU 1 and 3 on this node only.

For any question or suggestions, feel free to contact panby@zju.edu.cn.

Reference

"Adversarial Mutual Information for Text Generation" Boyuan Pan*, Yazheng Yang*, Kaizhao Liang, Bhavya Kailkhura, Zhongming Jin, Xian-sheng Hua, Deng Cai, Bo Li. ICML (2020)

@article{Pan2020AdversarialMI,
  title={Adversarial Mutual Information for Text Generation},
  author={Boyuan Pan and Y. Yang and Kaizhao Liang and B. Kailkhura and Zhongming Jin and Xiansheng Hua and Deng Cai and B. Li},
  journal={ArXiv},
  year={2020},
  volume={abs/2007.00067}
}