PyTorch implementation of the models described in the paper Emergent Translation in Multi-Agent Communication.
We present code for training and decoding both word- and sentence-level models and baselines, as well as preprocessed datasets.
- Python 2.7
- PyTorch 0.2
- Numpy
- CUDA (we recommend using the latest version. The version 8.0 was used in all our experiments.)
- For preprocessing, we used scripts from Moses and Subword-NMT.
The original corpora can be downloaded from (Bergsma500, Multi30k, MS COCO). For the preprocessed corpora see below.
Dataset | |
---|---|
Bergsma500 | Data |
Multi30k | Data |
MS COCO | Data |
- Download the datasets and place them in
/data/word
(Bergsma500) and/data/sentence
(Multi30k and MS COCO) - Set correct path in
scr_path()
from/scr/word/util.py
andscr_path()
,multi30k_reorg_path()
andcoco_path()
from/src/sentence/util.py
$ python word/bergsma_bli.py
$ python word/train_word_joint.py --l1 <L1> --l2 <L2>
where <L1>
and <L2>
are any of {en, de, es, fr, it, nl}
$ python sentence/baseline_nn.py --dataset <DATASET> --task <TASK> --src <SRC> --trg <TRG>
$ python sentence/nmt.py --dataset <DATASET> --task <TASK> --src <SRC> --trg <TRG> --nn_baseline
$ python sentence/train_naka_encdec.py --dataset <DATASET> --task <TASK> --src <SRC> --trg <TRG> --train_enc_how <ENC_HOW> --train_dec_how <DEC_HOW>
where <ENC_HOW>
is either two
or three
, and <DEC_HOW>
is either img
, des
, or both
.
$ python sentence/train_seq_joint.py --dataset <DATASET> --task <TASK>
$ python sentence/nmt.py --dataset <DATASET> --task <TASK> --src <SRC> --trg <TRG>
where <DATASET>
is multi30k
or coco
, and <TASK>
is either 1 or 2 (only applicable for Multi30k).
- Moses is licensed under LGPL, and Subword-NMT is licensed under MIT License.
- MS COCO and Multi30k are licensed under Creative Commons.
If you find the resources in this repository useful, please consider citing:
@inproceedings{Lee:18,
author = {Jason Lee and Kyunghyun Cho and Jason Weston and Douwe Kiela},
title = {Emergent Translation in Multi-Agent Communication},
year = {2018},
booktitle = {Proceedings of the International Conference on Learning Representations},
}