Repository for EMNLP'20 paper Multimodal Routing: Improving Local and Global Interpretability of Multimodal Language Analysis.
[Arxiv]: https://arxiv.org/abs/2004.14198
- Python 3.6
- Pytorch (>=1.2.0) (performance might vary in different versions)
- CUDA 10.0 or above
Data files (containing processed MOSI, MOSEI and IEMOCAP datasets) can be downloaded from here.
To retrieve the meta information and the raw data, please refer to the SDK for these datasets.
To run the program which gives computational results including accuracy and F1 scores,
bash run.sh
Some portion of the code were adapted from the multimodal_transformer repo.