/multimodal_routing

Primary LanguagePythonMIT LicenseMIT

multimodal_routing

Repository for EMNLP'20 paper Multimodal Routing: Improving Local and Global Interpretability of Multimodal Language Analysis.
[Arxiv]: https://arxiv.org/abs/2004.14198

Prerequisites

  • Python 3.6
  • Pytorch (>=1.2.0) (performance might vary in different versions)
  • CUDA 10.0 or above

Datasets

Data files (containing processed MOSI, MOSEI and IEMOCAP datasets) can be downloaded from here.

To retrieve the meta information and the raw data, please refer to the SDK for these datasets.

Commands

To run the program which gives computational results including accuracy and F1 scores,

bash run.sh

Acknowledgement

Some portion of the code were adapted from the multimodal_transformer repo.