/Self-MM

Codes for paper "Learning Modality-Specific Representations with Self-Supervised Multi-Task Learning for Multimodal Sentiment Analysis"

Primary LanguagePythonMIT LicenseMIT

Python 3.7

SELF-MM

Pytorch implementation for codes in Learning Modality-Specific Representations with Self-Supervised Multi-Task Learning for Multimodal Sentiment Analysis (AAAI2021)

Model

model

Usage

  1. Download datasets and preprocessing
  • mosi and MOSEI

download from CMU-MultimodalSDK

  • sims

download from Baidu Yun Disk[code: ozo2] or Google Drive

Then, preprocess data and save as a pickle file with the following structure.

{
    "train": {
        "raw_text": [],
        "audio": [],
        "vision": [],
        "id": [], # [video_id$_$clip_id, ..., ...]
        "text": [],
        "text_bert": [],
        "audio_lengths": [],
        "vision_lengths": [],
        "annotations": [],
        "classification_labels": [], # Negative(< 0), Neutral(0), Positive(> 0)
        "regression_labels": []
    },
    "valid": {***}, # same as the "train" 
    "test": {***}, # same as the "train"
}
  1. Download Bert-Base, Chinese from Google-Bert.
    Then, convert Tensorflow into pytorch using transformers-cli

  2. Clone this repo and install requirements.

git clone https://github.com/thuiar/Self-MM
cd Self-MM
conda create --name self_mm python=3.7
source activate self_mm
pip install -r requirements.txt
  1. Make some changes Modify the config/config_tune.py and config/config_regression.py to update dataset pathes.

  2. Run codes

python run.py --modelName self_mm --datasetName mosi

Results

Detailed results are shown in MMSA > results/result-stat.md.

Paper


Please cite our paper if you find our work useful for your research:

@inproceedings{yu2021le,
  title={Learning Modality-Specific Representations with Self-Supervised Multi-Task Learning for Multimodal Sentiment Analysis},
  author={Yu, Wenmeng and Xu, Hua and Ziqi, Yuan and Jiele, Wu},
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
  year={2021}
}