/EMMA_CoTEX_ABAW4

The codebase for ABAW4 challenge of ECCV2022 workshop.

Primary LanguagePython

Affective Behaviour Analysis Using Pretrained Model with Facial Priori for ABAW4

[Paper], [slides] (code: ABAW), [video] (code: ABAW)

This repository is the codebase for ABAW4 challenge, which includes EMMA for multi-task-learning (MTL) and masked CoTEX for learning from synthetic data (LSD) challenge. Our ICT-VIPL team reached 2nd place and 4th place in MTL and LSD challenges, respectively.

Citing this paper

If you find this repo is useful, please cite the following BibTeX entry. Thank you!

@inproceedings{li2023affective,
  title={Affective Behaviour Analysis Using Pretrained Model with Facial Prior},
  author={Li, Yifan and Sun, Haomiao and Liu, Zhaori and Han, Hu and Shan, Shiguang},
  booktitle={European Conference on Computer Vision Workshop},
  pages={19--30},
  year={2023},
  organization={Springer}
}

Pretrained models

The pretrained models for EMMA and COTEX are provided through the following urls:

MAE ViT pretrained on CelebA [link] (code: ABAW)
DAN pretrained on AffectNet [link] (code: ABAW)

We also provide the pretrained EMMA model:

EMMA [link] (code: ABAW)

Requirements

This codebase is based on Python 3.7. Ensure you have installed all the necessary Python packages, run python install -r requirements.txt

Data

Please download the ABAW4 data including MTL and LSD before running the code.

Training

EMMA

  • First you need to change the pretrained model and dataset directories in the script shs/train_EMMA.sh

  • Second, run the following command:

sh shs/train_EMMA.sh

Masked CoTEX

  • First you need to change the pretrained model and dataset directories in the script shs/train_masked_CoTEX.sh

  • Second, run the following command:

sh shs/train_masked_CoTEX.sh

Reference

This code refers to masked auto-encoder (MAE) and DAN. Thank you!