Integrating Multi-Label Contrastive Learning with Dual Adversarial Graph Neural Networks for Cross-Modal Retrieval
This repository contains the author's implementation in PyTorch for the AAAI-21 paper "Dual Adversarial Label-aware Graph Neural Networks for Cross-modal Retrieval" and the TPAMI-22 paper "Integrating Multi-Label Contrastive Learning with Dual Adversarial Graph Neural Networks for Cross-Modal Retrieval".
-
Python (>=3.8)
-
PyTorch (>=1.7.1)
-
Scipy (>=1.5.2)
You can download the features of the datasets from:
- MIRFlickr, OneDrive, BaiduPan(password: b04z)
- NUS-WIDE (top-21 concepts), BaiduPan(password: tjvo)
- MS-COCO, BaiduPan(password: 5uvp)
Here we provide the implementation of our proposed models, along with datasets. The repository is organised as follows:
data/
contains the necessary dataset files for NUS-WIDE, MIRFlickr, and MS-COCO;models.py
contains the implementation of theP-GNN-CON
andI-GNN-CON
;
Finally, main.py
puts all of the above together and can be used to execute a full training run on MIRFlcikr or NUS-WIDE or MS-COCO.
- Place the datasets in
data/
- Set the experiment parameters in
main.py
. - Train a model:
python main.py
- Modify the parameter
EVAL = True
inmain.py
for evaluation:
python main.py
If you find our work or the code useful, please consider cite our paper using:
@article{Qian_Xue_Zhang_Fang_Xu_2021,
title={Dual Adversarial Graph Neural Networks for Multi-label Cross-modal Retrieval},
volume={35},
number={3},
journal={Proceedings of the AAAI Conference on Artificial Intelligence},
author={Qian, Shengsheng and Xue, Dizhan and Zhang, Huaiwen and Fang, Quan and Xu, Changsheng},
year={2021},
pages={2440-2448}
}
@article{9815553,
title={Integrating Multi-Label Contrastive Learning With Dual Adversarial Graph Neural Networks for Cross-Modal Retrieval},
journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
author={Qian, Shengsheng and Xue, Dizhan and Fang, Quan and Xu, Changsheng},
year={2022},
pages={1-18},
doi={10.1109/TPAMI.2022.3188547}
}