This paper introduces MEAformer, a multi-modal entity alignment transformer approach for meta modality hybrid, which dynamically predicts the mutual correlation coefficients among modalities for more fine-grained entity-level modality fusion and alignment.
2023-7
We release the Repo for our paper:Rethinking Uncertainly Missing and Ambiguous Visual Modality in Multi-Modal Entity Alignment
!2023-5
We preprint our paperRevisit and Outstrip Entity Alignment: A Perspective of Generative Models
!2023-4
We release the complete code and data for MEAformer !
pip install -r requirement.txt
- Python (>= 3.7)
- PyTorch (>= 1.6.0)
- numpy (>= 1.19.2)
- Transformers (>= 4.21.3)
- easydict (>= 1.10)
- unidecode (>= 1.3.6)
- tensorboard (>= 2.11.0)
- Quick start: Using script file (
run.sh
)
>> cd MEAformer
>> bash run.sh
- Optional: Using the
bash command
>> cd MEAformer
# -----------------------
# ---- non-iterative ----
# -----------------------
# ---- w/o surface ----
# FBDB15K
>> bash run_meaformer.sh 1 FBDB15K norm 0.8 0
>> bash run_meaformer.sh 1 FBDB15K norm 0.5 0
>> bash run_meaformer.sh 1 FBDB15K norm 0.2 0
# FBYG15K
>> bash run_meaformer.sh 1 FBYG15K norm 0.8 0
>> bash run_meaformer.sh 1 FBYG15K norm 0.5 0
>> bash run_meaformer.sh 1 FBYG15K norm 0.2 0
# DBP15K
>> bash run_meaformer.sh 1 DBP15K zh_en 0.3 0
>> bash run_meaformer.sh 1 DBP15K ja_en 0.3 0
>> bash run_meaformer.sh 1 DBP15K fr_en 0.3 0
# ---- w/ surface ----
# DBP15K
>> bash run_meaformer.sh 1 DBP15K zh_en 0.3 1
>> bash run_meaformer.sh 1 DBP15K ja_en 0.3 1
>> bash run_meaformer.sh 1 DBP15K fr_en 0.3 1
# -----------------------
# ------ iterative ------
# -----------------------
# ---- w/o surface ----
# FBDB15K
>> bash run_meaformer_il.sh 1 FBDB15K norm 0.8 0
>> bash run_meaformer_il.sh 1 FBDB15K norm 0.5 0
>> bash run_meaformer_il.sh 1 FBDB15K norm 0.2 0
# FBYG15K
>> bash run_meaformer_il.sh 1 FBYG15K norm 0.8 0
>> bash run_meaformer_il.sh 1 FBYG15K norm 0.5 0
>> bash run_meaformer_il.sh 1 FBYG15K norm 0.2 0
# DBP15K
>> bash run_meaformer_il.sh 1 DBP15K zh_en 0.3 0
>> bash run_meaformer_il.sh 1 DBP15K ja_en 0.3 0
>> bash run_meaformer_il.sh 1 DBP15K fr_en 0.3 0
# ---- w/ surface ----
# DBP15K
>> bash run_meaformer_il.sh 1 DBP15K zh_en 0.3 1
>> bash run_meaformer_il.sh 1 DBP15K ja_en 0.3 1
>> bash run_meaformer_il.sh 1 DBP15K fr_en 0.3 1
βTips: you can open the run_meaformer.sh
or run_meaformer_il.sh
file for parameter or training target modification.
w/o surface & Non-iterative
in UMAEA. We modified part of the MSNEA to involve not using the content of attribute values but only the attribute types themselves (See issues for details):
Method | |||
---|---|---|---|
MSNEA | .609 | .541 | .557 |
EVA | .683 | .669 | .686 |
MCLEA | .726 | .719 | .719 |
MEAformer | .772 | .764 | .771 |
UMAEA | .800 | .801 | .818 |
βNOTE: Download from GoogleDrive (1.26G) and unzip it to make those files satisfy the following file hierarchy:
ROOT
βββ data
β βββ mmkg
βββ code
βββ MEAformer
π π Click
MEAformer
βββ config.py
βββ main.py
βββ requirement.txt
βββ run_meaformer.sh
βββ run_meaformer_il.sh
βββ run.sh
βββ model
β βββ __init__.py
β βββ layers.py
β βββ MEAformer_loss.py
β βββ MEAformer.py
β βββ MEAformer_tools.py
β βββ Tool_model.py
βββ src
β βββ __init__.py
β βββ distributed_utils.py
β βββ data.py
β βββ utils.py
βββ torchlight
βββ __init__.py
βββ logger.py
βββ metric.py
βββ utils.py
π π Click
mmkg
βββ DBP15K
β βββ fr_en
β β βββ ent_ids_1
β β βββ ent_ids_2
β β βββ ill_ent_ids
β β βββ training_attrs_1
β β βββ training_attrs_2
β β βββ triples_1
β β βββ triples_2
β βββ ja_en
β β βββ ent_ids_1
β β βββ ent_ids_2
β β βββ ill_ent_ids
β β βββ training_attrs_1
β β βββ training_attrs_2
β β βββ triples_1
β β βββ triples_2
β βββ translated_ent_name
β β βββ dbp_fr_en.json
β β βββ dbp_ja_en.json
β β βββ dbp_zh_en.json
β βββ zh_en
β βββ ent_ids_1
β βββ ent_ids_2
β βββ ill_ent_ids
β βββ training_attrs_1
β βββ training_attrs_2
β βββ triples_1
β βββ triples_2
βββ FBDB15K
β βββ norm
β βββ ent_ids_1
β βββ ent_ids_2
β βββ ill_ent_ids
β βββ training_attrs_1
β βββ training_attrs_2
β βββ triples_1
β βββ triples_2
βββ FBYG15K
β βββ norm
β βββ ent_ids_1
β βββ ent_ids_2
β βββ ill_ent_ids
β βββ training_attrs_1
β βββ training_attrs_2
β βββ triples_1
β βββ triples_2
βββ embedding
β βββ glove.6B.300d.txt
βββ pkls
β βββ dbpedia_wikidata_15k_dense_GA_id_img_feature_dict.pkl
β βββ dbpedia_wikidata_15k_norm_GA_id_img_feature_dict.pkl
β βββ FBDB15K_id_img_feature_dict.pkl
β βββ FBYG15K_id_img_feature_dict.pkl
β βββ fr_en_GA_id_img_feature_dict.pkl
β βββ ja_en_GA_id_img_feature_dict.pkl
β βββ zh_en_GA_id_img_feature_dict.pkl
βββ MEAformer
βββ dump
Please condiser citing this paper if you use the code
or data
from our work.
Thanks a lot :)
@inproceedings{chen2023meaformer,
author = {Zhuo Chen and
Jiaoyan Chen and
Wen Zhang and
Lingbing Guo and
Yin Fang and
Yufeng Huang and
Yichi Zhang and
Yuxia Geng and
Jeff Z. Pan and
Wenting Song and
Huajun Chen},
title = {MEAformer: Multi-modal Entity Alignment Transformer for Meta Modality Hybrid},
booktitle = {{ACM} Multimedia},
publisher = {{ACM}},
year = {2023}
}
We appreciate MCLEA, MSNEA, EVA, MMEA and many other related works for their open-source contributions.