/Training-Free-Graph-Matching

Source code of "Training Free Graph Neural Networks for Graph Matching"

Primary LanguagePython

README

This is the code for paper Training Free Graph Neural Networks for Graph Matching. arXiv preprint.

Description

Training Free Graph Matching (TFGM) is a framework to boost the performance of GNNs for graph matching without training. This github repository contains our exemplar implementations of TFGM with the popular GraphSAGE, SplineCNN, and DGMC. The idea is easy to implement and you can also try TFGM with other GNNs.

Dependencies

  1. PyTorch
  2. Pytorch Geometric 1.7.0

Datasets

Download the DBP15k, the DWY100k and the PPI dataset from this Onedrive Link. Unzip the file in the data folder.

PascalVOC will be downloaded automatically when running codes.

Reproduce Results in Paper

PascalVOC

>> python pascal.py --use_splinecnn --use_knn --use_dgmc --gpu_id 0

DBP15k

>> python dbp15k.py --dataset zh_en --use_char_embedding --use_dgmc --symmetric_align --use_supervision --weight_free --gpu_id 0 ## Chinese-English KG pair
>> python dbp15k.py --dataset ja_en --use_char_embedding --use_dgmc --symmetric_align --use_supervision --weight_free --gpu_id 0 ## Japanese-English KG pair
>> python dbp15k.py --dataset fr_en --use_char_embedding --use_dgmc --symmetric_align --use_supervision --weight_free --gpu_id 0 ## French-English KG pair

PPI

>> python ppi.py --dataset extra_edge --use_dgmc --num_steps 100  --weight_free --rnd_dim 128 --gpu_id 0  ## Low-Conf Edge dataset
>> python ppi.py --dataset rewirement --use_dgmc --num_steps 100  --weight_free --rnd_dim 128 --gpu_id 0  ## Random Rewirement dataset

If you have any questions regarding running the code, please feel free to raise a github issue.

Reference

If you use our code, please cite our paper

@article{liu2022training,
  title={Training Free Graph Neural Networks for Graph Matching},
  author={Liu, Zhiyuan and Cao, Yixin and Feng, Fuli and Wang, Xiang and Tang, Jie and Kawaguchi, Kenji and Chua, Tat-Seng},
  journal={arXiv preprint arXiv:2201.05349},
  year={2022}
}