Codes for the NeurIPS 2023 paper Latent Graph Inference with Limited Supervision
.
The Cora
, Citeseer
, and Pubmed
datasets can be download from here. Please place the downloaded files in the folder data_tf
. The ogbn-arxiv
dataset will be loaded automatically.
conda create -n LGI python=3.7.2
conda activate LGI
pip install torch==1.5.1 torchvision==0.6.1
pip install scipy==1.2.1
pip install scikit-learn==0.21.3
pip install dgl-cu102==0.5.2
pip install ogb==1.2.3
wget https://data.pyg.org/whl/torch-1.5.0%2Bcu102/torch_scatter-2.0.5-cp37-cp37m-linux_x86_64.whl
wget https://data.pyg.org/whl/torch-1.5.0%2Bcu102/torch_sparse-0.6.5-cp37-cp37m-linux_x86_64.whl
wget https://data.pyg.org/whl/torch-1.5.0%2Bcu102/torch_cluster-1.5.4-cp37-cp37m-linux_x86_64.whl
wget https://data.pyg.org/whl/torch-1.5.0%2Bcu102/torch_spline_conv-1.2.0-cp37-cp37m-linux_x86_64.whl
pip install torch_scatter-2.0.5-cp37-cp37m-linux_x86_64.whl
pip install torch_sparse-0.6.5-cp37-cp37m-linux_x86_64.whl
pip install torch_cluster-1.5.4-cp37-cp37m-linux_x86_64.whl
pip install torch_spline_conv-1.2.0-cp37-cp37m-linux_x86_64.whl
pip install torch-geometric==1.6.1
We provide GCN+KNN
, GCN+KNN_U
, and GCN+KNN_R
as examples due to their simplicity and effectiveness. To test their performances on the Pubmed
dataset, run the following command:
bash experiments.sh
The experimental results will be saved in the corresponding *.txt file.
@inproceedings{Jianglin2023LGI,
title={Latent Graph Inference with Limited Supervision},
author={Lu, Jianglin and Xu, Yi and Wang, Huan and Bai, Yue and Fu, Yun},
booktitle={Advances in Neural Information Processing Systems},
year={2023}
}
@inproceedings{fatemi2021slaps,
title={SLAPS: Self-Supervision Improves Structure Learning for Graph Neural Networks},
author={Fatemi, Bahare and Asri, Layla El and Kazemi, Seyed Mehran},
booktitle={Advances in Neural Information Processing Systems},
year={2021}
}
Our codes are mainly based on SLAPS. For other comparison methods, please refer to their publicly available code repositories. We gratefully thank the authors for their contributions.