Here are the complete codes and datasets that I used for "Revisiting Over-smoothing in Deep GCNs". Refer to the two-column version in this repo for figure and section numbers.
They are categorized into different folders according to the experimental sections in the paper (the two-column version one in the repo, which has slightly different figure index as the arXiv version). Please contact me chaoqiy2@illinois.edu if you have any question.
-
data/
- This is the data folder for Cora, Citeseer, Pubmed
-
Karate_Demo.ipynb
- Here is the data, model and vis code for Figure 1
-
Karate_Demo2.ipynb
- Here is the data, model and vis code for Figure 2
-
mean-subtraction/ (reproductive codes for Experiment Section 5.2 Mean-subtraction for GCNs)
- Instructions
- Before training, please use
mkdir cora
ormkdir citeseer
ormkdir pubmed
to generate the result folders for Cora, Citeseer, Pubmed - Run
python train2.py
orpython train.py
and get the experimental results for 20 rounds - move three result folders to Result-and-Vis
- Before training, please use
- Result-and-Vis/
- We already have the results for this paper (Cora, Citeseer, Pubmed). Users could choose to reproduce it by running
train2.py
ortrain.py
- run
mean-subtraction-vis.ipynb
to generate the figures
- We already have the results for this paper (Cora, Citeseer, Pubmed). Users could choose to reproduce it by running
- Instructions
-
neighbor-aggregation-weight/ (reproductive codes for Experiment Section 5.3 Weight of Neighborhood Aggregation in GCNs)
- Instructions
- Before training, please use
mkdir cora
ormkdir citeseer
ormkdir pubmed
to generate the result folders for Cora, Citeseer, Pubmed - Run
python train.py --dataset [name of the dataset]
and get the experimental results for 20 rounds
- Before training, please use
- Instructions
-
performace-depth-oversmooth/ (reproductive codes for Experiment Section 5.1 Overfitting in Deep GCNs Part-I)
- ATTENTION: The code is only for the GCN part, please remove the residual connection, non-linear activations and layer-wise parameter matrices between GCN layers to generate the SGC results.
- Instructions
- Before training, please use
mkdir cora
ormkdir citeseer
ormkdir pubmed
to generate the result folders for Cora, Citeseer, Pubmed - Run
python train.py
and get the experimental results for 20 rounds - move three result folders to Result-and-Vis
- Before training, please use
- Result-and-Vis/
- We already have the results for this paper (Cora, Citeseer, Pubmed). Users could choose to reproduce it by running train.py
- run
three-set-running-vis.ipynb
to generate the figures
-
performace-depth2-loss-function/ (reproductive codes for Experiment Section 5.1 Overfitting in Deep GCNs Part-II)
- Instructions
- Before training, please use
mkdir cora
ormkdir citeseer
ormkdir pubmed
to generate the result folders for Cora, Citeseer, Pubmed - Run
python train.py
and get the experimental results for 20 rounds - move three result folders to Result-and-Vis
- Before training, please use
- Result-and-Vis/
- We already have the results for this paper (Cora, Citeseer, Pubmed). Users could choose to reproduce it by running train.py
- run
overfitting-vis.ipynb
to generate the figures
- Instructions
If you feel our paper and the code is useful. Please add the following citation. Contact chaoqiy2@illinois.edu for any question.
@article{yang2020revisiting,
title={Revisiting over-smoothing in deep GCNs},
author={Yang, Chaoqi and Wang, Ruijie and Yao, Shuochao and Liu, Shengzhong and Abdelzaher, Tarek},
journal={arXiv preprint arXiv:2003.13663},
year={2020}
}