/cGCN_fMRI

Graph Convolution Network for fMRI Analysis Based on Connectivity Neighborhood

Primary LanguageJupyter Notebook

Graph Convolution Network for fMRI Analysis Based on Connectivity Neighborhood

Here is the paper. Please feel free to send email for any question.

Overview

We propose a connectivity-based graph convolution network (cGCN) architecture for fMRI analysis. fMRI data are represented as the k-nearest neighbors graph based on the group functional connectivity, and spatial features are extracted from connectomic neighborhoods through Graph Convolution Networks. We have demonstrated our cGCN architecture on two scenarios with improved classification accuracy (individual identification on HCP dataset and classification of patients from normal controls on ABIDE dataset). cGCN on the graph-represented data can be extended to fMRI data in other data representations, which provides a promising deep learning architecture for fMRI analysis.

The architecture of cGCN is shown below:

Architecture of cGCN

How to run/test

HCP dataset

Easily try out on colab. Or run the notebook locally.

Alternatively, set up the environment (refer to dependencies) and download data:

Then you can run run_HCP.py.

ABIDE dataset

Please set up the environment (refer to dependencies) and download data:

Then you can run run_ABIDE_leave_one_site_out.py and run_ABIDE_10_fold.py.

Results

HCP dataset

cGCN is tested and compared with ConvRNN [Wang, Lebo, et al. "Application of convolutional recurrent neural network for individual recognition based on resting state fmri data." Frontiers in Neuroscience 13 (2019): 434.]. The relation between the accuracy and the number of input frames (the random chance is 1%) is shown below:

HCP

ABIDE dataset

cGCN is tested and compared with DNN [Heinsfeld, Anibal Sólon, et al. "Identification of autism spectrum disorder using deep learning and the ABIDE dataset." NeuroImage: Clinical 17 (2018): 16-23.]. Both the leave-one-site-out and 10-fold cross-validations were tested and compared as shown below:

ABIDE 1

For the leave-one-site-out cross-validation, the relation between the accuracy and the number of input frames is shown below:

ABIDE 2

Dependencies

  • keras=2.1.5
  • tensorflow=1.4.1
  • h5py=2.8.0
  • nilearn=0.5.0
  • numpy=1.15.4

Acknowledgement

Some code is borrowed from dgcnn.