/GraphAttention

šŸ“‘ Implementing Graph Attention Networks paper by Petar Veličković et al., 2018 ICLR

Primary LanguagePython

Graph Attention Netoworks

This repository is dedicated to the implementation of the "Graph Attention Networks" paper by Petar Veličković et al., (ICLR 2018):
https://arxiv.org/pdf/1710.10903.pdf
Graph Attention Networks introduce an attention-based architecture for node classification in graph-structured data, offering advancements in various applications like social networks analysis and bioinformatics.

Installation and Configurations

To get started with this project, please ensure you have all the necessary libraries installed. Refer to the requirements.txt file for a complete list. You can install these packages using the command:

pip install -r requirements.txt

Customize your training and model parameters in config.py. This file includes settings for learning rate, number of epochs, and other model-specific configurations that you can tweak according to your needs.

All datasets are citation networks from the paper https://arxiv.org/pdf/1603.08861.pdf.

  • Cora dataset: consists of 2708 nodes (papers), each with 1433 features classesied into 7 classes including "Reinforcement_Learning", "Neural_Networks", etc.
  • CiteSeer dataset: consists of 3327 nodes (papers), each with 3703 features classified into 6 classes including "Artificial_Intelligence", "Machine_Learning", etc.
  • PubMed dataset: consists of 19717 nodes (papers), each with 500 features classified into 3 classes related to Diabetes Mellitus.

Getting Started

To train the model, navigate to 'training' directory and run 'main.py':

cd training
python main.py

Results

Training process takes less than a minute (to pass 1000 epochs).After training, you can view accuracy of the model on each dataset. For instance:
Cora last train accuracy is 0.879, last validation accuracy is 0.442.

CiteSeer last train accuracy is 0.950, last validation accuracy is 0.400.

PubMed last train accuracy is 1.000, last validation accuracy is 0.606.