Documentation | Paper | Colab Notebooks | External Resources | OGB Examples
PyTorch Geometric (PyG) is a geometric deep learning extension library for PyTorch.
It consists of various methods for deep learning on graphs and other irregular structures, also known as geometric deep learning, from a variety of published papers. In addition, it consists of an easy-to-use mini-batch loader for many small and single giant graphs, multi gpu-support, a large number of common benchmark datasets (based on simple interfaces to create your own), and helpful transforms, both for learning on arbitrary graphs as well as on 3D meshes or point clouds. Click here to join our Slack community!
OGB is hosting a large-scale graph machine learning challenge (OGB-LSC) at KDD Cup 2021 from March 15th to June 8th in order to discover innovative solutions for large-scale node classification, link prediction and graph regression. We are looking forward to your participation!
PyTorch Geometric makes implementing Graph Neural Networks a breeze (see here for the accompanying tutorial). For example, this is all it takes to implement the edge convolutional layer:
import torch
from torch.nn import Sequential as Seq, Linear as Lin, ReLU
from torch_geometric.nn import MessagePassing
class EdgeConv(MessagePassing):
def __init__(self, F_in, F_out):
super(EdgeConv, self).__init__(aggr='max') # "Max" aggregation.
self.mlp = Seq(Lin(2 * F_in, F_out), ReLU(), Lin(F_out, F_out))
def forward(self, x, edge_index):
# x has shape [N, F_in]
# edge_index has shape [2, E]
return self.propagate(edge_index, x=x) # shape [N, F_out]
def message(self, x_i, x_j):
# x_i has shape [E, F_in]
# x_j has shape [E, F_in]
edge_features = torch.cat([x_i, x_j - x_i], dim=1) # shape [E, 2 * F_in]
return self.mlp(edge_features) # shape [E, F_out]
In detail, the following methods are currently implemented:
- SplineConv from Fey et al.: SplineCNN: Fast Geometric Deep Learning with Continuous B-Spline Kernels (CVPR 2018) [Example1, Example2]
- GCNConv from Kipf and Welling: Semi-Supervised Classification with Graph Convolutional Networks (ICLR 2017) [Example]
- GCN2Conv from Chen et al.: Simple and Deep Graph Convolutional Networks (ICML 2020) [Example1, Example2]
- ChebConv from Defferrard et al.: Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering (NIPS 2016) [Example]
- NNConv from Gilmer et al.: Neural Message Passing for Quantum Chemistry (ICML 2017) [Example1, Example2]
- CGConv from Xie and Grossman: Crystal Graph Convolutional Neural Networks for an Accurate and Interpretable Prediction of Material Properties (Physical Review Letters 120, 2018)
- ECConv from Simonovsky and Komodakis: Edge-Conditioned Convolution on Graphs (CVPR 2017)
- EGConv from Tailor et al.: Adaptive Filters and Aggregator Fusion for Efficient Graph Convolutions (GNNSys 2021) [Example]
- GATConv from Veličković et al.: Graph Attention Networks (ICLR 2018) [Example]
- GATv2Conv from Brody et al.: How Attentive are Graph Attention Networks? (CoRR 2021)
- TransformerConv from Shi et al.: Masked Label Prediction: Unified Message Passing Model for Semi-Supervised Classification (CoRR 2020)
- SAGEConv from Hamilton et al.: Inductive Representation Learning on Large Graphs (NIPS 2017) [Example1, Example2, Example3]
- GraphConv from, e.g., Morris et al.: Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks (AAAI 2019)
- GatedGraphConv from Li et al.: Gated Graph Sequence Neural Networks (ICLR 2016)
- ResGatedGraphConv from Bresson and Laurent: Residual Gated Graph ConvNets (CoRR 2017)
- GINConv from Xu et al.: How Powerful are Graph Neural Networks? (ICLR 2019) [Example]
- GINEConv from Hu et al.: Strategies for Pre-training Graph Neural Networks (ICLR 2020)
- ARMAConv from Bianchi et al.: Graph Neural Networks with Convolutional ARMA Filters (CoRR 2019) [Example]
- SGConv from Wu et al.: Simplifying Graph Convolutional Networks (CoRR 2019) [Example]
- APPNP from Klicpera et al.: Predict then Propagate: Graph Neural Networks meet Personalized PageRank (ICLR 2019) [Example]
- MFConv from Duvenaud et al.: Convolutional Networks on Graphs for Learning Molecular Fingerprints (NIPS 2015)
- AGNNConv from Thekumparampil et al.: Attention-based Graph Neural Network for Semi-Supervised Learning (CoRR 2017) [Example]
- TAGConv from Du et al.: Topology Adaptive Graph Convolutional Networks (CoRR 2017) [Example]
- PNAConv from Corso et al.: Principal Neighbourhood Aggregation for Graph Nets (CoRR 2020) [Example]
- FAConv from Bo et al.: Beyond Low-Frequency Information in Graph Convolutional Networks (AAAI 2021)
- RGCNConv from Schlichtkrull et al.: Modeling Relational Data with Graph Convolutional Networks (ESWC 2018) [Example]
- FiLMConv from Brockschmidt: GNN-FiLM: Graph Neural Networks with Feature-wise Linear Modulation (ICML 2020) [Example]
- SignedConv from Derr et al.: Signed Graph Convolutional Network (ICDM 2018) [Example]
- DNAConv from Fey: Just Jump: Dynamic Neighborhood Aggregation in Graph Neural Networks (ICLR-W 2019) [Example]
- PANConv from Ma et al.: Path Integral Based Convolution and Pooling for Graph Neural Networks (NeurIPS 2020)
- PointConv (including Iterative Farthest Point Sampling, dynamic graph generation based on nearest neighbor or maximum distance, and k-NN interpolation for upsampling) from Qi et al.: PointNet: Deep Learning on Point Sets for 3D Classification and Segmentation (CVPR 2017) and PointNet++: Deep Hierarchical Feature Learning on Point Sets in a Metric Space (NIPS 2017) [Example1, Example2]
- EdgeConv from Wang et al.: Dynamic Graph CNN for Learning on Point Clouds (CoRR, 2018) [Example1, Example2]
- XConv from Li et al.: PointCNN: Convolution On X-Transformed Points (NeurIPS 2018) [Example]
- PPFConv from Deng et al.: PPFNet: Global Context Aware Local Features for Robust 3D Point Matching (CVPR 2018)
- GMMConv from Monti et al.: Geometric Deep Learning on Graphs and Manifolds using Mixture Model CNNs (CVPR 2017)
- FeaStConv from Verma et al.: FeaStNet: Feature-Steered Graph Convolutions for 3D Shape Analysis (CVPR 2018)
- HypergraphConv from Bai et al.: Hypergraph Convolution and Hypergraph Attention (CoRR 2019)
- GravNetConv from Qasim et al.: Learning Representations of Irregular Particle-detector Geometry with Distance-weighted Graph Networks (European Physics Journal C, 2019)
- SuperGAT from Kim and Oh: How To Find Your Friendly Neighborhood: Graph Attention Design With Self-Supervision (ICLR 2021) [Example]
- A MetaLayer for building any kind of graph network similar to the TensorFlow Graph Nets library from Battaglia et al.: Relational Inductive Biases, Deep Learning, and Graph Networks (CoRR 2018)
- GlobalAttention from Li et al.: Gated Graph Sequence Neural Networks (ICLR 2016) [Example]
- Set2Set from Vinyals et al.: Order Matters: Sequence to Sequence for Sets (ICLR 2016) [Example]
- Sort Pool from Zhang et al.: An End-to-End Deep Learning Architecture for Graph Classification (AAAI 2018) [Example]
- Dense Differentiable Pooling from Ying et al.: Hierarchical Graph Representation Learning with Differentiable Pooling (NeurIPS 2018) [Example]
- Dense MinCUT Pooling from Bianchi et al.: MinCUT Pooling in Graph Neural Networks (CoRR 2019) [Example]
- Graclus Pooling from Dhillon et al.: Weighted Graph Cuts without Eigenvectors: A Multilevel Approach (PAMI 2007) [Example]
- Voxel Grid Pooling from, e.g., Simonovsky and Komodakis: Dynamic Edge-Conditioned Filters in Convolutional Neural Networks on Graphs (CVPR 2017) [Example]
- Top-K Pooling from Gao and Ji: Graph U-Nets (ICML 2019), Cangea et al.: Towards Sparse Hierarchical Graph Classifiers (NeurIPS-W 2018) and Knyazev et al.: Understanding Attention and Generalization in Graph Neural Networks (ICLR-W 2019) [Example]
- SAG Pooling from Lee et al.: Self-Attention Graph Pooling (ICML 2019) and Knyazev et al.: Understanding Attention and Generalization in Graph Neural Networks (ICLR-W 2019) [Example]
- Edge Pooling from Diehl et al.: Towards Graph Pooling by Edge Contraction (ICML-W 2019) and Diehl: Edge Contraction Pooling for Graph Neural Networks (CoRR 2019) [Example]
- ASAPooling from Ranjan et al.: ASAP: Adaptive Structure Aware Pooling for Learning Hierarchical Graph Representations (AAAI 2020) [Example]
- PANPooling from Ma et al.: Path Integral Based Convolution and Pooling for Graph Neural Networks (NeurIPS 2020)
- MemPooling from Khasahmadi et al.: Memory-Based Graph Networks (ICLR 2020)
- Local Degree Profile from Cai and Wang: A Simple yet Effective Baseline for Non-attribute Graph Classification (CoRR 2018)
- Jumping Knowledge from Xu et al.: Representation Learning on Graphs with Jumping Knowledge Networks (ICML 2018) [Example]
- Node2Vec from Grover and Leskovec: node2vec: Scalable Feature Learning for Networks (KDD 2016) [Example]
- MetaPath2Vec from Dong et al.: metapath2vec: Scalable Representation Learning for Heterogeneous Networks (KDD 2017) [Example]
- Deep Graph Infomax from Veličković et al.: Deep Graph Infomax (ICLR 2019) [Example1, Example2]
- All variants of Graph Autoencoders and Variational Autoencoders from:
- Variational Graph Auto-Encoders from Kipf and Welling (NIPS-W 2016) [Example]
- Adversarially Regularized Graph Autoencoder for Graph Embedding from Pan et al. (IJCAI 2018) [Example]
- Simple and Effective Graph Autoencoders with One-Hop Linear Models from Salha et al. (ECML 2020) [Example]
- SEAL from Zhang and Chen: Link Prediction Based on Graph Neural Networks (NeurIPS 2018)
- RENet from Jin et al.: Recurrent Event Network for Reasoning over Temporal Knowledge Graphs (ICLR-W 2019) [Example]
- GraphUNet from Gao and Ji: Graph U-Nets (ICML 2019) [Example]
- SchNet from Schütt et al.: SchNet: A Continuous-filter Convolutional Neural Network for Modeling Quantum Interactions (NIPS 2017) [Example]
- DimeNet from Klicpera et al.: Directional Message Passing for Molecular Graphs (ICLR 2020) [Example]
- AttentiveFP from Xiong et al.: Pushing the Boundaries of Molecular Representation for Drug Discovery with the Graph Attention Mechanism (J. Med. Chem. 2020) [Example]
- DeepGCN and the GENConv from Li et al.: DeepGCNs: Can GCNs Go as Deep as CNNs? (ICCV 2019) and DeeperGCN: All You Need to Train Deeper GCNs (CoRR 2020) [Example]
- NeighborSampler from Hamilton et al.: Inductive Representation Learning on Large Graphs (NIPS 2017) [Example1, Example2, Example3]
- ClusterGCN from Chiang et al.: Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks (KDD 2019) [Example1, Example2]
- GraphSAINT from Zeng et al.: GraphSAINT: Graph Sampling Based Inductive Learning Method (ICLR 2020) [Example]
- ShaDow from Zeng et al.: Deep Graph Neural Networks with Shallow Subgraph Samplers (CoRR 2020)
- GDC from Klicpera et al.: Diffusion Improves Graph Learning (NeurIPS 2019) [Example]
- SIGN from Rossi et al.: SIGN: Scalable Inception Graph Neural Networks (CoRR 2020) [Example]
- GNNExplainer from Ying et al.: GNNExplainer: Generating Explanations for Graph Neural Networks (NeurIPS 2019) [Example]
- DropEdge from Rong et al.: DropEdge: Towards Deep Graph Convolutional Networks on Node Classification (ICLR 2020)
- GraphNorm from Cai et al.: GraphNorm: A Principled Approach to Accelerating Graph Neural Network Training (CoRR 2020)
- GraphSizeNorm from Dwivedi et al.: Benchmarking Graph Neural Networks (CoRR 2020)
- PairNorm from Zhao and Akoglu: PairNorm: Tackling Oversmoothing in GNNs (ICLR 2020)
- DiffGroupNorm from Zhou et al.: Towards Deeper Graph Neural Networks with Differentiable Group Normalization (NeurIPS 2020)
- Tree Decomposition from Jin et al.: Junction Tree Variational Autoencoder for Molecular Graph Generation (ICML 2018)
- TGN from Rossi et al.: Temporal Graph Networks for Deep Learning on Dynamic Graphs (GRL+ 2020) [Example]
- Weisfeiler Lehman Algorithm from Weisfeiler and Lehman: A Reduction of a Graph to a Canonical Form and an Algebra Arising During this Reduction (Nauchno-Technicheskaya Informatsia 1968) [Example]
- Label Propagation from Zhu and Ghahramani: Learning from Labeled and Unlabeled Data with Label Propagation (CMU-CALD 2002) [Example]
- CorrectAndSmooth from Huang et al.: Combining Label Propagation And Simple Models Out-performs Graph Neural Networks (CoRR 2020) [Example]
- Gini and BRO regularization from Henderson et al.: Improving Molecular Graph Neural Network Explainability with Orthonormalization and Induced Sparsity (ICML 2021)
Head over to our documentation to find out more about installation, data handling, creation of datasets and a full list of implemented methods, transforms, and datasets.
For a quick start, check out our examples in the examples/
directory.
If you notice anything unexpected, please open an issue and let us know. If you have any questions or are missing a specific feature, feel free to discuss them with us. We are motivated to constantly make PyTorch Geometric even better.
We provide pip wheels for all major OS/PyTorch/CUDA combinations, see here.
To install the binaries for PyTorch 1.8.0 and 1.8.1, simply run
pip install torch-scatter -f https://pytorch-geometric.com/whl/torch-1.8.0+${CUDA}.html
pip install torch-sparse -f https://pytorch-geometric.com/whl/torch-1.8.0+${CUDA}.html
pip install torch-cluster -f https://pytorch-geometric.com/whl/torch-1.8.0+${CUDA}.html
pip install torch-spline-conv -f https://pytorch-geometric.com/whl/torch-1.8.0+${CUDA}.html
pip install torch-geometric
where ${CUDA}
should be replaced by either cpu
, cu101
, cu102
, or cu111
depending on your PyTorch installation.
Binaries are provided for Python version <= 3.8
.
cpu |
cu101 |
cu102 |
cu111 |
|
---|---|---|---|---|
Linux | ✅ | ✅ | ✅ | ✅ |
Windows | ✅ | ✅ | ✅ | ✅ |
macOS | ✅ |
To install the binaries for PyTorch 1.7.0 and 1.7.1, simply run
pip install torch-scatter -f https://pytorch-geometric.com/whl/torch-1.7.0+${CUDA}.html
pip install torch-sparse -f https://pytorch-geometric.com/whl/torch-1.7.0+${CUDA}.html
pip install torch-cluster -f https://pytorch-geometric.com/whl/torch-1.7.0+${CUDA}.html
pip install torch-spline-conv -f https://pytorch-geometric.com/whl/torch-1.7.0+${CUDA}.html
pip install torch-geometric
where ${CUDA}
should be replaced by either cpu
, cu92
, cu101
, cu102
, or cu110
depending on your PyTorch installation.
Binaries are provided for Python version <= 3.8
.
cpu |
cu92 |
cu101 |
cu102 |
cu110 |
|
---|---|---|---|---|---|
Linux | ✅ | ✅ | ✅ | ✅ | ✅ |
Windows | ✅ | ❌ | ✅ | ✅ | ✅ |
macOS | ✅ |
Note: Binaries of older versions are also provided for PyTorch 1.4.0, PyTorch 1.5.0 and PyTorch 1.6.0 (following the same procedure).
In case you want to experiment with the latest PyG features which are not fully released yet, you can install PyG from master via
pip install git+https://github.com/rusty1s/pytorch_geometric.git
cd examples
python gcn.py
Please cite our paper (and the respective papers of the methods used) if you use this code in your own work:
@inproceedings{Fey/Lenssen/2019,
title={Fast Graph Representation Learning with {PyTorch Geometric}},
author={Fey, Matthias and Lenssen, Jan E.},
booktitle={ICLR Workshop on Representation Learning on Graphs and Manifolds},
year={2019},
}
Feel free to email us if you wish your work to be listed in the external resources.
python setup.py test