Transfomer based GNN implementatoin. Given a graph, returns an n-dimensional vector for each of them to be used in downstream tasks.
Uses a single attnetion mechanism to aggregate node vectors over r-degree neighborhood. Adjacency matrix is used as a mask.
-
Drastically reduces number of attention based aggregations with respect to r.
-
Faster to compute, parallelize.
-
Achieves r-layer message passing by including multiple layers in the transformer.
-
Setup all required packages by using pip install .
-
All of the initial settings and inputs can be configured at the start of train.py. You can put your traversable graph in here, and change inputs in the config object there.
-
After running the train.py script, the vector embeddings will be saved in output, along with the config file.