The description of "Element-guided Temporal Graph Representation Learning for Temporal Sets Prediction" at WWW 2022 is available here.
The original data could be downloaded from here.
You can download the data and then put the data files in the ./original_data
folder.
-
run
./preprocess_data/preprocess_data_{dataset_name}.py
to preprocess the original data, wheredataset_name
could be DC, TaoBao, JingDong and TMS. We also provide the preprocessed datasets at here, which should be put in the./dataset
folder. -
run
./train/train_ETGNN.py
to train the model on different datasets using the configuration in./utils/config.json
. -
run
./evaluate/evaluate_ETGNN.py
to evaluate the model. Please make sure theconfig
inevaluate_ETGNN.py
keeps identical to that in the model training process.
Hyperparameters can be found in ./utils/config.json
file, and you can adjust them when training the model on different datasets.
Hyperparameters | DC | TaoBao | JingDong | TMS |
---|---|---|---|---|
learning rate | 0.001 | 0.001 | 0.001 | 0.001 |
embedding dimension | 64 | 32 | 64 | 64 |
embedding dropout | 0.2 | 0.0 | 0.2 | 0.3 |
temporal attention dropout | 0.5 | 0.5 | 0.5 | 0.5 |
number of hops | 3 | 3 | 3 | 2 |
temporal information importance | 0.3 | 0.05 | 0.01 | 1.0 |
Please consider citing our paper when using the codes or datasets.
@inproceedings{DBLP:conf/www/YuWS0L22,
author = {Le Yu and
Guanghui Wu and
Leilei Sun and
Bowen Du and
Weifeng Lv},
title = {Element-guided Temporal Graph Representation Learning for Temporal
Sets Prediction},
booktitle = {{WWW} '22: The {ACM} Web Conference 2022, Virtual Event, Lyon, France,
April 25 - 29, 2022},
pages = {1902--1913},
publisher = {{ACM}},
year = {2022}
}