DNNTSP is a general neural network architecture that could make prediction on temporal sets.
Please refer to our KDD 2020 paper “Predicting Temporal Sets with Deep Neural Networks” for more details.
The descriptions of principal files in this project are explained as follows:
- ./model/
weighted_graph_conv.py
: codes for the Element Relationship Learning component (i.e. weighted GCN on dynamic graphs)masked_self_attention.py
andaggregate_nodes_temporal_feature.py
: codes for the Attention-based Temporal Dependency Learning component (i.e. masked self-attention and weighted aggregation of temporal information)global_gated_update.py
: codes for the Gated Information Fusing component (i.e. gated updating mechanism)
- ./train/
train_model.py
andtrain_main.py
: codes for training models
- ./test/
testing_model.py
: codes for evaluating models
- ./utils/: containing useful files that are required in the project (e.g. data loader, metrics calculation, loss function, configurations)
- ./data/: processed datasets are under in this folder. Original datasets could be downloaded as follows:
- ./save_model_folder/ and ./runs/: folders to save models and outputs of tensorboardX respectively
- ./results/: folders to save the evaluation metrics for models.
Please refer to our paper for more details of parameter settings. Hyperparameters could be found in ./utils/config.json and you can adjust them when running the model.
- Training: after setting the parameters, run
train_main.py
file to train models. - Testing: figure out the path of the specific saved model (i.e. variable
model_path
in ./test/testing_model.py) and then runtesting_model.py
file to evaluate models.
Principal environmental dependencies as follows:
Please consider citing the following paper when using our code.
@inproceedings{DNNTSP,
title={Predicting Temporal Sets with Deep Neural Networks},
author={Le Yu, Leilei Sun, Bowen Du, Chuanren Liu, Hui Xiong and Weifeng Lv},
booktitle={Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery \& Data Mining},
year={2020}
}