Awesome-Dynamic-Network-Embedding

This repository collects a lot of papers related to mining dynamic/temporal patterns in the network.

Contributed by Zhining Liu, Dawei Zhou.

Papers

  • Predicting Citywide Crowd Flows in Irregular Regions Using Multi-View Graph Convolutional Networks. Junkai Sun, Junbo Zhang, Qiaofei Li, Xiuwen Yi, Yu Zheng. Arxiv 2019. [Paper]

    • Summary: different scales of data are input to GCN and global information (e.g., weather and day of the week) are fed to fully-connected neural network, then a module of multi-view fusion based the gating mechanism is designed to combine two kinds of information to generate the final predictions.
  • Revisiting Spatial-Temporal Similarity: A Deep Learning Framework for Traffic Prediction. Huaxiu Yao, Xianfeng Tang, Hua Wei, Guanjie Zheng, Zhenhui Li. AAAI 2019. [Paper][Code]

    • Summary: two intriguing assumptions are proposed: (1) the spatial dependencies between locations are also evolving w.r.t the time; (2) the temporal dependency is not strictly periodic due to its dynamic temporal shifting, and the gating mechanism and attention mechanism are introduced to solve the aforementioned problems.
  • Attention Based Spatial-Temporal Graph Convolutional Networks for Traffic Flow Forecasting. Shengnan Guo, Youfang Lin, Ning Feng, Chao Song, Huaiyu Wan. AAAI 2019. [Paper][Code]

    • Summary: graph convolutions along the spatial dimension and a standard convolution along the temporal dimension with spatial-temporal attention and different period components as inputs.
  • Gated Residual Recurrent Graph Neural Networks for Traffic Prediction. Chen Cen, Li Kenli, Teo Sin, Zou Xiaofeng, Wang Jie, Zeng Zeng. AAAI 2019. [Paper]

    • Summary: diffusion convolutional with different scales of inputs.
  • EvolveGCN: Evolving Graph Convolutional Networks for Dynamic Graphs. Aldo Pareja, Giacomo Domeniconi, Jie Chen, Tengfei Ma, Toyotaro Suzumura, Hiroki Kanezashi, Tim Kaler, Charles E. Leisersen. Arxiv 2019. [Paper]

    • Summary: integrate the GCN layer and GRU into one module, where $W_t^{(l)}= \text{GRU}(H_t^{(l)},W_{t-1}^{(l)})$ and $H_t^{l+1}=\text{GCONV}(A_t,H_t^{(l)},W_t^{(l)})$, i.e., node embeddings $H_t^{(l)}$ as the GRU input and the weight matrix $W_t^{(l)}$ as the (new) GRU hidden state.
  • 3D Graph Convolutional Networks with Temporal Graphs: A Spatial Information Free Framework For Traffic Forecasting. Bing Yu, Mengzhang Li, Jiyong Zhang, Zhanxing Zhu. Arxiv 2019. [Paper]

    • Summary: construct the graph based the similarity between each pair of roads and a fully 3D graph convolution operator is proposed.
  • dyngraph2vec: Capturing Network Dynamics using Dynamic Graph Representation Learning. Palash Goyal, Sujit Rokka Chhetri, Arquimedes Canedo. Arxiv 2018. [Paper][Code]

    • Summary: under the framework of encoder-and-decoder, use the neighborhood vector set (collect all neighborhood vector of different timestamps) of each node as input and the reconstruction loss as the objective function.
  • Continuous-Time Dynamic Network Embeddings. Giang Hoang Nguyen, John Boaz Lee, Ryan A. Rossi, Nesreen K. Ahmed, Eunyee Koh, Sungchul Kim. WWW 2018. [Paper]

    • Summary: random walks are generated with the constraint of time order and time closeness (how to select the start is also discussed).
  • NetWalk: A Flexible Deep Embedding Approach for Anomaly Detection in Dynamic Networks. Wenchao Yu, Wei Cheng, Charu Aggarwal, Kai Zhang, Haifeng Chen, Wei Wang. KDD 2018. [Paper][Code]

    • Summary: maintain a list of random walks and update the list with newly arrived network objects, then update the node/edge embeddings in a incremental online way.
  • Diffusion Convolutional Recurrent Neural Network: Data-Driven Traffic Forecasting. Yaguang Li, Rose Yu, Cyrus Shahabi, Yan Liu. ICLR 2018. [Paper][Code]

    • Summary: apply the bidirectional diffusion convolution on the directed traffic graph to model spatial information and extract temporal dynamics using gated recurrent units (GRU).
  • Dynamic Network Embedding by Modeling Triadic Closure Process. Lekui Zhou,Yang Yang, Xiang Ren, Fei Wu, Yueting Zhuang. AAAI 2018. [Paper][Code]

    • Summary: model how a closed triad from an open triad via triadic closure process.
  • Embedding Temporal Network via Neighborhood Formation. Yuan Zuo, Guannan Liu, Hao Lin, Jia Guo, Xiaoqian Hu, Junjie Wu. KDD 2018. [Paper][Code]

    • Summary: use Hawkes Process to model the 1st-order neighborhood formation sequence.
  • Spatio-Temporal Graph Convolutional Networks: A Deep Learning Framework for Traffic Forecasting. Yu, Bing, Haoteng Yin, and Zhanxing Zhu. IJCAI 2018. [Paper][Code]

    • Summary: graph convolution on spatial domain and 1-D convolution along time axis.
  • Structured Sequence Modeling with Graph Convolutional Recurrent Networks. Youngjoo Seo, Michaël Defferrard, Pierre Vandergheynst, Xavier Bresson. Arxiv 2016. [Paper]

    • Summary: two methods to combine RNN with CNN: (1) graph signals firstly are processed by GCN and then use RNN to extract temporal dynamics; (2) replace the dense layer (W·x) in the RNN with the graph convolution (W_g*x).
  • Scalable Link Prediction in Dynamic Networks via Non-Negative Matrix Factorization. Linhong Zhu, Dong Guo, Junming Yin, Greg Ver Steeg, Aram Galstyan. TKDE 2016. [Paper][Code]

    • Summary: decompose the adjacency matrix of each timestamp to generate node embeddings and add the temporal smoothness over the node embeddings of consecutive timestamps.