/tensor-learning

Python codes for tensor factorization, tensor completion, and tensor regression techniques.

Primary LanguageJupyter NotebookMIT LicenseMIT

Tensor Learning (张量学习)

MIT License Python 3.7 GitHub stars

Made by Xinyu Chen • 🌐 https://twitter.com/chenxy346

Python codes for tensor factorization, tensor completion, and tensor regression techniques with the following real-world applications:

  • geotensor | Image inpainting
  • transdim | Spatiotemporal traffic data imputation and prediction
  • Recommender systems
  • mats | Multivariate time series imputation and forecasting

In a hurry? Please check out our contents as follows.

Our Research

▴ Back to top

We conduct extensive experiments on some real-world data sets:

  • Large-scale PeMS traffic speed data set registers traffic speed time series from 11160 sensors over 4/8/12 weeks (for PeMS-4W/PeMS-8W/PeMS-12W) with 288 time points per day (i.e., 5-min frequency) in California, USA. You can download this data set and place it at the folder of datasets.

    • Data path example: ../datasets/California-data-set/pems-4w.csv.

mats

mats is a project in the tensor learning repository, and it aims to develop machine learning for multivariate time series forecasting.

Low-Rank Autoregressive Tensor Completion for Multivariate Time Series Forecasting.
[arXiv]

Figure 1: Illustration of our proposed Low-Rank Tensor Completion (LATC) imputer/predictor with a prediction window τ (green nodes: observed values; white nodes: missing values; red nodes/panel: prediction; blue panel: training data to construct the tensor).

In this work, we develop a Low-Rank Autoregressive Tensor Completion for multivariate time series forecasting in the presence of missing values. To overcome the challenge of missing time series values, our LATC model takes into account:

  • autoregressive process on the matrix structure to capture local temporal states,
  • and low-rank assumption on the tensor structure to capture global low-rank patterns simultaneously.

Python codes for reproducing experiments are provided in the ../mats folder. Since these Python codes were written on the Jupyter Notebook, you could also view them on the nbviewer. Please open

If you find these codes useful, please star (★) this repository.

📖 Tutorial

▴ Back to top

We summarize some preliminaries for better understanding tensor learning. They are given in the form of tutorial as follows.

  • Foundations of Python Numpy Programming

  • Foundations of Tensor Computations

    • Kronecker product
  • Singular Value Decomposition (SVD)

Helpful Learning Material

▴ Back to top

  • Ruye Wang (2010). Introduction to Orthogonal Transforms with Applications in Data Processing and Analysis. Cambridge University Press. [PDF]

Quick Run

▴ Back to top

  • If you want to run the code, please
    • download (or clone) this repository,
    • open the .ipynb file using Jupyter notebook,
    • and run the code.

Citing

▴ Back to top

This repository is from the following paper, please cite our paper if it helps your research.

  • Xinyu Chen, Lijun Sun (2020). Low-rank autoregressive tensor completion for multivariate time series forecasting. arXiv: 2006.10436. [preprint] [data & Python code]

Acknowledgements

▴ Back to top

This research is supported by the Institute for Data Valorization (IVADO).

License

▴ Back to top

This work is released under the MIT license.