/Time-Series-Papers

This is a repository for collecting papers and code in time series domain.

MIT LicenseMIT

Time-Series-Papers

This is a repository for collecting papers and code in time series domain.

Table of Content

  ├─ Linear/  
  ├─ RNN and CNN/           
  ├─ Transformer/
  ├─ GNN/
  ├─ LLM Framework/
  ├─ Diffusion Model/
  ├─ Benchmark and Dataset/                      
  └─ Repositories/         

Linear

  • N-BEATS: Neural basis expansion analysis for interpretable time series forecasting, Oreshkin et al., ICLR 2020. [paper][n-beats][N-BEATS]
  • DLinear: Are Transformers Effective for Time Series Forecasting, Zeng et al., AAAI 2023. [paper][code]
  • TSMixer: Lightweight MLP-Mixer Model for Multivariate Time Series Forecasting, Ekambaram et al., KDD 2023. [paper][model][example]
  • Tiny Time Mixers (TTMs): Fast Pretrained Models for Enhanced Zero/Few-Shot Forecasting of Multivariate Time Series, Ekambaram et al., arxiv 2024. [paper][code]
  • FCDNet: Frequency-Guided Complementary Dependency Modeling for Multivariate Time-Series Forecasting, Chen et al., arxiv 2023. [paper][code]
  • SOFTS: Efficient Multivariate Time Series Forecasting with Series-Core Fusion, Han et al., arxiv 2024. [paper][code]
  • TimeMixer: Decomposable Multiscale Mixing for Time Series Forecasting, Wang et al., ICLR 2024. [paper][code]

RNN and CNN

  • TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis, Wu et al., ICLR 2023. [paper][code][slides]

Transformer

  • Transformers in Time Series: A Survey, Wen et al., IJCAI 2023. [paper][code]
  • Deep Time Series Models: A Comprehensive Survey and Benchmark, Wang et al., arxiv 2024. [paper][code]
  • Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting, Zhou et al., AAAI 2021 Best paper. [paper][code]
  • Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting, Wu et al., NeurIPS 2021. [paper][code][slides]
  • Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy, Xu et al., ICLR 2022. [paper][code][slides]
  • Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting, Liu et al., NeurIPS 2022. [paper][code]
  • iTransformer: Inverted Transformers Are Effective for Time Series Forecasting, Liu et al., ICLR 2024 Spotlight. [paper][code]
  • Pyraformer: Low-Complexity Pyramidal Attention for Long-Range Time Series Modeling and Forecasting, Liu et al., ICLR 2022. [paper][code]
  • FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting, Zhou et al., ICML 2022. [paper][code][DAMO-DI-ML]
  • PatchTST: A Time Series is Worth 64 Words: Long-term Forecasting with Transformers, Nie et al., ICLR 2023. [paper][code]
  • Crossformer: Transformer Utilizing Cross-Dimension Dependency for Multivariate Time Series Forecasting, Zhang and Yan, ICLR 2023. [paper][code]
  • TimeXer: Empowering Transformers for Time Series Forecasting with Exogenous Variables, Wang et al., arxiv 2024. [paper]
  • UniTST: Effectively Modeling Inter-Series and Intra-Series Dependencies for Multivariate Time Series Forecasting, Liu et al., arxiv 2024. [paper]

GNN

  • A Survey on Graph Neural Networks for Time Series: Forecasting, Classification, Imputation, and Anomaly Detection, Jin et al., arxiv 2023. [paper][code]
  • GPT-ST: Generative Pre-Training of Spatio-Temporal Graph Neural Networks, Li et al., NeurIPS 2023. [paper][code]
  • MSGNet: Learning Multi-Scale Inter-Series Correlations for Multivariate Time Series Forecasting, Cai et al., AAAI 2024. [paper][code]

LLM Framework

  • Large Models for Time Series and Spatio-Temporal Data: A Survey and Outlook, Jin et al., arxiv 2023. [paper][code]

  • Large Language Models for Time Series: A Survey, Zhang et al., arxiv 2024. [paper][code]

  • Large Language Models for Forecasting and Anomaly Detection: A Systematic Literature Review, Su et al., arxiv 2024. [paper]

  • SimMTM: A Simple Pre-Training Framework for Masked Time-Series Modeling, Dong et al., NeurIPS 2023 Spotlight. [paper][code]

  • One Fits All: Power General Time Series Analysis by Pretrained LM, Zhou et al., NeurIPS 2023 Spotlight. [paper][code][AI-for-Time-Series-Papers-Tutorials-Surveys][CALF]

  • Large Language Models Are Zero-Shot Time Series Forecasters, Gruver et al., NeurIPS 2023. [paper][code]

  • Lag-Llama: Towards Foundation Models for Time Series Forecasting, Rasul et al., arxiv 2023. [paper][code]

  • TimesFM: A decoder-only foundation model for time-series forecasting, Das et al., ICML 2024. [paper][code]

  • TimeGPT-1, Garza et al., arxiv 2023. [paper][nixtla]

  • Time-LLM: Time Series Forecasting by Reprogramming Large Language Models, Jin et al., ICLR 2024. [paper][code]

  • AutoTimes: Autoregressive Time Series Forecasters via Large Language Models, Liu et al., arxiv 2024. [paper][code]

  • Timer: Generative Pre-trained Transformers Are Large Time Series Models, Liu et al., ICML 2024. [paper][code][Unified Time Series Dataset][website]

  • TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling, Dong et al., ICML2024. [paper][code]

  • MOMENT: A Family of Open Time-series Foundation Models, Goswami et al., ICML 2024. [paper][code]

  • Unified Training of Universal Time Series Forecasting Transformers, Woo et al., ICML 2024. [paper][code]

  • Multi-Patch Prediction: Adapting LLMs for Time Series Representation Learning, Bian et al., arxiv 2024. [paper]

  • UniTS: Building a Unified Time Series Model, Gao et al., arxiv 2024. [paper][code]

  • Chronos: Learning the Language of Time Series, Ansari et al., arxiv 2024. [paper][code]

  • Large language models can be zero-shot anomaly detectors for time series, Alnegheimish et al., arxiv 2024. [paper]

  • Foundation Models for Time Series Analysis: A Tutorial and Survey, Liang et al., arxiv 2024. [paper][granite-tsfm]

  • Are Language Models Actually Useful for Time Series Forecasting?, Tan et al., arxiv 2024. [paper][code]

  • LETS-C: Leveraging Language Embedding for Time Series Classification, Kaur et al., arxiv 2024. [paper]


Diffusion Model

  • Diffusion-TS: Interpretable Diffusion for General Time Series Generation, Yuan and Qiao, ICLR 2024. [paper][code]
  • A Survey on Diffusion Models for Time Series and Spatio-Temporal Data, Yang et al., arxiv 2024. [paper][code]

Benchmark and Dataset


Repositories