/Time-Series-Papers

This is a repository for collecting papers and code in time series domain.

MIT LicenseMIT

Time-Series-Papers

This is a repository for collecting papers and code in time series domain.

Table of Content

  ├─ Linear/  
  ├─ RNN and CNN/           
  ├─ Transformer/
  ├─ GNN/
  ├─ Framework/                
  └─ Repositories/         

Linear

  • DLinear: Are Transformers Effective for Time Series Forecasting, Zeng et al., AAAI 2023. [paper][code]
  • TSMixer: Lightweight MLP-Mixer Model for Multivariate Time Series Forecasting, Ekambaram et al., KDD 2023. [paper][model][example]
  • Tiny Time Mixers (TTMs): Fast Pretrained Models for Enhanced Zero/Few-Shot Forecasting of Multivariate Time Series, Ekambaram et al., arxiv 2024. [paper]
  • FCDNet: Frequency-Guided Complementary Dependency Modeling for Multivariate Time-Series Forecasting, Chen et al., arxiv 2023. [paper][code]
  • TimeMixer: Decomposable Multiscale Mixing for Time Series Forecasting, Wang et al., ICLR 2024. [paper][[code]]

RNN and CNN

  • TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis, Wu et al., ICLR 2023. [paper][code][slides]

Transformer

  • Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting, Zhou et al., AAAI 2021 Best paper. [paper][code]
  • Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting, Wu et al., NeurIPS 2021. [paper][code][slides]
  • Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy, Xu et al., ICLR 2022. [paper][code][slides]
  • Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting, Liu et al., NeurIPS 2022. [paper][code]
  • iTransformer: Inverted Transformers Are Effective for Time Series Forecasting, Liu et al., ICLR 2024 Spotlight. [paper][code]
  • FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting, Zhou et al., ICML 2022. [paper][code][DAMO-DI-ML]
  • PatchTST: A Time Series is Worth 64 Words: Long-term Forecasting with Transformers, Nie et al., ICLR 2023. [paper][code]
  • Crossformer: Transformer Utilizing Cross-Dimension Dependency for Multivariate Time Series Forecasting, Zhang and Yan, ICLR 2023. [paper][code]
  • TimeXer: Empowering Transformers for Time Series Forecasting with Exogenous Variables, Wang et al., arxiv 2024. [paper]

GNN

  • A Survey on Graph Neural Networks for Time Series: Forecasting, Classification, Imputation, and Anomaly Detection, Jin et al., arxiv 2023. [paper][code]
  • MSGNet: Learning Multi-Scale Inter-Series Correlations for Multivariate Time Series Forecasting, Cai et al., AAAI 2024. [paper][code]

Framework

  • SimMTM: A Simple Pre-Training Framework for Masked Time-Series Modeling, Dong et al., NeurIPS 2023 Spotlight. [paper][code]
  • Timer: Transformers for Time Series Analysis at Scale, Liu et al., arxiv 2024. [paper][code]
  • AutoTimes: Autoregressive Time Series Forecasters via Large Language Models, Liu et al., arxiv 2024. [paper][code]
  • TSPP: A Unified Benchmarking Tool for Time-series Forecasting, Bączek et al., arxiv 2023. [paper][code]
  • One Fits All:Power General Time Series Analysis by Pretrained LM, Zhou et al., NeurIPS 2023. [paper][code][AI-for-Time-Series-Papers-Tutorials-Surveys]
  • Large Language Models Are Zero-Shot Time Series Forecasters, Gruver et al., NeurIPS 2023. [paper][code]
  • GPT-ST: Generative Pre-Training of Spatio-Temporal Graph Neural Networks, Li et al., NeurIPS 2023. [paper][code]
  • Lag-Llama: Towards Foundation Models for Time Series Forecasting, Rasul et al., arxiv 2023. [paper][code][pytorch-transformer-ts]
  • TimesFM: A decoder-only foundation model for time-series forecasting, Das et al., arxiv 2023. [paper]
  • TimeGPT-1, Garza et al., arxiv 2023. [paper][nixtla]
  • Time-LLM: Time Series Forecasting by Reprogramming Large Language Models, Jin et al., ICLR 2024. [paper][code]
  • AutoTimes: Autoregressive Time Series Forecasters via Large Language Models, Liu et al., arxiv 2024. [paper]
  • Large Models for Time Series and Spatio-Temporal Data: A Survey and Outlook, Jin et al., arxiv 2023. [paper][code]
  • MOMENT: A Family of Open Time-series Foundation Models, Goswami et al., arxiv 2024. [paper][code]
  • Large Language Models for Time Series: A Survey, Zhang et al., arxiv 2024. [paper][code]
  • Large Language Models for Forecasting and Anomaly Detection: A Systematic Literature Review, Su et al., arxiv 2024. [paper]
  • Unified Training of Universal Time Series Forecasting Transformers, Woo et al., arxiv 2024. [paper]
  • Multi-Patch Prediction: Adapting LLMs for Time Series Representation Learning, Bian et al., arxiv 2024. [paper]
  • UniTS: Building a Unified Time Series Model, Gao et al., arxiv 2024. [paper][code]
  • Chronos: Learning the Language of Time Series, Ansari et al., arxiv 2024. [paper][code]

Repositories