/lag-llama

Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting

Primary LanguagePythonApache License 2.0Apache-2.0

Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting

lag-llama-architecture

Lag-Llama is the first open-source foundation model for time series forecasting!

[Tweet Thread] [Model Weights] [Colab Demo 1: Zero-Shot Forecasting] [Colab Demo 2: (Preliminary Finetuning)] [GitHub] [Paper]


This repository houses the Lag-Llama architecture.


  • Coming Next: Detailed Fine-tuning Tutorial with examples on real-world datasets and best practices in using Lag-Llama! (Coming by end of March) 🚀

Updates:

  • 7-Mar-2024: We have released a preliminary Colab Demo 2 for finetuning, while we prepare a detailed tutorial. Please note this is preliminary and cannot be used for benchmarking. A detailed demo with instructions for benchmarking is coming soon along with the tutorial.
  • 17-Feb-2024: We have released a new updated Colab Demo 1 for zero-shot forecasting that shows how one can load time series of different formats.
  • 7-Feb-2024: We released Lag-Llama, with open-source model checkpoints and a Colab Demo for zero-shot forecasting.

Current Features:

💫 Zero-shot forecasting on a dataset of any frequency for any prediction length, using Colab Demo 1.

💫 (Preliminary) Finetuning on a dataset using Colab Demo 2.


Coming Soon:

⭐ A tutorial for finetuning Lag-Llama.

⭐ A tutorial for pretraining Lag-Llama on your own large-scale data.

⭐ Scripts to reproduce all results in the paper.

⭐ An online gradio demo where you can upload time series and get zero-shot predictions and perform finetuning.


Stay Tuned!🦙


Citing this work

Please use the following Bibtex entry to cite Lag-Llama.

@misc{rasul2024lagllama,
      title={Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting}, 
      author={Kashif Rasul and Arjun Ashok and Andrew Robert Williams and Hena Ghonia and Rishika Bhagwatkar and Arian Khorasani and Mohammad Javad Darvishi Bayazi and George Adamopoulos and Roland Riachi and Nadhir Hassen and Marin Biloš and Sahil Garg and Anderson Schneider and Nicolas Chapados and Alexandre Drouin and Valentina Zantedeschi and Yuriy Nevmyvaka and Irina Rish},
      year={2024},
      eprint={2310.08278},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}