- PromptCast: A New Prompt-based Learning Paradigm for Time Series Forecasting Hao, in arXiv 2022. [Paper]
- One Fits All: Power General Time Series Analysis by Pretrained LM, in arXiv 2023. [Paper]
- Temporal Data Meets LLM -- Explainable Financial Time Series Forecasting, in arXiv 2023. [Paper]
- TEST: Text Prototype Aligned Embedding to Activate LLM's Ability for Time Series. [Paper]
- LLM4TS: Two-Stage Fine-Tuning for Time-Series Forecasting with Pre-Trained LLMs. [Paper]
- Large Language Models are Few-Shot Health Learners, in arXiv 2023. [Paper]
- Frozen Language Model Helps ECG Zero-Shot Learning, in arXiv 2023.[Paper]
-
A Survey on Time-Series Pre-Trained Models, in arXiv 2023. [Paper]
-
Transfer learning for Time Series Forecasting. [GitHub]
-
TST: A transformer-based framework for multi- variate time series representation learning. [Paper]
-
Ti-mae: Self-supervised masked time series autoencoders. [Paper]
-
SimMTM: A Simple Pre-Training Framework for Masked Time-Series Modeling. [Paper]
-
Cost: Contrastive learning of disentangled seasonal-trend rep- resentations for time series forecasting.[Paper]
-
TS2Vec: Towards Universal Representation of Time Series. [Paper]
- Recommendation as Language Processing (RLP): A Unified Pretrain, Personalized Prompt & Predict Paradigm (P5), in arXiv 2022. [Paper]
- LLM4Rec. [GitHub]
- AnyPredict: Foundation Model for Tabular Prediction, in arXiv 2023. [Paper]
- XTab: Cross-table Pretraining for Tabular Transformers, in ICML 2023. [Paper]
- Awesome-LLMOps. [GitHub]