Pinned Repositories
Time-LLM
[ICLR 2024] Official implementation of " 🦙 Time-LLM: Time Series Forecasting by Reprogramming Large Language Models"
Autoformer
About Code release for "Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting" (NeurIPS 2021), https://arxiv.org/abs/2106.13008
AutoTimes
Official implementation for "AutoTimes: Autoregressive Time Series Forecasters via Large Language Models"
CoST
gill
🐟 Code and models for the NeurIPS 2023 paper "Generating Images with Multimodal Language Models".
TimeMixer
[ICLR 2024] Official implementation of "TimeMixer: Decomposable Multiscale Mixing for Time Series Forecasting"
neuralforecast
Scalable and user friendly neural :brain: forecasting algorithms.
iTransformer
Official implementation for "iTransformer: Inverted Transformers Are Effective for Time Series Forecasting" (ICLR 2024 Spotlight), https://openreview.net/forum?id=JePfAI8fah
Time-Series-Library
A Library for Advanced Deep Time Series Models.
Time-MoE
Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts
kwuking's Repositories
kwuking/TimeMixer
[ICLR 2024] Official implementation of "TimeMixer: Decomposable Multiscale Mixing for Time Series Forecasting"
kwuking/AutoTimes
Official implementation for "AutoTimes: Autoregressive Time Series Forecasters via Large Language Models"
kwuking/Autoformer
About Code release for "Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting" (NeurIPS 2021), https://arxiv.org/abs/2106.13008
kwuking/CoST
kwuking/gill
🐟 Code and models for the NeurIPS 2023 paper "Generating Images with Multimodal Language Models".
kwuking/google-research
Google Research
kwuking/iTransformer
Official implementation for "iTransformer: Inverted Transformers Are Effective for Time Series Forecasting" (ICLR 2024 Spotlight), https://openreview.net/forum?id=JePfAI8fah
kwuking/Koopa
Code release for "Koopa: Learning Non-stationary Time Series Dynamics with Koopman Predictors" (NeurIPS 2023), https://arxiv.org/abs/2305.18803
kwuking/ModaVerse
[CVPR2024] ModaVerse: Efficiently Transforming Modalities with LLMs
kwuking/NExT-GPT
Code and models for NExT-GPT: Any-to-Any Multimodal Large Language Model
kwuking/Time-Series-Library
A Library for Advanced Deep Time Series Models.
kwuking/PatchTST
An offical implementation of PatchTST: "A Time Series is Worth 64 Words: Long-term Forecasting with Transformers." (ICLR 2023) https://arxiv.org/abs/2211.14730
kwuking/Time-MoE
Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts
kwuking/ts2vec
A universal time series representation learning framework