TPF2017's Stars
deepseek-ai/DeepSeek-V2
DeepSeek-V2: A Strong, Economical, and Efficient Mixture-of-Experts Language Model
Hank0626/PDF
An official implementation of "Periodicity Decoupling Framework for Long-term Series Forecasting" (ICLR 2024)
SalesforceAIResearch/uni2ts
Unified Training of Universal Time Series Forecasting Transformers
salesforce/Merlion
Merlion: A Machine Learning Framework for Time Series Intelligence
ibm-granite/granite-tsfm
Foundation Models for Time Series
KindXiaoming/pykan
Kolmogorov Arnold Networks
amazon-science/chronos-forecasting
Chronos: Pretrained Models for Probabilistic Time Series Forecasting
state-spaces/mamba
Mamba SSM architecture
KimMeen/Time-LLM
[ICLR 2024] Official implementation of " 🦙 Time-LLM: Time Series Forecasting by Reprogramming Large Language Models"
DAMO-DI-ML/NeurIPS2023-One-Fits-All
The official code for "One Fits All: Power General Time Series Analysis by Pretrained LM (NeurIPS 2023 Spotlight)"
dwalton76/rubiks-cube-NxNxN-solver
A generic rubiks cube solver
kyo-takano/efficientcube
State-of-the-Art method for solving the Rubik's Cube
forestagostinelli/DeepCubeA
Code for DeepCubeA, a Deep Reinforcement Learning algorithm that can learn to solve the Rubik's cube.
vnpy/vnpy
基于Python的开源量化交易平台开发框架
Jack-Cherish/quantitative
量化交易:python3
qingsongedu/Awesome-TimeSeries-SpatioTemporal-LM-LLM
A professional list on Large (Language) Models and Foundation Models (LLM, LM, FM) for Time Series, Spatiotemporal, and Event Data.
thuml/Time-Series-Library
A Library for Advanced Deep Time Series Models.
liaoyuhua/LLM4TS
Large Language & Foundation Models for Time Series.
DeepRLChinese/DeepRL-Chinese
wangshusen/DRL
Deep Reinforcement Learning
datamllab/rlcard
Reinforcement Learning / AI Bots in Card (Poker) Games - Blackjack, Leduc, Texas, DouDizhu, Mahjong, UNO.
microsoft/DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
yuqinie98/PatchTST
An offical implementation of PatchTST: "A Time Series is Worth 64 Words: Long-term Forecasting with Transformers." (ICLR 2023) https://arxiv.org/abs/2211.14730
google-research/google-research
Google Research
THUDM/ChatGLM-6B
ChatGLM-6B: An Open Bilingual Dialogue Language Model | 开源双语对话语言模型
unit8co/darts
A python library for user-friendly forecasting and anomaly detection on time series.
Significant-Gravitas/AutoGPT
AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
meta-llama/llama
Inference code for Llama models
nomic-ai/gpt4all
GPT4All: Run Local LLMs on Any Device. Open-source and available for commercial use.
getcursor/cursor
The AI Code Editor