HaoBytes
PhD Student @ University of Manchester Research Intern @ Microsoft Research
University of ManchesterManchester
HaoBytes's Stars
Mihaiii/backtrack_sampler
An easy-to-understand framework for LLM samplers that rewind and revise generated tokens
laiguokun/multivariate-time-series-data
epideep/ILI-Data
sail-sg/volo
VOLO: Vision Outlooker for Visual Recognition
eumemic/ai-legion
An LLM-powered autonomous agent platform
microsoft/autogen
A programming framework for agentic AI 🤖
Significant-Gravitas/AutoGPT
AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
HITsz-TMG/Multi-agent-peer-review
Official implementation of our paper "Towards Reasoning in Large Language Models via Multi-Agent Peer Review Collaboration".
OpenBMB/ChatDev
Create Customized Software using Natural Language Idea (through LLM-powered Multi-Agent Collaboration)
AdityaLab/lstprompt
ngruver/llmtime
yxbian23/aLLM4TS
[ICML2024] Official repo for paper "Multi-Patch Prediction: Adapting LLMs for Time Series Representation Learning"
QiushiSun/Corex
[COLM'24] Corex: Pushing the Boundaries of Complex Reasoning through Multi-Model Collaboration
HaoUNSW/PISA
DC-research/TEMPO
The official code for "TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting (ICLR 2024)". TEMPO is one of the very first open source Time Series Foundation Models for forecasting task v1.0 version.
Exafunction/codeium.vim
Free, ultrafast Copilot alternative for Vim and Neovim
imsyy/home
个人主页,我的个人主页,个人主页源码,主页模板,homepage
amazon-science/RAGChecker
RAGChecker: A Fine-grained Framework For Diagnosing RAG
AutoSurveys/AutoSurvey
AGI-Edgerunners/LLM-Agents-Papers
A repo lists papers related to LLM based agent
SakanaAI/AI-Scientist
The AI Scientist: Towards Fully Automated Open-Ended Scientific Discovery 🧑🔬
zhihanyue/ts2vec
A universal time series representation learning framework
time-series-foundation-models/lag-llama
Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting
Nixtla/nixtla
TimeGPT-1: production ready pre-trained Time Series Foundation Model for forecasting and anomaly detection. Generative pretrained transformer for time series trained on over 100B data points. It's capable of accurately predicting various domains such as retail, electricity, finance, and IoT with just a few lines of code 🚀.
amazon-science/chronos-forecasting
Chronos: Pretrained (Language) Models for Probabilistic Time Series Forecasting
DAMO-DI-ML/NeurIPS2023-One-Fits-All
The official code for "One Fits All: Power General Time Series Analysis by Pretrained LM (NeurIPS 2023 Spotlight)"
google-research/timesfm
TimesFM (Time Series Foundation Model) is a pretrained time-series foundation model developed by Google Research for time-series forecasting.
karthik-codex/Autogen_GraphRAG_Ollama
Microsoft's GraphRAG + AutoGen + Ollama + Chainlit = Fully Local & Free Multi-Agent RAG Superbot
wdndev/llm_interview_note
主要记录大语言大模型(LLMs) 算法(应用)工程师相关的知识及面试题
chenfei-wu/TaskMatrix