tzwo's Stars
521xueweihan/HelloGitHub
:octocat: 分享 GitHub 上有趣、入门级的开源项目。Share interesting, entry-level open source projects on GitHub.
binary-husky/gpt_academic
为GPT/GLM等LLM大语言模型提供实用化交互接口,特别优化论文阅读/润色/写作体验,模块化设计,支持自定义快捷按钮&函数插件,支持Python和C++等项目剖析&自译解功能,PDF/LaTex论文翻译&总结功能,支持并行问询多种LLM模型,支持chatglm3等本地模型。接入通义千问, deepseekcoder, 讯飞星火, 文心一言, llama2, rwkv, claude2, moss等。
labmlai/annotated_deep_learning_paper_implementations
🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
ruanyf/weekly
科技爱好者周刊,每周五发布
vllm-project/vllm
A high-throughput and memory-efficient inference and serving engine for LLMs
chenzomi12/AISystem
AISystem 主要是指AI系统,包括AI芯片、AI编译器、AI推理和训练框架等AI全栈底层技术
RUCAIBox/LLMSurvey
The official GitHub page for the survey paper "A Survey of Large Language Models".
liguodongiot/llm-action
本项目旨在分享大模型相关技术原理以及实战经验(大模型工程化、大模型应用落地)
DA-southampton/NLP_ability
总结梳理自然语言处理工程师(NLP)需要积累的各方面知识,包括面试题,各种基础知识,工程能力等等,提升核心竞争力
WooooDyy/LLM-Agent-Paper-List
The paper list of the 86-page paper "The Rise and Potential of Large Language Model Based Agents: A Survey" by Zhiheng Xi et al.
adapter-hub/adapters
A Unified Library for Parameter-Efficient and Modular Transfer Learning
datawhalechina/learn-nlp-with-transformers
we want to create a repo to illustrate usage of transformers in chinese
hanmq/MachineLearning_Zhouzhihua_ProblemSets
Exercises answers to the book "machine-learning" written by Zhou Zhihua。周志华《机器学习》课后习题,个人解答。各算法都拿numpy和pandas实现了一遍
zhangyikaii/NJUCS-Course-Material
南京大学计算机系 课程资料 作业 代码 实验报告 NJU-CS 课程分享计划 :rice:
AGI-Edgerunners/LLM-Adapters
Code for our EMNLP 2023 Paper: "LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models"
hrcheng1066/awesome-pruning
cambridgeltl/autopeft
AutoPEFT: Automatic Configuration Search for Parameter-Efficient Fine-Tuning (Zhou et al.; TACL)
azhe198827/channel_prune
jiangshen95/AutoPEFT