taodongjie's Stars
LlamaFamily/Llama-Chinese
Llama中文社区,Llama3在线体验和微调模型已开放,实时汇总最新Llama3学习资料,已将所有代码更新适配Llama3,构建最好的中文Llama大模型,完全开源可商用
huggingface/peft
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
meta-llama/llama
Inference code for Llama models
Kent0n-Li/ChatDoctor
scutcyr/SoulChat
中文领域心理健康对话大模型SoulChat
Toyhom/Chinese-medical-dialogue-data
Chinese medical dialogue data 中文医疗对话数据集
zhangsheng93/cMedQA2
This is updated version of the dataset for Chinese community medical question answering.
lemuria-wchen/imcs21
Code and dataset for our Bioinformatics 2022 paper: "A Benchmark for Automatic Medical Consultation System: Frameworks, Tasks and Datasets"
king-yyf/CMeKG_tools
SCIR-HI/Huatuo-Llama-Med-Chinese
Repo for BenTsao [original name: HuaTuo (华驼)], Instruction-tuning Large Language Models with Chinese Medical Knowledge. 本草(原名:华驼)模型仓库,基于中文医学知识的大语言模型指令微调
yizhongw/self-instruct
Aligning pretrained language models with instruction data generated by themselves.
FreedomIntelligence/HuatuoGPT
HuatuoGPT, Towards Taming Language Models To Be a Doctor. (An Open Medical GPT)
scutcyr/BianQue
中文医疗对话模型扁鹊(BianQue)
PKU-YuanGroup/ChatLaw
ChatLaw:A Powerful LLM Tailored for Chinese Legal. 中文法律大模型
THUDM/ChatGLM2-6B
ChatGLM2-6B: An Open Bilingual Chat LLM | 开源双语对话语言模型
Dao-AILab/flash-attention
Fast and memory-efficient exact attention
rmlzy/my-ebook
我收集的电子书, 主要是软件开发方向
AI4Finance-Foundation/FinGPT
FinGPT: Open-Source Financial Large Language Models! Revolutionize 🔥 We release the trained model on HuggingFace.
mymusise/ChatGLM-Tuning
基于ChatGLM-6B + LoRA的Fintune方案
Blealtan/RWKV-LM-LoRA
RWKV is a RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
BlinkDL/ChatRWKV
ChatRWKV is like ChatGPT but powered by RWKV (100% RNN) language model, and open source.
BlinkDL/RWKV-LM
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
THUDM/P-tuning-v2
An optimized deep prompt tuning strategy comparable to fine-tuning across scales and tasks
HarderThenHarder/transformers_tasks
⭐️ NLP Algorithms with transformers lib. Supporting Text-Classification, Text-Generation, Information-Extraction, Text-Matching, RLHF, SFT etc.
hpcaitech/ColossalAI
Making large AI models cheaper, faster and more accessible
mlc-ai/mlc-llm
Universal LLM Deployment Engine with ML Compilation
huggingface/transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
horovod/horovod
Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.
facebookresearch/faiss
A library for efficient similarity search and clustering of dense vectors.
yym6472/ConSERT
Code for our ACL 2021 paper - ConSERT: A Contrastive Framework for Self-Supervised Sentence Representation Transfer