curtainwang's Stars
xtekky/gpt4free
The official gpt4free repository | various collection of powerful language models
labmlai/annotated_deep_learning_paper_implementations
🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
chatchat-space/Langchain-Chatchat
Langchain-Chatchat(原Langchain-ChatGLM)基于 Langchain 与 ChatGLM, Qwen 与 Llama 等语言模型的 RAG 与 Agent 应用 | Langchain-Chatchat (formerly langchain-ChatGLM), local knowledge based LLM (like ChatGLM, Qwen and Llama) RAG and Agent app with langchain
Dao-AILab/flash-attention
Fast and memory-efficient exact attention
BlinkDL/RWKV-LM
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
pwxcoo/chinese-xinhua
:orange_book: 中华新华字典数据库。包括歇后语,成语,词语,汉字。
NielsRogge/Transformers-Tutorials
This repository contains demos I made with the Transformers library by HuggingFace.
facebookresearch/xformers
Hackable and optimized Transformers building blocks, supporting a composable construction.
OFA-Sys/Chinese-CLIP
Chinese version of CLIP which achieves Chinese cross-modal retrieval and representation generation.
IDEA-CCNL/Fengshenbang-LM
Fengshenbang-LM(封神榜大模型)是IDEA研究院认知计算与自然语言研究中心主导的大模型开源体系,成为中文AIGC和认知智能的基础设施。
mymusise/ChatGLM-Tuning
基于ChatGLM-6B + LoRA的Fintune方案
datawhalechina/hugging-llm
HuggingLLM, Hugging Future.
IST-DASLab/gptq
Code for the ICLR 2023 paper "GPTQ: Accurate Post-training Quantization of Generative Pretrained Transformers".
Werneror/Poetry
非常全的古诗词数据,收录了从先秦到现代的共计85万余首古诗词。
pytries/marisa-trie
Static memory-efficient Trie-like structures for Python based on marisa-trie C++ library.
lucidrains/mixture-of-experts
A Pytorch implementation of Sparsely-Gated Mixture of Experts, for massively increasing the parameter count of language models
zphang/minimal-llama
renmada/t5-pegasus-pytorch
yangjianxin1/LLMPruner
LC1332/Luotuo-Text-Embedding
Luotuo Embedding(骆驼嵌入) is a text embedding model, which developed by 李鲁鲁, 冷子昂, 陈启源, 蒟蒻等.
intersun/PKD-for-BERT-Model-Compression
pytorch implementation for Patient Knowledge Distillation for BERT Model Compression
sgugger/Adam-experiments
Experiments with Adam/AdamW/amsgrad
bojone/oppo-text-match
小布助手对话短文本语义匹配的一个baseline
foamliu/Sentiment-Analysis
细粒度用户评论情感分析
yangjianxin1/OFA-Chinese
transformers结构的中文OFA模型
BlinkDL/SmallInitEmb
LayerNorm(SmallInit(Embedding)) in a Transformer to improve convergence
ChihoLeung/RoBERTa_Emotion_Classification
基于 RoBERTa-wwm-ext 模型的微博中文情绪识别
shuxinyin/T5-NLP
Explore different chinese nlp tasks by using t5/mt5/t5-pegasus like text-classification, text-summary and so on.
jlealtru/website_tutorials
This repository contains tutorials published on [My personal website](https://jlealtru.github.io/)
PistonY/pytorchdeploy
PyTorch C++ interface