sloppiest's Stars
BlinkDL/ChatRWKV
ChatRWKV is like ChatGPT but powered by RWKV (100% RNN) language model, and open source.
OpenMOSS/MOSS
An open-source tool-augmented conversational language model from Fudan University
LAION-AI/Open-Assistant
OpenAssistant is a chat-based assistant that understands tasks, can interact with third-party systems, and retrieve information dynamically to do so.
binary-husky/gpt_academic
为GPT/GLM等LLM大语言模型提供实用化交互接口,特别优化论文阅读/润色/写作体验,模块化设计,支持自定义快捷按钮&函数插件,支持Python和C++等项目剖析&自译解功能,PDF/LaTex论文翻译&总结功能,支持并行问询多种LLM模型,支持chatglm3等本地模型。接入通义千问, deepseekcoder, 讯飞星火, 文心一言, llama2, rwkv, claude2, moss等。
FreedomIntelligence/Medical_NLP
Medical NLP Competition, dataset, large models, paper
lonePatient/awesome-pretrained-chinese-nlp-models
Awesome Pretrained Chinese NLP Models,高质量中文预训练模型&大模型&多模态模型&大语言模型集合
PaddlePaddle/PaddleNLP
👑 Easy-to-use and powerful NLP and LLM library with 🤗 Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including 🗂Text Classification, 🔍 Neural Search, ❓ Question Answering, ℹ️ Information Extraction, 📄 Document Intelligence, 💌 Sentiment Analysis etc.
DA-southampton/NLP_ability
总结梳理自然语言处理工程师(NLP)需要积累的各方面知识,包括面试题,各种基础知识,工程能力等等,提升核心竞争力
PaddlePaddle/awesome-DeepLearning
深度学习入门课、资深课、特色课、学术案例、产业实践案例、深度学习知识百科及面试题库The course, case and knowledge of Deep Learning and AI
GanjinZero/GTS
Code for Unsupervised multi-granular Chinese word segmentation and term discovery via graph partition [JBI]
PaddlePaddle/Research
novel deep learning research works with PaddlePaddle
MatNLP/SMedBERT
SMedBERT: A Knowledge-Enhanced Pre-trained Language Model withStructured Semantics for Medical Text Mining
WENGSYX/Chinese-Word2vec-Medicine
Chinese Word2vec Medicine,中文医学词向量
trueto/medbert
本项目开源硕士毕业论文“BERT模型在中文临床自然语言处理中的 应用探索与研究”相关模型
lxy444/bertcner
Chinese clinical named entity recognition using pre-trained BERT model
GanjinZero/awesome_Chinese_medical_NLP
中文医学NLP公开资源整理:术语集/语料库/词向量/预训练模型/知识图谱/命名实体识别/QA/信息抽取/模型/论文/etc
brightmart/albert_zh
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
dmis-lab/biobert
Bioinformatics'2020: BioBERT: a pre-trained biomedical language representation model for biomedical text mining
google-research/albert
ALBERT: A Lite BERT for Self-supervised Learning of Language Representations
albertlauncher/albert
A fast and flexible keyboard launcher
GanjinZero/ChineseEHRBert
A Chinese EHR Bert Pretrained Model.
huggingface/transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
allenai/dont-stop-pretraining
Code associated with the Don't Stop Pretraining ACL 2020 paper