gdh756462786's Stars
microsoft/JARVIS
JARVIS, a system to connect LLMs with ML community. Paper: https://arxiv.org/pdf/2303.17580.pdf
microsoft/unilm
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
LlamaFamily/Llama-Chinese
Llama中文社区,Llama3在线体验和微调模型已开放,实时汇总最新Llama3学习资料,已将所有代码更新适配Llama3,构建最好的中文Llama大模型,完全开源可商用
openai/DALL-E
PyTorch package for the discrete VAE used for DALL·E.
bojone/bert4keras
keras implement of transformers for humans
HIT-SCIR/ltp
Language Technology Platform
elyase/awesome-gpt3
km1994/nlp_paper_study
该仓库主要记录 NLP 算法工程师相关的顶会论文研读笔记
hiyouga/ChatGLM-Efficient-Tuning
Fine-tuning ChatGLM-6B with PEFT | 基于 PEFT 的高效 ChatGLM 微调
MaartenGr/KeyBERT
Minimal keyword extraction with BERT
yuanzhoulvpi2017/zero_nlp
中文nlp解决方案(大模型、数据、模型、训练、推理)
baidu/AnyQ
FAQ-based Question Answering System
google-research/electra
ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators
HarderThenHarder/transformers_tasks
⭐️ NLP Algorithms with transformers lib. Supporting Text-Classification, Text-Generation, Information-Extraction, Text-Matching, RLHF, SFT etc.
Separius/awesome-fast-attention
list of efficient attention modules
alibaba/EasyTransfer
EasyTransfer is designed to make the development of transfer learning in NLP applications easier.
XiaoMi/MiNLP
XiaoMi Natural Language Processing Toolkits
gregversteeg/corex_topic
Hierarchical unsupervised and semi-supervised topic models for sparse count data with CorEx
smoothnlp/SmoothNLP
专注于可解释的NLP技术 An NLP Toolset With A Focus on Explainable Inference
autoliuweijie/FastBERT
The score code of FastBERT (ACL2020)
letiantian/Pinyin2Hanzi
拼音转汉字, 拼音输入法引擎, pin yin -> 拼音
thunlp/OpenMatch
An Open-Source Package for Information Retrieval.
liqima/faiss_note
faiss wiki in chinese.
jd-aig/nlp_baai
NLP models and codes for BAAI-JD joint project.
thu-coai/cotk
Conversational Toolkit. An Open-Source Toolkit for Fast Development and Fair Evaluation of Text Generation
duongkstn/albert-vi-as-service
albert-vi-as-service: A Fork of bert-as-service to deploy albert_vi
hgliyuhao/Prompt4Classification
hongshengxin/BloomCat
A complete training process based bloom for LLM training, including pretraining, SFT, lora, qlora, ppo
jingyonglin/jingyonglin.github.io
gdh756462786/classifier_multi_label
multi-label,classifier,text classification,多标签文本分类,文本分类,BERT,ALBERT,multi-label-classification