Pinned Repositories
baichuan-speedup
纯c++的全平台llm加速库,支持python调用,支持baichuan, glm, llama, moss基座,手机端流畅运行chatglm-6B级模型单卡可达10000+token / s,
bert-nmt
incorporation the bert model into nmt model
bert-thai
BERT pre-training in Thai language
chatbot-base-on-Knowledge-Graph
使用深度学习方法解析问题 知识图谱存储 查询知识点 基于医疗垂直领域的对话系统
kaggleSolution
MachineLearning算法练习
LeetCode
LeetCode试题练习
mrc-for-flat-nested-ner
Code for ACL 2020 paper `A Unified MRC Framework for Named Entity Recognition`
NERBertProject
基于预训练语言模型BERT的中文命名实体识别样例
Task_nlp_resources
Task_simbert_project
the project about sim_bert usage
wanglaiqi's Repositories
wanglaiqi/mrc-for-flat-nested-ner
Code for ACL 2020 paper `A Unified MRC Framework for Named Entity Recognition`
wanglaiqi/chatbot-base-on-Knowledge-Graph
使用深度学习方法解析问题 知识图谱存储 查询知识点 基于医疗垂直领域的对话系统
wanglaiqi/Task_nlp_resources
wanglaiqi/kaggleSolution
MachineLearning算法练习
wanglaiqi/LeetCode
LeetCode试题练习
wanglaiqi/NERBertProject
基于预训练语言模型BERT的中文命名实体识别样例
wanglaiqi/Task_simbert_project
the project about sim_bert usage
wanglaiqi/baichuan-speedup
纯c++的全平台llm加速库,支持python调用,支持baichuan, glm, llama, moss基座,手机端流畅运行chatglm-6B级模型单卡可达10000+token / s,
wanglaiqi/bert-nmt
incorporation the bert model into nmt model
wanglaiqi/bert-thai
BERT pre-training in Thai language
wanglaiqi/bert4keras
light reimplement of bert for keras
wanglaiqi/Bi-SimCut
Code for NAACL 2022 main conference paper "Bi-SimCut: A Simple Strategy for Boosting Neural Machine Translation"
wanglaiqi/chineseocr_lite
超轻量级中文ocr,支持竖排文字识别, 支持ncnn推理 , psenet(8.5M) + crnn(6.3M) + anglenet(1.5M) 总模型仅17M
wanglaiqi/fairseq
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
wanglaiqi/japanese-pretrained-models
Code for producing Japanese pretrained models provided by rinna Co., Ltd.
wanglaiqi/KnowledgeDistillation
Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。
wanglaiqi/LLaMA-Efficient-Tuning
Fine-tuning LLaMA with PEFT (PT+SFT+RLHF with QLoRA)
wanglaiqi/Llama2-Chinese
Llama中文社区,最好的中文Llama大模型,完全开源可商用
wanglaiqi/MTBook
《机器翻译:基础与模型》肖桐 朱靖波 著 - Machine Translation: Foundations and Models
wanglaiqi/nlpcda
一键中文数据增强包 ; NLP数据增强、bert数据增强、EDA:pip install nlpcda
wanglaiqi/PaddleNLP
An NLP library with Awesome pre-trained Transformer models and easy-to-use interface, supporting wide-range of NLP tasks from research to industrial applications.
wanglaiqi/PKD-for-BERT-Model-Compression
pytorch implementation for Patient Knowledge Distillation for BERT Model Compression
wanglaiqi/Pretrained-Language-Model
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
wanglaiqi/roberta_zh
RoBERTa中文预训练模型: RoBERTa for Chinese
wanglaiqi/textRewriting
中文文本改写
wanglaiqi/transformers
🤗 Transformers: State-of-the-art Natural Language Processing for Pytorch, TensorFlow, and JAX.
wanglaiqi/wanglaiqi.github.io