roberta
There are 363 repositories under roberta topic.
microsoft/LoRA
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
ymcui/Chinese-BERT-wwm
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
jessevig/bertviz
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
lonePatient/awesome-pretrained-chinese-nlp-models
Awesome Pretrained Chinese NLP Models,高质量中文预训练模型&大模型&多模态模型&大语言模型集合
brightmart/albert_zh
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
CLUEbenchmark/CLUE
中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
dbiir/UER-py
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
brightmart/roberta_zh
RoBERTa中文预训练模型: RoBERTa for Chinese
guillaume-be/rust-bert
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
fhamborg/news-please
news-please - an integrated web crawler and information extractor for news that just works
microsoft/DeBERTa
The implementation of DeBERTa
deepset-ai/FARM
:house_with_garden: Fast & easy transfer learning for NLP. Harvesting language models for the industry. Focus on Question Answering.
Tencent/TurboTransformers
a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.
CLUEbenchmark/CLUENER2020
CLUENER2020 中文细粒度命名实体识别 Fine Grained Named Entity Recognition
920232796/bert_seq2seq
pytorch实现 Bert 做seq2seq任务,使用unilm方案,现在也可以做自动摘要,文本分类,情感分析,NER,词性标注等任务,支持t5模型,支持GPT2进行文章续写。
Tencent/TencentPretrain
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
CLUEbenchmark/CLUECorpus2020
Large-scale Pre-training Corpus for Chinese 100G 中文预训练语料
grammarly/gector
Official implementation of the papers "GECToR – Grammatical Error Correction: Tag, Not Rewrite" (BEA-20) and "Text Simplification by Tagging" (BEA-21)
explosion/curated-transformers
🤖 A PyTorch library of curated Transformer models and their composable components
CLUEbenchmark/CLUEPretrainedModels
高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型
asyml/texar-pytorch
Integrating the Best of TF into PyTorch, for Machine Learning, Natural Language Processing, and Text Generation. This is part of the CASL project: http://casl-project.ai/
clue-ai/PromptCLUE
PromptCLUE, 全中文任务支持零样本学习模型
VinAIResearch/PhoBERT
PhoBERT: Pre-trained language models for Vietnamese (EMNLP-2020 Findings)
rinnakk/japanese-pretrained-models
Code for producing Japanese pretrained models provided by rinna Co., Ltd.
VinAIResearch/BERTweet
BERTweet: A pre-trained language model for English Tweets (EMNLP-2020)
KLUE-benchmark/KLUE
📖 Korean NLU Benchmark
EricFillion/happy-transformer
Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.
hemingkx/CLUENER2020
A PyTorch implementation of a BiLSTM\BERT\Roberta(+CRF) model for Named Entity Recognition.
HHousen/TransformerSum
Models to perform neural summarization (extractive and abstractive) using machine learning transformers and a tool to convert abstractive summarization datasets to the extractive task.
cliang1453/BOND
BOND: BERT-Assisted Open-Domain Name Entity Recognition with Distant Supervision
dpressel/mint
MinT: Minimal Transformer Library and Tutorials
brightmart/xlnet_zh
中文预训练XLNet模型: Pre-Trained Chinese XLNet_Large
iflytek/MiniRBT
MiniRBT (中文小型预训练模型系列)
sudharsan13296/Getting-Started-with-Google-BERT
Build and train state-of-the-art natural language processing models using BERT
liuyukid/transformers-ner
Pytorch-Named-Entity-Recognition-with-transformers
amansrivastava17/embedding-as-service
One-Stop Solution to encode sentence to fixed length vectors from various embedding techniques