bart
There are 212 repositories under bart topic.
bytedance/lightseq
LightSeq: A High Performance Library for Sequence Processing and Generation
dbiir/UER-py
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
guillaume-be/rust-bert
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
nlpodyssey/spago
Self-contained Machine Learning and Natural Language Processing library in Go
Tencent/TencentPretrain
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
shibing624/textgen
TextGen: Implementation of Text Generation models, include LLaMA, BLOOM, GPT2, BART, T5, SongNet and so on. 文本生成模型,实现了包括LLaMA,ChatGLM,BLOOM,GPT2,Seq2Seq,BART,T5,UDA等模型的训练和预测,开箱即用。
asahi417/lm-question-generation
Multilingual/multidomain question generation datasets, models, and python library for question generation.
nlpodyssey/cybertron
Cybertron: the home planet of the Transformers in Go
dpressel/mint
MinT: Minimal Transformer Library and Tutorials
sudharsan13296/Getting-Started-with-Google-BERT
Build and train state-of-the-art natural language processing models using BERT
MikeWangWZHL/EEG-To-Text
code for AAAI2022 paper "Open Vocabulary Electroencephalography-To-Text Decoding and Zero-shot Sentiment Classification"
varunkumar-dev/TransformersDataAugmentation
Code associated with the "Data Augmentation using Pre-trained Transformer Models" paper
shijx12/KQAPro_Baselines
Pytorch implementation of baseline models of KQA Pro, a large-scale dataset of complex question answering over knowledge base.
asahi417/lmppl
Calculate perplexity on a text with pre-trained language models. Support MLM (eg. DeBERTa), recurrent LM (eg. GPT3), and encoder-decoder LM (eg. Flan-T5).
VinAIResearch/BARTpho
BARTpho: Pre-trained Sequence-to-Sequence Models for Vietnamese (INTERSPEECH 2022)
aj-naik/Text-Summarization
Abstractive and Extractive Text summarization using Transformers.
tanyuqian/progressive-generation
NAACL 2021 - Progressive Generation of Long Text
vipulraheja/iterater
Official implementation of the paper "IteraTeR: Understanding Iterative Revision from Human-Written Text" (ACL 2022)
Junpliu/ConDigSum
Code for EMNLP 2021 paper "Topic-Aware Contrastive Learning for Abstractive Dialogue Summarization"
thu-coai/JointGT
Codes for our paper "JointGT: Graph-Text Joint Representation Learning for Text Generation from Knowledge Graphs" (ACL 2021 Findings)
IndoNLP/indonlg
The first-ever vast natural language generation benchmark for Indonesian, Sundanese, and Javanese. We provide multiple downstream tasks, pre-trained IndoGPT and IndoBART models, and a starter code! (EMNLP 2021)
HHousen/DocSum
A tool to automatically summarize documents abstractively using the BART or PreSumm Machine Learning Model.
pkchat-focus/FoCus
Source codes and dataset of Call for Customized Conversation: Customized Conversation Grounding Persona and Knowledge
ayaka14732/TrAVis
TrAVis: Visualise BERT attention in your browser
amazon-science/transformers-data-augmentation
Code associated with the "Data Augmentation using Pre-trained Transformer Models" paper
p208p2002/Transformer-QG-on-SQuAD
Implement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)
j-convey/BankTextCategorizer
Automated Categorization: Utilizing the power of neural networks, this project offers an automated solution to categorize bank descriptions, reducing manual effort and enhancing efficiency while maintaining privacy.
ayaka14732/TransCan
An English-to-Cantonese machine translation model
vdorie/stan4bart
Uses Stan sampler and math library to semiparametrically fit linear and multilevel models with additive Bayesian Additive Regression Tree (BART) components.
nsi319/Finetune-Transformers
Abstractive text summarization by fine-tuning seq2seq models.
cosmoquester/transformers-bart-pretrain
Script to pre-train hugginface transformers BART with Tensorflow 2
laihuiyuan/pre-trained-formality-transfer
Thank you BART! Rewarding Pre-Trained Models Improves Formality Style Transfer (ACL 2021)
ayaka14732/bart-base-jax
JAX implementation of the bart-base model
Seoneun/KoBART-Question-Generation
KorQuAD Korean domain Question Generation module based on KoBART
ImKeTT/PCAE
[KBS] PCAE: A Framework of Plug-in Conditional Auto-Encoder for Controllable Text Generation PyTorch Implementation
priism-center/thinkCausal_dev
Point-and-click bartCause analysis and causal inference education