Pinned Repositories
BertSum
Code for paper Fine-tune BERT for Extractive Summarization
geval
Code for paper "G-Eval: NLG Evaluation using GPT-4 with Better Human Alignment"
hiersumm
Code for paper Hierarchical Transformers for Multi-Document Summarization in ACL2019
NLP-progress
Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks.
NoisySumm
Codes for NAACL 2021 paper 'Noisy Self-Knowledge Distillation for Text Summarization'
PreSumm
code for EMNLP 2019 paper Text Summarization with Pretrained Encoders
pytorch-transformers
👾 A library of state-of-the-art pretrained models for Natural Language Processing (NLP)
structured
code for Learning Structured Text Representations
SUMO
Code for paper Single Document Summarization as Tree Induction
unilm
UniLM - Unified Language Model Pre-training / Pre-training for NLP and Beyond
nlpyang's Repositories
nlpyang/BertSum
Code for paper Fine-tune BERT for Extractive Summarization
nlpyang/PreSumm
code for EMNLP 2019 paper Text Summarization with Pretrained Encoders
nlpyang/hiersumm
Code for paper Hierarchical Transformers for Multi-Document Summarization in ACL2019
nlpyang/geval
Code for paper "G-Eval: NLG Evaluation using GPT-4 with Better Human Alignment"
nlpyang/structured
code for Learning Structured Text Representations
nlpyang/SUMO
Code for paper Single Document Summarization as Tree Induction
nlpyang/NoisySumm
Codes for NAACL 2021 paper 'Noisy Self-Knowledge Distillation for Text Summarization'
nlpyang/pytorch-transformers
👾 A library of state-of-the-art pretrained models for Natural Language Processing (NLP)
nlpyang/NLP-progress
Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks.
nlpyang/unilm
UniLM - Unified Language Model Pre-training / Pre-training for NLP and Beyond
nlpyang/DeBERTa
The implementation of DeBERTa
nlpyang/nlp-yang.github.io
nlpyang/nlp_papers
nlpyang/SNLI-decomposable-attention
Decomposable Attention Model for Sentence Pair Classification (Pytorch Version) from paper 'A Decomposable Attention Model for Natural Language Inference' https://arxiv.org/abs/1606.01933