acl2021
There are 33 repositories under acl2021 topic.
styfeng/DataAug4NLP
Collection of papers and resources for data augmentation for NLP.
monk1337/resp
Fetch Academic Research Papers from different sources
Alibaba-NLP/ACE
[ACL-IJCNLP 2021] Automated Concatenation of Embeddings for Structured Prediction
thuiar/TEXTOIR
TEXTOIR is the first opensource toolkit for text open intent recognition. (ACL 2021)
cambridgeltl/sapbert
[NAACL'21 & ACL'21] SapBERT: Self-alignment pretraining for BERT & XL-BEL: Cross-Lingual Biomedical Entity Linking.
songhaoyu/BoB
The released codes for ACL 2021 paper 'BoB: BERT Over BERT for Training Persona-based Dialogue Models from Limited Personalized Data'
jinzhuoran/CogIE
CogIE: An Information Extraction Toolkit for Bridging Text and CogNet. ACL 2021
James-Yip/QuantiDCE
Towards Quantifiable Dialogue Coherence Evaluation (ACL 2021)
tingofurro/keep_it_simple
Codebase, data and models for the Keep it Simple paper at ACL2021
mrpeerat/OSKut
Handling Cross- and Out-of-Domain Samples in Thai Word Segmentation (ACL 2021 Findings).
nc-ai/MultimodalSum
[ACL-IJCNLP 2021] Self-Supervised Multimodal Opinion Summarization
thinkwee/UniKeyphrase
[ACL2021] A Unified Extraction and Generation Framework for Keyphrase Prediction"
gabeorlanski/stackoverflow-encourages-cheating
Code for the NLP4Prog workshop paper "Reading StackOverflow Encourages Cheating: Adding Question TextImproves Extractive Code Generation"
GaryYufei/ACL2021MF
Source Code For ACL 2021 Paper "Mention Flags (MF): Constraining Transformer-based Text Generators"
ICTMCG/MTM
Official repository to release the code and datasets in the paper, "Article Reranking by Memory-enhanced Key Sentence Matching for Detecting Previously Fact-checked Claims", ACL-IJCNLP 2021.
anthonywchen/AmbER-Sets
The official repository for "Evaluating Entity Disambiguation and the Role of Popularity in Retrieval-Based NLP" published in ACL-IJNLP 2021.
cliang1453/super-structured-lottery-tickets
Super Tickets in Pre-Trained Language Models: From Model Compression to Improving Generalization (ACL 2021)
L-Zhe/BTmPG
Code for paper Pushing Paraphrase Away from Original Sentence: A Multi-Round Paraphrase Generation Approach by Zhe Lin, Xiaojun Wan. This paper is accepted by Findings of ACL'21.
SUDA-LA/wist
[ACL'21] Data for "An In-depth Study on Internal Structure of Chinese Words".
jiacheng-xu/sum-interpret
Code for Dissecting Generation Modes for Abstractive Summarization Models via Ablation and Attribution (ACL2021)
princeton-nlp/dyck-transformer
[ACL 2021] Self-Attention Networks Can Process Bounded Hierarchical Languages
sh0416/oommix
Official implementation for ACL2021 Oral Paper: "OoMMix: Out-of-manifold Regularization in Contextual Embedding Space for Text Classification"
clefourrier/CopperMT
[ACL 2021, Findings] Cognate Prediction Per Machine Translation
speedcell4/nersted
Official implementation of "Nested Named Entity Recognition via Explicitly Excluding the Influence of the Best Path" (ACL'21)
izhx/CLasDA
[ACL 21] Crowdsourcing Learning as Domain Adaptation: A Case Study on Named Entity Recognition
yulang/fine-tuning-and-composition-in-transformers
This repo contains datasets and code for On the Interplay Between Fine-tuning and Composition in Transformers, by Lang Yu and Allyson Ettinger.
ishan00/translation-for-code-switching-acl
Official repository for the paper titled "From Machine Translation to Code-Switching: Generating High-Quality Code-Switched Text" accepted at ACL 2021
nadavborenstein/Iggy
Implementation of the paper "How Did This Get Funded?! Automatically Identifying Quirky Scientific Achievements"
fajri91/Multi_SummEval
Evaluating the Efficacy of Summarization Evaluation across Languages. In Findings of ACL 2021.
gzcsudo/MSPAN-VideoQA
Multi-Scale Progressive Attention Network for Video Question Answering
tatianabarbone/ticket-prices
Python scripts and Jupyter Notebook analysis of ACL Festival 2021 StubHub ticket data.
H-Freax/automaic_paper_downloading
自动 批量下载 2021 ACL pdf到本地
mokekuma-git/JLeague_Matches-Bar_Graph
Make a bar graph of points each team got and will get.