NLP相关的论文,尤其是文本自动摘要相关的文章和代码
- LCSTS:"Global Encoding for Abstractive Summarization".EMNLP(2015)[PDF]
- LSTM:"Long Short-Term Memory".Neural Comput(1997)[PDF]
- TextRank: "TextRank: Bringing Order into Text". ACL(2004)[PDF]
- Sequence to Sequence: "Sequence to Sequence Learning with Neural Networks". NIPS(2014) [PDF]
- Transformer: "Attention is All you Need". NeurIPS(2017) [PDF][MyNote]
- New Log-linear: "Efficient Estimation of Word Representations in Vector Space" ICLR(2013) [PDF]
- Transformer-XL: "Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context". ACL(2019) [PDF] [code] [code-official] [code-tf] [code-py] [myNote]
- Pointer Networks: "Pointer Networks". NIPS(2015) [PDF]
- GAN: "Generative Adversarial Nets". NIPS(2014) [PDF] [code]
- BERT: "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding". NAACL(2019) [PDF] [code] [myNote] ⭐⭐⭐⭐⭐
- BoostingBERT: "BoostingBERT:Integrating Multi-Class Boosting into BERT for NLP". CoRR(2019) [PDF][myNote]
- ALBERT: "ALBERT: A Lite BERT for Self-supervised Learning of Language Representations". ICLR(2020) [PDF]
- RoBERTa: "RoBERTa: A Robustly Optimized BERT Pretraining Approach". arXiv(2019) [PDF] [code]
- BIG BiRD: "Big Bird: Transformers for Longer Sequences". NeurIPS(2020)[PDF]
- PEGASUS: "PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization". ICML(2020) [PDF] [code]
- HIBERT: "HIBERT: Document Level Pre-training of Hierarchical Bidirectional Transformers for Document Summarization". ACL(2019) [PDF][myNote]
- SummaRuNNer: "SummaRuNNer: A Recurrent Neural Network Based Sequence Model for Extractive Summarization of Documents". ACL(2019) [PDF] [code][myNote]
- SciBERTSUM: "SciBERTSUM: Extractive Summarization for Scientific Documents". CoRR (2022)[PDF][code][myNote]
- BERT for Ad Hoc: "Simple Applications of BERT for Ad Hoc Document Retrieval". CoRR(2019)[PDF][myNote]
- REFRESH: "Ranking Sentences for Extractive Summarization with Reinforcement Learning". NAACL-HLT (2018)[PDF]
- fastNLP: "Searching for Effective Neural Extractive Summarization: WhatWorks and What’s Next". ACL(2019) [PDF]
- Three Sentences: "Three Sentences Are All You Need: Local Path Enhanced Document Relation Extraction". ACL(2021)[PDF][code]
- Sentences and Words: "Neural Summarization by Extracting Sentences and Words". ACL(2016) [PDF]
- SWAP-NET "Extractive Summarization with SWAP-NET: Sentences andWords from Alternating Pointer Networks". ACL(2018) [PDF][code]
- Leveraging BERT: "Leveraging BERT for Extractive Text Summarization on Lectures" CoRR(2019) [PDF][code][myNote]
- BERTSum: "Fine-tune BERT for Extractive Summarization". arXiv(2019) [PDF] [code]
- NeuSum: "Neural Document Summarization by Jointly Learning to Score and Select Sentences". ACL(2018) [PDF]
- DiscoBERT: "Discourse-Aware Neural Extractive Text Summarization". ACL(2020) [PDF][code]
- sentence-transformers: "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks". EMNLP (2019)[PDF][code]
- EANN: "EANN: Event Adversarial Neural Networks for Multi-Modal Fake News Detection". KDD(2018) [PDF][code][myNote]
- FEVER: "FEVER: a Large-scale Dataset for Fact Extraction and VERification". NAACL-HLT(2018) [PDF][code]
- Claim extraction: "Claim extraction from text using transfer learning". ICON(2020)[PDF][[code]](https://github.com/ashish6630/Claim extraction.git)[myNote]
- Biomedical Publications: "Claim Extraction in Biomedical Publications using Deep Discourse Model and Transfer Learning". CoRR(2019) [PDF][code]
- CREDO: "Neural Network Architecture for Credibility Assessment of Textual Claims". CoRR(2018) [PDF] [myNote]
- SVM-based: "Extracting Important Sentences with Support Vector Machines". COLING(2002) [PDF]
- Open Extraction: "Open Extraction of Fine-Grained Political Statements" EMNLP(2015) [PDF][code]
- LG + SR: "Credibility Assessment of Textual Claims on the Web". CIKM(2016) [PDF] [myNote]
- PubMed: "Claim Extraction in Biomedical Publications using Deep Discourse Model and Transfer Learning". CoRR(2019) [PDF][code]
- HoVer: "HOVER: A Dataset for Many-Hop Fact Extraction And Claim Verification". EMNLP(2020)[code]
- Fake-news-reasoning: "Automatic Fake News Detection: Are Models Learning to Reason?". ACL/IJCNLP(2021) [PDF] [code]
- ClaHi-GAT: "Rumor Detection on Twitter with Claim-Guided Hierarchical Graph Attention Networks". EMNLP(2021) [PDF][myNote]
- Group Learning: "Students Who Study Together Learn Better: On the Importance of Collective Knowledge Distillation for Domain Transfer in Fact Verification". EMNLP(2021) [PDF] [myNote]
- Meet The Truth: "Meet The Truth: Leverage Objective Facts and Subjective Views for Interpretable Rumor Detection". ACL/IJCNLP(Findings2021) [PDF]
- SC-CMC-KS: "Sentiment classification model for Chinese micro-blog comments based on key sentences extraction". Soft Comput(2021) [PDF]
- nlp.stanford: "Twitter Sentiment Analysis".Entropy(2009) [PDF]
- 3-Way: "Twitter Sentiment Analysis, 3-Way Classification: Positive, Negative or Neutral?". IEEE BigData(2018) [PDF] [myNote]
- Twitter Data: "Sentiment Analysis of Twitter Data". LSM(2011) [PDF] [myNote]
- STANKER: "STANKER: Stacking Network based on Level-grained Attention-masked BERT for Rumor Detection on Social Media". EMNLP(2021) [PDF] [code] [myNote]
- SentiGAN:"SentiGAN: Generating Sentimental Texts via Mixture Adversarial Networks". IJAC(2018) [PDF]
- CNN: "Convolutional Neural Networks for Sentence Classification". EMNLP() [PDF][code]
- TBJE: "A Transformer-based joint-encoding for Emotion Recognition and Sentiment Analysis". CoRR(2020) [PDF] [code]
-