Non-Autoregressive Generation Progress
2020
- [arXiv] Non-Autoregressive Neural Dialogue Generation
- [arXiv] Improving Fluency of Non-Autoregressive Machine Translation
- [arXiv] GLAT: Glancing Transformer for Non-Autoregressive Neural Machine Translation
- [arXiv] Task-Level Curriculum Learning for Non-Autoregressive Neural Machine Translation
- [arXiv] Insertion-Based Modeling for End-to-End Automatic Speech Recognition
- [ICML] Non-Autoregressive Neural Text-to-Speech
- [ACL] Learning to Recover from Multi-Modality Errors for Non-Autoregressive Neural Machine Translation
- [ACL] Jointly Masked Sequence-to-Sequence Model for Non-Autoregressive Neural Machine Translation
- [ACL] ENGINE: Energy-Based Inference Networks for Non-Autoregressive Machine Translation
- [ACL] Improving Non-autoregressive Neural Machine Translation with Monolingual Data
- [arXiv] Mask CTC: Non-Autoregressive End-to-End ASR with CTC and Mask Predict
- [ICML] Parallel Machine Translation with Disentangled Context Transformer
- [arXiv] Non-Autoregressive Machine Translation with Latent Alignments
- [ICML] Aligned Cross Entropy for Non-Autoregressive Machine Translation
- [arXiv] Semi-Autoregressive Training Improves Mask-Predict Decoding
- [arXiv] LAVA NAT: A Non-Autoregressive Translation Model with Look-Around Decoding and Vocabulary Attention
- [arXiv] Imputer: Sequence Modelling via Imputation and Dynamic Programming
- [ICML] An EM Approach to Non-autoregressive Conditional Sequence Generation
- [ICLR] Understanding Knowledge Distillation in Non-autoregressive Machine Translation
- [AAAI] Minimizing the Bag-of-Ngrams Difference for Non-Autoregressive Neural Machine Translation
- [AAAI] Latent-Variable Non-Autoregressive Neural Machine Translation with Deterministic Inference Using a Delta Posterior
- [AAAI] Fine-Tuning by Curriculum Learning for Non-Autoregressive Neural Machine Translation
2019
- [arXiv] Non-Autoregressive Transformer Automatic Speech Recognition
- [NeurIPS] Levenshtein Transformer
- [NeurIPS] Fast Structured Decoding for Sequence Models
- [EMNLP] Mask-Predict: Parallel Decoding of Conditional Masked Language Models
- [EMNLP] FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow
- [ACL] Retrieving Sequential Information for Non-Autoregressive Neural Machine Translation
- [ACL] Imitation Learning for Non-Autoregressive Neural Machine Translation
- [AAAI] Non-Autoregressive Machine Translation with Auxiliary Regularization
- [AAAI] Non-Autoregressive Neural Machine Translation with Enhanced Decoder Input
2018
- [EMNLP] Deterministic Non-Autoregressive Neural Sequence Modeling by Iterative Refinement
- [EMNLP] End-to-End Non-Autoregressive Neural Machine Translation with Connectionist Temporal Classification
- [ICLR] Non-Autoregressive Neural Machine Translation
Contact
Changhan Wang (wangchanghan@gmail.com)