monotonic-attention
There are 3 repositories under monotonic-attention topic.
j-min/MoChA-pytorch
PyTorch Implementation of "Monotonic Chunkwise Attention" (ICLR 2018)
shyamupa/hma-translit
Transliteration using Sequence to Sequence transduction using Hard Monotonic Attention, based on our EMNLP 2018 paper
George0828Zhang/ssnt_loss
Pure PyTorch implementation of the loss described in "Online Segment to Segment Neural Transduction" https://arxiv.org/abs/1609.08194