monotonic-attention

There are 3 repositories under monotonic-attention topic.

  • j-min/MoChA-pytorch

    PyTorch Implementation of "Monotonic Chunkwise Attention" (ICLR 2018)

    Language:Python8010520
  • shyamupa/hma-translit

    Transliteration using Sequence to Sequence transduction using Hard Monotonic Attention, based on our EMNLP 2018 paper

    Language:Python8322
  • George0828Zhang/ssnt_loss

    Pure PyTorch implementation of the loss described in "Online Segment to Segment Neural Transduction" https://arxiv.org/abs/1609.08194

    Language:Python2200