Attention with concat mechanism
FAhtisham opened this issue · 0 comments
FAhtisham commented
Hi, first of all, thank you fo such an amazing code. Gets the work done if someone goes through the seq2seq tutorial first.
If we have to incorporate the concat attention mechanism then how are we going to tackle that problem in terms of batches w/o a for loop ?