A question about seq2seq with attention
wking-tao opened this issue · 0 comments
wking-tao commented
hi, I have a question about the way to calculate attention_weights.
https://github.com/graykode/nlp-tutorial/blob/master/4-2.Seq2Seq(Attention)/Seq2Seq(Attention)-Torch.py
in line 60, the attn_weights is calculated by dec_output and enc_outputs in your code, why not dec_hidden and enc_hidden?