graykode/nlp-tutorial

A question about seq2seq with attention

wking-tao opened this issue · 0 comments

hi, I have a question about the way to calculate attention_weights.
https://github.com/graykode/nlp-tutorial/blob/master/4-2.Seq2Seq(Attention)/Seq2Seq(Attention)-Torch.py
屏幕快照 2019-05-27 下午5 31 15
in line 60, the attn_weights is calculated by dec_output and enc_outputs in your code, why not dec_hidden and enc_hidden?