Seq2Seq pytorch
DonghyungKo opened this issue · 2 comments
DonghyungKo commented
Hi
thanks for sharing your codes.
I've had read your seq2seq implementation and I was wondering about the RNN Encode-Decode model.
in the paper, 'Learning Phrase Representations using RNN Encoder–Decoder
for Statistical Machine Translation'
They say
and I couldn't find the new hidden-state activation function in your code.
Do you have any plan to add the proposed activation process?
or is it okay to just skip the parts?
thank you so much in advance
graykode commented
Yes. but, Contribution is always open
DonghyungKo commented
thank you
I was just wondering whether it matters or not
straight-forward code btw, really nice work
thanks