can this be used in a seq 2 seq task?
cristianmtr opened this issue · 2 comments
cristianmtr commented
Can this be used in a seq 2 seq task? With an encoder LSTM and a decoder LSTM? The examples don't seem to cover this
AliOsm commented
If you are using TensorFlow you can use AttentionWrapper instead of this library, it is built-in functionality and supports Bahdanau and Luong attention.
https://www.tensorflow.org/api_docs/python/tf/contrib/seq2seq/AttentionWrapper
stale commented
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.