CyberZHG/keras-self-attention

how to apply Attention between two LSTM layers?

Closed this issue · 0 comments

how to apply Attention between two LSTM layers?