/BGRU-AttentionPooling

We propose the Bidirectional Gated Recurrent Units (BGRU), which are integrated with the attention mechanism to pay attention to the important information and filter out the noise.

Primary LanguagePython

BGRU-AttentionPooling