/INTER-INTRA-attentions

An experimental custom seq-2-seq model with both layer-wise (inter-layer), and intra-layer attention (attention to previous hidden states of the same RNN unit) for abstractive summarization.

Primary LanguageJupyter Notebook

Watchers