/AttentioNN

All about attention in neural networks. Soft attention, attention maps, local and global attention and multi-head attention.

Primary LanguageJupyter NotebookMIT LicenseMIT

AttentioNN

All about attention in neural networks described as colab notebooks

Notebooks

Name Description Notebook
Attention maps How does a CNN attent to image objects
Attention in nmt Attention mechanism in neural machine translation
Attention in image captioning Attention in image captioning using sof attention and double stochastic regularization
Transofrmer I Positional encoding, mutli-head attention and point-wise feed-forward neural networks
Transofrmer II Masked multi-head attention with layer normalization