/Recurrent-Attention-Model

Implementation of Recurrent Attention Model

Primary LanguageJupyter Notebook

RAM

My implementation of "Recurrent Model of Visual Attention"

Interpretation of Gradient Flow Network in RAM

In the graph below, arrow line means forward flow in the network. Colored oval means the source of loss and colored line means the flow of gradient. Special Line with block sign means gradient does not flow though this line. Gradient Flow Network

References:

[1] https://github.com/zhongwen/RAM