cmhungsteve/Awesome-Transformer-Attention

Applying to add a paper

tian-qing001 opened this issue · 1 comments

Hi~ Thanks for maintaining this awesome repository!

We have a new paper Agent Attention proposing a novel attention paradigm, which we think is relevant to the General Vision Transformer part of your repository. We are wondering if you could please add our paper to the collection.

Agent Attention: "Agent Attention: On the Integration of Softmax and Linear Attention", arXiv, 2023 (Tsinghua). [Paper][PyTorch]

Thank you very much.

Added.
Thank you for sharing