xmu-xiaoma666/External-Attention-pytorch

Multiple attention models comparison and summary

chrislouis0106 opened this issue · 0 comments

Hi, there.
I had a rough read of your repository and found some attention code models. I have never quite understood the difference between these attention models. Could you give me some advice or data on how to differentiate them?