SHI-Labs/Neighborhood-Attention-Transformer

PE added on query and key

XiaoyuShi97 opened this issue · 2 comments

Hi. I see that current version only support PE as a bias weight added to attention map. I wonder if future version supports adding PE on query and key, which is another common way of PE. Thx again for your work and prompt reply!

Hello and thanks for the interest.

Could you possibly refer a paper so we can look into it more?
Our current version follows Swin in applying relative positional biases to attention weights based on the relative position of the queries and keys to each other.

Closing this due to inactivity. If you still have questions feel free to open it back up.