In which file to read the source code implementation of El-Attention for self-attention
Closed this issue · 1 comments
ADaBenxiong commented
Hello, thank you very much for your outstanding work.
We want to read the implementation of EL-Attention source code about self-attention, but we haven't found the relevant source code implementation. In which folder should we read the source code implementation of EL-Attention.
Thanks a lot
yuyan2do commented
Please see implementation in https://github.com/microsoft/fastseq/blob/main/fastseq/optimizer/fairseq/el_attention_optimizer.py