OpenNLPLab/cosFormer
[ICLR 2022] Official implementation of cosformer-attention in cosFormer: Rethinking Softmax in Attention
PythonApache-2.0
Stargazers
- weixuansun
- zshanwei
- jac578shanghai
- eugenelawrence
- IrvingShu
- strategist922Taipei, Taiwan
- jgravingKonstanz, Germany
- nrupatungaBangalore
- HubHopHangzhou, China
- rentainheShenzhen, China
- WangsujeonChangwon, Korea
- caoqing-ruijing
- ZJYCPShanghai, China
- goodloopShanghai
- huaifeng1993
- ishineshanghai
- cslg-z
- liyuke65535
- Dongmei97
- sunhaohaiWuhan, China
- lexiaoyuanChina,Hubei
- wanpengxyzz
- Jingyilang
- DingShizheBeijing China
- YuigaWadaKeio University
- SirDavidLudwigMurfreesboro, TN
- HuCui2022
- SevenMoGodShanghai, China
- salmedinaPittsburgh
- bishop986China, Xi'an
- yunzqqSingapore
- nihaomiaoMemphis, TN, U.S.A
- XrosLiang
- shadymcy
- JustinDoItBeijing
- KimMoon98