OpenNLPLab/cosFormer
[ICLR 2022] Official implementation of cosformer-attention in cosFormer: Rethinking Softmax in Attention
PythonApache-2.0
Stargazers
- AbyssaledgeNLPR, CASIA @BraveGroup
- asako98
- bruinxiongSenseTime, Xi'an, China
- chinaliwenbo广州
- ClockworkShelley
- DCMMCTsinghua University
- dongan-betaZhejiang University
- dukebwHamilton, Ontario, Canada
- dumpmemory
- dxli94
- EmiyassstarTencent
- Flawless1202Tongji University
- fly51flyPRIS
- gcuderSyndena
- HankYeDuke University
- him4318Delhi
- jasongief
- KongMingxi
- leessen
- Minimong
- PWhiddySeattle WA
- SSshuishui
- sun254667307
- sustcsonglinMIT
- tiangarin
- tjyuyaoTongji University
- tky823Japan
- tonylinsMIT, EECS
- wanganzhiChengDu
- wm901115nwpu
- xer250
- xfangsnNorth Carolina State University
- zanwenok
- zero0kiriyuChina
- zydsBeijing
- zzf-damon昆明