lucidrains/linear-attention-transformer
Transformer based on a variant of attention that is linear complexity in respect to sequence length
PythonMIT
Stargazers
- AppServiceProviderDhaka, Bangladesh
- aydao
- ceshine@veritable-tech
- cfoster0
- chenkejinUniversity of Chinese Academy of Sciences
- ChristophAltBayer
- cih-y2k
- dimidd
- donglixpMicrosoft Research
- Eurus-HolmesUCLA
- fly51flyPRIS
- gaceladri
- gaguilarAmazon.com
- haichao592Beijing
- haozhejiBeijing, China
- jeffhsu3Ivy Natal
- jon-towNew York, New York
- JonathanFlyiforcedabot.com
- kamishNone
- kewiuss
- kugwzk
- likicodeUCL
- lliai
- LoicDagnasFrance
- madisonmay@IndicoDataSolutions
- NaxAlpha@autifyhq
- PhilippMarquardt
- psbots
- Sajid3
- sammyj-wGirlsAndBoysInTech
- TarpeliteBeihang University
- thinline72@lucidworks
- tyoc213
- vishaal27University of Tübingen | University of Cambridge
- xiaoikerStanford Univerisity
- zhhongzhi