lukemelas/do-you-even-need-attention
Is the attention layer even necessary? (https://arxiv.org/abs/2105.02723)
Python
Stargazers
- a-maiti@WadhwaniAI
- cfoster0
- christian-cahigPhilippines
- ClashLuke@deepjudge-ai
- Cuda-ChenSeeking for opportunities
- djurkisPrague
- dumpmemory
- hamishdicksonLondon
- howtodowtleZurich, Switzerland
- iacolippo@lightonai
- imr555Neovotech
- jaideepmurkute
- jealejandroMálaga
- josephroccaSingapore
- justinpinkney
- kosmar2011
- krmiddlebrookSan Diego
- kumujiRWTH Aachen University
- lukemelasOxford University
- MicPieOpenBioML.org
- Nachimak28Mumbai, India
- narain1student
- nhannguyen2709Ho Chi Minh
- odulcyMindee
- oscmansanMontréal
- SeanNaren@NVIDIA
- Selimonder
- Smith42Aspia Space
- snnclsr
- tginart@MetaMind
- thirdratecyberpunk
- thomasjoNORCE
- vishaal27University of Tübingen | University of Cambridge
- voidfulTaiwan
- yngtoddKnoxville, TN
- zhjohnchanThe Chinese University of Hong Kong, Shenzhen