kyegomez/FlashAttention20
Get down and dirty with FlashAttention2.0 in pytorch, plug in and play no complex CUDA kernels
PythonMIT
Stargazers
- 1353604736
- BillBrower@boozallen
- CFC87University of Wisconsin–Madison
- ChangyongYang
- cmcamdyShanghai
- co-sylvieToronto
- DachengLi1
- Decem-YDGUT
- DevenLu
- Erland366
- fushun1990beijing
- ipengx1029
- JeffCarpenterCanada
- jiwonsong-devSeoul, South Korea
- jturner116Toulouse, France
- kyegomezSwarms
- learning-chip
- lidh15Shenzhen
- liminnShangHai
- lonelycrab888USTC
- mengjunxieThe University of Tennessee at Chattanooga
- nirvana6
- Pashisfisuta
- qq332982511
- SakitsMIT, EECS
- sebastianjaszczurWarsaw, Poland
- shaoqing404
- skyheros001
- snakeztcZhejiang University
- strikerspsSoftware Engineer, CommScope
- tannlonggtaooBeijing
- THEFIG06
- VejniIRB Barcelona
- wawpaopaoBIo group
- yqnt418
- Zijie-TianTsinghua University