/Flash-Attention-Softmax-N

CUDA and Triton implementations of Flash Attention with SoftmaxN.

Primary LanguagePythonGNU General Public License v3.0GPL-3.0

No issues in this repository yet.