Dao-AILab/flash-attention

Headdim==96 in FA3

wplf opened this issue · 2 comments

wplf commented

Hi, Thank you for really great works.

I'd like to ask if headdim==96 in FA3 will be supported recently?

Thanks, again!

Yes you can try the decode branch. We're working on merging that branch to main.

wplf commented

okokokok,thank you very much.
Please open the sponsor , this will help me so much.