Dao-AILab/flash-attention

Does flash-attention2 support L40?

askcs517 opened this issue · 2 comments

Does flash-attention2 support L40?

Yes

where to view?I train llm by 8*L40,when i set flashattn=true,there has No speed boost?how to explain?