askcs517 opened this issue 5 months ago · 2 comments
Does flash-attention2 support L40?
Yes
where to view?I train llm by 8*L40,when i set flashattn=true,there has No speed boost?how to explain?