Dao-AILab/flash-attention

FlashAttention Pytorch Integration

DianCh opened this issue · 1 comments

Hi authors! Hi, I’m trying to experiment and make tweaks and potential upgrades to FlashAttention, and wondering if this repoi or the Pytorch source code is the best place to start. Does the Pytorch integration copy-paste/pull from this original FlashAttention repo, or there are implementation changes made along with the integration - any chance you know about it?

Thanks!

The kernels are copy-pasted afaik.