justSOZ opened this issue 7 months ago · 1 comments
After checking a previous answer, there is still an error after installing 'local-attention': ModuleNotFoundError: No module named 'fast_transformers'.
弄好了,需要安装torch-fast-transformer