Akegarasu/lora-scripts

为什么使用xformers会报错,如何解决

myface-wang opened this issue · 3 comments

Enable xformers for U-Net
Traceback (most recent call last):
File "C:\Users\24029\Downloads\lora-scripts\sd-scripts\train_network.py", line 996, in
trainer.train(args)
File "C:\Users\24029\Downloads\lora-scripts\sd-scripts\train_network.py", line 242, in train
vae.set_use_memory_efficient_attention_xformers(args.xformers)
File "C:\Users\24029\Downloads\lora-scripts\venv\lib\site-packages\diffusers\models\modeling_utils.py", line 262, in set_use_memory_efficient_attention_xformers
fn_recursive_set_mem_eff(module)
File "C:\Users\24029\Downloads\lora-scripts\venv\lib\site-packages\diffusers\models\modeling_utils.py", line 258, in fn_recursive_set_mem_eff
fn_recursive_set_mem_eff(child)
File "C:\Users\24029\Downloads\lora-scripts\venv\lib\site-packages\diffusers\models\modeling_utils.py", line 258, in fn_recursive_set_mem_eff
fn_recursive_set_mem_eff(child)
File "C:\Users\24029\Downloads\lora-scripts\venv\lib\site-packages\diffusers\models\modeling_utils.py", line 258, in fn_recursive_set_mem_eff
fn_recursive_set_mem_eff(child)
File "C:\Users\24029\Downloads\lora-scripts\venv\lib\site-packages\diffusers\models\modeling_utils.py", line 255, in fn_recursive_set_mem_eff
module.set_use_memory_efficient_attention_xformers(valid, attention_op)
File "C:\Users\24029\Downloads\lora-scripts\venv\lib\site-packages\diffusers\models\attention_processor.py", line 273, in set_use_memory_efficient_attention_xformers
raise e
File "C:\Users\24029\Downloads\lora-scripts\venv\lib\site-packages\diffusers\models\attention_processor.py", line 267, in set_use_memory_efficient_attention_xformers
_ = xformers.ops.memory_efficient_attention(
File "C:\Users\24029\Downloads\lora-scripts\venv\lib\site-packages\xformers\ops\fmha_init_.py", line 223, in memory_efficient_attention
return memory_efficient_attention(
File "C:\Users\24029\Downloads\lora-scripts\venv\lib\site-packages\xformers\ops\fmha_init
.py", line 321, in _memory_efficient_attention
return memory_efficient_attention_forward(
File "C:\Users\24029\Downloads\lora-scripts\venv\lib\site-packages\xformers\ops\fmha_init
.py", line 337, in _memory_efficient_attention_forward
op = _dispatch_fw(inp, False)
File "C:\Users\24029\Downloads\lora-scripts\venv\lib\site-packages\xformers\ops\fmha\dispatch.py", line 120, in _dispatch_fw
return _run_priority_list(
File "C:\Users\24029\Downloads\lora-scripts\venv\lib\site-packages\xformers\ops\fmha\dispatch.py", line 63, in _run_priority_list
raise NotImplementedError(msg)
NotImplementedError: No operator found for memory_efficient_attention_forward with inputs:
query : shape=(1, 2, 1, 40) (torch.float32)
key : shape=(1, 2, 1, 40) (torch.float32)
value : shape=(1, 2, 1, 40) (torch.float32)
attn_bias : <class 'NoneType'>
p : 0.0
decoderF is not supported because:
xFormers wasn't build with CUDA support
attn_bias type is <class 'NoneType'>
operator wasn't built - see python -m xformers.info for more info
flshattF@0.0.0 is not supported because:
xFormers wasn't build with CUDA support
dtype=torch.float32 (supported: {torch.float16, torch.bfloat16})
operator wasn't built - see python -m xformers.info for more info
tritonflashattF is not supported because:
xFormers wasn't build with CUDA support
dtype=torch.float32 (supported: {torch.float16, torch.bfloat16})
operator wasn't built - see python -m xformers.info for more info
triton is not available
cutlassF is not supported because:
xFormers wasn't build with CUDA support
operator wasn't built - see python -m xformers.info for more info
smallkF is not supported because:
max(query.shape[-1] != value.shape[-1]) > 32
xFormers wasn't build with CUDA support
operator wasn't built - see python -m xformers.info for more info
unsupported embed per head: 40
23:55:42-848789 ERROR Training failed / 训练失败

遇到同样问题,No module named 'xformers' ,只能关闭这个选项。3月份训练时还是可以勾选使用xformers的,似乎是更新后出现这个问题

  1. 根据机子中的 CUDA 版本,重新安装 xformers
    https://github.com/facebookresearch/xformers#installing-xformers

  2. 安装&更新 torchvision

pip install -U torchvision

venv) (base) G:\lora-scripts>pip3 install -U xformers --index-url https://download.pytorch.org/whl/cu121
Looking in indexes: https://download.pytorch.org/whl/cu121
ERROR: Could not find a version that satisfies the requirement xformers (from versions: none)
ERROR: No matching distribution found for xformers

(venv) (base) G:\lora-scripts>pip3 install -U xformers=0.0.25 --index-url https://download.pytorch.org/whl/cu121
ERROR: Invalid requirement: 'xformers=0.0.25'
Hint: = is not a valid operator. Did you mean == ?

(venv) (base) G:\lora-scripts>pip3 install -U xformers==0.0.25 --index-url https://download.pytorch.org/whl/cu121
Looking in indexes: https://download.pytorch.org/whl/cu121
ERROR: Could not find a version that satisfies the requirement xformers==0.0.25 (from versions: none)
ERROR: No matching distribution found for xformers==0.0.25

(venv) (base) G:\lora-scripts>pip3 install -U xformers==0.0.25post1 --index-url https://download.pytorch.org/whl/cu121
Looking in indexes: https://download.pytorch.org/whl/cu121
ERROR: Could not find a version that satisfies the requirement xformers==0.0.25post1 (from versions: none)
ERROR: No matching distribution found for xformers==0.0.25post1

(venv) (base) G:\lora-scripts>pip3 install -U xformers==0.0.27 --index-url https://download.pytorch.org/whl/cu121
Looking in indexes: https://download.pytorch.org/whl/cu121
ERROR: Could not find a version that satisfies the requirement xformers==0.0.27 (from versions: none)
ERROR: No matching distribution found for xformers==0.0.27

(venv) (base) G:\lora-scripts>pip3 install -U xformers --index-url https://download.pytorch.org/whl/cu121
Looking in indexes: https://download.pytorch.org/whl/cu121
ERROR: Could not find a version that satisfies the requirement xformers (from versions: none)
ERROR: No matching distribution found for xformers

有什么办法吗