PixArt-alpha/PixArt-sigma

Cannot load a Lora safetensors file

ukaprch opened this issue · 5 comments

my code:
adapter_id = "C:/Users/kaprc/.cache/huggingface/hub/lora/add-detail-xl.safetensors"
from peft import PeftModel
transformer = PeftModel.from_pretrained(transformer, adapter_id) <=== ERROR BELOW

Message=Can't find 'adapter_config.json' at 'C:/Users/kaprc/.cache/huggingface/hub/lora/add-detail-xl.safetensors'
Source=C:\Users\kaprc\source\repos\AI\runtimes\bin\windows\Python38\Lib\site-packages\peft\config.py
StackTrace:
File "C:\Users\kaprc\source\repos\AI\runtimes\bin\windows\Python38\Lib\site-packages\peft\config.py", line 197, in _get_peft_type
config_file = hf_hub_download(
File "C:\Users\kaprc\source\repos\AI\runtimes\bin\windows\Python38\Lib\site-packages\huggingface_hub\utils_validators.py", line 110, in _inner_fn
validate_repo_id(arg_value)
File "C:\Users\kaprc\source\repos\AI\runtimes\bin\windows\Python38\Lib\site-packages\huggingface_hub\utils_validators.py", line 158, in validate_repo_id
raise HFValidationError(
huggingface_hub.utils._validators.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': 'C:/Users/kaprc/.cache/huggingface/hub/lora/add-detail-xl.safetensors'. Use repo_type argument if needed.

During handling of the above exception, another exception occurred:

File "C:\Users\kaprc\source\repos\AI\runtimes\bin\windows\Python38\Lib\site-packages\peft\config.py", line 203, in _get_peft_type
raise ValueError(f"Can't find '{CONFIG_NAME}' at '{model_id}'")
File "C:\Users\kaprc\source\repos\AI\runtimes\bin\windows\Python38\Lib\site-packages\peft\peft_model.py", line 328, in from_pretrained
PeftConfig._get_peft_type(
File "C:\Users\kaprc\source\repos\AI\modules\Inpaint-Anything\stable_diffusion_inpaint.py", line 248, in text_2_image_with_pixart
transformer = PeftModel.from_pretrained(transformer, adapter_id)
File "C:\Users\kaprc\source\repos\AI\modules\Inpaint-Anything\app\app.py", line 215, in text_2_img_pixart (Current frame)
text_2_img = text_2_image_with_pixart(
File "C:\Users\kaprc\source\repos\AI\runtimes\bin\windows\Python38\Lib\site-packages\gradio\utils.py", line 661, in wrapper
response = f(*args, **kwargs)
File "C:\Users\kaprc\source\repos\AI\runtimes\bin\windows\Python38\Lib\site-packages\anyio_backends_asyncio.py", line 807, in run
result = context.run(func, *args)
File "C:\Users\kaprc\source\repos\AI\runtimes\bin\windows\Python38\Lib\threading.py", line 932, in _bootstrap_inner
self.run()
File "C:\Users\kaprc\source\repos\AI\runtimes\bin\windows\Python38\Lib\threading.py", line 890, in _bootstrap
self._bootstrap_inner()
ValueError: Can't find 'adapter_config.json' at 'C:/Users/kaprc/.cache/huggingface/hub/lora/add-detail-xl.safetensors'

1.5 Loras won't work with this model.

They are SDXL loras, not 1.5 SD loras.

Same problem though, PixArt isn't SDXL. Loras (if and when Sigma support them) will need to be trained on their base model.

Released lora related code

That's awesome! Thanks for your hard work.