Error on the Generate an ONNX model and optimize step
OverStache opened this issue · 5 comments
Describe the bug
error when Download stable diffusion PyTorch pipeline...
TypeError: stat: path should be string, bytes, os.PathLike or integer, not NoneType
no "base_model" attribute from model_info import of the huggingface_hub package
To Reproduce
run the python stable_diffusion.py --optimize
Expected behavior
the next step, idk
reference to https://community.amd.com/t5/ai/how-to-running-optimized-automatic1111-stable-diffusion-webui-on/ba-p/625585
Olive config
pip install olive-ai[directml]==0.4.0
git clone https://github.com/microsoft/olive --branch v0.4.0
reference https://github.com/microsoft/Olive/blob/main/examples/directml/stable_diffusion/README.md#setup
Olive logs
Download stable diffusion PyTorch pipeline...
Traceback (most recent call last):
File "D:\Games\compressed\yys\olive\examples\directml\stable_diffusion\stable_diffusion.py", line 355, in
optimize(args.model_id, unoptimized_model_dir, optimized_model_dir)
File "D:\Games\compressed\yys\olive\examples\directml\stable_diffusion\stable_diffusion.py", line 195, in optimize
pipeline = DiffusionPipeline.from_pretrained(base_model_id, torch_dtype=torch.float32)
File "D:\miniconda\envs\olive\lib\site-packages\huggingface_hub\utils_validators.py", line 119, in _inner_fn
return fn(*args, **kwargs)
File "D:\miniconda\envs\olive\lib\site-packages\diffusers\pipelines\pipeline_utils.py", line 665, in from_pretrained
if not os.path.isdir(pretrained_model_name_or_path):
File "D:\miniconda\envs\olive\lib\genericpath.py", line 42, in isdir
st = os.stat(s)
TypeError: stat: path should be string, bytes, os.PathLike or integer, not NoneType
Other information
- OS: Windows 10
- Olive version: 0.4.0
- ONNXRuntime package and version: onnxruntime-directml 1.17.3
- AMD Ryzen 5600X
- 16 GB RAM
- AMD Radeon RX 6700 XT 12 GB (Driver Version 23.10.18.06-230824a-395307C-INTERNAL-AMD-Software-PRO-Edition)
Additional context
i tried to trace the bug, here's what i found:
on the olive\examples\directml\stable_diffusion\user_script.py line 26.
the model_info(model_name).cardData.get("base_model", model_name) script returns None.
then i check, i ran only the model_info(model_name).cardData, there isn't any attribute called "base_model"
- the model_name was the default "runwayml/stable-diffusion-v1-5"
Any help would be appreciated
Hi, this issue is due to a change in the huggingface api and have been fixed on main by #968
Could you update olive to the most recent version (or install from main) and use the corresponding example https://github.com/microsoft/Olive/tree/main/examples#important?
i've checked out to the v0.2.0 tag,
still got the same error running the same command
*the versions installed from main
olive-ai 0.6.0
onnxruntime-directml 1.17.3 (installed manually: pip install olive-ai[directml])
i've checked out to the v0.2.0 tag, still got the same error running the same command
*the versions installed from main olive-ai 0.6.0 onnxruntime-directml 1.17.3 (installed manually: pip install olive-ai[directml])
i was using the examples/directml/stable_diffusion version, then i check this commit e325ee7, i copied the line 26 changes to the examples/directml/stable_diffusion version, the pipeline download works now, i'll close the issue if the pipeline has succeeded
Since you installed olive from main, please run the example by checking out main too.
Because this issue is coming from an external dependency, the old tag example will still have the issue unless you install an older version of huggingface hub without the breaking changes.
i was using the examples/directml/stable_diffusion version, then i check this commit e325ee7, i copied the line 26 changes to the examples/directml/stable_diffusion version, the pipeline download works now, i'll close the issue if the pipeline has succeeded
Yes, that would work too! Just need to make the changes to fix the error due to the api change in huggingface hub.