Akegarasu/lora-scripts

wd1.4的V3在训练器里和整合包里都报错,换了各种版本都没用

yangl5619 opened this issue · 1 comments

Traceback (most recent call last):
File "F:\sd-webui-aki-v4.6\python\lib\site-packages\gradio\routes.py", line 488, in run_predict
output = await app.get_blocks().process_api(
File "F:\sd-webui-aki-v4.6\python\lib\site-packages\gradio\blocks.py", line 1434, in process_api
data = self.postprocess_data(fn_index, result["prediction"], state)
File "F:\sd-webui-aki-v4.6\python\lib\site-packages\gradio\blocks.py", line 1297, in postprocess_data
self.validate_outputs(fn_index, predictions) # type: ignore
File "F:\sd-webui-aki-v4.6\python\lib\site-packages\gradio\blocks.py", line 1272, in validate_outputs
raise ValueError(
ValueError: An event handler (on_interrogate) didn't receive enough output values (needed: 4, received: 3).
Wanted outputs:
[textbox, label, label, html]
Received outputs:
[None, "", "

Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from F:\sd-webui-aki-v4.6.cache\huggingface\hub\models--SmilingWolf--wd-convnext-tagger-v3\snapshots\76f1f3eba9a519cb14b8ce23f54c8140a12580d7\model.onnx failed:D:\a_work\1\s\onnxruntime\core/graph/model_load_utils.h:56 onnxruntime::model_load_utils::ValidateOpsetForDomain ONNX Runtime only guarantees support for models stamped with official released onnx opset versions. Opset 4 is under development and support for this is limited. The operator schemas and or other functionality may change before next ONNX release and in this case ONNX Runtime will not guarantee backward compatibility. Current official support for domain ai.onnx.ml is till opset 3.

Time taken: 0.1 sec.

A: 2.15 GB, R: 2.16 GB, Sys: 3.8/23.9883 GB (15.7%)

"]

update onnxruntime version