microsoft/onnxruntime-inference-examples

ValueError: Expected InsertedCast_gfpgan.conv_body_first.weight to be an initializer

AdvancedHe opened this issue · 0 comments

Hi, when I try to quantize my onnx model use below code:

import onnx
from onnxruntime.quantization import quantize_dynamic, QuantType

model_fp32 = 'S.onnx'
model_quant = 'quan.onnx'
quantized_model = quantize_dynamic(model_fp32, model_quant)

quantize_dynamic(
    model_input=model_fp32, 
    model_output=model_quant, 
    weight_type=QuantType.QInt8, 
    optimize_model=True 
)

It appeared:

Traceback (most recent call last):
  File "C:\Software\PythonIDE\CodeLib\Project1\export_onnx\onnx_quan.py", line 6, in <module>
    quantized_model = quantize_dynamic(model_fp32, model_quant)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Software\PythonIDE\Anaconda\Lib\site-packages\onnxruntime\quantization\quantize.py", line 508, in quantize_dynamic
    quantizer.quantize_model()
  File "C:\Software\PythonIDE\Anaconda\Lib\site-packages\onnxruntime\quantization\onnx_quantizer.py", line 268, in quantize_model
    op_quantizer.quantize()
  File "C:\Software\PythonIDE\Anaconda\Lib\site-packages\onnxruntime\quantization\operators\conv.py", line 131, in quantize
    self.add_bias(nodes, scaled_output_name)
  File "C:\Software\PythonIDE\Anaconda\Lib\site-packages\onnxruntime\quantization\operators\conv.py", line 36, in add_bias
    raise ValueError(f"Expected {node.input[1]} to be an initializer")
ValueError: Expected InsertedCast_gfpgan.conv_body_first.weight to be an initializer

Process finished with exit code 1

How do I do to solve this problem?