ModelTC/MQBench

PTQ后转TVM失败

Xrigok opened this issue · 1 comments

PTQ后转TVM失败:
image
在追溯的时候打印MQBench-main/mqbench/deploy/common.py里的self.graph.initializer,发现不包括对输入的伪量化节点,导致self.onnx_model.graph.node里对输入的伪量化节点在self.onnx_model.initializer里找不到。
请问下这个问题怎么解决,在qat转TVM里是包括对输入的伪量化节点,但ptq却没有,会是导出onnx的问题吗?
底下是进行PTQ时对graph.initializer的name的打印,可以发现不包括对输入的伪量化节点:
conv1.weight
conv1.bias
conv1.weight_fake_quant.scale
conv1.weight_fake_quant.zero_point
layer1.0.conv1.weight
layer1.0.conv1.bias
layer1.0.conv1.weight_fake_quant.scale
layer1.0.conv1.weight_fake_quant.zero_point
layer1.0.conv2.weight
layer1.0.conv2.bias
layer1.0.conv2.weight_fake_quant.scale
layer1.0.conv2.weight_fake_quant.zero_point
layer1.1.conv1.weight
layer1.1.conv1.bias
layer1.1.conv1.weight_fake_quant.scale
layer1.1.conv1.weight_fake_quant.zero_point
layer1.1.conv2.weight
layer1.1.conv2.bias
layer1.1.conv2.weight_fake_quant.scale
layer1.1.conv2.weight_fake_quant.zero_point
layer2.0.conv1.weight
layer2.0.conv1.bias
layer2.0.conv1.weight_fake_quant.scale
layer2.0.conv1.weight_fake_quant.zero_point
layer2.0.conv2.weight
layer2.0.conv2.bias
layer2.0.conv2.weight_fake_quant.scale
layer2.0.conv2.weight_fake_quant.zero_point
layer2.0.downsample.0.weight
layer2.0.downsample.0.bias
layer2.0.downsample.0.weight_fake_quant.scale
layer2.0.downsample.0.weight_fake_quant.zero_point
layer2.1.conv1.weight
layer2.1.conv1.bias
layer2.1.conv1.weight_fake_quant.scale
layer2.1.conv1.weight_fake_quant.zero_point
layer2.1.conv2.weight
layer2.1.conv2.bias
layer2.1.conv2.weight_fake_quant.scale
layer2.1.conv2.weight_fake_quant.zero_point
layer3.0.conv1.weight
layer3.0.conv1.bias
layer3.0.conv1.weight_fake_quant.scale
layer3.0.conv1.weight_fake_quant.zero_point
layer3.0.conv2.weight
layer3.0.conv2.bias
layer3.0.conv2.weight_fake_quant.scale
layer3.0.conv2.weight_fake_quant.zero_point
layer3.0.downsample.0.weight
layer3.0.downsample.0.bias
layer3.0.downsample.0.weight_fake_quant.scale
layer3.0.downsample.0.weight_fake_quant.zero_point
layer3.1.conv1.weight
layer3.1.conv1.bias
layer3.1.conv1.weight_fake_quant.scale
layer3.1.conv1.weight_fake_quant.zero_point
layer3.1.conv2.weight
layer3.1.conv2.bias
layer3.1.conv2.weight_fake_quant.scale
layer3.1.conv2.weight_fake_quant.zero_point
layer4.0.conv1.weight
layer4.0.conv1.bias
layer4.0.conv1.weight_fake_quant.scale
layer4.0.conv1.weight_fake_quant.zero_point
layer4.0.conv2.weight
layer4.0.conv2.bias
layer4.0.conv2.weight_fake_quant.scale
layer4.0.conv2.weight_fake_quant.zero_point
layer4.0.downsample.0.weight
layer4.0.downsample.0.bias
layer4.0.downsample.0.weight_fake_quant.scale
layer4.0.downsample.0.weight_fake_quant.zero_point
layer4.1.conv1.weight
layer4.1.conv1.bias
layer4.1.conv1.weight_fake_quant.scale
layer4.1.conv1.weight_fake_quant.zero_point
layer4.1.conv2.weight
layer4.1.conv2.bias
layer4.1.conv2.weight_fake_quant.scale
layer4.1.conv2.weight_fake_quant.zero_point
fc.weight
fc.bias
691

This issue has not received any updates in 120 days. Please reply to this issue if this still unresolved!