推理时模型维度报错
Opened this issue · 3 comments
File "torch_to_onnx.py", line 190, in
conver_bert_torch_to_onnx()
File "torch_to_onnx.py", line 104, in conver_bert_torch_to_onnx
sess = onnxruntime.InferenceSession(MODEL_ONNX_PATH)
File "/opt/conda/lib/python3.7/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 335, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/opt/conda/lib/python3.7/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 379, in _create_inference_session
sess.initialize_session(providers, provider_options, disabled_optimizers)
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node (Squeeze_1169) Op (Squeeze) [ShapeInferenceError] Dimension of input 1 must be 1 instead of 768
有遇到过这样的问题吗
是不是模型选的不对,这个只是针对那一个模型做了加速,如果换别的模型,需要修改代码,目前我还没完备代码
我是用的pytorch_model.bin这个模型,需要改那一部分代码能否告知一下
哈哈,只是针对某一个模型才有用的~