RizhaoCai/PyTorch_ONNX_TensorRT

AttributeError: 'NoneType' object has no attribute 'create_execution_context'

Zhang-O opened this issue · 1 comments

Connected to pydev debugger (build 181.5540.34)
Loading ONNX file from path ./models/onnx/model.onnx...
Beginning ONNX file parsing
WARNING: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
Successfully casted down to INT32.
Completed parsing of ONNX file
Building an engine from file ./models/onnx/model.onnx; this may take a while...
[TensorRT] ERROR: Network must have at least one output
Failed to create the engine

Connected to pydev debugger (build 181.5540.34)
Loading ONNX file from path ./models/onnx/model.onnx...
Beginning ONNX file parsing
WARNING: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
Successfully casted down to INT32.
Completed parsing of ONNX file
Building an engine from file ./models/onnx/model.onnx; this may take a while...
[TensorRT] ERROR: Network must have at least one output
Failed to create the engine

Hello @Zhang-O

The immediate cause of the error "[TensorRT] ERROR: Network must have at least one output" often results from
Incompatible versions of ONNX exporters of PyTorch and the ONNX parser of TensorRT.

You can do two things

  1. Please provide your versions information such that I can help you figure out the reason.
  2. Use the command line "trtexec --onnx=${onnx_file_name} --explicitBatch" to see the ouput