laugh12321/TensorRT-YOLO

[Help]: trtexec export tensorrt model failed

fungtion opened this issue · 4 comments

when running trtexec --onnx=model.onnx --saveEngine=model.engine --fp16, it failed to create engine because could not find any implementation for node /model/model/model.0/conv/Conv

image

Is model.onnx exported using export.py? What is the version of TensorRT you're using? Could you provide the model and its version for testing purposes?

Sorry for late. I'm using TensorRT 8.6.1.6 + cuda 11.8 + cudnn 8.8 to export yolov9-c-converted.onnx to TensorRT model. I test this configuration on 4090 and P4, and it works fine, but when I used the same configuration on Titan x, it reported this error.

If you're encountering export failures specifically on certain devices, it could be an issue related to TensorRT. You may want to refer to NVIDIA/TensorRT#3640 for more information and potential solutions.

Thanks, I will check that