grimoire/mmdetection-to-tensorrt

Converted model.engine does not work properly on DeepStream

xarauzo opened this issue · 0 comments

I have converted a MMDet model using this tool and I got an output 'model.engine' model. However, when using DeepStream (with the amirstan plugin) the inference does not work as expected. I get no errors from TensorRT during inference. With a 0.5 threshold I get no detections shown. I decreased the threshold to 0.1 (just to see what happens) and I get a lot of bounding boxes (but none of them are correct).

I am using DeepStream 5.0 on a Jetson Xavier NX, running Jetpack 4.4 (I can't change neither the DeepStream nor the Jetpack versions).