Yolov11 inference problem
Opened this issue · 3 comments
I have a custom yolov11m model. It has got a 2 classes and fp16 precision. Firstly I convert pt to onnx. Then I used this repo for convert onnx->trt. I used following command:
python3 export.py -o yolo11m.onnx -e yolov11m.trt --end2end --v8 -p fp16
The conversion process completed successfully. I tried trt.py for inference test example. But I have fail message:
self.imgsz = self.engine.get_tensor_shape(self.engine.get_tensor_name(0))[2:] # get the read shape of model, in case user input it wrong
Finally, I want to explain main problem. I want to deploy yolov11.trt model on triton inference server. I was able to deploy model on triton. I sended image with client code. The response of inference is fail. All bounding box coordinates are true but all classes are person class( index 0 in the labels list) Firstly I thought model tranining is false . But I tried inference with pt model that results are true. Bbox and class are true.
Also I tried public yolov11m model on github repo. I did same process. Problem was continue.
Hi YuiChoi,
How did you use setup for inference ? Your model deploy in Triton Server or trt.py ?
Hi YuiChoi,
How did you use setup for inference ? Your model deploy in Triton Server or trt.py ?
I use trt.py in this repo.