Passer1072/RookieAI_yolov8

使用自己的onnx模型转TensorRT(.engine)模型后,在项目中调用报错

Opened this issue · 3 comments

Exception in thread Thread-2 (main_program_loop):
Traceback (most recent call last):
File "D:\miniconda3\envs\RookieAI_yolov8-main\lib\threading.py", line 1016, in _bootstrap_inner
self.run()
File "D:\miniconda3\envs\RookieAI_yolov8-main\lib\threading.py", line 953, in run
self._target(*self._args, **self._kwargs)
File "D:\ai\RookieAI_yolov8-main\RookieAI_YOLOv8_V2.5.0.py", line 2179, in main_program_loop
results = model.predict(frame, save=False, conf=confidence, half=True, agnostic_nms=True, iou=0.7, classes=[
File "D:\miniconda3\envs\RookieAI_yolov8-main\lib\site-packages\ultralytics\engine\model.py", line 444, in predict
return self.predictor.predict_cli(source=source) if is_cli else self.predictor(source=source, stream=stream)
File "D:\miniconda3\envs\RookieAI_yolov8-main\lib\site-packages\ultralytics\engine\predictor.py", line 168, in call
return list(self.stream_inference(source, model, *args, **kwargs)) # merge list of Result into one
File "D:\miniconda3\envs\RookieAI_yolov8-main\lib\site-packages\torch\utils_contextlib.py", line 35, in generator_context
response = gen.send(None)
File "D:\miniconda3\envs\RookieAI_yolov8-main\lib\site-packages\ultralytics\engine\predictor.py", line 261, in stream_inference
self.results = self.postprocess(preds, im, im0s)
File "D:\miniconda3\envs\RookieAI_yolov8-main\lib\site-packages\ultralytics\models\yolo\detect\predict.py", line 25, in postprocess
preds = ops.non_max_suppression(
File "D:\miniconda3\envs\RookieAI_yolov8-main\lib\site-packages\ultralytics\utils\ops.py", line 229, in non_max_suppression
xc = prediction[:, 4:mi].amax(1) > conf_thres # candidates
IndexError: amax(): Expected reduction dim 1 to have non-zero size.

有转TensorRT参数要求吗QAQ

yolo export model="your_model.pt" format=engine device=0 imgsz=640

Or

yolo export model="your_model.pt" format=engine device=0 imgsz=640
half= True

我也报错,是又因为格式imgsz不是640的问题吗? 我用的是320的大小