TensorRT for SOLO(use python)
TensorRT >=7.1
Ubuntu 18.04
python3 get_onnx.py --config ${SOLO_path}/configs/solov2_r101_fpn_8gpu_3x.py --checkpoint ${SOLO_path}/work_dirs/SOLOv2_R101_3x.pth --outputname solov2_r101.onnx
python3 inference.py --onnx_path solov2_r101.onnx --engine_path solov2_101.engine --mode fp16 --image_path ${your_picture_path} --save --show
GPU | Model | Mode | Inference time |
---|---|---|---|
V100 | solov2 r101 | fp16 | 35ms |
Xavier | solov2 r101 | fp16 | 150ms |