/yolo-inference-onnx

Run YOLO inference in C++ or Python using ONNX model

Primary LanguageC++Apache License 2.0Apache-2.0

YOLO inference using ONNX model

Run YOLO inference in C++ or Python using ONNX model

outputPic credit: Matthias Hangst/Getty Images

Prerequisites/Tested On

  • OpenCV 4.7.0
  • Python 3.10
  • CMake 3.5.1
  • C++ 17
  • Tested Yolov5 & Yolov7 ONNX models

Config

  • Change the input image, class names, and model path in src/yolo_inference.cpp or src/yolo_inference.py

Use below drop-down to see the steps

C++

Build

  • Clone the repository
  • Create a build directory
  • Run cmake
  • Run make

Steps

git clone https://github.com/kvnptl/yolo-inference-onnx.git
cd yolo-inference-onnx
mkdir build
cd build
cmake ..
make

Run

  • Go to the build directory
  • Run the executable
./yolo_inference

(OPTIONAL) Note: there is also a header file include/yolo_inference.hpp which contains the inference function. You can use that function in your own code if you want.

Python
python3 src/yolo_inference.py