/inference_engine

Simple ONNX model inference engine.

Primary LanguageC++MIT LicenseMIT

About this repository

  • This is a simple ONNX model inference runtime.
  • This is a just toy project for my study.

How to build sample scripts

Requirements

  • cmake (>= 3.15)
  • OpenCV (>= 4.0.0)
  • protobuf (>= 3.9.1)
  • An ONNX file for Multilayer Perceptron model

Build steps

MNIST + 3 MLP image classification model sample

Note that I have tested this script with the ONNX model trained with Chainer mnist example and exported with onnx-chainer.

Regarding other onnx models, all of them are not gurunteed working well.

mkdir build
cd build
cmake -DCMAKE_BUILD_TYPE=Release ..
make
# {Download images from http://yann.lecun.com/exdb/mnist/}
# {Convert the binary to jpg image}
./example/mnist_mlp.o -i /path/to/mnist/image -m /path/to/onnx_model

ImageNet + VGG19 image classification model sample

Note that I have tested this script only with the following models with ImageNet image resized with 224 x 224.

Regarding other onnx models, all of them are not gurunteed working well.

mkdir build
cd build
cmake -DCMAKE_BUILD_TYPE=Release ..
make
# {Download imagenet image from http://image-net.org/synset}
# {Resize the images to 224 x 224}
./example/imagenet_vgg19.o -i /path/to/image_net/image -m /path/to/onnx_model

How to test

cd inference_engine/test
git submodule update
make

Supported Operators

  • Gemm
  • Relu
  • Convolution
  • MaxPooling
  • Dropout
  • Softmax
  • Reshape (provisional support)

License

MIT