Tutorial of how to deloy DNN on android device using TFLite.
You can follow the tutorial here. Unfortunately, only the demo of classfication works. When you run the object detection demo on your phone, it will report the runtime error "Object Detector not found". The reason is that it is lack of a shared library. This tutorial will introduce how to solve this on this section.
The tutorial mentioned above only offers example models such as SSD and mobilenetv1. You can also use your own model. However, there is not a thorough and detailed tutorial which teaches you how to perform quantization and andoird deployment from online now. This toturial covers how to do this step by step.
- If you are using other frameworks such as MXNet or PyTorch, the first thing you need to do is convert your DNN models into tensorflow model based on ONNX.
- Before 8-bit quantization, you first need to apply fake quantization into the model using
tf.contrib.quantize.create_training_graph
for recording min/max andtf.contrib.quantize.create_eval_graph
for evalutaion (you can refer the link). Then run it on your calibarate dataset. - Using TFLite Convertor to convert the fake quantized model to tflite model. Note that the operator list needs to contrain all the intermediate operators. You can do this in both command line and python code.
- Put you quantized model on
asset folder
and revise the path in the code here. Pls also make sure your label.txt inasset folder
is corresponding to the your dataset.
The significant performance improvement can be seen when comparing these floating point with 8-bit quantization.
Floating-point Demo | 8-bit demo |
I have build the example project your you in example
folder.
- Download bazel 0.19 and tensorflow 1.13 for cross-compilation. Note that version much be matched, which mean if you are going to change tensorflow version, you need to change the bazel version accordingly.
- Install Android Studio, NDK and SDK according to the guidence here, then set their environment variable to their path accordingly.
- Run
.configure
in tensorflow directory to set up android envisionment.(need to set environment variable in step 2) - Run the command
bazel build -c opt --cxxopt='--std=c++11' --fat_apk_cpu=armeabi-v7a --cpu=armeabi-v7a //tensorflow/lite/examples/android:tflite_demo
, you can change the cpu device to which you want - An apk will generated in
tensorflow/bazel-bin
, untar it and get the shared lib calledlibtensorflow_demo.so
- Create an folder called
jniLibs
and put the shared library inside there. - Change all the app name in all the files under the project to
org.tensorflow.demo
- Open the project on Android Studio, connet your android device to your computer via USB
- Click
Sync project with gradle file
, then run. If you have apk file, you can also install via adb
cd /path/to/Andoird/Sdk
cd platform-tools/
./adb devices # it will list the devices number, for example aaabbbccc
./adb -s aaabbbccc install /path/to/your/apk