iglaweb/TFProfiler

Support for custom ops in tflite file?

hamlatzis opened this issue · 3 comments

I wanted to profile my tflite file but failed to load it because it contains custom ops.

Looked at the code and show the tflite library is downloaded from the internet. I have compiled from sources the library and have added custom ops. Is there a way to use my own version of the library instead of the one from the internet?

I wanted to profile my tflite file but failed to load it because it contains custom ops.

Looked at the code and show the tflite library is downloaded from the internet. I have compiled from sources the library and have added custom ops. Is there a way to use my own version of the library instead of the one from the internet?

You need to follow this guide to support custom ops https://www.tensorflow.org/lite/guide/ops_custom#convert_to_a_tensorflow_lite_model
Also, please, try your custom model with different delegates – cpu, gpu, xnnpack.
You are welcome to propose a pull request to the repo!

#1 (comment)
@iglaweb

that's what I'm saying. I have created my own custom ops and have been using my customised library for some time now. My problem is I don't know how to modify the TFProfiler so that instead of downloading the library to use my version, so that I can profile my models

For instance in the Android demos provided with the original library I've found were to make changes so that they don't download a library but use my own local .aar file but they are Java and not Kotlin

In the build.grandle I see there is code to download

    implementation 'org.tensorflow:tensorflow-lite-hexagon:0.0.0-nightly' //not available in stable channel
    implementation "org.tensorflow:tensorflow-lite:$tensorflow"
    implementation "org.tensorflow:tensorflow-lite-gpu:$tensorflow"
    implementation "org.tensorflow:tensorflow-lite-select-tf-ops:$tensorflow"
    implementation 'org.tensorflow:tensorflow-lite-support:0.1.0-rc1'
    implementation 'org.tensorflow:tensorflow-lite-metadata:0.1.0-rc2'

can I replace them with something like:

implementation project(':tensorflow-lite')

to get my local version?

can I replace them with something like:

implementation project(':tensorflow-lite')

to get my local version?

Yeah, this is the way to replace it with your own library project.