sicara/tf2-yolov4

Inference failing with exported tfLite models

m-romanenko opened this issue · 0 comments

The following code results in an error:

import tensorflow as tf
import numpy as np

# Load the TFLite model and allocate tensors.
interpreter = tf.lite.Interpreter(model_path="yolov4.tflite")
interpreter.allocate_tensors()

# Get input and output tensors.
input_details = interpreter.get_input_details()
output_details = interpreter.get_output_details()

# Test the model on random input data.
input_shape = input_details[0]['shape']
input_data = np.array(np.random.random_sample(input_shape), dtype=np.float32)
interpreter.set_tensor(input_details[0]['index'], input_data)
interpreter.invoke()
RuntimeError: Encountered unresolved custom op: CombinedNonMaxSuppression.Node number 603 (CombinedNonMaxSuppression) failed to prepare.

An issue is opened in Tensorflow repo to add CombinedNonMaxSuppression op to flex, which will hopefully resolve this :
tensorflow/tensorflow#41012