Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

problems during inference when using converted tflite model #169

Open
lukqw opened this issue Oct 18, 2021 · 1 comment
Open

problems during inference when using converted tflite model #169

lukqw opened this issue Oct 18, 2021 · 1 comment

Comments

@lukqw
Copy link

lukqw commented Oct 18, 2021

Hi there, I recently tried to convert your model to tf lite and run inference on it, but am experiencing some errors.

I'm using the following code to convert the model to tf lite:

model = Load_Yolo_model()
model.save("./yolo_model")
model_converter = tf.lite.TFLiteConverter.from_saved_model("./yolo_model")
model_lite = model_converter.convert()
f = open(f"./yolo.tflite", "wb")
f.write(model_lite)
f.close()

this creates both a saved_models folder and the .tflite file, which I should be able to use by doing the following:

model = self.tf.Interpreter(model_path=('./yolo.tflite'))
model.allocate_tensors()

input_details = model.get_input_details()
output_details = model.get_output_details()

input_shape = input_details[0]['shape']
input_data = np.array(np.random.random_sample(input_shape), dtype=np.float32)
model.set_tensor(input_details[0]['index'], input_data)

model.invoke()
model.get_tensor(output_details[0]['index'])

but I am getting the following error message:
external/org_tensorflow/tensorflow/lite/kernels/reshape.cc:58 stretch_dim != -1 (0 != -1)Node number 35 (RESHAPE) failed to prepare.

Did someone else run into this issue? I'm not entirely sure about what I am doing wrong

@lukqw
Copy link
Author

lukqw commented Oct 21, 2021

I managed to solve this problem by converting the model in the following way instead:
(taken from here)

batch_size = 1
model = Load_Yolo_model()
model.save("./yolo_model")
input_shape = model.inputs[0].shape.as_list()
input_shape[0] = batch_size
func = tf.function(model).get_concrete_function(tf.TensorSpec(input_shape, model.inputs[0].dtype))
model_converter = tf.lite.TFLiteConverter.from_concrete_functions([func])
model_lite = model_converter.convert()
f = open(f"./yolo_model.tflite", "wb")
f.write(model_lite)
f.close()

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant