0
votes

I need to export a custom object detection model, fine-tuned on a custom dataset, to TensorFlow Lite, so that it can run on Android devices.

I'm using TensorFlow 2.4.1 on Ubuntu 18.04, and so far this is what I did:

  1. fine-tuned an 'ssd_mobilenet_v2_fpnlite_640x640_coco17_tpu-8' model, using a dataset of new images. I used the 'model_main_tf2.py' script from the repository;
  2. I exported the model using 'exporter_main_v2.py'
python exporter_main_v2.py --input_type image_tensor --pipeline_config_path .\models\custom_model\pipeline.config --trained_checkpoint_dir .\models\custom_model\ --output_directory .\exported-models\custom_model

which produced a Saved Model (.pb file);
3. I tested the exported model for inference, and everything works fine. In the detection routine, I used:

def get_model_detection_function(model):
##Get a tf.function for detection

    @tf.function
    def detect_fn(image):
        """Detect objects in image."""
        image, shapes = model.preprocess(image)
        prediction_dict = model.predict(image, shapes)
        detections = model.postprocess(prediction_dict, shapes)
        return detections, prediction_dict, tf.reshape(shapes, [-1])
    return detect_fn

and the shape of the produced image object is 640x640, as expected.

Then, I tried to convert this .pb model to tflite. After updating to the nightly version of tensorflow (with the normal version, I got an error), I was actually able to produce a .tflite file by using this code:

import tensorflow as tf
from tflite_support import metadata as _metadata

saved_model_dir = 'exported-models/custom_model/'

## Convert the model
converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.experimental_new_converter = True
converter.target_spec.supported_ops = [
    tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]
tflite_model = converter.convert()

# Save the model.
with open('tflite/custom_model.tflite', 'wb') as f:
    f.write(tflite_model)

I tried to use this model in AndroidStudio, following the instructions given here.

However, I'm getting a couple of errors:

  1. something regarding 'Not a valid Tensorflow lite model' (have to check better on this);
  2. the error:
java.lang.IllegalArgumentException: Cannot copy to a TensorFlowLite tensor (serving_default_input_tensor:0) with 3 bytes from a Java Buffer with 270000 bytes.

The second error seems to indicate there's something weird with the input expected from the tflite model. I examined the file with Netron, and this is what I got:

Netron

the input is expected to have...1x1x1x3 shape, or am I misinterpreting the graph? Should I somehow set the tensor input size when using the tflite exporter?

Anyway, what is the right way to export my custom model so that it can run on Android?

1

1 Answers

0
votes

TF Ops are supported via the Flex delegate. I bet that is the problem. If you want to check if it is that, you can do:

  1. Download benchmark app with flex delegate support for TF Ops. You can find it here, in the section Native benchmark binary: https://www.tensorflow.org/lite/performance/measurement. For example, for android is https://storage.googleapis.com/tensorflow-nightly-public/prod/tensorflow/release/lite/tools/nightly/latest/android_aarch64_benchmark_model_plus_flex

  2. Connect your phone to your computer and where you have downloaded the apk, do adb push <apk_name> /data/local/tmp

  3. Push your model adb push <tflite_model> /data/local/tmp

  4. Open shell adb shell and go to folder cd /data/local/tmp. Then run the app with ./<apk_name> --graph=<tflite_model>

Info from: