11
votes

Tensorflow-Lite Android demo works with the original model it provides: mobilenet_quant_v1_224.tflite. See: https://github.com/tensorflow/tensorflow/tree/master/tensorflow/contrib/lite

They also provide other pretrained lite models here: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/lite/g3doc/models.md

However, I downloaded some of the smaller models from the above link, for example, mobilenet_v1_0.25_224.tflite, and replaced the original model with this model in the demo app by just changing the MODEL_PATH = "mobilenet_v1_0.25_224.tflite"; in the ImageClassifier.java. The app crashes with:

12-11 12:52:34.222 17713-17729/? E/AndroidRuntime: FATAL EXCEPTION: CameraBackground Process: android.example.com.tflitecamerademo, PID: 17713 java.lang.IllegalArgumentException: Failed to get input dimensions. 0-th input should have 602112 bytes, but found 150528 bytes. at org.tensorflow.lite.NativeInterpreterWrapper.getInputDims(Native Method) at org.tensorflow.lite.NativeInterpreterWrapper.run(NativeInterpreterWrapper.java:82) at org.tensorflow.lite.Interpreter.runForMultipleInputsOutputs(Interpreter.java:112) at org.tensorflow.lite.Interpreter.run(Interpreter.java:93) at com.example.android.tflitecamerademo.ImageClassifier.classifyFrame(ImageClassifier.java:108) at com.example.android.tflitecamerademo.Camera2BasicFragment.classifyFrame(Camera2BasicFragment.java:663) at com.example.android.tflitecamerademo.Camera2BasicFragment.access$900(Camera2BasicFragment.java:69) at com.example.android.tflitecamerademo.Camera2BasicFragment$5.run(Camera2BasicFragment.java:558) at android.os.Handler.handleCallback(Handler.java:751) at android.os.Handler.dispatchMessage(Handler.java:95) at android.os.Looper.loop(Looper.java:154) at android.os.HandlerThread.run(HandlerThread.java:61)

The reason seems to be that the input dimension required by the model is four times larger than the image size. So I modified DIM_BATCH_SIZE = 1 to DIM_BATCH_SIZE = 4. Now the error is:

FATAL EXCEPTION: CameraBackground Process: android.example.com.tflitecamerademo, PID: 18241 java.lang.IllegalArgumentException: Cannot convert an TensorFlowLite tensor with type FLOAT32 to a Java object of type [[B (which is compatible with the TensorFlowLite type UINT8) at org.tensorflow.lite.Tensor.copyTo(Tensor.java:36) at org.tensorflow.lite.Interpreter.runForMultipleInputsOutputs(Interpreter.java:122) at org.tensorflow.lite.Interpreter.run(Interpreter.java:93) at com.example.android.tflitecamerademo.ImageClassifier.classifyFrame(ImageClassifier.java:108) at com.example.android.tflitecamerademo.Camera2BasicFragment.classifyFrame(Camera2BasicFragment.java:663) at com.example.android.tflitecamerademo.Camera2BasicFragment.access$900(Camera2BasicFragment.java:69) at com.example.android.tflitecamerademo.Camera2BasicFragment$5.run(Camera2BasicFragment.java:558) at android.os.Handler.handleCallback(Handler.java:751) at android.os.Handler.dispatchMessage(Handler.java:95) at android.os.Looper.loop(Looper.java:154) at android.os.HandlerThread.run(HandlerThread.java:61)

My question is how to get a reduced-MobileNet tflite model to work with the TF-lite Android Demo.

(I actually tried other things, like convert a TF frozen graph to TF-lite model using the provided tool, even using exactly same example code as in https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/lite/toco/g3doc/cmdline_examples.md, but the converted tflite model still can not work in the Android Demo.)

2
Can you please state a clear question in the body of the post (not just the title)? Please have a look at this.compor
Just a note that I'm also experiencing this. Curiously, those same retrained models work fine for me when I drop them into the demo app for Tensorflow for Poets 2 Lite (which shares a lot of code with the Tensorflow-Android Lite demo referenced by OP. github.com/googlecodelabs/tensorflow-for-poets-2/tree/master/…Ash Eldritch

2 Answers

4
votes

The ImageClassifier.java included with Tensorflow-Lite Android demo expects a quantized model. As of right now, only one of the Mobilenets models is provided in quantized form: Mobilenet 1.0 224 Quant.

To use the other float models, swap in the ImageClassifier.java from the Tensorflow for Poets TF-Lite demo source. This is written for float models. https://github.com/googlecodelabs/tensorflow-for-poets-2/blob/master/android/tflite/app/src/main/java/com/example/android/tflitecamerademo/ImageClassifier.java

Do a diff and you'll see there are several important differences in implementation.

Another option to consider is converting the float models to quantized using TOCO: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/lite/toco/g3doc/cmdline_examples.md

2
votes

I was also getting the same errors as Seedling was getting. I have created a new Image classifier wrapper for Mobilenet Float model. It's working fine now. You can directly add this class in image classifier demo and use this to create classifier in Camera2BasicFragment

classifier = new ImageClassifierFloatMobileNet(getActivity());

below is the Image classifier class wrapper for Mobilenet Float model

    /**
 * This classifier works with the Float MobileNet model.
 */
public class ImageClassifierFloatMobileNet extends ImageClassifier {

  /**
   * An array to hold inference results, to be feed into Tensorflow Lite as outputs.
   * This isn't part of the super class, because we need a primitive array here.
   */
  private float[][] labelProbArray = null;

  private static final int IMAGE_MEAN = 128;
  private static final float IMAGE_STD = 128.0f;

  /**
   * Initializes an {@code ImageClassifier}.
   *
   * @param activity
   */
  public ImageClassifierFloatMobileNet(Activity activity) throws IOException {
    super(activity);
    labelProbArray = new float[1][getNumLabels()];
  }

  @Override
  protected String getModelPath() {
    // you can download this file from
    // https://storage.googleapis.com/download.tensorflow.org/models/tflite/mobilenet_v1_224_android_quant_2017_11_08.zip
//    return "mobilenet_quant_v1_224.tflite";
    return "retrained.tflite";
  }

  @Override
  protected String getLabelPath() {
//    return "labels_mobilenet_quant_v1_224.txt";
    return "retrained_labels.txt";
  }

  @Override
  public int getImageSizeX() {
    return 224;
  }

  @Override
  public int getImageSizeY() {
    return 224;
  }

  @Override
  protected int getNumBytesPerChannel() {
    // the Float model uses a 4 bytes
    return 4;
  }

  @Override
  protected void addPixelValue(int val) {
    imgData.putFloat((((val >> 16) & 0xFF)-IMAGE_MEAN)/IMAGE_STD);
    imgData.putFloat((((val >> 8) & 0xFF)-IMAGE_MEAN)/IMAGE_STD);
    imgData.putFloat((((val) & 0xFF)-IMAGE_MEAN)/IMAGE_STD);
  }

  @Override
  protected float getProbability(int labelIndex) {
    return labelProbArray[0][labelIndex];
  }

  @Override
  protected void setProbability(int labelIndex, Number value) {
    labelProbArray[0][labelIndex] = value.byteValue();
  }

  @Override
  protected float getNormalizedProbability(int labelIndex) {
    return labelProbArray[0][labelIndex];
  }

  @Override
  protected void runInference() {
    tflite.run(imgData, labelProbArray);
  }
}