0
votes

I'm running the code made in this tutorial and when I'm trying to make an online prediction with a JSON that contains my picture in b64 encoding I get the message that it Expected uint8 and got a hexadecimal string instead. I already checked the JSON and it's format is ok. What can be the problem?

I'm using google's CLI for the prediction, and my model's version is tensorflow 1.10 as well as my runtime version. I used the tf API for object detection with faster-rcnn

Code to conver image into b64 and into JSON and tfrecord

import base64
import io
import json
from PIL import Image
import tensorflow as tf
width = 1024
height = 768
predict_instance_json = "inputs.json"
predict_instance_tfr = "inputs.tfr"
with tf.python_io.TFRecordWriter(predict_instance_tfr) as tfr_writer:
  with open(predict_instance_json, "wb") as fp:
    for image in ["image1.jpg", "image2.jpg"]:
      img = Image.open(image)
      img = img.resize((width, height), Image.ANTIALIAS)
      output_str = io.BytesIO()
      img.save(output_str, "JPEG")
      fp.write(
          json.dumps({"b64": base64.b64encode(output_str.getvalue())}) + "\n")
      tfr_writer.write(output_str.getvalue())
      output_str.close()

Command for prediction:

gcloud ml-engine predict --model=${YOUR_MODEL} --version=${YOUR_VERSION} --json-instances=inputs.json

I already tested my model localy and created a docker container with tensorflow serving and it works fine, but no success on cloud ml.

The error promts:

"error": "Prediction failed: Error processing input: Expected uint8, got '\\xff\\xd8\\...

\\xff\\xd9' of type 'str' instead."
1
Please give the full traceback - roganjosh
Not as a comment; please edit the original question to include it, properly formatted - roganjosh

1 Answers

2
votes

The issue was with the way the graph was exported it is very important to add the flag --input_type when calling the script export_inference_graph.py otherwise the API models have their inputs a UINT8 instead of string.

--input_type encoded_image_string_tensor