Batch prediction on the Google ML Engine is throwing an exception.
I have a tensorflow model on the Google ML Engine. My model does an inference on an image. The input is not a normal array though. The model only accepts the b64 encoded string of the image.
My input is a text file hosted in a bucket on the Google Cloud Platform. The file looks like this:
{"image": "b64 encoded image string", "key": "0"}
{"image": "b64 encoded image string", "key": "1"}
{"image": "b64 encoded image string", "key": "2"}
The output should be the infrence on all images but I get the following exception:
('Exception during running the graph: Expected image (JPEG, PNG, or GIF), got unknown format starting with \'{\\"image\\": \\"/9j/4\'\n\t [[node encoder/map/while/DecodeJpeg (defined at /usr/local/lib/python2.7/dist-packages/google/cloud/ml/prediction/frameworks/tf_prediction_lib.py:210) ]]', 1)