I deployed a object detection model in Google ML, I am able to make online prediction, but it FAILS to make batch prediction, with below error in stackdriver logs:
Exception during running the graph: assertion failed: [Unable to decode bytes as JPEG, PNG, GIF, or BMP] [[Node: map/while/decode_image/cond_jpeg/cond_png/cond_gif/Assert_1/Assert = Assert[T=[DT_STRING], summarize=3, _device="/job:localhost/replica:0/task:0/device:CPU:0"](map/while/decode_image/cond_jpeg/cond_png/cond_gif/is_bmp, map/while/decode_image/cond_jpeg/cond_png/cond_gif/Assert_1/Assert/data_0)]]
I tried both gcloud command and python api but no luck. The request.json file for Online prediction.
{"inputs": {"b64": "/9j/4SurRXhpZgAATU0AKgAAAAgACgEPAAIAAAAHAAAAhgEQAAIAAAAFAAAAjgEaAAUAAAABAAAAlAEbAAUAAAABAAAAnAEoAAMAAAABAAIAAAExAAA2gITAAMAAAABAAEAAIdpAAQAAAABAAAA7oglAAQAAAABAAAC0gAAAyhYaWFvbWkAAE1.....}}
It already b64 encoded. It works fine with online prediction:
gcloud ml-engine predict --model object_detector --version v2 --json-instances request.json
BUT for Batch prediction it fails, below are two rows for batch_request.json file
{'instances': [{"inputs": {"b64": "/9j/4SurRXhpZgAATU0AKgAAAAgACgEPAAIAHAAAAhgEQAAIAAAAFAAAAjgEaAAUAAAABAAAAlAEbAAUAAAABAAAAnAEoAAMAAAABAAIAAAExAAIAAAA1AAAApAEyAAIAAAAUA...}}]}
{'instances': [{"inputs": {"b64": "/9j/4SurRXhpZgAATU0AKgAAAAgACgEPAAIAAAAAAhgEQAAIAAAAFAAAAjgEaAAUAAAABAAAAlAEbAAUAAAABAAAAnAEoAAMAAAABAAIAAAExAAIAAAA1AAAApAEyAAIAAAAUA...}}]}
body of python api request made for batch predicton:
{'jobId': 'mycloud_machine_object_detector_115252',
'predictionInput': {'dataFormat': 'TEXT',
'inputPaths': 'gs://my-bucket/object-detection/batch_request.json',
'outputPath': 'gs://my-bucket/object-detection/',
'region': 'us-central1',
'versionName': 'projects/mycloud_machine/models/object_detector/versions/v2'}}
I used the python code from Google Docs for making batch request.
project_id = 'projects/{}'.format(project_name)
ml = discovery.build('ml', 'v1', credentials=credentials)
request = ml.projects().jobs().create(parent=project_id,
body=body_fn())
try:
response = request.execute()
print('Job requested.')
# The state returned will almost always be QUEUED.
print('state : {}'.format(response['state']))
except errors.HttpError as err:
# Something went wrong, print out some information.
print('There was an error getting the prediction results.' +
'Check the details:')
print(err._get_reason())