I have successfully trained and locally predicted DNNLinearCombinedClassifier from the ai-platform-samples template.
When I run pip freeze| grep tensorflow
on my local PC :
tensorflow==1.15.0
tensorflow-datasets==1.2.0
tensorflow-estimator==1.15.1
tensorflow-hub==0.6.0
tensorflow-io==0.8.0
tensorflow-metadata==0.15.1
tensorflow-model-analysis==0.15.4
tensorflow-probability==0.8.0
tensorflow-serving-api==1.15.0
When I run saved_model_cli show
for my saved model, I get this output:
The given SavedModel SignatureDef contains the following input(s):
inputs['Sector'] tensor_info:
dtype: DT_STRING
shape: (-1)
name: Placeholder_2:0
inputs['announcement_type_simple'] tensor_info:
dtype: DT_STRING
shape: (-1)
name: Placeholder_1:0
inputs['market_cap'] tensor_info:
dtype: DT_FLOAT
shape: (-1)
name: Placeholder_3:0
inputs['sens_content'] tensor_info:
dtype: DT_STRING
shape: (-1)
name: Placeholder:0
The given SavedModel SignatureDef contains the following output(s):
outputs['all_class_ids'] tensor_info:
dtype: DT_INT32
shape: (-1, 3)
name: head/predictions/Tile:0
outputs['all_classes'] tensor_info:
dtype: DT_STRING
shape: (-1, 3)
name: head/predictions/Tile_1:0
outputs['class_ids'] tensor_info:
dtype: DT_INT64
shape: (-1, 1)
name: head/predictions/ExpandDims_2:0
outputs['classes'] tensor_info:
dtype: DT_STRING
shape: (-1, 1)
name: head/predictions/str_classes:0
outputs['logits'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 3)
name: dnn/logits/BiasAdd:0
outputs['probabilities'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 3)
name: head/predictions/probabilities:0
Method name is: tensorflow/serving/predict
The inputs are consistent with what I input into my json file which is the following :
{"sens_content": "RFG 201411130005A Trading Statement Rhodes Food Group", "announcement_type_simple": "trade statement", "Sector": "Consumer, Non-cyclical","market_cap": 4377615219.88}
The model infered with gcloud ai-platform local predict
.
When I run gcloud ai-platform predict --model=${MODEL_NAME} --version=${MODEL_VERSION} --json-instances=data/new-data.json --verbosity debug --log-http
it creates the following post :
==== request start ====
uri: https://ml.googleapis.com/v1/projects/simon-teraflow-project/models/tensorflow_sens1/versions/v3:predict
method: POST
== headers start ==
Authorization: --- Token Redacted ---
Content-Type: application/json
user-agent: gcloud/270.0.0 command/gcloud.ai-platform.predict invocation-id/f01f2f4b8c494082abfc38e19499019b environment/GCE environment-version/None interactive/True from-script/False python/2.7.13 term/xterm (Linux 4.9.0-11-amd64)
== headers end ==
== body start ==
{"instances": [{"Sector": "Consumer, Non-cyclical", "announcement_type_simple": "trade statement", "market_cap": 4377615219.88, "sens_content": "RFG 201411130005A Trading Statement Rhodes Food Group"}]}
== body end ==
==== request end ====
You can see that the input is consistent with what is required. Below is the response :
Traceback (most recent call last):
File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/calliope/cli.py", line 984, in Execute
resources = calliope_command.Run(cli=self, args=args)
File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/calliope/backend.py", line 798, in Run
resources = command_instance.Run(args)
File "/usr/lib/google-cloud-sdk/lib/surface/ai_platform/predict.py", line 110, in Run
signature_name=args.signature_name)
File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/api_lib/ml_engine/predict.py", line 77, in Predict
response_body)
HttpRequestFailError: HTTP request failed. Response: {
"error": {
"code": 400,
"message": "Bad Request",
"status": "INVALID_ARGUMENT"
}
}
ERROR: (gcloud.ai-platform.predict) HTTP request failed. Response: {
"error": {
"code": 400,
"message": "Bad Request",
"status": "INVALID_ARGUMENT"
}
}
Tried the same thing on the ai platform "test your model". Same result:
predict on ai platform gui
I have checked that the runtime is 1.15
which is consistent with the local predict, also consistent python versions aswell.
I've searched so for a similar case and have found nothing. Any advice would be greatly appreciated.