1
votes

I have successfully trained and locally predicted DNNLinearCombinedClassifier from the ai-platform-samples template.

When I run pip freeze| grep tensorflow on my local PC :

tensorflow==1.15.0
tensorflow-datasets==1.2.0
tensorflow-estimator==1.15.1
tensorflow-hub==0.6.0
tensorflow-io==0.8.0
tensorflow-metadata==0.15.1
tensorflow-model-analysis==0.15.4
tensorflow-probability==0.8.0
tensorflow-serving-api==1.15.0

When I run saved_model_cli show for my saved model, I get this output:

The given SavedModel SignatureDef contains the following input(s):
  inputs['Sector'] tensor_info:
      dtype: DT_STRING
      shape: (-1)
      name: Placeholder_2:0
  inputs['announcement_type_simple'] tensor_info:
      dtype: DT_STRING
      shape: (-1)
      name: Placeholder_1:0
  inputs['market_cap'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1)
      name: Placeholder_3:0
  inputs['sens_content'] tensor_info:
      dtype: DT_STRING
      shape: (-1)
      name: Placeholder:0
The given SavedModel SignatureDef contains the following output(s):
  outputs['all_class_ids'] tensor_info:
      dtype: DT_INT32
      shape: (-1, 3)
      name: head/predictions/Tile:0
  outputs['all_classes'] tensor_info:
      dtype: DT_STRING
      shape: (-1, 3)
      name: head/predictions/Tile_1:0
  outputs['class_ids'] tensor_info:
      dtype: DT_INT64
      shape: (-1, 1)
      name: head/predictions/ExpandDims_2:0
  outputs['classes'] tensor_info:
      dtype: DT_STRING
      shape: (-1, 1)
      name: head/predictions/str_classes:0
  outputs['logits'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 3)
      name: dnn/logits/BiasAdd:0
  outputs['probabilities'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 3)
      name: head/predictions/probabilities:0
Method name is: tensorflow/serving/predict

The inputs are consistent with what I input into my json file which is the following :

{"sens_content": "RFG 201411130005A Trading Statement Rhodes Food Group", "announcement_type_simple": "trade statement", "Sector": "Consumer, Non-cyclical","market_cap": 4377615219.88}

The model infered with gcloud ai-platform local predict.

When I run gcloud ai-platform predict --model=${MODEL_NAME} --version=${MODEL_VERSION} --json-instances=data/new-data.json --verbosity debug --log-http it creates the following post :

==== request start ====
uri: https://ml.googleapis.com/v1/projects/simon-teraflow-project/models/tensorflow_sens1/versions/v3:predict
method: POST
== headers start ==
Authorization: --- Token Redacted ---
Content-Type: application/json
user-agent: gcloud/270.0.0 command/gcloud.ai-platform.predict invocation-id/f01f2f4b8c494082abfc38e19499019b environment/GCE environment-version/None interactive/True from-script/False python/2.7.13 term/xterm (Linux 4.9.0-11-amd64)
== headers end ==
== body start ==
{"instances": [{"Sector": "Consumer, Non-cyclical", "announcement_type_simple": "trade statement", "market_cap": 4377615219.88, "sens_content": "RFG 201411130005A Trading Statement Rhodes Food Group"}]}
== body end ==
==== request end ====

You can see that the input is consistent with what is required. Below is the response :

Traceback (most recent call last):
  File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/calliope/cli.py", line 984, in Execute
    resources = calliope_command.Run(cli=self, args=args)
  File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/calliope/backend.py", line 798, in Run
    resources = command_instance.Run(args)
  File "/usr/lib/google-cloud-sdk/lib/surface/ai_platform/predict.py", line 110, in Run
    signature_name=args.signature_name)
  File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/api_lib/ml_engine/predict.py", line 77, in Predict
    response_body)
HttpRequestFailError: HTTP request failed. Response: {
  "error": {
    "code": 400,
    "message": "Bad Request",
    "status": "INVALID_ARGUMENT"
  }
}

ERROR: (gcloud.ai-platform.predict) HTTP request failed. Response: {
  "error": {
    "code": 400,
    "message": "Bad Request",
    "status": "INVALID_ARGUMENT"
  }
} 

Tried the same thing on the ai platform "test your model". Same result:
predict on ai platform gui predict on ai platform gui

I have checked that the runtime is 1.15 which is consistent with the local predict, also consistent python versions aswell.

I've searched so for a similar case and have found nothing. Any advice would be greatly appreciated.

1
Could you explain how you are parsing the JSON file? (code snippet if possible). The error is most likely caused by your JSON request not matching how you are parsing within your code. Run the prediction locally using the exact JSON format sent for online prediction (top level being instances) and you will most likely encounter the same errorGurkomal

1 Answers

0
votes

You can try the following:

1) Save your model locally, you can use the the following snippet [1] sample adapted to your mode

2) Test it using Docker

3) Deploy the model into GCP and make a request to the model [2](Adapted to your model), use the gcloud command instead of the GCP UI.

[1]

========Code snippet===============
MODEL_NAME = <MODEL NAME>
VERSION = <MODEL VERSION>
SERVE_PATH = './models/{}/{}'.format(MODEL_NAME, VERSION)

import tensorflow as tf
import tensorflow_hub as hub

use_model = "https://tfhub.dev/google/<MODEL NAME>/<MODEL VERSION>"

with tf.Graph().as_default():
  module = hub.Module(use_model, name=MODEL_NAME)
  text = tf.placeholder(tf.string, [None])
  embedding = module(text)

  init_op = tf.group([tf.global_variables_initializer(), tf.tables_initializer()])

  with tf.Session() as session:
    session.run(init_op)

    tf.saved_model.simple_save(
        session,
        SERVE_PATH,
        inputs = {"text": text},
        outputs = {"embedding": embedding},
        legacy_init_op = tf.tables_initializer()
    )    
========/ Code snippet===============

[2]

Replace <Project_name>, <model_name>, <bucket_name> and <model_version>

    $ gcloud ai-platform models create <model_name> --project <Project_name>
    $ gcloud beta ai-platform versions create v1 --project <Project_name> --model <model_name> --origin=/location/of/model/dir/<model_name>/<model_version> --staging-bucket gs://<bucket_name> --runtime-version=1.15 --machine-type=n1-standard-8
    $ echo '{"text": "cat"}' > instances.json
    $ gcloud ai-platform predict --project <Project_name> --model <model_name> --version v1 --json-instances=instances.json
    $ curl -X POST -v -k -H "Content-Type: application/json" -d '{"instances": [{"text": "cat"}]}'  -H "Authorization: Bearer `gcloud auth print-access-token`" "https://ml.googleapis.com/v1/projects/<Project_name>/models/<model_name>/versions/v1:predict"