I'm trying to run a trained model of this tutorial with our own CSV dataset with each row composed of 4 int values (the last being the label) at a distance using TensorFlow Serving:
I'm running TensorFlow Serving using Docker at a distance and my dev environement is Windows using Python 3.6.
I export my model using the following code, similar to the example given here:
feature_spec = {'firstInt': tf.FixedLenFeature([1], tf.int64),
'secondInt': tf.FixedLenFeature([1], tf.int64),
'thirdInt': tf.FixedLenFeature([1], tf.int64)}
def serving_input_receiver_fn():
serialized_tf_example = tf.placeholder(dtype=tf.string,
shape=[None],
name='input_example_tensor')
receiver_tensors = {'examples': serialized_tf_example}
features = tf.parse_example(serialized_tf_example, feature_spec)
return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)
classifier.export_savedmodel(
'.\\SaveLC\\save_dir',
serving_input_receiver_fn=serving_input_receiver_fn)
I'm honestly not sure what to expect as a result, but in this guide , the half_plus_two model puts out a predictions array like this
"predictions": [
2.5,
3,
4.5
]
when it's sent a POST request like this
{"instances": [1.0, 2.0, 5.0]}
So I imagine something similar should be returned but instead I'm told a servable does not even exist. Of note, it seems to do this with other models provided in the guide as well.
"error": "Servable not found for request: Latest(save_dir_new)"
What is this servable and how is it meant to be exported if my current method doesn't currently work?
Thanks