0
votes

I'm serving my model using an export produced as this:

features_placeholder = tf.placeholder(tf.float32, None)
labels_placeholder = tf.placeholder(tf.float32, None)

# Training loop code
......
# Train is finished.

# Export model
tf.saved_model.simple_save(sess,param.logs_dir + 'model_export', 
            {"features": features_placeholder}, {"binary_classif": labels_placeholder})

Then, I'm making the following POST request (raw body):

{"instances" : [1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0]}

The error I get is the following:

{ "error": "You must feed a value for placeholder tensor \'Placeholder_1\' with dtype float\n\t [[Node: Placeholder_1 = Placeholder_output_shapes=[], dtype=DT_FLOAT, shape=, _device=\"/job:localhost/replica:0/task:0/device:CPU:0\"]]" }

Does anyone know what I'm doing wrong?

2
could you provide the code in the three points you mentioned so we could trace the cause of the error - Aziz
I've edited accordingly - David Cruz
Hi @DavidCruz were you able to solve your problem? I'm experiencing a similar one. This link has my question: stackoverflow.com/questions/52151309/… - user3276768
@DavidCruz I am also getting somewhat similar error. Could you resolve it? - tired and bored dev
I've updated my question with a possible answer. Feel free to try it out - David Cruz

2 Answers

0
votes

You have to make sure that the shapes of your two placeholders features_placeholder and labels_placeholder correspond to the shape of the two variables features_feed and labels_feed to avoid the error you have when feeding the dictionary.

0
votes

For those seeking for an answer to this problem, I'll give it a try.

When exporting the model, the simple_save function is expecting the tensors pointers, not the placeholders. One way to do this is to name your tensors when defining the model, like this:

def inference(features):
    layer_1 = nn_layer(features, get_num_features(), get_num_hidden1(), 'layer1', act=tf.nn.relu)
    logits = nn_layer(layer_1, get_num_hidden1(), get_num_classes(), 'out', act=tf.identity)
    logits = tf.identity(logits, name='predictions')
    return logits

Since I've named my logits tensor to 'predictions', I'm now able to get this tensor in graph mode, before saving the model:

features = graph.get_tensor_by_name('features:0')
predictions = graph.get_tensor_by_name('predictions:0')
tf.saved_model.simple_save(sess,param.logs_dir + 'model_export', 
                {"features": features}, 
                {"predictions": predictions})

Note: Tensorflow documentation is very short, especially regarding the simple_save function. This is the only way I've could make it work, but I'm not 100% sure of the correct way to do this.