2
votes

I have been fighting with tensorflow's builder to be able to serve my model, I am trying to feed data to my classifier after serving the model

My question is how would i feed the input to the model? I have seen the code used by google's inception tutorial

and have tried to implement it

classify_inputs_tensor_info = utils.build_tensor_info(
          serialized_tf_example)
      classes_output_tensor_info = utils.build_tensor_info(classes)
      scores_output_tensor_info = utils.build_tensor_info(values)

      classification_signature = signature_def_utils.build_signature_def(
          inputs={
              signature_constants.CLASSIFY_INPUTS: classify_inputs_tensor_info
          },
          outputs={
              signature_constants.CLASSIFY_OUTPUT_CLASSES:
                  classes_output_tensor_info,
              signature_constants.CLASSIFY_OUTPUT_SCORES:
                  scores_output_tensor_info
          },
          method_name=signature_constants.CLASSIFY_METHOD_NAME)

and from what i understand the input is passed to a tensor called serialized_tf_example which as the name suggests serializes the input to string but then they use tf.FixedLenFeature which i don't understand and then parses the serialized_tf_example with tf.parse_example and assigns it to x which is used within the model, but i would like to parse it to a classifier that accepts arrays as inputs but don't know how to go around this.

while trying to implement this i wrote this

serialized_tf_example = tf.placeholder(tf.string, name='tf_example')
        feature_configs = { 'audio/encoded': tf.FixedLenFeature( shape=[193], dtype=tf.float32, default_value=input_x),}
        tf_example = tf.parse_example(serialized_tf_example, feature_configs)
        x = tf_example['audio/encoded']

        sess = tf.InteractiveSession()
        sess.run(tf.global_variables_initializer())

        # Define the dimensions in the feature columns
        feature_columns = [tf.contrib.layers.real_valued_column("", dimension=5)]

        classifier = tf.contrib.learn.DNNLinearCombinedClassifier(
            dnn_feature_columns=feature_columns, dnn_hidden_units=[200,300], n_classes=10,
            dnn_optimizer=tf.train.GradientDescentOptimizer(
                learning_rate=0.01
            )
        )

        #run training
        classifier.fit(input_fn=get_train_inputs, steps=100)
        #testing 
        accuracy_score = classifier.evaluate(input_fn=get_test_inputs, steps=10)["accuracy"]
        print('Test accuracy : ', format(accuracy_score))

        prediction = format(list(classifier.predict_classes(x, as_iterable=True)))

but x is a tensor and so is not able to be read. when i try use run or .eval() it asks me to feed a value to serialized_tf_example

InvalidArgumentError (see above for traceback): You must feed a value for placeholder tensor 'tf_example' with dtype string [[Node: tf_example = Placeholderdtype=DT_STRING, shape=[], _device="/job:localhost/replica:0/task:0/cpu:0"]]

when i use prediction = format(list(classifier.predict_classes(np.array(x), as_iterable=True)) I get

InvalidArgumentError (see above for traceback): Shape in shape_and_slice spec [1,200] does not match the shape stored in checkpoint: [193,200] [[Node: save/RestoreV2_1 = RestoreV2[dtypes=[DT_FLOAT], _device="/job:localhost/replica:0/task:0/cpu:0"](_recv_save/Const_0, save/RestoreV2_1/tensor_names, save/RestoreV2_1/shape_and_slices)]]

1

1 Answers

0
votes

You can/should use classifier.predict without tf.Example.Your input_fn in train and eval returns x, y. you can write a predict_input_fn similar to other input functions.

predictoin = next(classifier.predict_classes(input_fn=predict_input_fn))

Please note that, if you get all predictions with list function should end by an exception. You can check tf.estimator.inputs.numpy_input_fn