I'm converting a tensorflow.keras model to an estimator and using estimator.export_saved_mode(serving_input_receiver_fn=serving_input_receiver_fn) to get my model ready for TensorFlow Serving. Here's what my serving_input_receiver_fn looks like:
def image_preprocessing(image):
image = tf.expand_dims(image, 0)
image = tf.image.resize_bilinear(image, [HEIGHT, WIDTH], align_corners=False)
image = tf.squeeze(image, axis=[0])
image = tf.cast(image, dtype=tf.uint8)
return image
def serving_input_receiver_fn():
def prepare_image(image_str_tensor):
image = tf.image.decode_jpeg(image_str_tensor, channels=CHANNELS)
return image_preprocessing(image)
input_ph = tf.placeholder(tf.string, shape=[None], name='image_binary')
images_tensor = tf.map_fn(prepare_image, input_ph, back_prop=False, dtype=tf.uint8)
images_tensor = tf.image.convert_image_dtype(images_tensor, dtype=tf.float32)
return tf.estimator.export.ServingInputReceiver(
{model.input_names[0]: images_tensor},
{'image_bytes': input_ph})
Is there a way to continue taking uint8 as input, but convert it to float32 and then apply a tensorflow.keras preprocessing function such as tensorflow.keras.applications.xception.preprocess_input?
I'm not sure how to normalize this input according to the mean/std of a tensorflow.keras.applications model. Before adding the above my model accepted json serialized lists of numpy arrays and I would keras normalize them client side. Now that it accepts uint8 base64 encoded byte strings I'm not sure how to move the keras normalizing over to this function.