0
votes

TLDR: I get a ValueError: when running

tf.contrib.lite.TocoConverter.from_saved_model()

Aims: I am trying to convert a TensorFlow saved model to tflite for deployment on mobile devices via Firebase. I can train the model and output a saved model but I am having trouble converting it to .tflite with the python ToCo interface. Any help would be greatly appreciated. Also if anyone can comment on whether the tflite conversion will capture the hub.text_embedding_column() input process that I am relying on. Will the mobile deployment execute this with raw input text or do I need to deploy that part of it separately?

Question: here is the code I am running:

INPUTS:

train_input_fn = tf.estimator.inputs.pandas_input_fn(
    train_df, train_df["target_var"], num_epochs=None, shuffle=True
)

predict_train_input_fn = tf.estimator.inputs.pandas_input_fn(
    train_df, train_df["target_var"], shuffle=False
)

predict_test_input_fn = tf.estimator.inputs.pandas_input_fn(
    test_df, test_df["target_var"], shuffle=False)

embedded_text_feature_column = hub.text_embedding_column(
    key="text", 
    module_spec="https://tfhub.dev/google/nnlm-en-dim128/1"
)

TRAIN AND EVALUATE:

estimator = tf.estimator.DNNClassifier(
    hidden_units=[500, 100],
    feature_columns=[embedded_text_feature_column],
    n_classes=2,
    optimizer=tf.train.AdagradOptimizer(learning_rate=0.003),
    model_dir="my-model"
)

estimator.train(input_fn=train_input_fn, steps=1000)

train_eval_result = estimator.evaluate(input_fn=predict_train_input_fn)
test_eval_result = estimator.evaluate(input_fn=predict_test_input_fn)

SAVE MODEL:

feature_spec = tf.feature_column.make_parse_example_spec([embedded_text_feature_column])

serve_input_fun = tf.estimator.export.build_parsing_serving_input_receiver_fn(
    feature_spec,
    default_batch_size=None
)

estimator.export_savedmodel(
    export_dir_base = "my-model",
    serving_input_receiver_fn = serve_input_fun,
    as_text=False,
    checkpoint_path="my-model/model.ckpt-1000",
)

CONVERT MODEL:

converter = tf.contrib.lite.TocoConverter.from_saved_model("my-model/1529320265/") 
tflite_model = converter.convert()

Error

When running the last line I get the following error:

ValueError: Tensors input_example_tensor:0 not known type tf.string

And the full trace is:

ValueError Traceback (most recent call last)
in ()
1 converter = tf.contrib.lite.TocoConverter.from_saved_model("my-model/1529320265/")
----> 2 tflite_model = converter.convert()

/media/rmn/data/projects/anaconda3/envs/monily_tf19/lib/python3.6/site-packages/tensorflow/contrib/lite/python/lite.py in convert(self)
307 reorder_across_fake_quant=self.reorder_across_fake_quant,
308 change_concat_input_ranges=self.change_concat_input_ranges,
--> 309 allow_custom_ops=self.allow_custom_ops)
310 return result
311
/media/rmn/data/projects/anaconda3/envs/monily_tf19/lib/python3.6/site-packages/tensorflow/contrib/lite/python/convert.py in toco_convert(input_data, input_tensors, output_tensors, inference_type, inference_input_type, input_format, output_format, quantized_input_stats, default_ranges_stats, drop_control_dependency, reorder_across_fake_quant, allow_custom_ops, change_concat_input_ranges)
204 else:
205 raise ValueError("Tensors %s not known type %r" % (input_tensor.name, --> 206 input_tensor.dtype))
207
208 input_array = model.input_arrays.add()

ValueError: Tensors input_example_tensor:0 not known type tf.string

Details

train_df and test_df are pandas dataframes consisting of a single input text column and a binary target variable. I am using Python 3.6.5 and TensorFlow r1.9.

1

1 Answers

2
votes

This issue is fixed on TensorFlow's master branch (in commit d3931c8). Reference the following documentation on TensorFlow's website to build a pip installation from GitHub: https://www.tensorflow.org/install/install_sources.