1
votes

I have a tf.estimator model

model = tf.estimator.DNNClassifier(hidden_units=[5, 4],feature_columns=feat_cols,n_classes=2)

exported via

feature_spec = tf.feature_column.make_parse_example_spec(feat_cols)
serving_input_receiver_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)
export_dir = model.export_savedmodel('export', serving_input_receiver_fn)

I am able to load and use in my notebook via

predict_fn = tf.contrib.predictor.from_saved_model(export_dir)

When I run tensorflowjs_converter

tensorflowjs_converter --input_format=tf_saved_model --output_format=tensorflowjs ./1553869899 ./web_model

I get

ValueError: Unsupported Ops in the model before optimization
ParseExample, AsString

I've done some looking around and I get that apparently ParseExample and AsString are unsupported Ops. I'm using very vanilla code that is not calling ParseExample or AsString directly. Not about to rewrite parts of tensorflow, which is what seems to be called for in other answers to unsupported Ops questions.

Question: Is there a way around this? Do I need to abandon tf.estimator and code it up via the lower level API? Is there a different way of exporting a tf.estimator model or converting it that would work?

Thanks.

I implemented an equivalent model in Keras and was able to save and load it in tensorflowjs. The original question is still relevant, though less urgent. - Bill Needels