1
votes

I meet a problem like Loading saved_model causes "Failed to compile fragment shader" for gather op enter image description here

const MODEL_URL = './web_model/tensorflowjs_model.pb';
const WEIGHTS_URL = './web_model/weights_manifest.json';
async function predict(){
  const model = await tf.loadFrozenModel(MODEL_URL, WEIGHTS_URL);
  var input_x = tf.tensor([[2714,    0,    0,   0,    0,    0,    0,    0,    0,    0,    0,    0,    0,    0,
    0,    0,    0,    0,    0,    0,    0,    0,    0,    0,    0,    0,    0,    0,
    0,    0,    0,    0,    0,    0,    0,    0,    0,    0,    0,    0,    0,    0,
    0,    0,    0,    0,    0,    0,    0,    0]],shape=[1,50],dtype='int32');
  var dropout_keep_prob = 1.0;
  var output = model.execute({dropout_keep_prob:dropout_keep_prob, input_x:input_x});
  console.log(output);
}
predict();

This is something about my model:

self.input_x = tf.placeholder(tf.int32, [None, sequence_length], name="input_x")
self.input_y = tf.placeholder(tf.float32, [None, num_classes], name="input_y")
self.dropout_keep_prob = tf.placeholder(tf.float32, name="dropout_keep_prob")
self.scores = tf.nn.xw_plus_b(self.h_drop, W, b, name="scores")
self.predictions = tf.argmax(self.scores, 1, name="predictions")
tf.saved_model.simple_save(sess, "./saved_model",
                           inputs={"input_x": cnn.input_x, }, outputs={"predictions": cnn.predictions,"scores":cnn.scores,})

And I just convert my saved_model like that webpage.

This is the Python tensorflow input, as you know, the model used to be a python script, and I want to convert it to tfjs.

inputs_major = np.zeros(shape=[1,max_seq_length], dtype=np.int32)
for v in range(len(vec)):
    inputs_major[0][v] = vec[v]

I updated tf.js, but loadFrozenModel has been removed, and I change it to loadGraphModel, error is "Uncaught (in promise) Error: Failed to parse model JSON of response from ./web_model/tensorflowjs_model.pb. Your path contains a .pb file extension. Support for .pb models have been removed in TensorFlow.js 1.0 in favor of .json models. You can re-convert your Python TensorFlow model using the TensorFlow.js 1.0 conversion scripts or you can convert your.pb models with the 'pb2json'NPM script in the tensorflow/tfjs-converter repository."

So I try to use tensorflowjs converter 1.0.1 instead of 0.8, and my tf version is 1.13.

The erorr is "tensorflow.python.eager.lift_to_graph.UnliftableError: Unable to lift tensor because it depends transitively on placeholder via at least one path, e.g.: IdentityN (IdentityN) <- scores (BiasAdd) <- scores/MatMul (MatMul) <- dropout/dropout/mul (Mul) <- dropout/dropout/Floor (Floor) <- dropout/dropout/add (Add) <- dropout_keep_prob (Placeholder)"

I think is because my wrong saved_model,

self.input_x = tf.placeholder(tf.int32, [None, sequence_length], name="input_x")
self.input_y = tf.placeholder(tf.float32, [None, num_classes], name="input_y")
self.dropout_keep_prob = tf.placeholder(tf.float32, name="dropout_keep_prob")
self.scores = tf.nn.xw_plus_b(self.h_drop, W, b, name="scores")
self.predictions = tf.argmax(self.scores, 1, name="predictions")
tf.saved_model.simple_save(sess, "./saved_model",
                           inputs={"input_x": cnn.input_x,},
                           outputs={"predictions": cnn.predictions, "scores": cnn.scores, })

So I changed my code.

tf.saved_model.simple_save(sess, "./saved_model",
                           inputs={"input_x": cnn.input_x, "dropout_keep_prob":cnn.dropout_keep_prob,},
                           outputs={"predictions": cnn.predictions, "scores": cnn.scores, })

When saving "WARNING:tensorflow:From D:\Python\Python35\lib\site-packages\tensorflow\python\saved_model\signature_def_utils_impl.py:205: build_tensor_info (from tensorflow.python.saved_model.utils_impl) is deprecated and will be removed in a future version. Instructions for updating: This function will only be available through the v1 compatibility library as tf.compat.v1.saved_model.utils.build_tensor_info or tf.compat.v1.saved_model.build_tensor_info."

The error is "Limited tf.compat.v2.summary API due to missing TensorBoard installation 2019-03-20 19:43:18.894836: I tensorflow/core/grappler/devices.cc:53] Number of eligible GPUs (core count >= 8): 0 (Note: TensorFlow was not compiled with CUDA support) 2019-03-20 19:43:18.909183: I tensorflow/core/grappler/clusters/single_machine.cc:359] Starting new session 2019-03-20 19:43:18.931823: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 2019-03-20 19:43:18.989768: E tensorflow/core/grappler/grappler_item_builder.cc:636] Init node embedding/W/Assign doesn't exist in graph Traceback (most recent call last): File "d:\anaconda3\lib\site-packages\tensorflow\python\grappler\tf_optimizer.py", line 43, in OptimizeGraph verbose, graph_id, status) SystemError: returned NULL without setting an error

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "d:\anaconda3\lib\runpy.py", line 193, in _run_module_as_main "main", mod_spec) File "d:\anaconda3\lib\runpy.py", line 85, in _run_code exec(code, run_globals) File "D:\Anaconda3\Scripts\tensorflowjs_converter.exe__main__.py", line 9, in File "d:\anaconda3\lib\site-packages\tensorflowjs\converters\converter.py", line 358, in main strip_debug_ops=FLAGS.strip_debug_ops) File "d:\anaconda3\lib\site-packages\tensorflowjs\converters\tf_saved_model_conversion_v2.py", line 271, in convert_tf_saved_model concrete_func) File "d:\anaconda3\lib\site-packages\tensorflow\python\framework\convert_to_constants.py", line 140, in convert_variables_to_constants_v2 graph_def = _run_inline_graph_optimization(func) File "d:\anaconda3\lib\site-packages\tensorflow\python\framework\convert_to_constants.py", line 59, in _run_inline_graph_optimization return tf_optimizer.OptimizeGraph(config, meta_graph) File "d:\anaconda3\lib\site-packages\tensorflow\python\grappler\tf_optimizer.py", line 43, in OptimizeGraph verbose, graph_id, status) File "d:\anaconda3\lib\site-packages\tensorflow\python\framework\errors_impl.py", line 548, in exit c_api.TF_GetCode(self.status.status)) tensorflow.python.framework.errors_impl.InvalidArgumentError: Failed to import metagraph, check error log for more info."

I hope you could give me some help, the problem nearly makes me crazy. Thanks!

1
it looks like you are using a very old version of TF.js (0.11.2). Can you try again with the latest version (1.0.1)?David Soergel
Also, to be clear, please re-run tensorflowjs_converter using the latest version. Thanks!David Soergel
Thanks, I update tf.js, but loadFrozenModel has been removed, and I change it to loadGraphModel, error is Your path contains a .pb file extension. Support for .pb models have been removed in TensorFlow.js 1.0 in favor of .json models. Y. So I try to use latest tensorflowjs_converter instead of 0.8, and my tf version is 1.13. But "Init node embedding/W/Assign doesn't exist in graph", SystemError: <built-in function TF_OptimizeGraph> returned NULL without setting an errortensorflow.python.framework.errors_impl.InvalidArgumentError: Failed to import metagraph, check error log for more info.Jackie Swocky
I'm trying to use keras model and give up tf saved_model now, thank you anyway.Jackie Swocky

1 Answers

0
votes

In the end, I give up it, as latest version has removed loadFrozenModel, and the support is little. I try to use keras model and it works.