6
votes

Had tried to run https://glitch.com/~tar-understood-exoplanet and the model would fail to load and I wouldn't be able to use enable the webcam.

Anyone had the same issue?

While the program is running, in the console I get the following:

tfjs:2 Uncaught (in promise) Error: The dtype of dict['image_tensor'] provided in model.execute(dict) must be int32, but was float32
    at Object.b [as assert] (tfjs:2)
    at tfjs:2
    at Array.forEach (<anonymous>)
    at t.checkInputShapeAndType (tfjs:2)
    at t.<anonymous> (tfjs:2)
    at tfjs:2
    at Object.next (tfjs:2)
    at tfjs:2
    at new Promise (<anonymous>)
    at Zv (tfjs:2)

I have a Macbook Pro and some other people on their Windows also had some issues running the model. We also tried it on different browsers, Safari and Chrome.

SUCCESS! after switching to coco-ssd 2.0.2:

I added the version 2.0.2 in line 62 as follows:

<script src="https://cdn.jsdelivr.net/npm/@tensorflow-models/[email protected]"></script>
3
Thanks for reporting this Claire - I shall ask if any of the team know the answer on Monday!Jason Mayes
For Chrome, it could be caused by that they rolled out a SameSite cookie policy on 04/04 that is causing the issue.RainCast

3 Answers

3
votes

Same error here, just occured since Friday night (04/03/2020) TFModel works well in past few weeks.

3
votes

This is caused by the warmup run of coco-ssd that uses tf.zeros tensor. The default dtype for tf.zeros is 'float' in the recent release of TFJS. I have put out a new version with fixes. It should work if you use the latest version of coco-ssd (2.0.2) in the glitch example (index.html) as following.

    <!-- Load the coco-ssd model to use to recognize things in images -->
    <script src="https://cdn.jsdelivr.net/npm/@tensorflow-models/[email protected]"></script>
0
votes

I got the same error.

My Scenerio: I trained a pre-trained model from tensorflow model zoo using transfer learning using tensorflow api as saved model (model.pb file) and converted it into tfjs format (model.json and shared .bin files).

When I tried running this model.json on the javascript(web), it gives the below error:

Uncaught (in promise) Error: The dtype of dict['input_tensor'] provided in model.execute(dict) must be int32, but was float32

When I tried someone else's working converted model (model.json and shared .bin files) on my javascript(web), it worked.

Conclusion: There is something wrong with my converted model. I converted it using tensorflowjs_converter. My original model in (model.pb) works accurately in python too.

I'm still trying out to convert my model.pb file with different tensorflowjs_converters as it seems to be the converters versioning issue.