0
votes

Model summary:

Layer (type)                 Output Shape              Param #   
=================================================================
dense_1 (Dense)              (None, 195)               38220     
_________________________________________________________________
dense_2 (Dense)              (None, 400)               78400     
_________________________________________________________________
dropout_1 (Dropout)          (None, 400)               0         
_________________________________________________________________
dense_3 (Dense)              (None, 200)               80200     
_________________________________________________________________
dropout_2 (Dropout)          (None, 200)               0         
_________________________________________________________________
dense_4 (Dense)              (None, 3)                 603       
=================================================================

Here dense_4 (Dense) has the output shape (None, 3). The last layer is the output layer. Because of 'None', I am facing error during Flask app development. This is the error in Flask

raise ValueError("Tensor %s is not an element of this graph." % obj) ValueError: Tensor Tensor("dense_8/Softmax:0", shape=(?, 3), dtype=float32) is not an element of this graph.

I tried to add this piece of code

global graph
graph = tf.get_default_graph()

and inside predict api the following code

with graph.as_default():
    y_hat = model.predict(x_test, batch_size=1, verbose=1)

Later I got to see another error

tensorflow.python.framework.errors_impl.FailedPreconditionError: Error while reading resource variable dense_6/kernel from Container: localhost. This could mean that the variable was uninitialized. Not found: Resource localhost/dense_6/kernel/class tensorflow::Var does not exist.
[[{{node dense_6/MatMul/ReadVariableOp}}]]

Any idea why? Full error trace: here classifier model loaded 127.0.0.1 - - [08/Jan/2020 13:13:19] "[1m[35mPOST /predict HTTP/1.1[0m" 500 -´ Traceback (most recent call last): File "C:\Users\user1\AppData\Local\Continuum\anaconda3\lib\site- packages\flask\app.py", line 2463, in __call__ return self.wsgi_app(environ, start_response) File "C:\Users\user1\AppData\Local\Continuum\anaconda3\lib\site- packages\flask\app.py", line 2449, in wsgi_app response = self.handle_exception(e) File "C:\Users\user1\AppData\Local\Continuum\anaconda3\lib\site- packages\flask\app.py", line 1866, in handle_exception reraise(exc_type, exc_value, tb) File "C:\Users\user1\AppData\Local\Continuum\anaconda3\lib\site- packages\flask\_compat.py", line 39, in reraise raise value File "C:\Users\user1\AppData\Local\Continuum\anaconda3\lib\site- packages\flask\app.py", line 2446, in wsgi_app response = self.full_dispatch_request() File "C:\Users\user1\AppData\Local\Continuum\anaconda3\lib\site- packages\flask\app.py", line 1951, in full_dispatch_request rv = self.handle_user_exception(e) File "C:\Users\user1\AppData\Local\Continuum\anaconda3\lib\site- packages\flask\app.py", line 1820, in handle_user_exception reraise(exc_type, exc_value, tb) File "C:\Users\user1\AppData\Local\Continuum\anaconda3\lib\site- packages\flask\_compat.py", line 39, in reraise raise value File "C:\Users\user1\AppData\Local\Continuum\anaconda3\lib\site- packages\flask\app.py", line 1949, in full_dispatch_request rv = self.dispatch_request() File "C:\Users\user1\AppData\Local\Continuum\anaconda3\lib\site- packages\flask\app.py", line 1935, in dispatch_request return self.view_functions[rule.endpoint](**req.view_args) File "C:\Users\user1\Desktop\flask_apps\app.py", line 147, in predict y = model.predict(X_test,batch_size=1, verbose=1) File "C:\Users\user1\AppData\Roaming\Python\Python37\site- packages\tensorflow\python\keras\engine\training.py", line 1078, in predict callbacks=callbacks) File "C:\Users\user1\AppData\Roaming\Python\Python37\site- packages\tensorflow\python\keras\engine\training_arrays.py", line 363, in model_iteration batch_outs = f(ins_batch) File "C:\Users\user1\AppData\Roaming\Python\Python37\site- packages\tensorflow\python\keras\backend.py", line 3292, in __call__ run_metadata=self.run_metadata) File "C:\Users\user1\AppData\Roaming\Python\Python37\site- packages\tensorflow\python\client\session.py", line 1458, in __call__ run_metadata_ptr) tensorflow.python.framework.errors_impl.FailedPreconditionError: Error while reading resource variable dense_6/kernel from Container: localhost. This could mean that the variable was uninitialized. Not found: Resource localhost/dense_6/kernel/class tensorflow::Var does not exist.

1
Can you add your full error trace?Vivek Mehta
Sure@VivekMehtaAnjana K
_make_predict_function() is called only after a call to predict(). I believe this is a flaw in Keras design - this code is not synchronous and not thread ready. Thats why I need to call this function before threading. It goes in conjunction with: self.default_graph.finalize() # avoid modifications. You can find more information here github.com/jaromiru/AI-blog/issues/2Anjana K
Somehow, it turns out to be Design error in Tensorflow backend with Keras 2.3. When i Downgraded to 2.2.5. this tensor issue was solved. Also used this code # on thread 1 session = tf.Session(graph=tf.Graph()) with session.graph.as_default(): k.backend.set_session(session) model = k.models.load_model(filepath) # on thread 2 with session.graph.as_default(): k.backend.set_session(session) model.predict(x, **kwargs) –Anjana K
could you post this as a answer? It will be useful for future visitors.Vivek Mehta

1 Answers

0
votes

Somehow, it turns out to be Design error in Tensorflow backend with Keras 2.3. When I Downgraded to 2.2.5. this tensor issue was solved. Also used this code

on thread 1

session = tf.Session(graph=tf.Graph()) with session.graph.as_default(): k.backend.set_session(session) model = k.models.load_model(filepath)

on thread 2

with session.graph.as_default(): k.backend.set_session(session) model.predict(x, **kwargs)

Refer https://github.com/jaromiru/AI-blog/issues/2