0
votes

I have a code using low-level TF API. I want to add to it some code using Keras. I encountered a cryptic error with the simplest scenario: I have a Keras model which loads and predicts correctly. However, when I call tf.reset_default_graph before I get an error during load_model

tf.reset_default_graph()
model = load_model("model.h5")

I'm getting: ValueError: Tensor Tensor("Placeholder:0", shape=(40, 80), dtype=float32) is not an element of this graph.

The problem reproduces with the following minimal code:

import tensorflow as tf
from keras.models import load_model

model = load_model("model.h5")
model.summary()

# tf.reset_default_graph() OR
tf.keras.backend.clear_session()

model = load_model("model.h5")
model.summary()
1

1 Answers

0
votes

Debugging revealed that the problem is that Keras uses the default session if it exists, and if some initialization was done on this session resetting the graph causes the Keras's confusion because it is expecting that the session state will not change and the session's graph will not be reset. I haven't seen neither of this in the documentation and it caused me to spend a few hours on this issue. So if I want to load the model and then use it multiple times with calls to reset_default_graph in between, I need to keep the session with the graph around like this:

def load():
    with tf.Graph().as_default() as g:
        config = tf.ConfigProto(log_device_placement=False)
        config.gpu_options.allow_growth = True
        sess = tf.Session(graph=g, config=config)
        with sess.as_default():
            model = load_model("model.h5")
            model.summary()
            X = np.random.normal(0, 1, (20,2))
            pred = model.predict(X[np.newaxis])
            print(pred)

            return model, sess

model, sess = load()

with sess.as_default():
    X = np.random.normal(0, 1, (20, 2))
    pred = model.predict(X[np.newaxis])
    print(pred)