27
votes

Im deploying a keras model and sending the test data to the model via a flask api. I have two files:

First: My Flask App:

# Let's startup the Flask application
app = Flask(__name__)

# Model reload from jSON:
print('Load model...')
json_file = open('models/model_temp.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
keras_model_loaded = model_from_json(loaded_model_json)
print('Model loaded...')

# Weights reloaded from .h5 inside the model
print('Load weights...')
keras_model_loaded.load_weights("models/Model_temp.h5")
print('Weights loaded...')

# URL that we'll use to make predictions using get and post
@app.route('/predict',methods=['GET','POST'])
def predict():
    data = request.get_json(force=True)
    predict_request = [data["month"],data["day"],data["hour"]] 
    predict_request = np.array(predict_request)
    predict_request = predict_request.reshape(1,-1)
    y_hat = keras_model_loaded.predict(predict_request, batch_size=1, verbose=1)
    return jsonify({'prediction': str(y_hat)}) 

if __name__ == "__main__":
    # Choose the port
    port = int(os.environ.get('PORT', 9000))
    # Run locally
    app.run(host='127.0.0.1', port=port)

Second: The file Im using to send the json data sending to the api endpoint:

response = rq.get('api url has been removed')
data=response.json()
currentDT = datetime.datetime.now()
Month = currentDT.month
Day = currentDT.day
Hour = currentDT.hour

url= "http://127.0.0.1:9000/predict"
post_data = json.dumps({'month': month, 'day': day, 'hour': hour,})
r = rq.post(url,post_data)

Im getting this response from Flask regarding Tensorflow:

ValueError: Tensor Tensor("dense_6/BiasAdd:0", shape=(?, 1), dtype=float32) is not an element of this graph.

My keras model is a simple 6 dense layer model and trains with no errors.

Any ideas?

7
EDIT: For anyone else that might have this problem in the future changing to a Theano backend fixed the issue.DataGuy
Thank you so much bro !! It will save many livesPrashant Gupta

7 Answers

36
votes

Flask uses multiple threads. The problem you are running into is because the tensorflow model is not loaded and used in the same thread. One workaround is to force tensorflow to use the gloabl default graph .

Add this after you load your model

global graph
graph = tf.get_default_graph() 

And inside your predict

with graph.as_default():
    y_hat = keras_model_loaded.predict(predict_request, batch_size=1, verbose=1)
18
votes

It's so much simpler to wrap your keras model in a class and that class can keep track of it's own graph and session. This prevents the problems that having multiple threads/processes/models can cause which is almost certainly the cause of your issue. While other solutions will work this is by far the most general, scalable and catch all. Use this one:

import os
from keras.models import model_from_json
from keras import backend as K
import tensorflow as tf
import logging

logger = logging.getLogger('root')


class NeuralNetwork:
    def __init__(self):
        self.session = tf.Session()
        self.graph = tf.get_default_graph()
        # the folder in which the model and weights are stored
        self.model_folder = os.path.join(os.path.abspath("src"), "static")
        self.model = None
        # for some reason in a flask app the graph/session needs to be used in the init else it hangs on other threads
        with self.graph.as_default():
            with self.session.as_default():
                logging.info("neural network initialised")

    def load(self, file_name=None):
        """
        :param file_name: [model_file_name, weights_file_name]
        :return:
        """
        with self.graph.as_default():
            with self.session.as_default():
                try:
                    model_name = file_name[0]
                    weights_name = file_name[1]

                    if model_name is not None:
                        # load the model
                        json_file_path = os.path.join(self.model_folder, model_name)
                        json_file = open(json_file_path, 'r')
                        loaded_model_json = json_file.read()
                        json_file.close()
                        self.model = model_from_json(loaded_model_json)
                    if weights_name is not None:
                        # load the weights
                        weights_path = os.path.join(self.model_folder, weights_name)
                        self.model.load_weights(weights_path)
                    logging.info("Neural Network loaded: ")
                    logging.info('\t' + "Neural Network model: " + model_name)
                    logging.info('\t' + "Neural Network weights: " + weights_name)
                    return True
                except Exception as e:
                    logging.exception(e)
                    return False

    def predict(self, x):
        with self.graph.as_default():
            with self.session.as_default():
                y = self.model.predict(x)
        return y
5
votes

Just after loading the model add model._make_predict_function() `

# Model reload from jSON:
print('Load model...')
json_file = open('models/model_temp.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
keras_model_loaded = model_from_json(loaded_model_json)
print('Model loaded...')

# Weights reloaded from .h5 inside the model
print('Load weights...')
keras_model_loaded.load_weights("models/Model_temp.h5")
print('Weights loaded...')
keras_model_loaded._make_predict_function()
1
votes

It turns out this way does not need a clear_session call and is at the same time configuration friendly, using the graph object from configured session session = tf.Session(config=_config); self.graph = session.graph and the prediction by the created graph as default with self.graph.as_default(): offers a clean approach

from keras.backend.tensorflow_backend import set_session
...
def __init__(self):
    config = self.keras_resource()
    self.init_model(config)

def init_model(self, _config, *args):
    session = tf.Session(config=_config)
    self.graph = session.graph
    #set configured session 
    set_session(session)
    self.model = load_model(file_path)

def keras_resource(self):
    config = tf.ConfigProto()
    config.gpu_options.allow_growth = True
    return config

def predict_target(self, to_predict):
    with self.graph.as_default():
        predict = self.model.predict(to_predict)
    return predict
1
votes

I had the same problem. it was resolved by changing TensorFlow-1 version to TensorFlow-2. just uninstall ver-1 and install ver-2.

0
votes

Ya their is a bug when you predict from model with keras. Keras will not be able to build graph due to some error. Try to predict images from model with the help of tensor flow. Just replace this line of code

Keras code:

features = model_places.predict( img )

tensorflow code:

import tensorflow as tf

graph = tf.get_default_graph()

import this library in your code and replace.

 with graph.as_default():
    features = model_places.predict( img ).tolist()

If Problem still not solved :

if still problem not solved than try to refresh the graph.

As your code is fine, running with a clean environment should solve it.

Clear keras cache at ~/.keras/

Run on a new environment, with the right packages (can be done easily with anaconda)

Make sure you are on a fresh session, keras.backend.clear_session() should remove all existing tf graphs.

Keras Code:

keras.backend.clear_session()
features = model_places.predict( img )

TensorFlow Code:

import tensorflow as tf
with tf.Session() as sess:
    tf.reset_default_graph()
0
votes

Simplest solution is to use tensorflow 2.0. Run your code in Tensorflow 2.0 environment and it will work.

I was facing same issues while exposing a pre-trained model via REST server. I was loading the model at the server startup and later using the loaded model to make predictions via POST/GET request. While predicting it was generating error as session not saved between the predict call. Though when I was loading the model every time prediction is made it was working fine.

Then to avoid this issue with the session I just ran the code in TF=2.0 environment and it ran fine.