3
votes

I'm use keras to build a model, and write optimizing codes and all the others in tensorflow. When I was using quite simple layers like Dense or Conv2D, everything was straightforward. But adding BatchNormalization layer into my keras model makes problem complicated.

Since BatchNormalization layer behaves differently in training phase and testing phase, I figured out that I need K.learning_phase():True in my feed_dict. But following code is not working well. It runs with no error, but the model's performance isn't getting any better.

import keras.backend as K
...
x_train, y_train = get_data()
sess.run(train_op, feed_dict={x:x_train, y:y_train, K.learning_phase():True})

When I tried training keras model with keras fit function, it worked well.

What should I do to train a keras model with BatchNormalization layer in tensorflow?

1

1 Answers

1
votes

Actually I duplicated this question that I hadn't seen.

I found the answer here, it just consists in passing an special argument to the BatchNormalization layer call