3
votes

I have a keras model which works fine when I import submodules (layers, backend functions) from keras. However, the same exact model breaks if I import those from tensorflow.keras.

Here is an example that illustrates the issue:

if True:
  from keras import backend as K
  from keras.models import Model
  from keras.layers import Input, Lambda
  from keras.preprocessing.image import ImageDataGenerator
else:
  from tensorflow.keras import backend as K
  from tensorflow.keras.models import Model
  from tensorflow.keras.layers import Input, Lambda
  from tensorflow.keras.preprocessing.image import ImageDataGenerator 

def ex_add(inputs):
  """Made-up example that illustrates the problem"""
  ones = K.ones(K.shape(inputs))
  return inputs + ones

img_input = Input(shape=(512, 512, 3))

ex = Lambda(pconv_add)(img_input)    

model = Model(inputs=[img_input], outputs=ex)
model.compile(optimizer='Adam', loss='mse')

test_generator = ... # data_generator.flow_from_directory() using ImageDataGenerator
img = next(test_generator)[0]
pconv_predict = model.predict(img)

When importing from keras, everything works fine. Importing from tensorflow.keras leads to this (when I call model.predict or model.fit_generator):

ERROR:tensorflow:================================== Object was never used (type ): If you want to mark it as used call its "mark_used()" method. It was originally created here: File "/usr/lib/python3.6/runpy.py", line 193, in _run_module_as_main "main", mod_spec)

...

(long message removed fro brevity)

...

--------------------------------------------------------------------------- AttributeError Traceback (most recent call last) in () 14 #model.summary() 15 img = next(train_generator)[0][0] ---> 16 pconv_predict = model.predict(img)

/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/engine/training.py in predict(self, x, batch_size, verbose, steps, max_queue_size, workers, use_multiprocessing) 1876 else: 1877 return training_arrays.predict_loop( -> 1878 self, x, batch_size=batch_size, verbose=verbose, steps=steps) 1879 1880 def train_on_batch(self, x, y=None, sample_weight=None, class_weight=None):

...

(long message removed fro brevity)

...

/usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/variables.py in is_variable_initialized(variable) 2897 initialized, False otherwise. 2898 """ -> 2899 return state_ops.is_variable_initialized(variable) 2900 2901

/usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/state_ops.py in is_variable_initialized(ref, name) 129 return gen_state_ops.is_variable_initialized(ref=ref, name=name) 130 # Handle resource variables. --> 131 return ref.is_initialized(name=name) 132 133

AttributeError: 'Tensor' object has no attribute 'is_initialized'

The problem roots in the Lambda layer and particularly in the K.shape(inputs). I tried replacing it with (4, 512, 512, 3) and it worked fine regardless of the way I import keras. Have you seen this problem before and how can I solve it?

Note, I run this code on Colaboratory. The keras and tensorflow.keras versions are 2.2.4 and 2.1.6-tf.

1

1 Answers

2
votes

It is an open issue: https://github.com/tensorflow/tensorflow/issues/24938 and is only in graph mode. In eager mode this may work. However, usage of tf.zeros work fine in both graph and eager mode.