0
votes

I start my own image encoder basing on the example from this "Manipulate complex graph topologies" (leveraging 2 separate inputs for model). Tensorflow version is 2.2.0

Model compiled successfully (see summary at the end).

My input data looks like following:

train_top_reduce, train_left_reduce = ( <list of numpy - 2d matrix>, <list of numpy - 2d matrix>)
train_x = {"top_reduce":train_top_reduce, "left_reduce":train_left_reduce}
train_y = <list of numpy.asarray( PIL's image ) >

When I try:

history = model.fit(train_x, train_y)

I got an exception:

AttributeError: 'tuple' object has no attribute '_keras_mask'

c:\python\python37\lib\site-packages\tensorflow\python\keras\engine\training.py:571 train_function * outputs = self.distribute_strategy.run(

c:\python\python37\lib\site-packages\tensorflow\python\distribute\distribute_lib.py:951 run ** return self._extended.call_for_each_replica(fn, args=args, kwargs=kwargs)

c:\python\python37\lib\site-packages\tensorflow\python\distribute\distribute_lib.py:2290 call_for_each_replica return self._call_for_each_replica(fn, args, kwargs)

c:\python\python37\lib\site-packages\tensorflow\python\distribute\distribute_lib.py:2649 _call_for_each_replica return fn(*args, **kwargs)

c:\python\python37\lib\site-packages\tensorflow\python\keras\engine\training.py:531 train_step ** y_pred = self(x, training=True)

c:\python\python37\lib\site-packages\tensorflow\python\keras\engine\base_layer.py:927 call outputs = call_fn(cast_inputs, *args, **kwargs)

c:\python\python37\lib\site-packages\tensorflow\python\keras\engine\network.py:719 call convert_kwargs_to_constants=base_layer_utils.call_context().saving)

c:\python\python37\lib\site-packages\tensorflow\python\keras\engine\network.py:832 _run_internal_graph input_t._keras_mask = mask

Model summary:


Layer (type) Output Shape Param # Connected to


top_reduce (InputLayer) [(None, None, 256, 5 0


left_reduce (InputLayer) [(None, None, 256, 5 0


top_dence (Dense) (None, None, 256, 32 1632 top_reduce[0][0]


left_dence (Dense) (None, None, 256, 32 1632 left_reduce[0][0]


concatenate (Concatenate) (None, None, 256, 64 0 top_dence[0][0]
left_dence[0][0]


conv2d (Conv2D) (None, None, 256, 32 2080 concatenate[0][0]


1

1 Answers

0
votes

After blind search of decision I found that problem was in <list of numpy - 2d matrix>. I replaced python list by invoking numpy.stack(list_of_matrix) - that converts list to numpy +1 dimension