2
votes

I have tried to run simple keras model with one embedding layer with 9 inputs. But I always get two errors, depending the layer after embedding. I tried to use 2 different representations of data, but I get the same. Now, what I have:

1.I'm using my own fit generator, which yeild data:

(list of shapes of input data) -
[(25,), (25,), (25,), (25, 24), (25, 11), (25, 10), (25, 28), (25, 8), (25, 7)] 


features = [['id1',1], ['id2',1],
 ['id3',1], ['id4',24],
 ['id5',11], ['id6',10], ['id7',28], ['id8',8], ['id9',7]]

embeddings = []
inputs = []
for idx, feature in enumerate(features):
      meta_input = Input(shape=(feature[1],), name = feature[0] + '_input')
      sqrt = int(np.sqrt(feature[1]))

      embedding = Embedding(feature[1], 1, input_length=1,name = feature[0] + '_embed')(meta_input)
      fl = Flatten()(embedding)
      embeddings.append(fl)
      inputs.append(meta_input)

x = Concatenate()(embeddings)    
dense_meta_1 =  Dense(256, activation='relu')(x) #x
drop_meta = Dropout(0.2)(dense_meta_1)

dense_meta_2 = Dense(1)(drop_meta)


model = Model(inputs, dense_meta_2)

model.compile(optimizer='Adam', loss='mean_squared_error', metrics= 
['mae']) 
history = model.fit_generator(my_gen_v2(batch_size, batch_folder, steps), epochs=1, steps_per_epoch=steps,
                         max_queue_size=1)

so when I use flatten layers - I got this message (some part):

InvalidArgumentError: Matrix size-incompatible: In[0]: [25,91], In[1]: [9,256] [[node dense_25/MatMul (defined at /home/human/anaconda3/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:1076) = MatMul[T=DT_FLOAT, _class=["loc:@training_7/Adam/gradients/dense_25/MatMul_grad/MatMul"], transpose_a=false, transpose_b=false, _device="/job:localhost/replica:0/task:0/device:GPU:0"](concatenate_16/concat, dense_25/kernel/read)]] [[{{node metrics_11/mean_absolute_error/Mean_1/_1219}} = _Recvclient_terminated=false, recv_device="/job:localhost/replica:0/task:0/device:CPU:0", send_device="/job:localhost/replica:0/task:0/device:GPU:0", send_device_incarnation=1, tensor_name="edge_1116_metrics_11/mean_absolute_error/Mean_1", tensor_type=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:CPU:0"]]

but when I use Reshape layers:

embedding = Reshape(target_shape=(1,), name = feature[0] + '_reshape')(embedding)

I'v got this:

InvalidArgumentError: Input to reshape is a tensor with 600 values, but the requested shape has 25 [[node race_reshape/Reshape (defined at /home/human/anaconda3/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:1898) = Reshape[T=DT_FLOAT, Tshape=DT_INT32, _device="/job:localhost/replica:0/task:0/device:GPU:0"](race_embed_16/GatherV2, race_reshape/Reshape/shape)]] [[{{node metrics_12/mean_absolute_error/Mean_1/_1417}} = _Recvclient_terminated=false, recv_device="/job:localhost/replica:0/task:0/device:CPU:0", send_device="/job:localhost/replica:0/task:0/device:GPU:0", send_device_incarnation=1, tensor_name="edge_1098_metrics_12/mean_absolute_error/Mean_1", tensor_type=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:CPU:0"]]

There are no simillar questions on stackoverflow, only about images shapes. Please help me resolve this, coz I spend a lot of time for this(

1

1 Answers

1
votes

The problem was resolved by changing input_length in Embedding layer to input shape of feature (feature[1] in my example)