1
votes

My network consists of LSTM and Dense parts connected together by the other Dense part and I cannot concatenate inputs of size [(1, 8), (None, 32)]. Reshape and Flatten do not work.

Here's the architecture:

def build_model_equal(dropout_rate=0.25):

    curve_input_1 = Input(batch_shape=(1, None, 1), name='curve_input_1')
    lstm_1 = LSTM(256, return_sequences=True, dropout=0.1)(curve_input_1)
    lstm_1 = LSTM(64, dropout=0.1)(lstm_1)
    lstm_out = Dense(8)(lstm_1)

    metadata_input = Input(shape=(31,), name='metadata_input')

    dense_1 = Dense(512, activation='relu')(metadata_input)
    dense_1 = BatchNormalization()(dense_1)
    dense_1 = Dropout(dropout_rate)(dense_1)

    dense_out = Dense(32)(dense_1)

    x = keras.layers.concatenate([lstm_out, dense_out], axis=1)

    output_hidden = Dense(64)(x)
    output_hidden = BatchNormalization()(output_hidden)
    output_hidden = Dropout(dropout_rate)(output_hidden)

    output = Dense(n_classes, activation='softmax', name='output')(output_hidden)

    model = Model(inputs=[curve_input_1, metadata_input], outputs=output)
    return model

When I train this model via

model.fit([x_train, x_metadata], y_train,
    validation_data=[[x_valid, x_metadata_val], y_valid], 
    epochs=n_epoch,
    batch_size=n_batch, shuffle=True, 
    verbose=2, callbacks=[checkPoint]
)  

I get an error

ValueError: A Concatenate layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(1, 8), (None, 32)]

When I add Reshape layer like

dense_out = Dense(32)(dense_4)
dense_out = Reshape((1, 32))(dense_out)

x = keras.layers.concatenate([lstm_out, dense_out], axis=1)

I get

ValueError: A Concatenate layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(1, 8), (None, 1, 32)]

Reshape layer input_shape=(32,) or input_shape=(None, 32) parameters do not change the situation, error and shapes are the same.

Adding Reshape to LSTM like

curve_input_1 = Input(batch_shape=(1, None, 1), name='curve_input_1')
lstm_first_1 = LSTM(256, return_sequences=True, dropout=0.1, name='lstm_first_1')(curve_input_1)
lstm_second_1 = LSTM(64, dropout=0.1, name='lstm_second_1')(lstm_first_1)
lstm_out = Dense(8)(lstm_second_1)
lstm_out = Reshape((None, 8))(lstm_out)

Produces an error

ValueError: Tried to convert 'shape' to a tensor and failed. Error: None values not supported.

Changing concatenate axis parameter to 0, 1 and -1 doesn't help.

Changing Dense part input shape doesn't help. When I do metadata_input = Input(shape=(1, 31), name='metadata_input') instead of metadata_input = Input(shape=(31,), name='metadata_input') it produces an error with [(1, 8), (None, 1, 32)] dimensions.

My guess is that I need to transform data either to [(1, 8), (1, 32)] or to [(None, 8), (None, 32)] shape, but Reshape and Flatten layers didn't help.

There should be an easy way to do that that I missed.

1

1 Answers

1
votes

I think the problem could be the use batch_shape for the first Input and shape for the second one.

With the first input, your batch size is hardcoded as 1 and your input data has 2 extra dimensions, None (unspecified) and 1.

For the second input, since you are using shape, you are declaring that your input has batch size unespecified and data has one dimension of 31 values.

Note that using shape=(31,) is the same as using batch_shape=(None, 31) (from here).

Aligning both is working for me, at least at model declaration time (I was unable to run the fit though, and I'm not sure if I'm missing something and this solution doesn't fit your use case.

So, to summarize, you can try:

curve_input_1 = Input(batch_shape=(1, None, 1), name='curve_input_1')
metadata_input = Input(batch_shape=(1, 31), name='metadata_input')

Or:

curve_input_1 = Input(batch_shape=(1, None, 1), name='curve_input_1')
metadata_input = Input(batch_shape=(1, 31), name='metadata_input')

Which is equivalent to:

curve_input_1 = Input(shape=(None, 1, ), name='curve_input_1')
metadata_input = Input(shape=(31, ), name='metadata_input')

Please, let me know it this worked or lead you in a good direction!