I would like to build this type of neural network architecture: 2DCNN+GRU. Consider that the input is a 4D-tensor (batch_size, 1, 1500, 40), then I've got 3 2D-CNN layers (with batch norm, relu, max pooling and dropout). In output from the third cnn layer I obtain a 4D-tensor (None, 120, 1500, 1). Now it comes my issue, how do I connect the GRU layer with this input shape? I tried to do a reshape in keras (so it becomes (None, 1500, 120)) and feed the output through a gru layer but there's something wrong... Consider, also, that my labels for training is a 3D-tensor (batch_size, 1500, 2). I copy here the keras model and the output from the summary() command:
input_data = Input(shape=[1,1500,40])
x = input_data
for i in range(len([32,96,120])):
x = Conv2D(filters=[32,96,120],
kernel_size=[5,5],
activation='relu',
padding='same'
)(x)
x = BatchNormalization(axis=3)(x)
x = Dropout(0.3)(x)
x = MaxPooling2D(pool_size=[(1,5),(1,4),(1,2)],
data_format="channels_first")(x)
x = Reshape((1500, 120))(x)
x = GRU(units=120,
activation='tanh',
recurrent_activation='hard_sigmoid',
dropout=0.3,
recurrent_dropout=0.3,
)(x)
predictions = Dense(2, activation='softmax')(x)
network = Model(input_data, predictions)
network.summary()
Can you help me? Thank you