8
votes

I have a sequence input in this shape: (6000, 64, 100, 50)

The 6000 is just the number of sample sequences. Each sequences is 64 in length.

I plan to fit this input into an LSTM using Keras.

I setup my input this way:

input = Input(shape=(64, 100, 50))

This gives me an input shape of (?, 64, 100, 50)

However, when I put input into my LSTM like so:

x = LSTM(256, return_sequences=True)(input)

I get this error:

Input 0 is incompatible with layer lstm_37: expected ndim=3, found ndim=4

This would have worked if my input shape was something like (?, 64, 100), but not when I've a 4th dimension.

Does this mean that LSTM can only take an input of 3 dimensional? How can I feed a 4 or even higher dimension input into LSTM using Keras?

2

2 Answers

4
votes

The answer is you can't.

The Keras Documentation provides the following information for Recurrent Layer:

Input shape

3D tensor with shape (batch_size, timesteps, input_dim).

In your case you have 64 timesteps where each step is of shape (100, 50). The easiest way to get the model working is to reshape your data to (100*50).

Numpy provides an easy function to do so:

X = numpy.zeros((6000, 64, 100, 50), dtype=numpy.uint8)
X = numpy.reshape(X, (6000, 64, 100*50))

Wheter this is reasonable or not highly depends on your data.

2
votes

you can also consider TimeDistributed(LSTM(...))

inp = Input(shape=(64, 100, 50))
x = TimeDistributed(LSTM(256, return_sequences=True))(inp)

model = Model(inp, x)
model.compile('adam', 'mse')
model.summary()