2
votes

I'm trying to create a keras LSTM to predict time series. My x_train is shaped like 3000,15,10 (Examples, Timesteps, Features), y_train like 3000,15,1 and I'm trying to build a many to many model (10 input features per sequence make 1 output / sequence).

The code I'm using is this:

model = Sequential()

model.add(LSTM(
    10,
    input_shape=(15, 10),
    return_sequences=True))
model.add(Dropout(0.2))

model.add(LSTM(
    100,
    return_sequences=True))
model.add(Dropout(0.2))
model.add(Dense(1, activation='linear'))
model.compile(loss="mse", optimizer="rmsprop")
model.fit(
        X_train, y_train,
        batch_size=512, nb_epoch=1, validation_split=0.05)

However, I can't fit the model when using :

model.add(Dense(1, activation='linear'))
>> Error when checking model target: expected dense_1 to have 2 dimensions, but got array with shape (3000, 15, 1)

or when formatting it this way:

model.add(Dense(1))
model.add(Activation("linear"))
>> Error when checking model target: expected activation_1 to have 2 dimensions, but got array with shape (3000, 15, 1)

I already tried flattening the model ( model.add(Flatten()) ) before adding the dense layer but that just gives me ValueError: Input 0 is incompatible with layer flatten_1: expected ndim >= 3, found ndim=2. This confuses me because I think my data actually is 3 dimensional, isn't it?

The code originated from https://github.com/Vict0rSch/deep_learning/tree/master/keras/recurrent

2
As it's many-to-many, why have you set return_sequences=False? Try to set it to True in second LSTM. - Marcin Możejko
Hi Marcin, I changed it to True but still getting the same error. - sbz
Could you update your code snippet? And which version of keras do you use? Are you 100% sure that it's the same message? - Marcin Możejko
Updated the snippet and I'm using 1.2.2 (with Python 2.7.5). The error is Error when checking model target: expected activation_1 to have 2 dimensions, but got array with shape (3000, 15, 1) and I'm using the model.add(Dense(1)) model.add(Activation("linear")) formatting. - sbz
Now I see. You are using relatively old version of keras. Try: model.add(TimeDistributed(Dense(1))). - Marcin Możejko

2 Answers

2
votes

In case of keras < 2.0: you need to use TimeDistributed wrapper in order to apply it element-wise to a sequence.

In case of keras >= 2.0: Dense layer is applied element-wise by default.

0
votes

Since you updated your keras version and your error messages changed, here is what works on my machine (Keras 2.0.x)

This works:

model = Sequential()

model.add(LSTM(10,input_shape=(15, 10), return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM( 100, return_sequences=True))
model.add(Dropout(0.2))
model.add(Dense(1, activation='linear'))

This also works:

model = Sequential()

model.add(LSTM(10,input_shape=(15, 10), return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM( 100, return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM(1,return_sequences=True, activation='linear'))

Testing with:

x = np.ones((3000,15,10))
y = np.ones((3000,15,1))

Compiling and training with:

model.compile(optimizer='adam',loss='mse')
model.fit(x,y,epochs=4,verbose=2)