0
votes

I'm currently working on a Keras neural network for fun. I'm just learning the basics, but cant get over this dimension problem:

So my input data (X) should be a 12x6 matrix, with 12 timestamps and 6 different data values for every timestamp:

X = np.zeros([2867, 12, 6])
Y = np.zeros([2867, 3])

My Output (Y) should be a one-hot encoded 3x1 vector.

Now i want to feed this data through the following LSTM model.

model = Sequential()
model.add(LSTM(30, activation="softsign", return_sequences=True, input_shape=(12, 6)))
model.add(Dense(3))
model.summary()
model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
model.fit(x=X, y=Y, batch_size=100, epochs=1000, verbose=2, validation_split=0.2)

The Summary looks like this:

Model: "sequential"


Layer (type)                 Output Shape              Param #   
=================================================================
lstm (LSTM)                  (None, 12, 30)            4440      
_________________________________________________________________
dense (Dense)                (None, 12, 3)             93        
=================================================================
Total params: 4,533
Trainable params: 4,533
Non-trainable params: 0
_________________________________________________________________

When i run this program, i get this error: ValueError: Shapes (None, 3) and (None, 12, 3) are incompatible.

I already tried to reshape my data to a 72x1 vector, but this doesnt work either.

Maybe someone can help me how to shape my input data correctly :).

1

1 Answers

0
votes

You probably need to define your model as follows as you used the categorical_crossentropy loss function.

model.add(LSTM(30, activation="softsign", 
           return_sequences=False, input_shape=(12, 6)))
model.add(Dense(3, activations='softmax'))