I am using LSTM on keras and using a reshape layer prior in hopes that I don't have to specify shape for the LSTM layer.
the input is 84600 x 6
84600 seconds in 2 months. 6 metric/[labels] im measuring throughout the 2 months
so far I have
model = tf.keras.models.Sequential()
model.add(tf.keras.layers.Reshape((86400,1,6), input_shape=(84600, 6)))
model.add(tf.keras.layers.LSTM(128, activation='relu', input_shape=
(x_train.shape), return_sequences=True))
model.add(tf.keras.layers.Dense(10, activation='softmax'))
which throws an error:
ValueError: Input 0 of layer lstm is incompatible with the layer: expected ndim=3, found ndim=4. Full shape received: [None, 86400, 1, 6]
this is understandable. The batch size, plus 3 layers equals 4. However, when I reshape
model.add(tf.keras.layers.Reshape((86400,1,6), input_shape=(84600, 6)))
vvvvvvv
model.add(tf.keras.layers.Reshape((86400,6), input_shape=(84600, 6)))
It throws
ValueError: Error when checking input: expected reshape_input to have 3 dimensions, but got array with shape (86400, 6)
It seems to ignore the batch size as an array element. And treats it as 2 indexes. It jumps from 4 dimensions to 2 dimensions.
The problem is LSTM takes 3 dimensions as input, and I can't seem to get that. Ideally I want a 86400 x 1 x 6 array/tensor. So it becomes 84600 examples of 1x6 data.
Thank you very much!