Question
How to batch train a multi-step LSTM in Keras for single-label multi-class classificaiton, at each time-step for > 2 classes?
Current Error
Each target batch is a 3-dimensional array with shape (batch_size, n_time_steps, n_classes) but Keras expects a 2-dimensional array.
Example/Context
Suppose we have daily closing prices for N stocks and for each day and stock: m features and one of three actions "bought", "held", "sold". If there are 30 days' worth of data per stock we may train an LSTM to predict each action (on each day, for each stock) as follows.
For each batch of samples of size n << N, X_train will have a shape of (n, 30, m) i.e. n samples, 30 time-steps, and m features. After one-hot encoding "bought", "held", and "sold" Y_train will have a shape of (n, 30, 3), which is an array of 3-dimensions.
The problem is that Keras is giving an error due to expecting Y_train to be 2-dimensional.
Here is a code snippet:
n_time_steps = 30
n_ftrs = 700
n_neurons = 100
n_classes = 3
batch_size = 256
n_epochs = 500
model = Sequential()
model.add(LSTM(n_neurons, input_shape=(n_time_steps, n_ftrs)))
model.add(Dense(n_classes, activation='sigmoid'))
model.compile(loss='categorical_crossentropy', optimizer='adam',
metrics=['accuracy'])
for e in range(n_epochs):
X_train, Y_train = BatchGenerator()
# Y_train.shape = (256, 30, 3)
model.fit(X_train, Y_train, batch_size=batch_size, nb_epoch=1)
Error
Error when checking target: expected dense_20 to have 2 dimensions,
but got array with shape (256, 30, 3)
(None, n_time_steps, n_ftrs)without settingreturn_sequencestoTrue? - Novak