0
votes

I'm loading pre-trained model with shape of ((23094, 71, 768), (23094, 19, 282)).

When i pass my new X values with shape of (29116, 72, 768) it shows Error when checking input: expected lstm_1_input to have shape (71, 768) but got array with shape (72, 768).

This is my model summary: Model: "sequential_1"


Layer (type) Output Shape Param #

lstm_1 (LSTM) (None, 71, 256) 1049600


lstm_2 (LSTM) (None, 71, 64) 82176


lstm_3 (LSTM) (None, 32) 12416


dense_1 (Dense) (None, 5358) 176814


reshape_1 (Reshape) (None, 19, 282) 0


activation_1 (Activation) (None, 19, 282) 0

Total params: 1,321,006 Trainable params: 1,321,006 Non-trainable params: 0


1

1 Answers

0
votes

LSTM needs a 3D array (batch size, timesteps, features). There are many ways to train with variable temporal length (dimension = 1, in your case 71).

In your case, sample size = 23094, 23094 (this doesn't need to match)

temporal dimension = 71 != 19 (this can be different if you have variable input length and in a single batch all of the inputs have the same length, but in your model summary the temporal dimension is fixed at 71 and not None, so you must train with 71 temporal dimensions)

feature dimension = 768 != 282 (this must be same)

Just use zero padding to match the shape https://www.tensorflow.org/api_docs/python/tf/keras/layers/ZeroPadding2D

Also, provide full code to get code-based answer.