My questions is rather simple but seemed to be unsolved.
Input: (bs, timesteps, input_dim) --> Tensor("stack:0", shape=(?, 4, 400), dtype=float32)
Layer: output = LSTM(100, input_shape = (timesteps, input_feature), return_sequence = True) (input)
Expect: (bs, timesteps, output_dim) --> Tensor("gru_20/transpose_1:0", shape=(?, 4, 100), dtype=float32)
Output: Tensor("gru_20/transpose_1:0", shape=(?, ?, 100), dtype=float32)
Why does Keras not infer the number of timesteps, even it receives an input_shape? When I use the model summary the result it shows has the correct output shape:
lstm_2 (LSTM) (None, 4, 100) 3232
But not during construction. So, when I want to unstacked the Tensor to a list of Tensors for every timesteps * (bs, 10) by using unstack(output, axis=1)] I receive ofc this error: ValueError: Cannot infer num from shape (?, ?, 100)
Where is my mistake?
BTW. Adding TimeDistributed(Dense(100))(questions) results in the correct output dim: Tensor("time_distributed_17/Reshape_1:0", shape=(?, 4, 100), dtype=float32) but not an option because of shared weights. If not, what is the workaround?
Reshape
layer. – jdehesa