I want to use a (2 layered) pretrained LSTM model and I want to add a new LSTM layer into that before the last Dense layer. So the layer I will add will be third. Since the second LSTM for the pretrained model has set return_sequences=True. I am unable to add third LSTM layer. How can I change the configuration of any layer of pretrained model with LSTM to add another LSTM layer. I am not keen on making another copy of model and copying the weights. I want to change in the existing model itself.
I am trying it as:
model.layers[-1].return_sequences=True
This statement does not generate any error. But the layer configuration still shows return_sequences=False
I also tried changing the configuration of layer explicitly as:
config=model.layers[-1].get_config()
config['layers']['config']['return_sequences']=True
This changes the value of return_sequences in dictionary config. But I do not know how to change the layer. Something like, is not working.
model.layers[-1]=LSTM.from_config(config)
It gives init takes at least two arguments.
Layer[-1] of model is actually a bidirectional wrapped LSTM.
return_sequences = True
is the only way to add an LSTM after another. Are you sure this is your problem? - Daniel Möller