1
votes

I want to use a (2 layered) pretrained LSTM model and I want to add a new LSTM layer into that before the last Dense layer. So the layer I will add will be third. Since the second LSTM for the pretrained model has set return_sequences=True. I am unable to add third LSTM layer. How can I change the configuration of any layer of pretrained model with LSTM to add another LSTM layer. I am not keen on making another copy of model and copying the weights. I want to change in the existing model itself.

I am trying it as:

model.layers[-1].return_sequences=True

This statement does not generate any error. But the layer configuration still shows return_sequences=False

I also tried changing the configuration of layer explicitly as:

config=model.layers[-1].get_config()
config['layers']['config']['return_sequences']=True

This changes the value of return_sequences in dictionary config. But I do not know how to change the layer. Something like, is not working.

model.layers[-1]=LSTM.from_config(config)

It gives init takes at least two arguments.

Layer[-1] of model is actually a bidirectional wrapped LSTM.

1
return_sequences = True is the only way to add an LSTM after another. Are you sure this is your problem? - Daniel Möller
By the way, changing an existing model to add a layer in between is harder in Keras than building a new model with two layers. - Daniel Möller
I want to set return_sequences=False of the pretrained model to True without building another copy. - shaifali Gupta

1 Answers

0
votes

I think making another copy of the model and copying the weights is your best bet. If you study the source code you'd probably be able to figure out a way to hack another layer on, but that would take effort, would potentially not work, and might break in the future.