2
votes

When we add a bidirectional RNN layer I can understand that we have to concatenate hidden states. If we use bidirectional RNN layer in encoder decoder model do we have to train the bidirectional RNN layer separately ?

1

1 Answers

0
votes

No. To quote from the abstract of Bidirectional Recurrent Neural Networks by Schuster and Paliwal:

The BRNN can be trained without the limitation of using input information just up to a preset future frame. This is accomplished by training it simultaneously in positive and negative time direction.

I guess you are talking about tf.nn.static_bidirectional_rnn.