0
votes

I am building a multi-layer RNN with the same setting as in (using MultiRNNCell to wrap up the cells and then use dynamic_rnn to call):

Outputs and State of MultiRNNCell in Tensorflow

And as descriped in the above question, the dynamic_rnn returns outputs, state = tf.nn.dynamic_rnn(...)

The outputs only provides outputs I guess from the top layer (because the shape is batch_size x steps x state_size). However, the state return the outputs from each layer (tuple with num_layer elements, each one contains the last state of that layer).

(1) Is there any way that I can assess the outputs from all time steps for each layer(not jus the last layer returned by the dynamic_rnn) in a simple way without running a one-step RNN recursively and read the state for each step?

(2) Is the output returned indicated for the last(top) layer?

1

1 Answers

1
votes

Based on the documentation of the tf.nn.rnn_cell.MultiRNNCell you should be safe doing the following:

cell_1 = tf.nn.rnn_cell.GRUCell(7, name="gru1")
cell_2 = tf.nn.rnn_cell.GRUCell(7, name="gru2")
outputs_1, states_1 = tf.nn.dynamic_rnn(cell_1, X, dtype=tf.float32)
outputs_2, states_2 = tf.nn.dynamic_rnn(cell_2, outputs_1, dtype=tf.float32)

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    first_layer_outputs = sess.run(outputs_1)
    second_layer_outputs = sess.run(outputs_2)

As for the outputs returned by tf.nn.dynamic_rnn, they are indeed from the top layer if the cell provided is tf.nn.rnn_cell.MultiRNNCell.