2
votes

It seems tensorflow does not support variable batch size for bidirectional RNN. In this example the sequence_length is tied to batch_size, which is a Python integer:

  _seq_len = tf.fill([batch_size], tf.constant(n_steps, dtype=tf.int64))
  outputs, state1,state2 = rnn.bidirectional_rnn(rnn_fw_cell, rnn_bw_cell, input,
                                    dtype="float",
                                    sequence_length=_seq_len)

How can I use different batch sizes for training and testing?

1

1 Answers

5
votes

The bidirectional code works with variable batch sizes. For example, take a look at this test code, which creates a tf.placeholder(..., shape=(None, input_size)) (where None means that the batch size can be variable).

You can convert your code snippet to work with variable batch sizes with a small modification:

# Compute the batch size based on the shape of the (presumably fed-in) `input`
# tensor. (Assumes that `input = tf.placeholder(..., shape=[None, input_size])`.)
batch_size = tf.shape(input)[0]

_seq_len = tf.fill(tf.expand_dims(batch_size, 0),
                   tf.constant(n_steps, dtype=tf.int64))
outputs, state1, state2 = rnn.bidirectional_rnn(rnn_fw_cell, rnn_bw_cell, input,
                                                dtype=tf.float32,
                                                sequence_length=_seq_len)