5
votes

Is there a way to use the shape of a tensor with dynamic shape in tensorflow operations without evaluating it in a session? For instance, consider the following:

activation = tf.nn.relu(conv_plus_b, name=scope.name) #has shape [None, None, 700,1]
conv_len =activation.shape[1]
pool = tf.nn.max_pool(activation, ksize=[1,conv_len,1,1], strides=[1,1,1,1], padding='VALID')

Running this code throws the error: TypeError: Expected int for argument 'ksize' not Dimension(None).

So my question is: Is there any way of using such dynamic shapes to define the shape parameter of tensorflow operations without evaluating it in a session?

I found a similar issue on: https://groups.google.com/a/tensorflow.org/forum/#!topic/discuss/BlguDbTxCAk The following solution was proposed for performing resize operations in tensorflow using dynamic shape:

n = tf.shape(foo)[0]
tf.reshape(foo, tf.pack([n, 1]))

tf.pack has been deprecated. I'm not sure if tf.stack would work in tf.reshape operation but using it in tf.nn.max_pool throws the error TypeError: Expected list for attr ksize

I understand that there are different variations of the shape function. I've tried activation.get_shape()[1] (which I heard works with static shapes), activation.shape[1] and tf.shape(activation)[1]. They all throw errors.

Thanks a lot for looking into this.

1

1 Answers

3
votes

tf.nn.max_pool() does not support dynamic size, ksize need to be constant, instead of that you can use tf.reduce_max(),

    import tensorflow as tf
    conv_plus_b = tf.placeholder(tf.float32, shape=[None, None, 700, 1])
    activation = tf.nn.relu(conv_plus_b) #has shape [None, None, 700,1]
    # conv_len = tf.shape(conv_plus_b)[1]
    # pool = tf.nn.max_pool(activation, ksize=[1,conv_len,1,1], strides=[1,1,1,1], padding='VALID')
    pool = tf.reduce_max(activation, axis=1, keep_dims=True)

for more details check following links:

Tensorflow maxpool with dynamic ksize

https://github.com/tensorflow/tensorflow/issues/9394

https://github.com/tensorflow/tensorflow/issues/4746