1
votes

In the case I have a tensor with a None first dimension corresponding to the batch size, for example:

tensor = tf.placeholder(tf.float32, shape=[None, 256, 256,3],name="placeholder_input")

and now I have a tensor function "myfunc", acting on a tensor on size [256,256,3] that I want to apply as many times as the number of batch to get as a result an output of size [None, 256, 256,3]. If the shape was not dynamic I would simply do :

output_tensor = tf.stack([myfunc(tensor [k,:,:,:] for k in range(BATCH_SIZE)])

How could i do with a dynamic shape ?

2
There is a function named batch_slice in the repo of Mask_RCNN doing the same thing: github.com/matterport/Mask_RCNN/blob/master/mrcnn/utils.py#L801. - keineahnung2345
Don't think this is a solution to my problem, as it requires a batch size which I do not have in case of a dynamic shape with variable batch size - Stringer Bell

2 Answers

0
votes

If you really want to do that you can use tf.map_fn

Otherwise you can try to directly deal with the original tensor (with the first dimension = None) and do the operations on the proper axes (no need to loop)

0
votes

iterating over None is possible in Tensor Flow using tf.map_fn and tf.scan, but make sure that eager execution is disabled before building any graph.

it can be done using :

import tensorflow as tf
tf.compat.v1.disable_eager_execution()

you could do something like :

tensor = tf.compat.v1.placeholder(dtype=tf.float32, shape=[None, 256, 256,3],name="placeholder_input")
output_tensor = tf.map_fn(lambda x : x, elems=(tensor))

output :

<tf.Tensor 'map/TensorArrayV2Stack/TensorListStack:0' shape=(None, 256, 256, 3) dtype=float32>

tf.stack would not be required here because tf.map_fn() automatically stacks all returned tensors