In the case I have a tensor with a None first dimension corresponding to the batch size, for example:
tensor = tf.placeholder(tf.float32, shape=[None, 256, 256,3],name="placeholder_input")
and now I have a tensor function "myfunc", acting on a tensor on size [256,256,3] that I want to apply as many times as the number of batch to get as a result an output of size [None, 256, 256,3]. If the shape was not dynamic I would simply do :
output_tensor = tf.stack([myfunc(tensor [k,:,:,:] for k in range(BATCH_SIZE)])
How could i do with a dynamic shape ?