0
votes

I am new in tensorflow so this might be an easy question, but it is really stuck me
I am tring to implement this paper by keras, background is tensorflow
In first stage of training, he used softmax_pair
if we got this output from last fc
vertical is batch size and this is NoneType

x11 x12 x13 x14...   
x21 x22 x23 x24...   
x31 x32 x33 x34...
...

and we do exponential, so we have

e11 e12 e13 e14...   
e21 e22 e23 e24...   
e31 e32 e33 e34...
...

and then, I am stuck here

e11/(e11+e12) e12/(e11+e12) e13/(e13+e14) e14/(e13+e14)...
e21/(e21+e22) e22/(e21+e22) e23/(e23+e24) e24/(e23+e24)...
e31/(e31+e32) e32/(e31+e32) e33/(e33+e34) e34/(e33+e34)...
...

I don't know how to do pairwise addition
tf.transpose and tf.segment_sum might be great
but after research I found transpose is expensive
further more, after tf.segment_sum I only have half size of tensor, I don't know how to double it
oh and I am thinking how to produce segment_ids

so how can I do this calculate? Thanks!!

----------update
The part I talked in paper is Fig.3
The fc output is P2c-1 and P2c, which is mean possibility of class c appear or not appear in the image
c=1,2,3...num of class

Is transpose not expensive? sometimes I see this,e.g. the comment ,perhaps I misunderstood this?

The tensorflow docs for tf.transpose state that unlike numpy tensorflow returns a new tensor -> memory.

1
What part of the paper do you talk about? Is the number of columns always 4? Why do you think transpose is expensive? - sygi

1 Answers

0
votes

Assuming X is your tensor of size R x C:

_, C = X.get_shape()
X_split = tf.split(1, C/2, X)
Y_split = [tf.nn.softmax(slice) for slice in X_split]
Y = tf.concat(1, Y_split)

C will be the number of colums, X_split will be a list of subtensors, each having a two columns, Y_split will calculate regular softmax for each of the tensors, Y will join the results of softmaxes.