Newbie here, I am sorry if this question is silly but I couldn't find anything about it online. I am getting an unexpected shape for the output of tf.squared_difference
. I would expect the obtain of a Tensor with shape=(100, ?)
shape as the loss from the following snippet
[print("Logits",logits,"#Labels",labels,"LOSS",tf.squared_difference(labels,logits)) for logits, labels in zip(logits_series,labels_series)]
However it produces (100,100)
Loss
Logits Tensor("add_185:0", shape=(100, 1), dtype=float32) #Labels Tensor("unstack_29:0", shape=(100,), dtype=float32) LOSS Tensor("SquaredDifference_94:0", shape=(100, 100), dtype=float32) Logits Tensor("add_186:0", shape=(100, 1), dtype=float32) #Labels Tensor("unstack_29:1", shape=(100,), dtype=float32) LOSS Tensor("SquaredDifference_95:0", shape=(100, 100), dtype=float32)
I have tested another example with following code and gives the expected output shape.
myTESTX = tf.placeholder(tf.float32, [100, None])
myTESTY = tf.placeholder(tf.float32, [100, 1])
print("Test diff X-Y",tf.squared_difference(myTESTX,myTESTY) )
print("Test diff Y-X",tf.squared_difference(myTESTY,myTESTX) )
Test diff X-Y Tensor("SquaredDifference_92:0", shape=(100, ?), dtype=float32) Test diff Y-X Tensor("SquaredDifference_93:0", shape=(100, ?), dtype=float32)
I am having issue why these two snippets produce different output shape