I have an image multilabel classification problem that I would like to solve with tensorflow.
I'm trying to construct proper loss function and a "proper" final layer for CNN network.
What kind of arguments the
tf.nn.sigmoid_cross_entropy_with_logits(labels, logits)
function expects?
Am I safe to assume that:
- labels are vectors with binary values {0,1}
- logits are vectors with same dimmension as labels with values from whole ]-∞, ∞[
Therefore I should skip ReLU in the last layer (to ensure final output can be negative).
Or maybe logits are bounded and represent probabilty?
I'm not 100% sure on this.

labelsare not one-hot vector but only a scalar for binary classification. unless there are multiple-label within one training sample such as both labels elephant and cat can appear in one image, thenlabelswill be a vector. - thinkdeep