5
votes

Library: Keras, backend:Tensorflow

I am training a single class/binary classification problem, wherein my final layer has a single node, with activation of sigmoid type. I am compiling my model with a binary cross entropy loss. When I run the code to train my model, I notice that the loss is a value greater than 1. Is that right, or am I going wrong somewhere? I have checked the labels. They're all 0s and 1s.

Is it possible to have the binary cross entropy loss greater than 1?

2
What exactly is a "single class/binary classification"? How can you have a single class for binary classification?OneRaynyDay
The output is basically either 1 or 0. So it either belongs to the class or doesn't. For example cat image classification.Blue
What are the values you're getting?OneRaynyDay
In the range 6-8. I have not trained it for multiple epochs, because from what I know the loss should be in the range 0-1Blue
If your data is also between 0 and 1, I think you should not worry much.Daniel Möller

2 Answers

9
votes

Keras binary_crossentropy first convert your predicted probability to logits. Then it uses tf.nn.sigmoid_cross_entropy_with_logits to calculate cross entropy and return to you the mean of that. Mathematically speaking, if your label is 1 and your predicted probability is low (like 0.1), the cross entropy can be greater than 1, like losses.binary_crossentropy(tf.constant([1.]), tf.constant([0.1])).

6
votes

Yes, its correct, the cross-entropy is not bound in any specific range, its just positive (> 0).