0
votes

I used histogram loss as a loss function for my model but it provide NAN gradient. Code snippet (loss function):

def histogram_loss(y_true, y_pred):
    h_true = tf.histogram_fixed_width( y_true, value_range=(-1., 1.), nbins=20)
    h_pred = tf.histogram_fixed_width( y_pred, value_range=(-1., 1.), nbins=20)
    h_true = tf.cast(h_true, dtype=tf.dtypes.float32)
    h_pred = tf.cast(h_pred, dtype=tf.dtypes.float32)
    return K.mean(K.square(h_true - h_pred))

Error messages:

ValueError: An operation has `None` for gradient. Please make sure that all of your ops have a gradient defined (i.e. are differentiable). Common ops without gradient: K.argmax, K.round, K.eval.

why I got value error (NAN gradient) ?

1

1 Answers

0
votes

The gradient of tf.histogram is None... It's not a differential function

x = tf.Variable(np.random.uniform(0,10, 100), dtype=tf.float32)

with tf.GradientTape() as tape:
    hist = tf.histogram_fixed_width(x, value_range=(-1., 1.), nbins=20)
    hist = tf.cast(hist, dtype=tf.dtypes.float32)

grads = tape.gradient(hist, x)
grads