0
votes

I'm having trouble debugging my custom loss function in keras.

def custom_loss_wrapper(input_tensor):
   def custom_loss(y_true, y_pred):
      diff = y_pred - y_true
      diff = kb.print_tensor(diff)
      print(input_tensor)
   return kb.square(diff)
return custom_loss

model.compile(optimizer='adam',loss=custom_loss_wrapper(model.input)) 
model.fit(x=training_set,y=target_set,epochs=100)

Output:

Tensor("lstm_29_input:0", shape=(?, 1, 5), dtype=float32)
Epoch 1/100 61550/61550 [==============================] - 13s 217us/step - loss: 0.0049
etc.

In particular, I cant seem to make prints of any kind works while the model is training. I tried tf.Print and Theano Print but to no avail. When I try with a normal print it gets printed only once (when it gets compiled i presume). Also, I tried to access the values of input_tensor (here I tried with various methods like kb.eval, converting to NumPy array,etc.) and it seems that my input_tensor is only a placeholder tensor. That again, contains no value because custom_loss is being executed during the compilation I presume. How I do access to input_tensor at runtime?

1
If it's TF2 use tf.print (with a non-capital p) - Mohammad Jafar Mashhadi
Already tried it but still not working - elia mantoet
I tried your code in colab and it didn't have any problems. It prints the input tensor in the loss function. Whatever the problem is it's not reproducible with the snippet you provided - Mohammad Jafar Mashhadi

1 Answers

1
votes

You can define your model.fit as a loop displaying all the required fields like below -

for epoch in range(1,5):
        model.fit(x, y, batch_size=64, epochs= epoch, initial_epoch = (epoch-1), verbose=1, validation_split=0.2, shuffle=True)
        inputs = model.model._feed_inputs + model.model._feed_targets + model.model._feed_sample_weights
        print(model.input)
        print(model.total_loss)
        print(model.trainable_weights)