I'm having trouble debugging my custom loss function in keras.
def custom_loss_wrapper(input_tensor):
def custom_loss(y_true, y_pred):
diff = y_pred - y_true
diff = kb.print_tensor(diff)
print(input_tensor)
return kb.square(diff)
return custom_loss
model.compile(optimizer='adam',loss=custom_loss_wrapper(model.input))
model.fit(x=training_set,y=target_set,epochs=100)
Output:
Tensor("lstm_29_input:0", shape=(?, 1, 5), dtype=float32)
Epoch 1/100 61550/61550 [==============================] - 13s 217us/step - loss: 0.0049
etc.
In particular, I cant seem to make prints of any kind works while the model is training. I tried tf.Print and Theano Print but to no avail. When I try with a normal print it gets printed only once (when it gets compiled i presume). Also, I tried to access the values of input_tensor (here I tried with various methods like kb.eval, converting to NumPy array,etc.) and it seems that my input_tensor is only a placeholder tensor. That again, contains no value because custom_loss is being executed during the compilation I presume. How I do access to input_tensor at runtime?
tf.print(with a non-capital p) - Mohammad Jafar Mashhadi