1
votes

I´m trying to add a crf layer to my functional model, but get this error, which I can not solve:

ValueError: ('Could not interpret loss function identifier:', )

The CRF Layer comes from the keras contribs package.

Model:

from keras_contrib.layers import CRF


inputs = Input(shape=(MAX_LENGTH,))

embedding = Embedding(VOCAB_SIZE +1, EMBEDDING_SIZE, mask_zero= True)(inputs)

left = LSTM(HIDDEN_SIZE, return_sequences=True)(embedding)
right = LSTM(HIDDEN_SIZE, go_backwards=True, return_sequences=True)(embedding)
left_right = concatenate([left, right])

left2 = LSTM(HIDDEN_SIZE, return_sequences=True)(embedding)
right2 = LSTM(HIDDEN_SIZE, go_backwards=True, return_sequences=True)(embedding)
left_right2 = concatenate([left2, right2])

left_right_combi = add([left_right, left_right2])

left_right_combii = TimeDistributed(Dense(NUM_LABELS, activation='softmax'))\
(left_right_combi)


crf = CRF(NUM_LABELS, sparse_target=True)(left_right_combii)

combined_model = Model(inputs=inputs, outputs=crf)
combined_model.compile(loss=CRF.loss_function, optimizer='adam', metrics=[CRF.accuracy])

If I use the "normal" loss function and metric I get this error:

combined_model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])

"ValueError: An operation has None for gradient. Please make sure that all of your ops have a gradient defined (i.e. are differentiable). Common ops without gradient: K.argmax, K.round, K.eval."

Any ideas how I can use the crf layer correctly?

Thank you :)

2
We cannot run this example, the class CRF is undefined, you need to better explain the problem for anyone to provide the answer.Dr. Snoopy
@Matias Valdenegro Unfortunately I can´t provide the data, so you cannot run the code anyway. If the Class CRF is undefinded you have to install the CRF from Keras contribs package.L.Berlanda

2 Answers

1
votes

you should import crf_loss and 'crf_accuracy' to use the CRF layer properly

to wrap up, it will look like this:

from keras_contrib.losses import crf_loss
from keras_contrib.metrics import crf_accuracy
#.
#.
#.
model.compile(optimizer="adam", loss=crf_loss, metrics=[crf_accuracy])

you can also see this example from Keras contribution GitHub.

0
votes

The CRF layer of keras-contrib expects the crf_loss when using in learn_mode='join'(The default mode). If you want to use any other normal loss function , say crossentropy , you should set learn_mode='marginal' while instantiating.

crf=CRF(<classes>,learn_mode='marginal')