0
votes

Deep RNN Model was working like a month ago. Lest it as a differnt project took over. Now coming back and trying to run training I get an error. Getting an error:

Traceback (most recent call last):

File "/home/matiss/.local/share/JetBrains/Toolbox/apps/PyCharm-P/ch-0/201.7223.92/plugins/python/helpers/pydev/_pydevd_bundle/pydevd_exec2.py", line 3, in Exec exec(exp, global_vars, local_vars) File "", line 1, in File "/home/matiss/Documents/python_work/PycharmProjects/NectCleave/functions.py", line 358, in weighted_model File "/usr/local/lib/python3.8/dist-packages/keras/engine/training.py", line 1213, in fit self._make_train_function() File "/usr/local/lib/python3.8/dist-packages/keras/engine/training.py", line 314, in _make_train_function training_updates = self.optimizer.get_updates( File "/usr/local/lib/python3.8/dist-packages/keras/legacy/interfaces.py", line 91, in wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.8/dist-packages/keras/backend/tensorflow_backend.py", line 75, in symbolic_fn_wrapper return func(*args, **kwargs) File "/usr/local/lib/python3.8/dist-packages/keras/optimizers.py", line 504, in get_updates grads = self.get_gradients(loss, params) File "/usr/local/lib/python3.8/dist-packages/keras/optimizers.py", line 93, in get_gradients raise ValueError('An operation has None for gradient. ' ValueError: An operation has None for gradient. Please make sure that all of your ops have a gradient defined (i.e. are differentiable). Common ops without gradient: K.argmax, K.round, K.eval.

My model arhitecture:

def make_model(metrics='', output_bias=None, timesteps=None, features=None):
    from keras import regularizers

    if output_bias is not None:
        output_bias = Constant(output_bias)
    K.clear_session()
    model = Sequential()
    # First LSTM layer
    model.add(
        Bidirectional(LSTM(units=50, return_sequences=True, recurrent_dropout=0.1), input_shape=(timesteps, features)))
    model.add(Dropout(0.5))

    # Second LSTM layer
    model.add(Bidirectional(LSTM(units=50, return_sequences=True)))
    model.add(Dropout(0.5))

    # Third LSTM layer
    model.add(Bidirectional(LSTM(units=50, return_sequences=True)))
    model.add(Dropout(0.5))

    # Forth LSTM layer
    model.add(Bidirectional(LSTM(units=50, return_sequences=False)))
    model.add(Dropout(0.5))

    # First Dense Layer
    model.add(Dense(units=128, kernel_initializer='he_normal', activation='relu'))
    model.add(Dropout(0.5))

    # Adding the output layer
    if output_bias == None:
        model.add(Dense(units=1, activation='sigmoid', kernel_regularizer=regularizers.l2(0.001)))
    else:
        model.add(Dense(units=1, activation='sigmoid',
                        bias_initializer=output_bias, kernel_regularizer=regularizers.l2(0.001)))
    # https://keras.io/api/losses/
    model.compile(optimizer=Adam(lr=1e-3), loss=BinaryCrossentropy(), metrics=metrics)


    return model

Please helpo. Why is this happening?

1

1 Answers

0
votes

Okay, so after a half of the day of googling and checking stuff I could not find a solution. Then I decided to just set up a new python virtual enviroment, install all the required packages and boom: it works again. Have no idea what was issue and how did it come to happen but it works now.

Hope this saves some time to others in same problems.