How do I add Keras dropout layer? Unfortunately, I don't know where exactly I would have to add this layer. I looked at 2 links:
- https://keras.io/api/layers/regularization_layers/dropout/
- https://machinelearningmastery.com/dropout-regularization-deep-learning-models-keras/
For example, I've seen this
model.add(Dense(60, input_dim=60, activation='relu', kernel_constraint=maxnorm(3)))
model.add(Dropout(0.2))
model.add(Dense(30, activation='relu', kernel_constraint=maxnorm(3)))
model.add(Dropout(0.2))
model.add(Dense(1, activation='sigmoid'))
The dense layers are created with a loop as I understand it so I'm not sure how to add this.
def get_Model(...):
# build dense layer for model
for i in range(1, len(dense_layers)):
layer = Dense(dense_layers[i],
activity_regularizer=l2(reg_layers[i]),
activation='relu',
name='layer%d' % i)
mlp_vector = layer(mlp_vector)
predict_layer = Concatenate()([mf_cat_latent, mlp_vector])
result = Dense(1, activation='sigmoid',
kernel_initializer='lecun_uniform', name='result')
model = Model(inputs=[input_user, input_item], outputs=result(predict_layer))
return model