I understand that dropout doesn't have the same effect for kernels of convolutional filters of a neural network, as it does for FC layers:
But does the same fact apply, if you dropout the whole filter?
Let's assume a network structure like: Input, Conv2D, Conv2D, ..., Conv2D, Conv2D, Sigmoid. So there is no fully connected Layer in the whole network.
Question 1 Is it reasonable to apply conv filter dropouts to avoid co-adaptation between filters for improving the results of filter visualization.
Question 2 Is there a quick way to do dropout filters in keras.