1
votes

this seems to be a more theoretical question and i hope someone knows the answer, im using tensorflow to train a fully connected deep neural network, i apply dropout to my hidden layers and im investigating dropout in some cases.

I know that dropout is only applied to the input and hidden layers, for evaluation of the network the keep probability should be 1.0. For the case i want to train my network without dropout.... can i just set the keep probability on 1 on the hidden layers for training or do i have to remove it completely from my source-code?

Greetings

1

1 Answers

2
votes

You can keep your code as is, a keep probability of 1.0 is indeed equal to no dropout, as every activations are kept.