0
votes

I am working on the binary classification. I have created my network like: Conv1, Relu1, Pool1 - Conv2, Relu2, Pool2 - Conv3, Relu3, Pool3 - Conv4, Relu4 - Conv5 Relu5 Dropot 0.5, FC, Dropout 0.5 - SoftmaxlossLayer

All conv layer is 3x3.

The default Weightdecay is 0.0005. And I am getting this result. Training accuracy: 98% Testing Accuracy: 88%

enter image description here

The same network is then used with Weightdecay 0.005

enter image description here

Anyone, Please help me to share why it is showing like that by changing the weight decay value?

1

1 Answers

1
votes

Weight decay penalizes model complexity, so it's used to control model's variance against its bias. Clearly if you penalize complexity too much, the model will not learn anything useful, since it will be too simple.

For other methods of regularizing neural networks you can see notes for Hinton's Coursera course.