I am wanting to build a neural network classifier using the caret package. I have specified a tunegrid with some hyper-parameters that I want to test to get the best accuracy.
After I run the model, the train function function will always default to the standard decay and size values. Is this a bug within caret? or is there an issue with my code?
Code:
nnet_grid <- expand.grid(.decay = c(0.5, 0.1, 1e-2, 1e-3, 1e-4, 1e-5, 1e-6, 1e-7), .size = c(3, 5, 10, 20))
features.nn <- train(label ~ .,
method = "nnet",
trControl = trControl,
data = features,
tunegrid = nnet_grid,
verbose = FALSE)
Output:
No pre-processing
Resampling: Cross-Validated (5 fold)
Summary of sample sizes: 1680, 1680, 1680, 1680, 1680
Resampling results across tuning parameters:
size decay Accuracy Kappa
1 0e+00 0.10904762 0.0645
1 1e-04 0.10142857 0.0565
1 1e-01 0.14380952 0.1010
3 0e+00 0.09571429 0.0505
3 1e-04 0.05523810 0.0080
3 1e-01 0.19190476 0.1515
5 0e+00 0.13000000 0.0865
5 1e-04 0.14761905 0.1050
5 1e-01 0.31809524 0.2840
Accuracy was used to select the optimal model using the largest value.
The final values used for the model were size = 5 and decay = 0.1.