I have a data frame with some dummy variables that I want to use as training set for glmnet
.
Since I'm using glmnet
I want to center and scale the features using the preProcess
option in the caret train
function. I don't want that this transformation is applied also to the dummy variables.
Is there a way to prevent the transformation of these variables?
caret::train
andcaret::trainControl
yet, and the current status is the same as in this question from 2012. So using a "hacky" workaround will eventually be the way to go at the moment... – geekoverdose