7
votes

I have a data frame with some dummy variables that I want to use as training set for glmnet.

Since I'm using glmnet I want to center and scale the features using the preProcess option in the caret train function. I don't want that this transformation is applied also to the dummy variables.

Is there a way to prevent the transformation of these variables?

1
Good question. We are having same issue in my group and trying to avoid hacky solutions. I will keep you updated in case something comes out.Gianmario Spacagna
AFAIK this is not addressed in caret::train and caret::trainControl yet, and the current status is the same as in this question from 2012. So using a "hacky" workaround will eventually be the way to go at the moment...geekoverdose

1 Answers

1
votes

There's not (currently) a way to do this besides writing a custom model to do so (see the example with PLS and RF near the end).

I'm working on a method to specify which variables get which pre-processing method. However, with dummy variables, this is tough since you might need to specific the names of a lot of predictors whose columns are not in the current dat set. The idea is to be able to use wildcards (e.g. Species* to capture Speciesversicolor and Speciesvirginica) but the code isn't quite there yet.

Max