Suppose you have a data frame with a high number of columns(1000 factors, each with 15 levels). You'd like to create a dummy variable data set, but since it would be too sparse, you would like to keep dummies in sparse matrix format.
My data set is quite big and the less steps there are, the better for me. I know how to do above steps; but I couldn't get my head around directly creating that sparse matrix from the initial data set, i.e. having one step instead of two. Any ideas?
EDIT: Some comments asked for further elaboration, so here it goes:
Where X is my original data set with 1000 columns and 50000 records, each column having 15 levels,
Step1: Creating dummy variables from the original data set with a code like;
# Creating dummy data set with empty values
dummified <- matrix(NA,nrow(X),15*ncol(X))
# Adding values to this data set for each column and each level within columns
for (i in 1:ncol(X)){colFactr <- factor(X[,i],exclude=NULL)
for (j in 1:l){
lvl <- levels(colFactr)[j]
indx <- ((i-1)*l)+j
dummified[,indx] <- ifelse(colFactr==lvl,1,0)
}
}
Step2: Converting that huge matrix into a sparse matrix, with a code like;
sparse.dummified <- sparseMatrix(dummified)
But this approach still created this interim large matrix which takes a lot of time & memory, therefore I am asking the direct methodology (if there is any).
model.matrix(~ -1 + . , data=yourdata)
. Is this what you want? – user20650Matrix(model.matrix(~ -1 + . , data=df, contrasts.arg = lapply(df, contrasts, contrasts=FALSE)),sparse=TRUE)
– user20650