5
votes

I am performing lasso regression in R using glmnet package:

fit.lasso <- glmnet(x,y)
plot(fit.lasso,xvar="lambda",label=TRUE)

fit.lasso plot

Then using cross-validation:

cv.lasso=cv.glmnet(x,y)
plot(cv.lasso)

lambda VS MSE

One tutorial (last slide) suggest the following for R^2:

R_Squared =  1 - cv.lasso$cvm/var(y)

But it did not work.

I want to understand the model efficiency/performance in fitting the data. As we usually get R^2 and adjusted R^2 when performing lm() function in r.

3

3 Answers

4
votes

If you are using "gaussian" family, you can access R-squared value by

fit.lasso$glmnet.fit$dev.ratio

3
votes

I use the example data to demonstrate it

library(glmnet)

load data

data(BinomialExample)
head(x) 
head(y)

For cross validation

cvfit = cv.glmnet(x, y, family = "binomial", type.measure = "class")
rsq = 1 - cvfit$cvm/var(y)
plot(cvfit$lambda,rsq)

enter image description here

1
votes

Firtst fit the Lasso model with the selected lambda

...

lasso.model <- glmnet(x=X,y=Y, family = "binomial", alpha=1, lambda = cv.model$lambda.min )

then you could get the pseudo R2 from the fitted model

`lasso.model$dev.ratio`

this value give the deviance explained by the model/Null deviance