I think you forgot to show the code for sample_df. However, assuming the following you can access it as follows:
library(caret)
x = matrix(rnorm(500), ncol=10)
y = rnorm(100)
sample_df = cbind.data.frame(y,x)
control = trainControl(
method="LOOCV",
allowParallel = TRUE,
number = nrow(sample_df),
verboseIter = FALSE,
returnData = FALSE
)
my_elasticnet <- train(sample_df[2:11], sample_df$y,
method = "glmnet",
preProc = c("center", "scale"),
trControl = control)
my_elasticnet$finalModel$beta
If you just look at names, you'll get everything you need regarding the final model:
> names(my_elasticnet$finalModel)
[1] "a0" "beta" "df" "dim" "lambda" "dev.ratio" "nulldev"
[8] "npasses" "jerr" "offset" "call" "nobs" "lambdaOpt" "xNames"
[15] "problemType" "tuneValue" "obsLevels" "param"
EDIT: In Response to comment
The final model depends what level of alpha and lambda you select. There are 66 such such parameters. If you want to choose the one the machine thinks is best, you can do:
coef(my_elasticnet$finalModel, my_elasticnet$bestTune$lambda)
That will give you just a 11x1 vector that you're looking for.