0
votes

I am new to R and trying to do hyper parameter tuning for xgboost- binary classification, However I am getting error, I would appreciate if someone could help me

Error in as.matrix(cv.res)[, 3] : subscript out of bounds In addition: Warning message: 'early.stop.round' is deprecated. Use 'early_stopping_rounds' instead. See help("Deprecated") and help("xgboost-deprecated").

Please find below the code snippet`

 I would appreciate if some one could provide another alternative too apart from this approach in R

X_Train <- as(X_train, "dgCMatrix")


GS_LogLoss = data.frame("Rounds" = numeric(), 
                        "Depth" = numeric(),
                        "r_sample" = numeric(),
                        "c_sample" = numeric(), 
                        "minLogLoss" = numeric(),
                        "best_round" = numeric())

for (rounds in seq(50,100, 25)) {
  
  for (depth in c(4, 6, 8, 10)) {
    
    for (r_sample in c(0.5, 0.75, 1)) {
      
      for (c_sample in c(0.4, 0.6, 0.8, 1)) {
        
        for (imb_scale_pos_weight in c(5, 10, 15, 20, 25))	{
          
          for (wt_gamma in c(5, 7, 10)) {
            
            for (wt_max_delta_step in c(5,7,10)) {
              
              for (wt_min_child_weight in c(5,7,10,15))	{
                
                
                set.seed(1024)
                eta_val = 2 / rounds
                cv.res = xgb.cv(data = X_Train, nfold = 2, label = y_train, 
                                nrounds = rounds, 
                                eta = eta_val, 
                                max_depth = depth,
                                subsample = r_sample, 
                                colsample_bytree = c_sample,
                                early.stop.round = 0.5*rounds,
                                scale_pos_weight= imb_scale_pos_weight,
                                max_delta_step = wt_max_delta_step,
                                gamma = wt_gamma,
                                objective='binary:logistic', 
                                eval_metric = 'auc',
                                verbose = FALSE)
                
                print(paste(rounds, depth, r_sample, c_sample, min(as.matrix(cv.res)[,3]) ))
                GS_LogLoss[nrow(GS_LogLoss)+1, ] = c(rounds, 
                                                     depth, 
                                                     r_sample, 
                                                     c_sample, 
                                                     min(as.matrix(cv.res)[,3]), 
                                                     which.min(as.matrix(cv.res)[,3]))
                
              }
            }
          }
        }	
      }
    }
  }	
}

`

1
Are u sure the output from xgb.cv is a matrix? And I would suggest solving the issue for 1 parameter first. U can extend the test to others once that worksStupidWolf

1 Answers

0
votes

To do you hyperparameters selection, you could use the metapackage tidymodels, especially the packages parsnip, rsample, yardstick and tune.

A workflow like this would work:

library(tidyverse)
library(tidymodels)

# Specify the model and the parameters to tune (parnsip)
model <-
  boost_tree(tree_depth = tune(), mtry = tune()) %>% 
  set_mode("classification") %>% 
  set_engine("xgboost")

# Specify the resampling method (rsample)
splits <- vfold_cv(X_train, v = 2)

# Specify the metrics to optimize (yardstick)
metrics <- metric_set(roc_auc)

# Specify the parameters grid (or you can use dials to automate your grid search)
grid <- expand_grid(tree_depth = c(4, 6, 8, 10),
                    mtry = c(2, 10, 50)) # You can add others

# Run each model (tune)
tuned <- tune_grid(formula = Y ~ .,
                   model = model,
                   resamples = splits,
                   grid = grid,
                   metrics = metrics,
                   control = control_grid(verbose = TRUE))

# Check results
show_best(tuned)
autoplot(tuned)
select_best(tuned)

# Update model
tuned_model <- 
  model %>% 
  finalize_model(select_best(tuned)) %>% 
  fit(Y ~ ., data = X_train)

# Make prediction 
predict(tuned_model, X_train)
predict(tuned_model, X_test)

Please note that the names during the model specification are subject to change compare to the original names in xgboost because parsnip is a unified interface with consistant names across several models. See here.