2
votes

For the function below, I am not getting the number of estimators as the out put but conversely I get a the following type error.

cv() got an unexpected keyword argument 'show_progress'

Even though the documentation contains the flag, I am getting the type error. I am following this blog for parameter tuning. Could anyone point me out where am I going wrong? the blog Is there any other way to get the number of estimators as the output?

def modelfit(alg, dtrain, predictors, useTrainCV=True, cv_folds=5, early_stopping_rounds=50):
if useTrainCV:
    xgb_param = alg.get_xgb_params()
    xgtrain = xgb.DMatrix(dtrain[predictors].values, label=dtrain[target].values, silent=False)
    cvresult = xgb.cv(xgb_param, xgtrain, num_boost_round=alg.get_params()['n_estimators'], nfold=cv_folds,
        metrics='auc', early_stopping_rounds=early_stopping_rounds, show_progress = True)
    alg.set_params(n_estimators=cvresult.shape[0])

#Fit the algorithm on the data
alg.fit(dtrain[predictors], dtrain[target],eval_metric='auc')

#Predict training set:
dtrain_predictions = alg.predict(dtrain[predictors])
dtrain_predprob = alg.predict_proba(dtrain[predictors])[:,1]

#Print model report:
print "\nModel Report"
print "Accuracy : %.4g" % metrics.accuracy_score(dtrain[target].values, dtrain_predictions)
print "AUC Score (Train): %f" % metrics.roc_auc_score(dtrain[target], dtrain_predprob)


feat_imp = pd.Series(alg.booster().get_fscore()).sort_values(ascending=False)
feat_imp.plot(kind='bar', title='Feature Importances')
plt.ylabel('Feature Importance Score')
2
Which version of xgboost you have? For which version was that tutorial? Why are you putting ** in front of show_progress, its not there in that tutorial.Vivek Kumar
@VivekKumar I removed the * thing. I wanted to highlight that thing. Could you guide me about how to check the version?Niranjan Agnihotri
pip freeze | grep xgboost will show you the version you are you are using. I was able to reproduce your same error and most likely it is due to the fact that you are using 1.0 or above. The tutorial is fairly old so it may be using older releases.Carlo Mazzaferro
If the above doesnt work, then in a terminal - >import xgboost then print xgboost.__version__Vivek Kumar
By pip freeze | grep xgboost I am getting 0.6Niranjan Agnihotri

2 Answers

7
votes

The latest version (0.6) of xgboost has these options for xgb.cv:

xgboost.cv(params, dtrain, num_boost_round=10, nfold=3, 
stratified=False, folds=None, metrics=(), obj=None, feval=None, 
maximize=False, early_stopping_rounds=None, fpreproc=None, 
as_pandas=True, verbose_eval=None, show_stdv=True, seed=0, 
callbacks=None, shuffle=True)

show_progress has been deprecated in favor of verbose_eval. See here

-1
votes

Remove show_progress = True and it should run