For the function below, I am not getting the number of estimators as the out put but conversely I get a the following type error.
cv() got an unexpected keyword argument 'show_progress'
Even though the documentation contains the flag, I am getting the type error. I am following this blog for parameter tuning. Could anyone point me out where am I going wrong? the blog Is there any other way to get the number of estimators as the output?
def modelfit(alg, dtrain, predictors, useTrainCV=True, cv_folds=5, early_stopping_rounds=50):
if useTrainCV:
xgb_param = alg.get_xgb_params()
xgtrain = xgb.DMatrix(dtrain[predictors].values, label=dtrain[target].values, silent=False)
cvresult = xgb.cv(xgb_param, xgtrain, num_boost_round=alg.get_params()['n_estimators'], nfold=cv_folds,
metrics='auc', early_stopping_rounds=early_stopping_rounds, show_progress = True)
alg.set_params(n_estimators=cvresult.shape[0])
#Fit the algorithm on the data
alg.fit(dtrain[predictors], dtrain[target],eval_metric='auc')
#Predict training set:
dtrain_predictions = alg.predict(dtrain[predictors])
dtrain_predprob = alg.predict_proba(dtrain[predictors])[:,1]
#Print model report:
print "\nModel Report"
print "Accuracy : %.4g" % metrics.accuracy_score(dtrain[target].values, dtrain_predictions)
print "AUC Score (Train): %f" % metrics.roc_auc_score(dtrain[target], dtrain_predprob)
feat_imp = pd.Series(alg.booster().get_fscore()).sort_values(ascending=False)
feat_imp.plot(kind='bar', title='Feature Importances')
plt.ylabel('Feature Importance Score')
**
in front ofshow_progress
, its not there in that tutorial. – Vivek Kumarpip freeze | grep xgboost
will show you the version you are you are using. I was able to reproduce your same error and most likely it is due to the fact that you are using 1.0 or above. The tutorial is fairly old so it may be using older releases. – Carlo Mazzaferroimport xgboost
thenprint xgboost.__version__
– Vivek Kumarpip freeze | grep xgboost
I am getting 0.6 – Niranjan Agnihotri