I'm building a grid search of multiple classifiers and want to use recursive feature elimination with cross validation. I started with the code as provided in Recursive feature elimination and grid search using scikit-learn. Below is my working code:
param_grid = [{'C': 0.001}, {'C': 0.01}, {'C': .1}, {'C': 1.0}, {'C': 10.0},
{'C': 100.0}, {'fit_intercept': True}, {'fit_intercept': False},
{'penalty': 'l1'}, {'penalty': 'l2'}]
estimator = LogisticRegression()
selector = RFECV(estimator, step=1, cv=5, scoring="roc_auc")
clf = grid_search.GridSearchCV(selector, {"estimator_params": param_grid},
cv=5, n_jobs=-1)
clf.fit(X,y)
print clf.best_estimator_.estimator_
print clf.best_estimator_.ranking_
print clf.best_estimator_.score(X, y)
I'm receiving a DeprecationWarning as it appears the "estimator_params" parameter is being removed in 0.18; I'm trying to figure out the correct syntax to use in line 4.
Trying...
param_grid = [{'C': 0.001}, {'C': 0.01}, {'C': .1}, {'C': 1.0}, {'C': 10.0},
{'C': 100.0}, {'fit_intercept': True}, {'fit_intercept': False},
{'fit_intercept': 'l1'}, {'fit_intercept': 'l2'}]
clf = grid_search.GridSearchCV(selector, param_grid,
cv=5, n_jobs=-1)
Returns ValueError: Parameter values should be a list. And...
param_grid = {"penalty": ["l1","l2"],
"C": [.001,.01,.1,1,10,100],
"fit_intercept": [True, False]}
clf = grid_search.GridSearchCV(selector, param_grid,
cv=5, n_jobs=-1)
Returns ValueError: Invalid parameter penalty for estimator RFECV. Check the list of available parameters with estimator.get_params().keys()
. Checking the keys shows all 3 of "C", "fit_intercept" and "penalty" as parameter keys. Trying...
param_grid = {"estimator__C": [.001,.01,.1,1,10,100],
"estimator__fit_intercept": [True, False],
"estimator__penalty": ["l1","l2"]}
clf = grid_search.GridSearchCV(selector, param_grid,
cv=5, n_jobs=-1)
never completes execution so I'm guessing that type of parameter assignment is not supported.
As for now I'm setup to ignore the warnings but I'd like to update the code with the appropriate syntax for 0.18. Any assistance would be appreciated!