问题
I am using recurive feature elimination with cross validation (rfecv)
as the feature selection technique with GridSearchCV
.
My code is as follows.
X = df[my_features_all]
y = df['gold_standard']
x_train, x_test, y_train, y_test = train_test_split(X, y, random_state=0)
k_fold = StratifiedKFold(n_splits=5, shuffle=True, random_state=0)
clf = RandomForestClassifier(random_state = 42, class_weight="balanced")
rfecv = RFECV(estimator=clf, step=1, cv=k_fold, scoring='roc_auc')
param_grid = {'estimator__n_estimators': [200, 500],
'estimator__max_features': ['auto', 'sqrt', 'log2'],
'estimator__max_depth' : [3,4,5]
}
CV_rfc = GridSearchCV(estimator=rfecv, param_grid=param_grid, cv= k_fold, scoring = 'roc_auc', verbose=10, n_jobs = 5)
CV_rfc.fit(x_train, y_train)
print("Finished feature selection and parameter tuning")
Now, I want to get the optimal number of features
and selected features
from the above code.
For that I ran the below code.
#feature selection results
print("Optimal number of features : %d" % rfecv.n_features_)
features=list(X.columns[rfecv.support_])
print(features)
However, I got the following error:
AttributeError: 'RFECV' object has no attribute 'n_features_'
.
Is there any other way of getting these details?
I am happy to provide more details if needed.
回答1:
The object rfecv
that you passed to GridSearchCV
is not fitted by it. It is first cloned and those clones are then fitted to data and evaluated for all the different combinations of hyperparameters.
So to access the best features, you would need to access the best_estimator_
attribute of the GridSearchCV
:-
CV_rfc.fit(x_train, y_train)
print("Finished feature selection and parameter tuning")
print("Optimal number of features : %d" % rfecv.n_features_)
features=list(X.columns[CV_rfc.best_estimator_.support_])
print(features)
来源:https://stackoverflow.com/questions/55650782/how-to-get-the-selected-features-in-gridsearchcv-in-sklearn-in-python