问题
I would like to use the AdaBoostClassifier with LinearSVC as base estimator. I want to do a gridsearch on some of the parameters in LinearSVC. Also I have to scale my features.
p_grid = {'base_estimator__C': np.logspace(-5, 3, 10)}
n_splits = 5
inner_cv = StratifiedKFold(n_splits=n_splits,
shuffle=True, random_state=5)
SVC_Kernel=LinearSVC(multi_class ='crammer_singer',tol=10e-3,max_iter=10000,class_weight='balanced')
ABC = AdaBoostClassifier(base_estimator=SVC_Kernel,n_estimators=600,learning_rate=1.5,algorithm="SAMME")
for train_index, test_index in kk.split(input):
X_train, X_test = input[train_index], input[test_index]
y_train, y_test = target[train_index], target[test_index]
pipe_SVC = Pipeline([('scaler', RobustScaler()),('AdaBoostClassifier', ABC)])
clfSearch = GridSearchCV(estimator=pipe_SVC, param_grid=p_grid,
cv=inner_cv, scoring='f1_macro', iid=False, n_jobs=-1)
clfSearch.fit(X_train, y_train)
The following error occurs:
ValueError: Invalid parameter base_estimator for estimator Pipeline(memory=None,
steps=[('scaler',
RobustScaler(copy=True, quantile_range=(25.0, 75.0),
with_centering=True, with_scaling=True)),
('AdaBoostClassifier',
AdaBoostClassifier(algorithm='SAMME',
base_estimator=LinearSVC(C=1.0,
class_weight='balanced',
dual=True,
fit_intercept=True,
intercept_scaling=1,
loss='squared_hinge',
max_iter=10000,
multi_class='crammer_singer',
penalty='l2',
random_state=None,
tol=0.01,
verbose=0),
learning_rate=1.5, n_estimators=600,
random_state=None))],
verbose=False). Check the list of available parameters with `estimator.get_params().keys()`.
Without the AdaBoostClassifier the pipeline is working, so I think there is the problem.
回答1:
I think your p_grid
should be defined as follows,
p_grid = {'AdaBoostClassifier__base_estimator__C': np.logspace(-5, 3, 10)}
Try pipe_SVC.get_params()
, if you are not sure about the name of your parameter.
来源:https://stackoverflow.com/questions/58540137/adaboost-in-pipeline-with-gridsearch-sklearn