AdaBoostClassifier with different base learners

↘锁芯ラ 提交于 2019-12-17 23:18:13

问题


I am trying to use AdaBoostClassifier with a base learner other than DecisionTree. I have tried SVM and KNeighborsClassifier but I get errors. Can some one point out the classifiers that can be used with AdaBoostClassifier?


回答1:


Ok, we have a systematic method to find out all the base learners supported by AdaBoostClassifier. Compatible base learner's fit method needs to support sample_weight, which can be obtained by running following code:

import inspect
from sklearn.utils.testing import all_estimators
for name, clf in all_estimators(type_filter='classifier'):
    if 'sample_weight' in inspect.getargspec(clf().fit)[0]:
       print name

This results in following output: AdaBoostClassifier, BernoulliNB, DecisionTreeClassifier, ExtraTreeClassifier, ExtraTreesClassifier, MultinomialNB, NuSVC, Perceptron, RandomForestClassifier, RidgeClassifierCV, SGDClassifier, SVC.

If the classifier doesn't implement predict_proba, you will have to set AdaBoostClassifier parameter algorithm = 'SAMME'.

Thanks to Andreas for showing how to list all estimators.




回答2:


You shouldnot use SVM with Adaboost. Adaboost should use weak-classifier. Using of classifiers like SVM will result in overfitting.




回答3:


Any classifier that supports passing sample weights should work. SVC is one such classifier. What specific error message (and traceback) do you get? Can you provide a minimalistic reproduction case for this error (e.g. as a http://gist.github.com )?



来源:https://stackoverflow.com/questions/18306416/adaboostclassifier-with-different-base-learners

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!