集成学习
Bagging,Boosting以及Stacking (1)Bagging + 决策树 = 随机森林 (2)AdaBoost + 决策树 = 提升树 (3)Gradient Boosting + 决策树 = GBDT 一、Boosting 1)Adaboost https://www.cnblogs.com/willnote/p/6801496.html 2)Gradient Boosting https://www.cnblogs.com/massquantity/p/9174746.html 二、Bagging (bootstrap aggregating) https://www.cnblogs.com/zongfa/p/9304353.html 相比与Adaboost,不考虑权重 三、Stacking https://blog.csdn.net/maqunfi/article/details/82220115 https://www.kaggle.com/arthurtok/introduction-to-ensembling-stacking-in-python (Introduction to Ensembling/Stacking in Python) 来源: CSDN 作者: huanglv997 链接: https://blog.csdn.net/weixin