Which is the loss function for multi-class classification in XGBoost?

こ雲淡風輕ζ 提交于 2020-01-22 19:52:05

问题


I'm trying to know which loss function uses XGBoost for multi-class classification. I found in this question the loss function for logistic classification in the binary case.

I had though that for the multi-class case it might be the same as in GBM (for K classes) which can be seen here, where y_k=1 if x's label is k and 0 in any other case, and p_k(x) is the softmax function. However, I have made the first and second order gradient using this loss function and the hessian doesn't match the one defined in the code here (in function GetGradient in SoftmaxMultiClassObj) by a constant 2.

Could you please tell me which is the loss function used?

Thank you in advance.


回答1:


The loss function used for multiclass is, as you suspect, the softmax objective function. As of now the only options for multiclass are shown in the quote below, the multi:softprob returning all probabilities instead of just those of the most likely class.

“multi:softmax” –set XGBoost to do multiclass classification using the softmax objective, you also need to set num_class(number of classes)

“multi:softprob” –same as softmax, but output a vector of ndata * nclass, which can be further reshaped to ndata, nclass matrix. The result contains predicted probability of each data point belonging to each class.

See https://xgboost.readthedocs.io/en/latest//parameter.html#learning-task-parameters.




回答2:


A good example is given on:

http://machinelearningmastery.com/avoid-overfitting-by-early-stopping-with-xgboost-in-python/

Basically, you can define the loss function with the "eval_metric" parameter. (default being rmse for regression, and error for classification)

A description of the "error" function is given on the official github repo:

""error": Binary classification error rate. It is calculated as #(wrong cases)/#(all cases). For the predictions, the evaluation will regard the instances with prediction value larger than 0.5 as positive instances, and the others as negative instances."

A full list of eval metrics available can also be found at the "Learning Task Parameters" section of https://github.com/dmlc/xgboost/blob/master/doc/parameter.md

Hope that answers your question, best of luck,



来源:https://stackoverflow.com/questions/41983544/which-is-the-loss-function-for-multi-class-classification-in-xgboost

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!