I am trying out multi-class classification with xgboost and I\'ve built it using this code,
clf = xgb.XGBClassifier(max_depth=7, n_estimators=1000)
clf.fit(byte
By default, XGBClassifier uses the objective='binary:logistic'
. When you use this objective, it employs either of these strategies: one-vs-rest
(also known as one-vs-all) and one-vs-one
. It may not be the right choice for your problem at hand.
When you use objective='multi:softprob'
, the output is a vector of number of data points * number of classes. As a result, there is an increase in time complexity of your code.
Try setting objective=multi:softmax
in your code. It is more apt for multi-class classification task.