XGBoost. How to get probabilities of class from xgb.dump (multi:softprob objective)

后端 未结 1 1406
南方客
南方客 2021-01-24 02:59

I\'ve got 3-class classification predict using XGBoost. Next turn is get tree-model (printed by xgb.dump()) and use it in .net production system. I really do not understand how

1条回答
  •  北海茫月
    2021-01-24 03:40

    This took a while to figure out. Once you get your tree, The steps to follow are

    1. Figure out the leaf values for each booster. The first booster is class 0 next is class 1 next is class 2 next is class 0 and class 1 and so on. So essentially if you have 10 num_round, you will see 30 boosters.

      Be careful about the "missing". If you have not specifically mentioned a missing value in the DMatrix, xgb can consider the value 0 as missing. So when you walk down your tree you might need to jump to the node x denoted by missing=x when you have the feature value as 0 for that node. One way of getting around this confusing thing is making sure you have put a missing value in the DMatrix when training and predicting. I put a value that is impossible to be present in my data and also made sure that I handle NA type of values by replacing them with some (non zero) value before I do train or predict. Obviously 0 can actually mean missing for you in which case that's OK. You might actually notice this thing in categorical features which have 1 or 0 in your data and the node in a tree has a ridiculous condition on a very small negative number etc.

    2. Let's say you have 3 rounds. Then you will end up with values like this l1_0,l2_0,l3_0 for class 0 - and l1_1,l2_1,l3_1 for class 1 and l1_2,l2_2,l3_2 for class 2.

      Now, a great way of making sure you are getting the right logic is to set output_margin and pred_leaf on. One at at time. When you set pred_leaf on, you will get a matrix which will show exactly which leaf you should have hit for all of your classes, for a single instance. When you set output_margin on you will get the sum of the leaf values which xgb is calculating.

      Now, this sum is 0.5 + l1_0+l2_0+l3_0 for class 0 and so on. You can cross verify this with the predict response to get with output_margin on. Here 0.5 is the bias.

    3. Now say you got v0, v1 and v2 as the bias + leaf value summation result. Then you probability for class 0 is

          p(class0) = exp(v0)/(exp(v0)+exp(v1)+exp(v2))
      

    0 讨论(0)
提交回复
热议问题