If number of trees == num of boosting rounds
score = sum over predicted leaf values over all trees + 0.5
prob = exp(score)/(1+exp(score)) or 1/(1+exp(-score))
If number of trees == num of boosting rounds * number of class
score_i for all class i = sum over predicted leaf values over all trees + 0.5 , for all class i
prob_i for i = exp(score_i)/sum_i_(exp(score_i))
References:
- https://stackoverflow.com/questions/39858916/xgboost-how-to-get-probabilities-of-class-from-xgb-dump-multisoftprob-objecti/40632862#40632862
- https://stackoverflow.com/questions/37193953/how-to-use-xgboost-r-tree-dump-to-compute-or-do-predictions?rq=1
- https://homes.cs.washington.edu/~tqchen/pdf/BoostedTree.pdf
- https://stackoverflow.com/questions/39858916/xgboost-how-to-get-probabilities-of-class-from-xgb-dump-multisoftprob-objecti
- https://stats.stackexchange.com/questions/245537/calculate-probabilities-from-xgb-dump-for-multisoftprob-objective-formula
- http://xgboost.readthedocs.io/en/latest/parameter.html
- https://www.youtube.com/watch?v=Vly8xGnNiWs