XGBoost
XGboost is an end-to-end boosting system. It is sparsity-aware. (Chen and Guestrin, n.d.)
Regularized Learning Objective
For a given data set of
where

We wish to minimise the regularized objective:
where
Gradient Tree Boosting
Since the regularized objective contains functions, the objective cannot be optimized in the Euclidean space.
Let
We can perform a Taylor expansion, and obtain:
where
If we define
and we find that for a fixed structure
This can then be used to score a tree structure
Implementing Distributed XGBoost
The implementation of distributed XGBoost uses RABIT, and the Allreduce framework. XGBoost requires gradients and hessians from each distributed worker. This fit the allreduce framework, which broadcasts each worker’s reduce result across all processes. (Chen, Cano, and Zhou, n.d.)
Bibliography
Chen, Tianqi, and Carlos Guestrin. n.d. “Xgboost: A Scalable Tree Boosting System.” http://arxiv.org/abs/1603.02754v3.
Chen, Tianqi, Ignacio Cano, and Tianyi Zhou. n.d. “RABIT: A Reliable Allreduce and Broadcast Interface” 3 (2).