4.1.1. Extreme Boosting Machine

XGBoost, which was proposed by Chen and Guestrin [50], is a scalable machine learning model used in tree-boosting. It has been widely used for forecasting purposes such as STLF and store sales forecasting [51]. The basic principle of XGBoost is boosting, which combines a weak basic learning model with an active learner in an iterative fashion [52]. At each iteration of boosting, the residuals can modify the previous predictor to optimize the specified loss function. XGBoost provides faster learning and expandability based on parallel and distributed computing by further developing the existing boosting technique. It establishes an objective function to measure model performance by adding regularization to loss functions to improve performance. In addition, missing values can be handled easily because they are recognized and automatically supplemented to perform boosting. XGBoost gradually increases the depth of the tree at the beginning of learning. If the gain information obtained in the tree with increased depth is smaller than the of Gamma value, the depth stops increasing.
