A. XGBOOST The Xgboost algorithm is based on GBDT [27]. Compared with GBDT, the advantage of Xgboost is that it supports linear classifiers and performs Taylor expansion for the cost function by introducing the second derivative to make the results more accurate. There are the following principles of Xgboost. Xgboost model uses additive training method to optimize the objective function, which means the optimization process of the latter step relies on the result of its previous step. The t-th objective function of the model can be expressed as obj) = -, (i, j;= 1 + fi(xi)) + 12 (50) + constant (1) where l represents the loss term of the t-th round, constant represents a constant term, and 12 is the regularization term of model, shown as (6=, +, S2 (fi) = y.T; +a (2) j=1 both y and i are customization parameters. Generally, the larger these two values are, the simpler the structure of the tree is. And the problems of overfitting can be effectively solved. A. XGBOOST The Xgboost algorithm is based on GBDT [27]. Compared with GBDT, the advantage of Xgboost is that it supports linear classifiers and performs Taylor expansion for the cost function by introducing the second derivative to make the results more accurate. There are the following principles of Xgboost. Xgboost model uses additive training method to optimize the objective function, which means the optimization process of the latter step relies on the result of its previous step. The t-th objective function of the model can be expressed as obj) = -, (i, j;= 1 + fi(xi)) + 12 (50) + constant (1) where l represents the loss term of the t-th round, constant represents a constant term, and 12 is the regularization term of model, shown as (6=, +, S2 (fi) = y.T; +a (2) j=1 both y and i are customization parameters. Generally, the larger these two values are, the simpler the structure of the tree is. And the problems of overfitting can be effectively solved