eXtreme Gradient Boosting regression. Calls xgboost::xgb.train() from package xgboost.

Format

R6::R6Class() inheriting from mlr3::LearnerRegr.

Construction

LearnerRegrXgboost$new()
mlr3::mlr_learners$get("regr.xgboost")
mlr3::lrn("regr.xgboost")

References

Chen T, Guestrin C (2016). “Xgboost: A scalable tree boosting system.” In Proceedings of the 22nd ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 785--794. ACM. doi: 10.1145/2939672.2939785 .

See also

Examples

learner = mlr3::lrn("regr.xgboost") print(learner)
#> <LearnerRegrXgboost:regr.xgboost> #> * Model: - #> * Parameters: nrounds=1, verbose=0 #> * Packages: xgboost #> * Predict Type: response #> * Feature types: integer, numeric #> * Properties: importance, missings, weights
# available parameters: learner$param_set$ids()
#> [1] "booster" "watchlist" "eta" #> [4] "gamma" "max_depth" "min_child_weight" #> [7] "subsample" "colsample_bytree" "colsample_bylevel" #> [10] "num_parallel_tree" "lambda" "lambda_bias" #> [13] "alpha" "objective" "eval_metric" #> [16] "base_score" "max_delta_step" "missing" #> [19] "monotone_constraints" "tweedie_variance_power" "nthread" #> [22] "nrounds" "feval" "verbose" #> [25] "print_every_n" "early_stopping_rounds" "maximize" #> [28] "sample_type" "normalize_type" "rate_drop" #> [31] "skip_drop" "callbacks"