eXtreme Gradient Boosting regression. Calls xgboost::xgb.train() from package xgboost.

Custom mlr3 defaults

  • nrounds:

    • Actual default: no default.

    • Adjusted default: 1.

    • Reason for change: Without a default construction of the learner would error. Just setting a nonsense default to workaround this. nrounds needs to be tuned by the user.

  • nthread:

    • Actual value: Undefined, triggering auto-detection of the number of CPUs.

    • Adjusted value: 1.

    • Reason for change: Conflicting with parallelization via future.

  • verbose:

    • Actual default: 1.

    • Adjusted default: 0.

    • Reason for change: Reduce verbosity.

  • objective:

    • Actual default: reg:squarederror

    • Adjusted default: survival:cox

    • Reason for change: Changed to a survival objective.

Dictionary

This Learner can be instantiated via the dictionary mlr_learners or with the associated sugar function lrn():

mlr_learners$get("surv.xgboost")
lrn("surv.xgboost")

References

Chen, Tianqi, Guestrin, Carlos (2016). “Xgboost: A scalable tree boosting system.” In Proceedings of the 22nd ACM SIGKDD Conference on Knowledge Discovery and Data Mining, 785--794. ACM. doi: 10.1145/2939672.2939785 .

See also

Super classes

mlr3::Learner -> mlr3proba::LearnerSurv -> LearnerSurvXgboost

Methods

Public methods

Inherited methods

Method new()

Creates a new instance of this R6 class.

Usage

LearnerSurvXgboost$new()


Method importance()

The importance scores are calculated with xgboost::xgb.importance().

Usage

LearnerSurvXgboost$importance()

Returns

Named numeric().


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerSurvXgboost$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

if (requireNamespace("xgboost")) { learner = mlr3::lrn("surv.xgboost") print(learner) # available parameters: learner$param_set$ids() }
#> <LearnerSurvXgboost:surv.xgboost> #> * Model: - #> * Parameters: nrounds=1, nthread=1, verbose=0 #> * Packages: xgboost #> * Predict Type: crank #> * Feature types: integer, numeric #> * Properties: importance, missings, weights
#> [1] "booster" "watchlist" #> [3] "eta" "gamma" #> [5] "max_depth" "min_child_weight" #> [7] "subsample" "colsample_bytree" #> [9] "colsample_bylevel" "colsample_bynode" #> [11] "num_parallel_tree" "lambda" #> [13] "lambda_bias" "alpha" #> [15] "aft_loss_distribution" "aft_loss_distribution_scale" #> [17] "objective" "base_score" #> [19] "max_delta_step" "missing" #> [21] "monotone_constraints" "tweedie_variance_power" #> [23] "nthread" "nrounds" #> [25] "feval" "verbose" #> [27] "print_every_n" "early_stopping_rounds" #> [29] "maximize" "save_period" #> [31] "save_name" "xgb_model" #> [33] "sample_type" "normalize_type" #> [35] "rate_drop" "skip_drop" #> [37] "one_drop" "tree_method" #> [39] "grow_policy" "max_leaves" #> [41] "max_bin" "callbacks" #> [43] "sketch_eps" "scale_pos_weight" #> [45] "updater" "refresh_leaf" #> [47] "feature_selector" "top_k" #> [49] "predictor" "ntreelimit"