R/LearnerSurvGlmnet.R
mlr_learners_surv.glmnet.Rd
Generalized linear models with elastic net regularization.
Calls glmnet::glmnet()
from package glmnet.
The default for hyperparameter family
is set to "cox"
.
Caution: This learner is different to cv_glmnet
in that it does not use the
internal optimization of lambda. The parameter needs to be tuned by the user.
Essentially, one needs to tune parameter s
which is used at predict-time.
See https://stackoverflow.com/questions/50995525/ for more information.
This Learner can be instantiated via the dictionary mlr_learners or with the associated sugar function lrn()
:
mlr_learners$get("surv.glmnet") lrn("surv.glmnet")
Friedman J, Hastie T, Tibshirani R (2010). “Regularization Paths for Generalized Linear Models via Coordinate Descent.” Journal of Statistical Software, 33(1), 1--22. doi: 10.18637/jss.v033.i01 .
mlr3::Learner
-> mlr3proba::LearnerSurv
-> LearnerSurvGlmnet
new()
Creates a new instance of this R6 class.
LearnerSurvGlmnet$new()
clone()
The objects of this class are cloneable with this method.
LearnerSurvGlmnet$clone(deep = FALSE)
deep
Whether to make a deep clone.
if (requireNamespace("glmnet")) { learner = mlr3::lrn("surv.glmnet") print(learner) # available parameters: learner$param_set$ids() }#> <LearnerSurvGlmnet:surv.glmnet> #> * Model: - #> * Parameters: list() #> * Packages: glmnet #> * Predict Type: crank #> * Feature types: logical, integer, numeric #> * Properties: weights#> [1] "alpha" "offset" "nlambda" "lambda.min.ratio" #> [5] "lambda" "standardize" "intercept" "thresh" #> [9] "dfmax" "pmax" "exclude" "penalty.factor" #> [13] "lower.limits" "upper.limits" "maxit" "mxitnr" #> [17] "epsnr" "type.logistic" "type.multinomial" "fdev" #> [21] "devmax" "eps" "big" "mnlam" #> [25] "pmin" "exmx" "prec" "mxit" #> [29] "relax" "trace.it" "s" "exact" #> [33] "newoffset"