Skip to contents

Ordinary linear regression. Calls stats::lm().

Dictionary

This mlr3::Learner can be instantiated via the dictionary mlr3::mlr_learners or with the associated sugar function mlr3::lrn():

mlr_learners$get("regr.lm")
lrn("regr.lm")

Meta Information

  • Task type: “regr”

  • Predict Types: “response”, “se”

  • Feature Types: “logical”, “integer”, “numeric”, “character”, “factor”

  • Required Packages: mlr3, mlr3learners, 'stats'

Parameters

IdTypeDefaultLevelsRange
dfnumericInf\((-\infty, \infty)\)
intervalcharacter-none, confidence, prediction-
levelnumeric0.95\((-\infty, \infty)\)
modellogicalTRUETRUE, FALSE-
offsetlogical-TRUE, FALSE-
pred.varuntyped--
qrlogicalTRUETRUE, FALSE-
scalenumericNULL\((-\infty, \infty)\)
singular.oklogicalTRUETRUE, FALSE-
xlogicalFALSETRUE, FALSE-
ylogicalFALSETRUE, FALSE-
rankdeficientcharacter-warnif, simple, non-estim, NA, NAwarn-
tolnumeric1e-07\((-\infty, \infty)\)
verboselogicalFALSETRUE, FALSE-

Contrasts

To ensure reproducibility, this learner always uses the default contrasts:

Setting the option "contrasts" does not have any effect. Instead, set the respective hyperparameter or use mlr3pipelines to create dummy features.

See also

Other Learner: mlr_learners_classif.cv_glmnet, mlr_learners_classif.glmnet, mlr_learners_classif.kknn, mlr_learners_classif.lda, mlr_learners_classif.log_reg, mlr_learners_classif.multinom, mlr_learners_classif.naive_bayes, mlr_learners_classif.nnet, mlr_learners_classif.qda, mlr_learners_classif.ranger, mlr_learners_classif.svm, mlr_learners_classif.xgboost, mlr_learners_regr.cv_glmnet, mlr_learners_regr.glmnet, mlr_learners_regr.kknn, mlr_learners_regr.km, mlr_learners_regr.nnet, mlr_learners_regr.ranger, mlr_learners_regr.svm, mlr_learners_regr.xgboost

Super classes

mlr3::Learner -> mlr3::LearnerRegr -> LearnerRegrLM

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerRegrLM$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

if (requireNamespace("stats", quietly = TRUE)) {
# Define the Learner and set parameter values
learner = lrn("regr.lm")
print(learner)

# Define a Task
task = tsk("mtcars")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

# print the model
print(learner$model)

# importance method
if("importance" %in% learner$properties) print(learner$importance)

# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
}
#> <LearnerRegrLM:regr.lm>: Linear Model
#> * Model: -
#> * Parameters: list()
#> * Packages: mlr3, mlr3learners, stats
#> * Predict Types:  [response], se
#> * Feature Types: logical, integer, numeric, character, factor
#> * Properties: weights
#> 
#> Call:
#> stats::lm(formula = task$formula(), data = task$data())
#> 
#> Coefficients:
#> (Intercept)           am         carb          cyl         disp         drat  
#>    16.30218      3.76393      0.13314     -0.81045      0.02792      0.90348  
#>        gear           hp         qsec           vs           wt  
#>     0.14509     -0.03059      0.80309      0.36197     -4.11977  
#> 
#> regr.mse 
#> 4.351582