Skip to contents

Classification via logistic regression. Calls stats::glm() with family set to "binomial".

Internal Encoding

Starting with mlr3 v0.5.0, the order of class labels is reversed prior to model fitting to comply to the stats::glm() convention that the negative class is provided as the first factor level.

Weights

It is not advisable to change the weights of a logistic regression. For more details, see this question on Cross Validated.

Initial parameter values

  • model:

    • Actual default: TRUE.

    • Adjusted default: FALSE.

    • Reason for change: Save some memory.

Dictionary

This Learner can be instantiated via the dictionary mlr_learners or with the associated sugar function lrn():

mlr_learners$get("classif.log_reg")
lrn("classif.log_reg")

Meta Information

  • Task type: “classif”

  • Predict Types: “response”, “prob”

  • Feature Types: “logical”, “integer”, “numeric”, “character”, “factor”, “ordered”

  • Required Packages: mlr3, mlr3learners, 'stats'

Parameters

IdTypeDefaultLevelsRange
dispersionuntyped-
epsilonnumeric1e-08\((-\infty, \infty)\)
etastartuntyped--
maxitnumeric25\((-\infty, \infty)\)
modellogicalTRUETRUE, FALSE-
mustartuntyped--
offsetuntyped--
singular.oklogicalTRUETRUE, FALSE-
startuntyped-
tracelogicalFALSETRUE, FALSE-
xlogicalFALSETRUE, FALSE-
ylogicalTRUETRUE, FALSE-

Contrasts

To ensure reproducibility, this learner always uses the default contrasts:

Setting the option "contrasts" does not have any effect. Instead, set the respective hyperparameter or use mlr3pipelines to create dummy features.

See also

Other Learner: mlr_learners_classif.cv_glmnet, mlr_learners_classif.glmnet, mlr_learners_classif.kknn, mlr_learners_classif.lda, mlr_learners_classif.multinom, mlr_learners_classif.naive_bayes, mlr_learners_classif.nnet, mlr_learners_classif.qda, mlr_learners_classif.ranger, mlr_learners_classif.svm, mlr_learners_classif.xgboost, mlr_learners_regr.cv_glmnet, mlr_learners_regr.glmnet, mlr_learners_regr.kknn, mlr_learners_regr.km, mlr_learners_regr.lm, mlr_learners_regr.nnet, mlr_learners_regr.ranger, mlr_learners_regr.svm, mlr_learners_regr.xgboost

Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifLogReg

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method loglik()

Extract the log-likelihood (e.g., via stats::logLik() from the fitted model.

Usage

LearnerClassifLogReg$loglik()


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerClassifLogReg$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

if (requireNamespace("stats", quietly = TRUE)) {
# Define the Learner and set parameter values
learner = lrn("classif.log_reg")
print(learner)

# Define a Task
task = tsk("sonar")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

# print the model
print(learner$model)

# importance method
if("importance" %in% learner$properties) print(learner$importance)

# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
}
#> <LearnerClassifLogReg:classif.log_reg>: Logistic Regression
#> * Model: -
#> * Parameters: list()
#> * Packages: mlr3, mlr3learners, stats
#> * Predict Types:  [response], prob
#> * Feature Types: logical, integer, numeric, character, factor, ordered
#> * Properties: loglik, twoclass
#> Warning: glm.fit: algorithm did not converge
#> Warning: glm.fit: fitted probabilities numerically 0 or 1 occurred
#> 
#> Call:  stats::glm(formula = task$formula(), family = "binomial", data = data, 
#>     model = FALSE)
#> 
#> Coefficients:
#> (Intercept)           V1          V10          V11          V12          V13  
#>     -176.26       588.50        52.72       345.46       -87.86       189.40  
#>         V14          V15          V16          V17          V18          V19  
#>     -159.18       158.36       -46.23      -273.22       452.43      -362.29  
#>          V2          V20          V21          V22          V23          V24  
#>     -194.86       295.70       -40.53       -54.92       -93.21       406.53  
#>         V25          V26          V27          V28          V29           V3  
#>     -455.48       208.07       -19.19       109.75       -29.02      -738.98  
#>         V30          V31          V32          V33          V34          V35  
#>      -13.82        25.10       -98.01       -28.21      -105.71       292.66  
#>         V36          V37          V38          V39           V4          V40  
#>     -215.92      -182.32       151.03        77.98       394.23      -317.26  
#>         V41          V42          V43          V44          V45          V46  
#>      364.23      -273.15        99.74       344.30       -68.79       102.12  
#>         V47          V48          V49           V5          V50          V51  
#>      483.67        52.95       789.56       -91.66     -3003.95      -876.98  
#>         V52          V53          V54          V55          V56          V57  
#>    -1140.04      2345.33      1860.91      -895.43     -1977.31       413.35  
#>         V58          V59           V6          V60           V7           V8  
#>    -1492.07      3315.70       299.66      -759.55      -302.51        29.88  
#>          V9  
#>     -299.89  
#> 
#> Degrees of Freedom: 138 Total (i.e. Null);  78 Residual
#> Null Deviance:	    192.1 
#> Residual Deviance: 5.845e-09 	AIC: 122
#> classif.ce 
#>  0.3043478