Skip to contents

Multinomial log-linear models via neural networks. Calls nnet::multinom() from package nnet.

Dictionary

This Learner can be instantiated via the dictionary mlr_learners or with the associated sugar function lrn():

mlr_learners$get("classif.multinom")
lrn("classif.multinom")

Meta Information

  • Task type: “classif”

  • Predict Types: “response”, “prob”

  • Feature Types: “logical”, “integer”, “numeric”, “factor”

  • Required Packages: mlr3, mlr3learners, nnet

Parameters

IdTypeDefaultLevelsRange
HesslogicalFALSETRUE, FALSE-
abstolnumeric1e-04\((-\infty, \infty)\)
censoredlogicalFALSETRUE, FALSE-
decaynumeric0\((-\infty, \infty)\)
entropylogicalFALSETRUE, FALSE-
maskuntyped--
maxitinteger100\([1, \infty)\)
MaxNWtsinteger1000\([1, \infty)\)
modellogicalFALSETRUE, FALSE-
linoutlogicalFALSETRUE, FALSE-
rangnumeric0.7\((-\infty, \infty)\)
reltolnumeric1e-08\((-\infty, \infty)\)
sizeinteger-\([1, \infty)\)
skiplogicalFALSETRUE, FALSE-
softmaxlogicalFALSETRUE, FALSE-
summcharacter00, 1, 2, 3-
tracelogicalTRUETRUE, FALSE-
Wtsuntyped--

See also

Other Learner: mlr_learners_classif.cv_glmnet, mlr_learners_classif.glmnet, mlr_learners_classif.kknn, mlr_learners_classif.lda, mlr_learners_classif.log_reg, mlr_learners_classif.naive_bayes, mlr_learners_classif.nnet, mlr_learners_classif.qda, mlr_learners_classif.ranger, mlr_learners_classif.svm, mlr_learners_classif.xgboost, mlr_learners_regr.cv_glmnet, mlr_learners_regr.glmnet, mlr_learners_regr.kknn, mlr_learners_regr.km, mlr_learners_regr.lm, mlr_learners_regr.nnet, mlr_learners_regr.ranger, mlr_learners_regr.svm, mlr_learners_regr.xgboost

Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifMultinom

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.


Method loglik()

Extract the log-likelihood (e.g., via stats::logLik() from the fitted model.

Usage

LearnerClassifMultinom$loglik()


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerClassifMultinom$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

if (requireNamespace("nnet", quietly = TRUE)) {
# Define the Learner and set parameter values
learner = lrn("classif.multinom")
print(learner)

# Define a Task
task = tsk("sonar")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

# print the model
print(learner$model)

# importance method
if("importance" %in% learner$properties) print(learner$importance)

# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
}
#> <LearnerClassifMultinom:classif.multinom>: Multinomial Log-Linear Model
#> * Model: -
#> * Parameters: list()
#> * Packages: mlr3, mlr3learners, nnet
#> * Predict Types:  [response], prob
#> * Feature Types: logical, integer, numeric, factor
#> * Properties: loglik, multiclass, twoclass, weights
#> # weights:  62 (61 variable)
#> initial  value 96.347458 
#> iter  10 value 38.786551
#> iter  20 value 22.471243
#> iter  30 value 13.581243
#> iter  40 value 7.736340
#> iter  50 value 0.247483
#> iter  60 value 0.000373
#> final  value 0.000053 
#> converged
#> Call:
#> nnet::multinom(formula = Class ~ ., data = task$data())
#> 
#> Coefficients:
#>  (Intercept)           V1          V10          V11          V12          V13 
#>  1081.161826 -1356.181896  1078.127360 -2385.055112   -29.076346  -210.112377 
#>          V14          V15          V16          V17          V18          V19 
#>   605.062145   269.065587   178.889236   924.056301 -1136.634534   159.508691 
#>           V2          V20          V21          V22          V23          V24 
#> -1431.805191  -262.762888  -453.202414   646.307232  -298.961479 -1237.841911 
#>          V25          V26          V27          V28          V29           V3 
#>  1190.395974  -591.152935   157.847356  1217.665983 -1319.670629   772.115941 
#>          V30          V31          V32          V33          V34          V35 
#>  -979.695501  2485.901435 -2084.595660   302.307586  1768.814500 -1382.201262 
#>          V36          V37          V38          V39           V4          V40 
#>  1398.090281    30.161110  -108.066689 -1489.245524 -3562.950392  2146.106385 
#>          V41          V42          V43          V44          V45          V46 
#>   481.115652 -1390.270427    -5.979684 -1038.968011 -1973.241727  1514.632577 
#>          V47          V48          V49           V5          V50          V51 
#>   148.823013 -2954.938647 -1737.648668 -1605.542043  1398.348991 -1390.681420 
#>          V52          V53          V54          V55          V56          V57 
#> -1124.294015  -398.799166  -816.731143   -11.730199   -78.529447   334.354179 
#>          V58          V59           V6          V60           V7           V8 
#>  -549.981547  -353.167252    61.399793  -165.782124  1454.099928   761.936107 
#>           V9 
#>  -944.507513 
#> 
#> Residual Deviance: 0.0001060298 
#> AIC: 122.0001 
#> classif.ce 
#>  0.2463768