Skip to contents

Multinomial log-linear models via neural networks. Calls nnet::multinom() from package nnet.

Dictionary

This mlr3::Learner can be instantiated via the dictionary mlr3::mlr_learners or with the associated sugar function mlr3::lrn():

mlr_learners$get("classif.multinom")
lrn("classif.multinom")

Meta Information

  • Task type: “classif”

  • Predict Types: “response”, “prob”

  • Feature Types: “logical”, “integer”, “numeric”, “factor”

  • Required Packages: mlr3, mlr3learners, nnet

Parameters

IdTypeDefaultLevelsRange
HesslogicalFALSETRUE, FALSE-
abstolnumeric1e-04\((-\infty, \infty)\)
censoredlogicalFALSETRUE, FALSE-
decaynumeric0\((-\infty, \infty)\)
entropylogicalFALSETRUE, FALSE-
maskuntyped--
maxitinteger100\([1, \infty)\)
MaxNWtsinteger1000\([1, \infty)\)
modellogicalFALSETRUE, FALSE-
linoutlogicalFALSETRUE, FALSE-
rangnumeric0.7\((-\infty, \infty)\)
reltolnumeric1e-08\((-\infty, \infty)\)
sizeinteger-\([1, \infty)\)
skiplogicalFALSETRUE, FALSE-
softmaxlogicalFALSETRUE, FALSE-
summcharacter00, 1, 2, 3-
tracelogicalTRUETRUE, FALSE-
Wtsuntyped--

See also

Other Learner: mlr_learners_classif.cv_glmnet, mlr_learners_classif.glmnet, mlr_learners_classif.kknn, mlr_learners_classif.lda, mlr_learners_classif.log_reg, mlr_learners_classif.naive_bayes, mlr_learners_classif.nnet, mlr_learners_classif.qda, mlr_learners_classif.ranger, mlr_learners_classif.svm, mlr_learners_classif.xgboost, mlr_learners_regr.cv_glmnet, mlr_learners_regr.glmnet, mlr_learners_regr.kknn, mlr_learners_regr.km, mlr_learners_regr.lm, mlr_learners_regr.nnet, mlr_learners_regr.ranger, mlr_learners_regr.svm, mlr_learners_regr.xgboost

Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifMultinom

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.


Method loglik()

Extract the log-likelihood (e.g., via stats::logLik() from the fitted model.

Usage

LearnerClassifMultinom$loglik()


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerClassifMultinom$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

if (requireNamespace("nnet", quietly = TRUE)) {
# Define the Learner and set parameter values
learner = lrn("classif.multinom")
print(learner)

# Define a Task
task = tsk("sonar")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

# print the model
print(learner$model)

# importance method
if("importance" %in% learner$properties) print(learner$importance)

# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
}
#> <LearnerClassifMultinom:classif.multinom>: Multinomial Log-Linear Model
#> * Model: -
#> * Parameters: list()
#> * Packages: mlr3, mlr3learners, nnet
#> * Predict Types:  [response], prob
#> * Feature Types: logical, integer, numeric, factor
#> * Properties: loglik, multiclass, twoclass, weights
#> # weights:  62 (61 variable)
#> initial  value 96.347458 
#> iter  10 value 28.060727
#> iter  20 value 1.899489
#> iter  30 value 0.003516
#> final  value 0.000058 
#> converged
#> Call:
#> nnet::multinom(formula = Class ~ ., data = task$data())
#> 
#> Coefficients:
#> (Intercept)          V1         V10         V11         V12         V13 
#>   606.48499  -292.62216  -142.11791  -588.13233   -88.06081  -314.66330 
#>         V14         V15         V16         V17         V18         V19 
#>  -134.59359   380.41387   -93.35268   203.69452    83.21463    30.36906 
#>          V2         V20         V21         V22         V23         V24 
#>  -208.73362  -213.45852  -335.38329   104.26954   -54.17037  -522.82205 
#>         V25         V26         V27         V28         V29          V3 
#>   700.21783  -201.29489  -279.93755   276.00581  -122.70963   -55.82210 
#>         V30         V31         V32         V33         V34         V35 
#>    39.08240    57.62629   -86.84656  -265.38407   345.24797  -149.54434 
#>         V36         V37         V38         V39          V4         V40 
#>   165.80103   677.99313  -323.04585  -410.21814  -506.70203   468.79101 
#>         V41         V42         V43         V44         V45         V46 
#>   -74.72936  -118.46404  -278.18165  -225.21560  -453.56885  -472.40222 
#>         V47         V48         V49          V5         V50         V51 
#>    29.90197  -336.13828  -343.42369  -451.68864   137.03675  -117.66505 
#>         V52         V53         V54         V55         V56         V57 
#>   -92.13127   -31.72623   -40.72676    44.68332    46.71611    53.03701 
#>         V58         V59          V6         V60          V7          V8 
#>   -34.35565   -51.57852  -168.28882    10.24126   163.11532   640.56455 
#>          V9 
#>   -57.59101 
#> 
#> Residual Deviance: 0.0001159859 
#> AIC: 122.0001 
#> classif.ce 
#>  0.2463768