Multinomial log-linear learner via neural networks
Source:R/LearnerClassifMultinom.R
mlr_learners_classif.multinom.Rd
Multinomial log-linear models via neural networks.
Calls nnet::multinom()
from package nnet.
Dictionary
This mlr3::Learner can be instantiated via the dictionary mlr3::mlr_learners or with the associated sugar function mlr3::lrn()
:
Meta Information
Task type: “classif”
Predict Types: “response”, “prob”
Feature Types: “logical”, “integer”, “numeric”, “factor”
Required Packages: mlr3, mlr3learners, nnet
Parameters
Id | Type | Default | Levels | Range |
Hess | logical | FALSE | TRUE, FALSE | - |
abstol | numeric | 1e-04 | \((-\infty, \infty)\) | |
censored | logical | FALSE | TRUE, FALSE | - |
decay | numeric | 0 | \((-\infty, \infty)\) | |
entropy | logical | FALSE | TRUE, FALSE | - |
mask | untyped | - | - | |
maxit | integer | 100 | \([1, \infty)\) | |
MaxNWts | integer | 1000 | \([1, \infty)\) | |
model | logical | FALSE | TRUE, FALSE | - |
linout | logical | FALSE | TRUE, FALSE | - |
rang | numeric | 0.7 | \((-\infty, \infty)\) | |
reltol | numeric | 1e-08 | \((-\infty, \infty)\) | |
size | integer | - | \([1, \infty)\) | |
skip | logical | FALSE | TRUE, FALSE | - |
softmax | logical | FALSE | TRUE, FALSE | - |
summ | character | 0 | 0, 1, 2, 3 | - |
trace | logical | TRUE | TRUE, FALSE | - |
Wts | untyped | - | - |
See also
Chapter in the mlr3book: https://mlr3book.mlr-org.com/chapters/chapter2/data_and_basic_modeling.html#sec-learners
Package mlr3extralearners for more learners.
as.data.table(mlr_learners)
for a table of available Learners in the running session (depending on the loaded packages).mlr3pipelines to combine learners with pre- and postprocessing steps.
Extension packages for additional task types:
mlr3proba for probabilistic supervised regression and survival analysis.
mlr3cluster for unsupervised clustering.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Other Learner:
mlr_learners_classif.cv_glmnet
,
mlr_learners_classif.glmnet
,
mlr_learners_classif.kknn
,
mlr_learners_classif.lda
,
mlr_learners_classif.log_reg
,
mlr_learners_classif.naive_bayes
,
mlr_learners_classif.nnet
,
mlr_learners_classif.qda
,
mlr_learners_classif.ranger
,
mlr_learners_classif.svm
,
mlr_learners_classif.xgboost
,
mlr_learners_regr.cv_glmnet
,
mlr_learners_regr.glmnet
,
mlr_learners_regr.kknn
,
mlr_learners_regr.km
,
mlr_learners_regr.lm
,
mlr_learners_regr.nnet
,
mlr_learners_regr.ranger
,
mlr_learners_regr.svm
,
mlr_learners_regr.xgboost
Super classes
mlr3::Learner
-> mlr3::LearnerClassif
-> LearnerClassifMultinom
Examples
if (requireNamespace("nnet", quietly = TRUE)) {
# Define the Learner and set parameter values
learner = lrn("classif.multinom")
print(learner)
# Define a Task
task = tsk("sonar")
# Create train and test set
ids = partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
# print the model
print(learner$model)
# importance method
if("importance" %in% learner$properties) print(learner$importance)
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
# Score the predictions
predictions$score()
}
#>
#> ── <LearnerClassifMultinom> (classif.multinom): Multinomial Log-Linear Model ───
#> • Model: -
#> • Parameters: list()
#> • Packages: mlr3, mlr3learners, and nnet
#> • Predict Types: [response] and prob
#> • Feature Types: logical, integer, numeric, and factor
#> • Encapsulation: none (fallback: -)
#> • Properties: multiclass, twoclass, and weights
#> • Other settings: use_weights = 'use'
#> # weights: 62 (61 variable)
#> initial value 96.347458
#> iter 10 value 29.798498
#> iter 20 value 11.198546
#> iter 30 value 5.075010
#> iter 40 value 0.275305
#> iter 50 value 0.002855
#> final value 0.000057
#> converged
#> Call:
#> nnet::multinom(formula = Class ~ ., data = task$data())
#>
#> Coefficients:
#> (Intercept) V1 V10 V11 V12 V13
#> 1665.10612 -1881.84170 -1012.45299 -1544.70196 -1579.26284 -237.57831
#> V14 V15 V16 V17 V18 V19
#> -1965.84311 923.95870 45.30015 1976.62242 58.04681 633.93594
#> V2 V20 V21 V22 V23 V24
#> -448.94781 -811.92042 -1438.30926 -446.14325 533.76321 -3600.95559
#> V25 V26 V27 V28 V29 V3
#> 4691.39151 -701.57480 -1552.72546 1629.63646 -412.61476 777.61673
#> V30 V31 V32 V33 V34 V35
#> -294.99085 572.61023 -708.39364 -794.78207 1966.38381 -1802.32843
#> V36 V37 V38 V39 V4 V40
#> 1373.19784 3169.37153 -1213.36662 -1595.59491 -3248.85121 1578.02919
#> V41 V42 V43 V44 V45 V46
#> -1608.74882 1248.58616 -1831.67228 -2137.74375 -531.31338 -3161.98597
#> V47 V48 V49 V5 V50 V51
#> 589.26086 -2785.01296 -1697.40005 -1424.51982 674.63525 -856.48427
#> V52 V53 V54 V55 V56 V57
#> -45.65690 -161.85493 -209.51030 540.77070 285.13956 199.97640
#> V58 V59 V6 V60 V7 V8
#> -201.92567 -591.57092 920.87997 -570.09044 1987.50407 4093.80125
#> V9
#> -1361.00540
#>
#> Residual Deviance: 0.0001144722
#> AIC: 122.0001
#> classif.ce
#> 0.2318841