Multinomial log-linear learner via neural networks
Source:R/LearnerClassifMultinom.R
mlr_learners_classif.multinom.Rd
Multinomial log-linear models via neural networks.
Calls nnet::multinom()
from package nnet.
Dictionary
This mlr3::Learner can be instantiated via the dictionary mlr3::mlr_learners or with the associated sugar function mlr3::lrn()
:
Meta Information
Task type: “classif”
Predict Types: “response”, “prob”
Feature Types: “logical”, “integer”, “numeric”, “factor”
Required Packages: mlr3, mlr3learners, nnet
Parameters
Id | Type | Default | Levels | Range |
Hess | logical | FALSE | TRUE, FALSE | - |
abstol | numeric | 1e-04 | \((-\infty, \infty)\) | |
censored | logical | FALSE | TRUE, FALSE | - |
decay | numeric | 0 | \((-\infty, \infty)\) | |
entropy | logical | FALSE | TRUE, FALSE | - |
mask | untyped | - | - | |
maxit | integer | 100 | \([1, \infty)\) | |
MaxNWts | integer | 1000 | \([1, \infty)\) | |
model | logical | FALSE | TRUE, FALSE | - |
linout | logical | FALSE | TRUE, FALSE | - |
rang | numeric | 0.7 | \((-\infty, \infty)\) | |
reltol | numeric | 1e-08 | \((-\infty, \infty)\) | |
size | integer | - | \([1, \infty)\) | |
skip | logical | FALSE | TRUE, FALSE | - |
softmax | logical | FALSE | TRUE, FALSE | - |
summ | character | 0 | 0, 1, 2, 3 | - |
trace | logical | TRUE | TRUE, FALSE | - |
Wts | untyped | - | - |
See also
Chapter in the mlr3book: https://mlr3book.mlr-org.com/chapters/chapter2/data_and_basic_modeling.html#sec-learners
Package mlr3extralearners for more learners.
as.data.table(mlr_learners)
for a table of available Learners in the running session (depending on the loaded packages).mlr3pipelines to combine learners with pre- and postprocessing steps.
Extension packages for additional task types:
mlr3proba for probabilistic supervised regression and survival analysis.
mlr3cluster for unsupervised clustering.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Other Learner:
mlr_learners_classif.cv_glmnet
,
mlr_learners_classif.glmnet
,
mlr_learners_classif.kknn
,
mlr_learners_classif.lda
,
mlr_learners_classif.log_reg
,
mlr_learners_classif.naive_bayes
,
mlr_learners_classif.nnet
,
mlr_learners_classif.qda
,
mlr_learners_classif.ranger
,
mlr_learners_classif.svm
,
mlr_learners_classif.xgboost
,
mlr_learners_regr.cv_glmnet
,
mlr_learners_regr.glmnet
,
mlr_learners_regr.kknn
,
mlr_learners_regr.km
,
mlr_learners_regr.lm
,
mlr_learners_regr.nnet
,
mlr_learners_regr.ranger
,
mlr_learners_regr.svm
,
mlr_learners_regr.xgboost
Super classes
mlr3::Learner
-> mlr3::LearnerClassif
-> LearnerClassifMultinom
Methods
Method loglik()
Extract the log-likelihood (e.g., via stats::logLik()
from the fitted model.
Examples
if (requireNamespace("nnet", quietly = TRUE)) {
# Define the Learner and set parameter values
learner = lrn("classif.multinom")
print(learner)
# Define a Task
task = tsk("sonar")
# Create train and test set
ids = partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
# print the model
print(learner$model)
# importance method
if("importance" %in% learner$properties) print(learner$importance)
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
# Score the predictions
predictions$score()
}
#> <LearnerClassifMultinom:classif.multinom>: Multinomial Log-Linear Model
#> * Model: -
#> * Parameters: list()
#> * Packages: mlr3, mlr3learners, nnet
#> * Predict Types: [response], prob
#> * Feature Types: logical, integer, numeric, factor
#> * Properties: loglik, multiclass, twoclass, weights
#> # weights: 62 (61 variable)
#> initial value 96.347458
#> iter 10 value 38.786551
#> iter 20 value 22.471243
#> iter 30 value 13.581243
#> iter 40 value 7.736340
#> iter 50 value 0.247483
#> iter 60 value 0.000373
#> final value 0.000053
#> converged
#> Call:
#> nnet::multinom(formula = Class ~ ., data = task$data())
#>
#> Coefficients:
#> (Intercept) V1 V10 V11 V12 V13
#> 1081.161826 -1356.181896 1078.127360 -2385.055112 -29.076346 -210.112377
#> V14 V15 V16 V17 V18 V19
#> 605.062145 269.065587 178.889236 924.056301 -1136.634534 159.508691
#> V2 V20 V21 V22 V23 V24
#> -1431.805191 -262.762888 -453.202414 646.307232 -298.961479 -1237.841911
#> V25 V26 V27 V28 V29 V3
#> 1190.395974 -591.152935 157.847356 1217.665983 -1319.670629 772.115941
#> V30 V31 V32 V33 V34 V35
#> -979.695501 2485.901435 -2084.595660 302.307586 1768.814500 -1382.201262
#> V36 V37 V38 V39 V4 V40
#> 1398.090281 30.161110 -108.066689 -1489.245524 -3562.950392 2146.106385
#> V41 V42 V43 V44 V45 V46
#> 481.115652 -1390.270427 -5.979684 -1038.968011 -1973.241727 1514.632577
#> V47 V48 V49 V5 V50 V51
#> 148.823013 -2954.938647 -1737.648668 -1605.542043 1398.348991 -1390.681420
#> V52 V53 V54 V55 V56 V57
#> -1124.294015 -398.799166 -816.731143 -11.730199 -78.529447 334.354179
#> V58 V59 V6 V60 V7 V8
#> -549.981547 -353.167252 61.399793 -165.782124 1454.099928 761.936107
#> V9
#> -944.507513
#>
#> Residual Deviance: 0.0001060298
#> AIC: 122.0001
#> classif.ce
#> 0.2463768