Multinomial log-linear learner via neural networks
Source:R/LearnerClassifMultinom.R
mlr_learners_classif.multinom.Rd
Multinomial log-linear models via neural networks.
Calls nnet::multinom()
from package nnet.
Dictionary
This mlr3::Learner can be instantiated via the dictionary mlr3::mlr_learners or with the associated sugar function mlr3::lrn()
:
Meta Information
Task type: “classif”
Predict Types: “response”, “prob”
Feature Types: “logical”, “integer”, “numeric”, “factor”
Required Packages: mlr3, mlr3learners, nnet
Parameters
Id | Type | Default | Levels | Range |
Hess | logical | FALSE | TRUE, FALSE | - |
abstol | numeric | 1e-04 | \((-\infty, \infty)\) | |
censored | logical | FALSE | TRUE, FALSE | - |
decay | numeric | 0 | \((-\infty, \infty)\) | |
entropy | logical | FALSE | TRUE, FALSE | - |
mask | untyped | - | - | |
maxit | integer | 100 | \([1, \infty)\) | |
MaxNWts | integer | 1000 | \([1, \infty)\) | |
model | logical | FALSE | TRUE, FALSE | - |
linout | logical | FALSE | TRUE, FALSE | - |
rang | numeric | 0.7 | \((-\infty, \infty)\) | |
reltol | numeric | 1e-08 | \((-\infty, \infty)\) | |
size | integer | - | \([1, \infty)\) | |
skip | logical | FALSE | TRUE, FALSE | - |
softmax | logical | FALSE | TRUE, FALSE | - |
summ | character | 0 | 0, 1, 2, 3 | - |
trace | logical | TRUE | TRUE, FALSE | - |
Wts | untyped | - | - |
See also
Chapter in the mlr3book: https://mlr3book.mlr-org.com/chapters/chapter2/data_and_basic_modeling.html#sec-learners
Package mlr3extralearners for more learners.
as.data.table(mlr_learners)
for a table of available Learners in the running session (depending on the loaded packages).mlr3pipelines to combine learners with pre- and postprocessing steps.
Extension packages for additional task types:
mlr3proba for probabilistic supervised regression and survival analysis.
mlr3cluster for unsupervised clustering.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Other Learner:
mlr_learners_classif.cv_glmnet
,
mlr_learners_classif.glmnet
,
mlr_learners_classif.kknn
,
mlr_learners_classif.lda
,
mlr_learners_classif.log_reg
,
mlr_learners_classif.naive_bayes
,
mlr_learners_classif.nnet
,
mlr_learners_classif.qda
,
mlr_learners_classif.ranger
,
mlr_learners_classif.svm
,
mlr_learners_classif.xgboost
,
mlr_learners_regr.cv_glmnet
,
mlr_learners_regr.glmnet
,
mlr_learners_regr.kknn
,
mlr_learners_regr.km
,
mlr_learners_regr.lm
,
mlr_learners_regr.nnet
,
mlr_learners_regr.ranger
,
mlr_learners_regr.svm
,
mlr_learners_regr.xgboost
Super classes
mlr3::Learner
-> mlr3::LearnerClassif
-> LearnerClassifMultinom
Methods
Method loglik()
Extract the log-likelihood (e.g., via stats::logLik()
from the fitted model.
Examples
if (requireNamespace("nnet", quietly = TRUE)) {
# Define the Learner and set parameter values
learner = lrn("classif.multinom")
print(learner)
# Define a Task
task = tsk("sonar")
# Create train and test set
ids = partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
# print the model
print(learner$model)
# importance method
if("importance" %in% learner$properties) print(learner$importance)
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
# Score the predictions
predictions$score()
}
#> <LearnerClassifMultinom:classif.multinom>: Multinomial Log-Linear Model
#> * Model: -
#> * Parameters: list()
#> * Packages: mlr3, mlr3learners, nnet
#> * Predict Types: [response], prob
#> * Feature Types: logical, integer, numeric, factor
#> * Properties: multiclass, twoclass, weights
#> # weights: 62 (61 variable)
#> initial value 96.347458
#> iter 10 value 28.060727
#> iter 20 value 1.899489
#> iter 30 value 0.003516
#> final value 0.000058
#> converged
#> Call:
#> nnet::multinom(formula = Class ~ ., data = task$data())
#>
#> Coefficients:
#> (Intercept) V1 V10 V11 V12 V13
#> 606.48499 -292.62216 -142.11791 -588.13233 -88.06081 -314.66330
#> V14 V15 V16 V17 V18 V19
#> -134.59359 380.41387 -93.35268 203.69452 83.21463 30.36906
#> V2 V20 V21 V22 V23 V24
#> -208.73362 -213.45852 -335.38329 104.26954 -54.17037 -522.82205
#> V25 V26 V27 V28 V29 V3
#> 700.21783 -201.29489 -279.93755 276.00581 -122.70963 -55.82210
#> V30 V31 V32 V33 V34 V35
#> 39.08240 57.62629 -86.84656 -265.38407 345.24797 -149.54434
#> V36 V37 V38 V39 V4 V40
#> 165.80103 677.99313 -323.04585 -410.21814 -506.70203 468.79101
#> V41 V42 V43 V44 V45 V46
#> -74.72936 -118.46404 -278.18165 -225.21560 -453.56885 -472.40222
#> V47 V48 V49 V5 V50 V51
#> 29.90197 -336.13828 -343.42369 -451.68864 137.03675 -117.66505
#> V52 V53 V54 V55 V56 V57
#> -92.13127 -31.72623 -40.72676 44.68332 46.71611 53.03701
#> V58 V59 V6 V60 V7 V8
#> -34.35565 -51.57852 -168.28882 10.24126 163.11532 640.56455
#> V9
#> -57.59101
#>
#> Residual Deviance: 0.0001159859
#> AIC: 122.0001
#> classif.ce
#> 0.2463768