Skip to contents

Single Layer Neural Network. Calls nnet::nnet.formula() from package nnet.

Note that modern neural networks with multiple layers are connected via package mlr3torch.

Dictionary

This mlr3::Learner can be instantiated via the dictionary mlr3::mlr_learners or with the associated sugar function mlr3::lrn():

mlr_learners$get("classif.nnet")
lrn("classif.nnet")

Meta Information

  • Task type: “classif”

  • Predict Types: “response”, “prob”

  • Feature Types: “integer”, “numeric”, “factor”, “ordered”

  • Required Packages: mlr3, mlr3learners, nnet

Parameters

IdTypeDefaultLevelsRange
HesslogicalFALSETRUE, FALSE-
MaxNWtsinteger1000\([1, \infty)\)
Wtsuntyped--
abstolnumeric1e-04\((-\infty, \infty)\)
censoredlogicalFALSETRUE, FALSE-
contrastsuntypedNULL-
decaynumeric0\((-\infty, \infty)\)
maskuntyped--
maxitinteger100\([1, \infty)\)
na.actionuntyped--
rangnumeric0.7\((-\infty, \infty)\)
reltolnumeric1e-08\((-\infty, \infty)\)
sizeinteger3\([0, \infty)\)
skiplogicalFALSETRUE, FALSE-
subsetuntyped--
tracelogicalTRUETRUE, FALSE-
formulauntyped--

Initial parameter values

  • size:

    • Adjusted default: 3L.

    • Reason for change: no default in nnet().

Custom mlr3 parameters

  • formula: if not provided, the formula is set to task$formula().

References

Ripley BD (1996). Pattern Recognition and Neural Networks. Cambridge University Press. doi:10.1017/cbo9780511812651 .

See also

Other Learner: mlr_learners_classif.cv_glmnet, mlr_learners_classif.glmnet, mlr_learners_classif.kknn, mlr_learners_classif.lda, mlr_learners_classif.log_reg, mlr_learners_classif.multinom, mlr_learners_classif.naive_bayes, mlr_learners_classif.qda, mlr_learners_classif.ranger, mlr_learners_classif.svm, mlr_learners_classif.xgboost, mlr_learners_regr.cv_glmnet, mlr_learners_regr.glmnet, mlr_learners_regr.kknn, mlr_learners_regr.km, mlr_learners_regr.lm, mlr_learners_regr.nnet, mlr_learners_regr.ranger, mlr_learners_regr.svm, mlr_learners_regr.xgboost

Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifNnet

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerClassifNnet$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

if (requireNamespace("nnet", quietly = TRUE)) {
# Define the Learner and set parameter values
learner = lrn("classif.nnet")
print(learner)

# Define a Task
task = tsk("sonar")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

# print the model
print(learner$model)

# importance method
if("importance" %in% learner$properties) print(learner$importance)

# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
}
#> <LearnerClassifNnet:classif.nnet>: Single Layer Neural Network
#> * Model: -
#> * Parameters: size=3
#> * Packages: mlr3, mlr3learners, nnet
#> * Predict Types:  response, [prob]
#> * Feature Types: logical, integer, numeric, factor, ordered
#> * Properties: multiclass, twoclass, weights
#> # weights:  187
#> initial  value 98.709611 
#> iter  10 value 51.544891
#> iter  20 value 25.722922
#> iter  30 value 25.198270
#> iter  40 value 24.829482
#> iter  50 value 24.422203
#> iter  60 value 24.409340
#> iter  70 value 21.790075
#> iter  80 value 20.729026
#> iter  90 value 20.151006
#> iter 100 value 20.002287
#> final  value 20.002287 
#> stopped after 100 iterations
#> a 60-3-1 network with 187 weights
#> inputs: V1 V10 V11 V12 V13 V14 V15 V16 V17 V18 V19 V2 V20 V21 V22 V23 V24 V25 V26 V27 V28 V29 V3 V30 V31 V32 V33 V34 V35 V36 V37 V38 V39 V4 V40 V41 V42 V43 V44 V45 V46 V47 V48 V49 V5 V50 V51 V52 V53 V54 V55 V56 V57 V58 V59 V6 V60 V7 V8 V9 
#> output(s): Class 
#> options were - entropy fitting 
#> classif.ce 
#>  0.1449275