Single Layer Neural Network.
Calls nnet::nnet.formula()
from package nnet.
Note that modern neural networks with multiple layers are connected via package mlr3torch.
Dictionary
This mlr3::Learner can be instantiated via the dictionary mlr3::mlr_learners or with the associated sugar function mlr3::lrn()
:
Meta Information
Task type: “regr”
Predict Types: “response”
Feature Types: “logical”, “integer”, “numeric”, “factor”, “ordered”
Required Packages: mlr3, mlr3learners, nnet
Parameters
Id | Type | Default | Levels | Range |
Hess | logical | FALSE | TRUE, FALSE | - |
MaxNWts | integer | 1000 | \([1, \infty)\) | |
Wts | untyped | - | - | |
abstol | numeric | 1e-04 | \((-\infty, \infty)\) | |
censored | logical | FALSE | TRUE, FALSE | - |
contrasts | untyped | NULL | - | |
decay | numeric | 0 | \((-\infty, \infty)\) | |
mask | untyped | - | - | |
maxit | integer | 100 | \([1, \infty)\) | |
na.action | untyped | - | - | |
rang | numeric | 0.7 | \((-\infty, \infty)\) | |
reltol | numeric | 1e-08 | \((-\infty, \infty)\) | |
size | integer | 3 | \([0, \infty)\) | |
skip | logical | FALSE | TRUE, FALSE | - |
subset | untyped | - | - | |
trace | logical | TRUE | TRUE, FALSE | - |
formula | untyped | - | - |
References
Ripley BD (1996). Pattern Recognition and Neural Networks. Cambridge University Press. doi:10.1017/cbo9780511812651 .
See also
Chapter in the mlr3book: https://mlr3book.mlr-org.com/chapters/chapter2/data_and_basic_modeling.html#sec-learners
Package mlr3extralearners for more learners.
as.data.table(mlr_learners)
for a table of available Learners in the running session (depending on the loaded packages).mlr3pipelines to combine learners with pre- and postprocessing steps.
Extension packages for additional task types:
mlr3proba for probabilistic supervised regression and survival analysis.
mlr3cluster for unsupervised clustering.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Other Learner:
mlr_learners_classif.cv_glmnet
,
mlr_learners_classif.glmnet
,
mlr_learners_classif.kknn
,
mlr_learners_classif.lda
,
mlr_learners_classif.log_reg
,
mlr_learners_classif.multinom
,
mlr_learners_classif.naive_bayes
,
mlr_learners_classif.nnet
,
mlr_learners_classif.qda
,
mlr_learners_classif.ranger
,
mlr_learners_classif.svm
,
mlr_learners_classif.xgboost
,
mlr_learners_regr.cv_glmnet
,
mlr_learners_regr.glmnet
,
mlr_learners_regr.kknn
,
mlr_learners_regr.km
,
mlr_learners_regr.lm
,
mlr_learners_regr.ranger
,
mlr_learners_regr.svm
,
mlr_learners_regr.xgboost
Super classes
mlr3::Learner
-> mlr3::LearnerRegr
-> LearnerRegrNnet
Examples
if (requireNamespace("nnet", quietly = TRUE)) {
# Define the Learner and set parameter values
learner = lrn("regr.nnet")
print(learner)
# Define a Task
task = tsk("mtcars")
# Create train and test set
ids = partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
# print the model
print(learner$model)
# importance method
if("importance" %in% learner$properties) print(learner$importance)
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
# Score the predictions
predictions$score()
}
#> <LearnerRegrNnet:regr.nnet>: Single Layer Neural Network
#> * Model: -
#> * Parameters: size=3
#> * Packages: mlr3, mlr3learners, nnet
#> * Predict Types: [response]
#> * Feature Types: logical, integer, numeric, factor, ordered
#> * Properties: weights
#> # weights: 37
#> initial value 8445.242250
#> iter 10 value 346.375403
#> iter 20 value 176.439132
#> iter 30 value 158.345415
#> iter 40 value 139.968094
#> iter 50 value 82.148219
#> iter 60 value 34.713929
#> iter 70 value 34.119941
#> iter 80 value 34.108462
#> iter 80 value 34.108462
#> iter 80 value 34.108462
#> final value 34.108462
#> converged
#> a 10-3-1 network with 37 weights
#> inputs: am carb cyl disp drat gear hp qsec vs wt
#> output(s): mpg
#> options were - linear output units
#> regr.mse
#> 11.74087