Naive Bayes Classification Learner
Source:R/LearnerClassifNaiveBayes.R
mlr_learners_classif.naive_bayes.Rd
Naive Bayes classification.
Calls e1071::naiveBayes()
from package e1071.
Dictionary
This Learner can be instantiated via the dictionary mlr_learners or with the associated sugar function lrn()
:
Meta Information
Task type: “classif”
Predict Types: “response”, “prob”
Feature Types: “logical”, “integer”, “numeric”, “factor”
Required Packages: mlr3, mlr3learners, e1071
Parameters
Id | Type | Default | Range |
eps | numeric | 0 | \((-\infty, \infty)\) |
laplace | numeric | 0 | \([0, \infty)\) |
threshold | numeric | 0.001 | \((-\infty, \infty)\) |
See also
Chapter in the mlr3book: https://mlr3book.mlr-org.com/chapters/chapter2/data_and_basic_modeling.html#sec-learners
Package mlr3extralearners for more learners.
Dictionary of Learners: mlr_learners
as.data.table(mlr_learners)
for a table of available Learners in the running session (depending on the loaded packages).mlr3pipelines to combine learners with pre- and postprocessing steps.
Extension packages for additional task types:
mlr3proba for probabilistic supervised regression and survival analysis.
mlr3cluster for unsupervised clustering.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Other Learner:
mlr_learners_classif.cv_glmnet
,
mlr_learners_classif.glmnet
,
mlr_learners_classif.kknn
,
mlr_learners_classif.lda
,
mlr_learners_classif.log_reg
,
mlr_learners_classif.multinom
,
mlr_learners_classif.nnet
,
mlr_learners_classif.qda
,
mlr_learners_classif.ranger
,
mlr_learners_classif.svm
,
mlr_learners_classif.xgboost
,
mlr_learners_regr.cv_glmnet
,
mlr_learners_regr.glmnet
,
mlr_learners_regr.kknn
,
mlr_learners_regr.km
,
mlr_learners_regr.lm
,
mlr_learners_regr.nnet
,
mlr_learners_regr.ranger
,
mlr_learners_regr.svm
,
mlr_learners_regr.xgboost
Super classes
mlr3::Learner
-> mlr3::LearnerClassif
-> LearnerClassifNaiveBayes
Examples
if (requireNamespace("e1071", quietly = TRUE)) {
# Define the Learner and set parameter values
learner = lrn("classif.naive_bayes")
print(learner)
# Define a Task
task = tsk("sonar")
# Create train and test set
ids = partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
# print the model
print(learner$model)
# importance method
if("importance" %in% learner$properties) print(learner$importance)
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
# Score the predictions
predictions$score()
}
#> <LearnerClassifNaiveBayes:classif.naive_bayes>: Naive Bayes
#> * Model: -
#> * Parameters: list()
#> * Packages: mlr3, mlr3learners, e1071
#> * Predict Types: [response], prob
#> * Feature Types: logical, integer, numeric, factor
#> * Properties: multiclass, twoclass
#>
#> Naive Bayes Classifier for Discrete Predictors
#>
#> Call:
#> naiveBayes.default(x = x, y = y)
#>
#> A-priori probabilities:
#> y
#> M R
#> 0.5323741 0.4676259
#>
#> Conditional probabilities:
#> V1
#> y [,1] [,2]
#> M 0.03206351 0.02340116
#> R 0.02248615 0.01408958
#>
#> V10
#> y [,1] [,2]
#> M 0.2512081 0.1330358
#> R 0.1629446 0.1080144
#>
#> V11
#> y [,1] [,2]
#> M 0.2970432 0.1272075
#> R 0.1751123 0.1084565
#>
#> V12
#> y [,1] [,2]
#> M 0.3164054 0.1303706
#> R 0.1853969 0.1308910
#>
#> V13
#> y [,1] [,2]
#> M 0.3313986 0.1402126
#> R 0.2248800 0.1461785
#>
#> V14
#> y [,1] [,2]
#> M 0.3380622 0.1762075
#> R 0.2766308 0.1842206
#>
#> V15
#> y [,1] [,2]
#> M 0.3517676 0.2047663
#> R 0.3159354 0.2355087
#>
#> V16
#> y [,1] [,2]
#> M 0.3927203 0.224722
#> R 0.3791708 0.271190
#>
#> V17
#> y [,1] [,2]
#> M 0.4320932 0.2545494
#> R 0.4172846 0.3045279
#>
#> V18
#> y [,1] [,2]
#> M 0.4763662 0.2688041
#> R 0.4441815 0.2743434
#>
#> V19
#> y [,1] [,2]
#> M 0.5527419 0.2723451
#> R 0.4628092 0.2494252
#>
#> V2
#> y [,1] [,2]
#> M 0.04327027 0.03316476
#> R 0.03079231 0.02490068
#>
#> V20
#> y [,1] [,2]
#> M 0.6280135 0.2570207
#> R 0.4964754 0.2519864
#>
#> V21
#> y [,1] [,2]
#> M 0.6912041 0.2342915
#> R 0.5320277 0.2393104
#>
#> V22
#> y [,1] [,2]
#> M 0.7100932 0.2193231
#> R 0.5604154 0.2513247
#>
#> V23
#> y [,1] [,2]
#> M 0.7130351 0.2196259
#> R 0.6184062 0.2435786
#>
#> V24
#> y [,1] [,2]
#> M 0.7205770 0.2096406
#> R 0.6715662 0.2343353
#>
#> V25
#> y [,1] [,2]
#> M 0.7000405 0.2279788
#> R 0.6857862 0.2460989
#>
#> V26
#> y [,1] [,2]
#> M 0.7220365 0.2278832
#> R 0.7135338 0.2263837
#>
#> V27
#> y [,1] [,2]
#> M 0.7365189 0.2558385
#> R 0.6957062 0.2147518
#>
#> V28
#> y [,1] [,2]
#> M 0.7099216 0.2673797
#> R 0.6678338 0.1939931
#>
#> V29
#> y [,1] [,2]
#> M 0.6345919 0.2535190
#> R 0.6238954 0.2451006
#>
#> V3
#> y [,1] [,2]
#> M 0.05017162 0.03687514
#> R 0.03696615 0.02809303
#>
#> V30
#> y [,1] [,2]
#> M 0.5571338 0.2153258
#> R 0.5742369 0.2388500
#>
#> V31
#> y [,1] [,2]
#> M 0.4661635 0.2169021
#> R 0.5289785 0.2063389
#>
#> V32
#> y [,1] [,2]
#> M 0.4078986 0.2088024
#> R 0.4549215 0.2211376
#>
#> V33
#> y [,1] [,2]
#> M 0.3740797 0.1826422
#> R 0.4587677 0.2240939
#>
#> V34
#> y [,1] [,2]
#> M 0.3541797 0.1885155
#> R 0.4740569 0.2630218
#>
#> V35
#> y [,1] [,2]
#> M 0.3254595 0.2311782
#> R 0.4869446 0.2711054
#>
#> V36
#> y [,1] [,2]
#> M 0.3101419 0.2402579
#> R 0.4841785 0.2741363
#>
#> V37
#> y [,1] [,2]
#> M 0.3084324 0.2102877
#> R 0.4252062 0.2499377
#>
#> V38
#> y [,1] [,2]
#> M 0.3136189 0.1849106
#> R 0.3481862 0.2179329
#>
#> V39
#> y [,1] [,2]
#> M 0.3216797 0.1667914
#> R 0.3021892 0.2051456
#>
#> V4
#> y [,1] [,2]
#> M 0.06165270 0.04513934
#> R 0.04273385 0.03249057
#>
#> V40
#> y [,1] [,2]
#> M 0.2934878 0.1489260
#> R 0.3141569 0.1816461
#>
#> V41
#> y [,1] [,2]
#> M 0.2862216 0.1652392
#> R 0.2824815 0.1607351
#>
#> V42
#> y [,1] [,2]
#> M 0.2975486 0.1628150
#> R 0.2520508 0.1511561
#>
#> V43
#> y [,1] [,2]
#> M 0.2726892 0.1364089
#> R 0.2125492 0.1121381
#>
#> V44
#> y [,1] [,2]
#> M 0.2350203 0.14203277
#> R 0.1765123 0.09007585
#>
#> V45
#> y [,1] [,2]
#> M 0.2243284 0.16494355
#> R 0.1438062 0.08776186
#>
#> V46
#> y [,1] [,2]
#> M 0.1754068 0.13217894
#> R 0.1214508 0.08112291
#>
#> V47
#> y [,1] [,2]
#> M 0.13614189 0.08070572
#> R 0.09014462 0.05884216
#>
#> V48
#> y [,1] [,2]
#> M 0.10608108 0.06107323
#> R 0.06707231 0.04499084
#>
#> V49
#> y [,1] [,2]
#> M 0.06339865 0.03515377
#> R 0.03741231 0.02828350
#>
#> V5
#> y [,1] [,2]
#> M 0.08241216 0.05477640
#> R 0.06454769 0.05000945
#>
#> V50
#> y [,1] [,2]
#> M 0.02261757 0.01341814
#> R 0.01836462 0.01213459
#>
#> V51
#> y [,1] [,2]
#> M 0.01858919 0.012159923
#> R 0.01235538 0.008673632
#>
#> V52
#> y [,1] [,2]
#> M 0.01419865 0.00794023
#> R 0.01024154 0.00702731
#>
#> V53
#> y [,1] [,2]
#> M 0.010494595 0.006159632
#> R 0.008803077 0.005747390
#>
#> V54
#> y [,1] [,2]
#> M 0.012454054 0.008331985
#> R 0.009787692 0.005574510
#>
#> V55
#> y [,1] [,2]
#> M 0.009444595 0.008025749
#> R 0.008469231 0.004951102
#>
#> V56
#> y [,1] [,2]
#> M 0.008594595 0.005684198
#> R 0.007173846 0.004111428
#>
#> V57
#> y [,1] [,2]
#> M 0.007405405 0.004743904
#> R 0.008009231 0.005182022
#>
#> V58
#> y [,1] [,2]
#> M 0.009263514 0.006887264
#> R 0.006029231 0.004118166
#>
#> V59
#> y [,1] [,2]
#> M 0.008804054 0.007103355
#> R 0.007049231 0.005094731
#>
#> V6
#> y [,1] [,2]
#> M 0.1126743 0.05004777
#> R 0.1029723 0.06939848
#>
#> V60
#> y [,1] [,2]
#> M 0.007345946 0.006354595
#> R 0.005501538 0.003084310
#>
#> V7
#> y [,1] [,2]
#> M 0.1306716 0.05569400
#> R 0.1218062 0.06649418
#>
#> V8
#> y [,1] [,2]
#> M 0.1474838 0.07985112
#> R 0.1184292 0.08291581
#>
#> V9
#> y [,1] [,2]
#> M 0.2053284 0.11081163
#> R 0.1407769 0.09794916
#>
#> classif.ce
#> 0.3478261