k-Nearest-Neighbor Classification Learner
Source:R/LearnerClassifKKNN.R
mlr_learners_classif.kknn.Rd
k-Nearest-Neighbor classification.
Calls kknn::kknn()
from package kknn.
Note
There is no training step for k-NN models, just storing the training data to
process it during the predict step.
Therefore, $model
returns a list with the following elements:
formula
: Formula for callingkknn::kknn()
during$predict()
.data
: Training data for callingkknn::kknn()
during$predict()
.pv
: Training parameters for callingkknn::kknn()
during$predict()
.kknn
: Model as returned bykknn::kknn()
, only available after$predict()
has been called. This is not stored by default, you must set hyperparameterstore_model
toTRUE
.
Dictionary
This mlr3::Learner can be instantiated via the dictionary mlr3::mlr_learners or with the associated sugar function mlr3::lrn()
:
Meta Information
Task type: “classif”
Predict Types: “response”, “prob”
Feature Types: “logical”, “integer”, “numeric”, “factor”, “ordered”
Required Packages: mlr3, mlr3learners, kknn
Parameters
Id | Type | Default | Levels | Range |
k | integer | 7 | \([1, \infty)\) | |
distance | numeric | 2 | \([0, \infty)\) | |
kernel | character | optimal | rectangular, triangular, epanechnikov, biweight, triweight, cos, inv, gaussian, rank, optimal | - |
scale | logical | TRUE | TRUE, FALSE | - |
ykernel | untyped | NULL | - | |
store_model | logical | FALSE | TRUE, FALSE | - |
References
Hechenbichler, Klaus, Schliep, Klaus (2004). “Weighted k-nearest-neighbor techniques and ordinal classification.” Technical Report Discussion Paper 399, SFB 386, Ludwig-Maximilians University Munich. doi:10.5282/ubm/epub.1769 .
Samworth, J R (2012). “Optimal weighted nearest neighbour classifiers.” The Annals of Statistics, 40(5), 2733--2763. doi:10.1214/12-AOS1049 .
Cover, Thomas, Hart, Peter (1967). “Nearest neighbor pattern classification.” IEEE transactions on information theory, 13(1), 21--27. doi:10.1109/TIT.1967.1053964 .
See also
Chapter in the mlr3book: https://mlr3book.mlr-org.com/chapters/chapter2/data_and_basic_modeling.html#sec-learners
Package mlr3extralearners for more learners.
as.data.table(mlr_learners)
for a table of available Learners in the running session (depending on the loaded packages).mlr3pipelines to combine learners with pre- and postprocessing steps.
Extension packages for additional task types:
mlr3proba for probabilistic supervised regression and survival analysis.
mlr3cluster for unsupervised clustering.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Other Learner:
mlr_learners_classif.cv_glmnet
,
mlr_learners_classif.glmnet
,
mlr_learners_classif.lda
,
mlr_learners_classif.log_reg
,
mlr_learners_classif.multinom
,
mlr_learners_classif.naive_bayes
,
mlr_learners_classif.nnet
,
mlr_learners_classif.qda
,
mlr_learners_classif.ranger
,
mlr_learners_classif.svm
,
mlr_learners_classif.xgboost
,
mlr_learners_regr.cv_glmnet
,
mlr_learners_regr.glmnet
,
mlr_learners_regr.kknn
,
mlr_learners_regr.km
,
mlr_learners_regr.lm
,
mlr_learners_regr.nnet
,
mlr_learners_regr.ranger
,
mlr_learners_regr.svm
,
mlr_learners_regr.xgboost
Super classes
mlr3::Learner
-> mlr3::LearnerClassif
-> LearnerClassifKKNN
Examples
if (requireNamespace("kknn", quietly = TRUE)) {
# Define the Learner and set parameter values
learner = lrn("classif.kknn")
print(learner)
# Define a Task
task = tsk("sonar")
# Create train and test set
ids = partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
# print the model
print(learner$model)
# importance method
if("importance" %in% learner$properties) print(learner$importance)
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
# Score the predictions
predictions$score()
}
#> <LearnerClassifKKNN:classif.kknn>: k-Nearest-Neighbor
#> * Model: -
#> * Parameters: k=7
#> * Packages: mlr3, mlr3learners, kknn
#> * Predict Types: [response], prob
#> * Feature Types: logical, integer, numeric, factor, ordered
#> * Properties: multiclass, twoclass
#> $formula
#> Class ~ .
#> NULL
#>
#> $data
#> Class V1 V10 V11 V12 V13 V14 V15 V16 V17
#> <fctr> <num> <num> <num> <num> <num> <num> <num> <num> <num>
#> 1: R 0.0200 0.2111 0.1609 0.1582 0.2238 0.0645 0.0660 0.2273 0.3100
#> 2: R 0.0262 0.6194 0.6333 0.7060 0.5544 0.5320 0.6479 0.6931 0.6759
#> 3: R 0.0762 0.4459 0.4152 0.3952 0.4256 0.4135 0.4528 0.5326 0.7306
#> 4: R 0.0286 0.3039 0.2988 0.4250 0.6343 0.8198 1.0000 0.9988 0.9508
#> 5: R 0.0317 0.3513 0.1786 0.0658 0.0513 0.3752 0.5419 0.5440 0.5150
#> ---
#> 135: M 0.0335 0.2660 0.3188 0.3553 0.3116 0.1965 0.1780 0.2794 0.2870
#> 136: M 0.0272 0.3997 0.3941 0.3309 0.2926 0.1760 0.1739 0.2043 0.2088
#> 137: M 0.0323 0.2154 0.3085 0.3425 0.2990 0.1402 0.1235 0.1534 0.1901
#> 138: M 0.0303 0.2354 0.2898 0.2812 0.1578 0.0273 0.0673 0.1444 0.2070
#> 139: M 0.0260 0.2354 0.2720 0.2442 0.1665 0.0336 0.1302 0.1708 0.2177
#> V18 V19 V2 V20 V21 V22 V23 V24 V25 V26
#> <num> <num> <num> <num> <num> <num> <num> <num> <num> <num>
#> 1: 0.2999 0.5078 0.0371 0.4797 0.5783 0.5071 0.4328 0.5550 0.6711 0.6415
#> 2: 0.7551 0.8929 0.0582 0.8619 0.7974 0.6737 0.4293 0.3648 0.5331 0.2413
#> 3: 0.6193 0.2032 0.0666 0.4636 0.4148 0.4292 0.5730 0.5399 0.3161 0.2285
#> 4: 0.9025 0.7234 0.0453 0.5122 0.2074 0.3985 0.5890 0.2872 0.2043 0.5782
#> 5: 0.4262 0.2024 0.0956 0.4233 0.7723 0.9735 0.9390 0.5559 0.5268 0.6826
#> ---
#> 135: 0.3969 0.5599 0.0258 0.6936 0.7969 0.7452 0.8203 0.9261 0.8810 0.8814
#> 136: 0.2678 0.2434 0.0378 0.1839 0.2802 0.6172 0.8015 0.8313 0.8440 0.8494
#> 137: 0.2429 0.2120 0.0101 0.2395 0.3272 0.5949 0.8302 0.9045 0.9888 0.9912
#> 138: 0.2645 0.2828 0.0353 0.4293 0.5685 0.6990 0.7246 0.7622 0.9242 1.0000
#> 139: 0.3175 0.3714 0.0363 0.4552 0.5700 0.7397 0.8062 0.8837 0.9432 1.0000
#> V27 V28 V29 V3 V30 V31 V32 V33 V34 V35
#> <num> <num> <num> <num> <num> <num> <num> <num> <num> <num>
#> 1: 0.7104 0.8080 0.6791 0.0428 0.3857 0.1307 0.2604 0.5121 0.7547 0.8537
#> 2: 0.5070 0.8533 0.6036 0.1099 0.8514 0.8512 0.5045 0.1862 0.2709 0.4232
#> 3: 0.6995 1.0000 0.7262 0.0481 0.4724 0.5103 0.5459 0.2881 0.0981 0.1951
#> 4: 0.5389 0.3750 0.3411 0.0277 0.5067 0.5580 0.4778 0.3299 0.2198 0.1407
#> 5: 0.5713 0.5429 0.2177 0.1321 0.2149 0.5811 0.6323 0.2965 0.1873 0.2969
#> ---
#> 135: 0.9301 0.9955 0.8576 0.0398 0.6069 0.3934 0.2464 0.1645 0.1140 0.0956
#> 136: 0.9168 1.0000 0.7896 0.0488 0.5371 0.6472 0.6505 0.4959 0.2175 0.0990
#> 137: 0.9448 1.0000 0.9092 0.0298 0.7412 0.7691 0.7117 0.5304 0.2131 0.0928
#> 138: 0.9979 0.8297 0.7032 0.0490 0.7141 0.6893 0.4961 0.2584 0.0969 0.0776
#> 139: 0.9375 0.7603 0.7123 0.0136 0.8358 0.7622 0.4567 0.1715 0.1549 0.1641
#> V36 V37 V38 V39 V4 V40 V41 V42 V43 V44
#> <num> <num> <num> <num> <num> <num> <num> <num> <num> <num>
#> 1: 0.8507 0.6692 0.6097 0.4943 0.0207 0.2744 0.0510 0.2834 0.2825 0.4256
#> 2: 0.3043 0.6116 0.6756 0.5375 0.1083 0.4719 0.4647 0.2587 0.2129 0.2222
#> 3: 0.4181 0.4604 0.3217 0.2828 0.0394 0.2430 0.1979 0.2444 0.1847 0.0841
#> 4: 0.2856 0.3807 0.4158 0.4054 0.0174 0.3296 0.2707 0.2650 0.0723 0.1238
#> 5: 0.5163 0.6153 0.4283 0.5479 0.1408 0.6133 0.5017 0.2377 0.1957 0.1749
#> ---
#> 135: 0.0080 0.0702 0.0936 0.0894 0.0570 0.1127 0.0873 0.1020 0.1964 0.2256
#> 136: 0.0434 0.1708 0.1979 0.1880 0.0848 0.1108 0.1702 0.0585 0.0638 0.1391
#> 137: 0.1297 0.1159 0.1226 0.1768 0.0564 0.0345 0.1562 0.0824 0.1149 0.1694
#> 138: 0.0364 0.1572 0.1823 0.1349 0.0608 0.0849 0.0492 0.1367 0.1552 0.1548
#> 139: 0.1869 0.2655 0.1713 0.0959 0.0272 0.0768 0.0847 0.2076 0.2505 0.1862
#> V45 V46 V47 V48 V49 V5 V50 V51 V52 V53
#> <num> <num> <num> <num> <num> <num> <num> <num> <num> <num>
#> 1: 0.2641 0.1386 0.1051 0.1343 0.0383 0.0954 0.0324 0.0232 0.0027 0.0065
#> 2: 0.2111 0.0176 0.1348 0.0744 0.0130 0.0974 0.0106 0.0033 0.0232 0.0166
#> 3: 0.0692 0.0528 0.0357 0.0085 0.0230 0.0590 0.0046 0.0156 0.0031 0.0054
#> 4: 0.1192 0.1089 0.0623 0.0494 0.0264 0.0384 0.0081 0.0104 0.0045 0.0014
#> 5: 0.1304 0.0597 0.1124 0.1047 0.0507 0.1674 0.0159 0.0195 0.0201 0.0248
#> ---
#> 135: 0.1814 0.2012 0.1688 0.1037 0.0501 0.0529 0.0136 0.0130 0.0120 0.0039
#> 136: 0.0638 0.0581 0.0641 0.1044 0.0732 0.1127 0.0275 0.0146 0.0091 0.0045
#> 137: 0.0954 0.0080 0.0790 0.1255 0.0647 0.0760 0.0179 0.0051 0.0061 0.0093
#> 138: 0.1319 0.0985 0.1258 0.0954 0.0489 0.0167 0.0241 0.0042 0.0086 0.0046
#> 139: 0.1439 0.1470 0.0991 0.0041 0.0154 0.0214 0.0116 0.0181 0.0146 0.0129
#> V54 V55 V56 V57 V58 V59 V6 V60 V7 V8
#> <num> <num> <num> <num> <num> <num> <num> <num> <num> <num>
#> 1: 0.0159 0.0072 0.0167 0.0180 0.0084 0.0090 0.0986 0.0032 0.1539 0.1601
#> 2: 0.0095 0.0180 0.0244 0.0316 0.0164 0.0095 0.2280 0.0078 0.2431 0.3771
#> 3: 0.0105 0.0110 0.0015 0.0072 0.0048 0.0107 0.0649 0.0094 0.1209 0.2467
#> 4: 0.0038 0.0013 0.0089 0.0057 0.0027 0.0051 0.0990 0.0062 0.1201 0.1833
#> 5: 0.0131 0.0070 0.0138 0.0092 0.0143 0.0036 0.1710 0.0103 0.0731 0.1401
#> ---
#> 135: 0.0053 0.0062 0.0046 0.0045 0.0022 0.0005 0.1091 0.0031 0.1709 0.1684
#> 136: 0.0043 0.0043 0.0098 0.0054 0.0051 0.0065 0.1103 0.0103 0.1349 0.2337
#> 137: 0.0135 0.0063 0.0063 0.0034 0.0032 0.0062 0.0958 0.0067 0.0990 0.1018
#> 138: 0.0126 0.0036 0.0035 0.0034 0.0079 0.0036 0.1354 0.0048 0.1465 0.1123
#> 139: 0.0047 0.0039 0.0061 0.0040 0.0036 0.0061 0.0338 0.0115 0.0655 0.1400
#> V9
#> <num>
#> 1: 0.3109
#> 2: 0.5598
#> 3: 0.3564
#> 4: 0.2105
#> 5: 0.2083
#> ---
#> 135: 0.1865
#> 136: 0.3113
#> 137: 0.1030
#> 138: 0.1945
#> 139: 0.1843
#>
#> $pv
#> $pv$k
#> [1] 7
#>
#>
#> $kknn
#> NULL
#>
#> classif.ce
#> 0.1884058