k-Nearest-Neighbor Classification Learner
Source:R/LearnerClassifKKNN.R
mlr_learners_classif.kknn.Rd
k-Nearest-Neighbor classification.
Calls kknn::kknn()
from package kknn.
Note
There is no training step for k-NN models, just storing the training data to
process it during the predict step.
Therefore, $model
returns a list with the following elements:
formula
: Formula for callingkknn::kknn()
during$predict()
.data
: Training data for callingkknn::kknn()
during$predict()
.pv
: Training parameters for callingkknn::kknn()
during$predict()
.kknn
: Model as returned bykknn::kknn()
, only available after$predict()
has been called. This is not stored by default, you must set hyperparameterstore_model
toTRUE
.
Dictionary
This Learner can be instantiated via the dictionary mlr_learners or with the associated sugar function lrn()
:
$get("classif.kknn")
mlr_learnerslrn("classif.kknn")
Meta Information
Task type: “classif”
Predict Types: “response”, “prob”
Feature Types: “logical”, “integer”, “numeric”, “factor”, “ordered”
Required Packages: mlr3, mlr3learners, kknn
Parameters
Id | Type | Default | Levels | Range |
k | integer | 7 | \([1, \infty)\) | |
distance | numeric | 2 | \([0, \infty)\) | |
kernel | character | optimal | rectangular, triangular, epanechnikov, biweight, triweight, cos, inv, gaussian, rank, optimal | - |
scale | logical | TRUE | TRUE, FALSE | - |
ykernel | untyped | - | ||
store_model | logical | FALSE | TRUE, FALSE | - |
References
Hechenbichler, Klaus, Schliep, Klaus (2004). “Weighted k-nearest-neighbor techniques and ordinal classification.” Technical Report Discussion Paper 399, SFB 386, Ludwig-Maximilians University Munich. doi:10.5282/ubm/epub.1769 .
Samworth, J R (2012). “Optimal weighted nearest neighbour classifiers.” The Annals of Statistics, 40(5), 2733--2763. doi:10.1214/12-AOS1049 .
Cover, Thomas, Hart, Peter (1967). “Nearest neighbor pattern classification.” IEEE transactions on information theory, 13(1), 21--27. doi:10.1109/TIT.1967.1053964 .
See also
Chapter in the mlr3book: https://mlr3book.mlr-org.com/chapters/chapter2/data_and_basic_modeling.html#sec-learners
Package mlr3extralearners for more learners.
Dictionary of Learners: mlr_learners
as.data.table(mlr_learners)
for a table of available Learners in the running session (depending on the loaded packages).mlr3pipelines to combine learners with pre- and postprocessing steps.
Extension packages for additional task types:
mlr3proba for probabilistic supervised regression and survival analysis.
mlr3cluster for unsupervised clustering.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Other Learner:
mlr_learners_classif.cv_glmnet
,
mlr_learners_classif.glmnet
,
mlr_learners_classif.lda
,
mlr_learners_classif.log_reg
,
mlr_learners_classif.multinom
,
mlr_learners_classif.naive_bayes
,
mlr_learners_classif.nnet
,
mlr_learners_classif.qda
,
mlr_learners_classif.ranger
,
mlr_learners_classif.svm
,
mlr_learners_classif.xgboost
,
mlr_learners_regr.cv_glmnet
,
mlr_learners_regr.glmnet
,
mlr_learners_regr.kknn
,
mlr_learners_regr.km
,
mlr_learners_regr.lm
,
mlr_learners_regr.nnet
,
mlr_learners_regr.ranger
,
mlr_learners_regr.svm
,
mlr_learners_regr.xgboost
Super classes
mlr3::Learner
-> mlr3::LearnerClassif
-> LearnerClassifKKNN
Examples
if (requireNamespace("kknn", quietly = TRUE)) {
learner = mlr3::lrn("classif.kknn")
print(learner)
# available parameters:
learner$param_set$ids()
}
#> <LearnerClassifKKNN:classif.kknn>: k-Nearest-Neighbor
#> * Model: -
#> * Parameters: k=7
#> * Packages: mlr3, mlr3learners, kknn
#> * Predict Types: [response], prob
#> * Feature Types: logical, integer, numeric, factor, ordered
#> * Properties: multiclass, twoclass
#> [1] "k" "distance" "kernel" "scale" "ykernel"
#> [6] "store_model"