R/LearnerClassifKKNN.R
mlr_learners_classif.kknn.Rdk-Nearest-Neighbor classification.
Calls kknn::kknn() from package kknn.
There is no training step for k-NN models, just storing the training data to
process it during the predict step.
Therefore, $model returns a list with the following elements:
formula: Formula for calling kknn::kknn() during $predict().
data: Training data for calling kknn::kknn() during $predict().
pars: Training parameters for calling kknn::kknn() during $predict().
kknn: Model as returned by kknn::kknn(), only available after $predict() has been called.
This Learner can be instantiated via the dictionary mlr_learners or with the associated sugar function lrn():
mlr_learners$get("classif.kknn") lrn("classif.kknn")
Cover, Thomas, Hart, Peter (1967). “Nearest neighbor pattern classification.” IEEE transactions on information theory, 13(1), 21--27. doi: 10.1109/TIT.1967.1053964 . Hechenbichler, Klaus, Schliep, Klaus (2004). “Weighted k-nearest-neighbor techniques and ordinal classification.” Technical Report Discussion Paper 399, SFB 386, Ludwig-Maximilians University Munich. doi: 10.5282/ubm/epub.1769 . Samworth, J R (2012). “Optimal weighted nearest neighbour classifiers.” The Annals of Statistics, 40(5), 2733--2763. doi: 10.1214/12-AOS1049 .
mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifKKNN
new()Creates a new instance of this R6 class.
LearnerClassifKKNN$new()
clone()The objects of this class are cloneable with this method.
LearnerClassifKKNN$clone(deep = FALSE)
deepWhether to make a deep clone.
if (requireNamespace("kknn")) { learner = mlr3::lrn("classif.kknn") print(learner) # available parameters: learner$param_set$ids() }#>#> <LearnerClassifKKNN:classif.kknn> #> * Model: - #> * Parameters: list() #> * Packages: kknn #> * Predict Type: response #> * Feature types: logical, integer, numeric, factor, ordered #> * Properties: multiclass, twoclass#> [1] "k" "distance" "kernel" "scale" "ykernel"