k-Nearest-Neighbor regression.
Calls kknn::kknn() from package kknn.
There is no training step for k-NN models, just storing the training data to
process it during the predict step.
Therefore, $model returns a list with the following elements:
formula: Formula for calling kknn::kknn() during $predict().
data: Training data for calling kknn::kknn() during $predict().
pars: Training parameters for calling kknn::kknn() during $predict().
kknn: Model as returned by kknn::kknn(), only available after $predict() has been called.
This Learner can be instantiated via the dictionary mlr_learners or with the associated sugar function lrn():
mlr_learners$get("regr.kknn") lrn("regr.kknn")
Cover, Thomas, Hart, Peter (1967). “Nearest neighbor pattern classification.” IEEE transactions on information theory, 13(1), 21--27. doi: 10.1109/TIT.1967.1053964 . Hechenbichler, Klaus, Schliep, Klaus (2004). “Weighted k-nearest-neighbor techniques and ordinal classification.” Technical Report Discussion Paper 399, SFB 386, Ludwig-Maximilians University Munich. doi: 10.5282/ubm/epub.1769 . Samworth, J R (2012). “Optimal weighted nearest neighbour classifiers.” The Annals of Statistics, 40(5), 2733--2763. doi: 10.1214/12-AOS1049 .
mlr3::Learner -> mlr3::LearnerRegr -> LearnerRegrKKNN
new()Creates a new instance of this R6 class.
LearnerRegrKKNN$new()
clone()The objects of this class are cloneable with this method.
LearnerRegrKKNN$clone(deep = FALSE)
deepWhether to make a deep clone.
if (requireNamespace("kknn")) { learner = mlr3::lrn("regr.kknn") print(learner) # available parameters: learner$param_set$ids() }#> <LearnerRegrKKNN:regr.kknn> #> * Model: - #> * Parameters: list() #> * Packages: kknn #> * Predict Type: response #> * Feature types: logical, integer, numeric, factor, ordered #> * Properties: -#> [1] "k" "distance" "kernel" "scale" "ykernel"