search for


Comparison study of K-nearest neighborhood classification algorithms
Journal of the Korean Data & Information Science Society 2019;30:977-85
Published online September 30, 2019;
© 2019 Korean Data and Information Science Society.

Yongsuk Jang1 · Beomjin Park2 · Changyi Park3

123Department of Statistics, University of Seoul
Correspondence to: Professor, Department of Statistics, University of Seoul, Seoul 02504, Korea.

This research was supported by the 2019 sabbatical year research grant of the University of Seoul.
Received July 22, 2019; Revised September 2, 2019; Accepted September 11, 2019.
This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License ( which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.
K-nearest neighbor (K-NN) classifier has been adopted in various classifications such as image classification because the classification accuracy of K-NN is generally acceptable for its simplicity in the implementation of its algorithm. While the weighted K-NN algorithm based on the kernel smoothing technique in local regressions makes the resulting decision boundary smooth, the kernel K-NN algorithm using the kernel trick in kernel machines make the decision boundary more complex. In the kernel K-NN algorithm, we propose to adopt the geometry based criterion for the selection of the tuning parameter of the Gaussian kernel because selecting the tuning parameter via cross validations can be computationally burdensome. Through simulated and real data analysis, we compare the performances of K-NN algorithms.
Keywords : Geometry based criterion, kernel K-nearest neighbor algorithm, weighted K-nearest neighbor algorithm.