The k-nearest neighbor (k-NN) pattern classifier is a simple yet effective learner. However, it has a few drawbacks, one of which is the large model size. There are a number of algorithms that are able to condense the model size of the k-NN classifier at the expense of accuracy. Boosting is therefore desirable for increasing the accuracy of these condensed models. Unfortunately, there does not exist a boosting algorithm that works well with k-NN directly. We present a direct boosting algorithm for the k-NN classifier that creates an ensemble of models with locally modified distance weighting. An empirical study conducted on 10 standard databases from the UCI repository shows that this new Boosted k-NN algorithm has increased generalization accuracy in the majority of the datasets and never performs worse than standard k-NN.
College and Department
Physical and Mathematical Sciences; Computer Science
BYU ScholarsArchive Citation
Neo, TohKoon, "A Direct Algorithm for the K-Nearest-Neighbor Classifier via Local Warping of the Distance Metric" (2007). All Theses and Dissertations. 1248.
computer science, machine learning, k nearest neighbor, knn, boosting