The full text of this article is unavailable through your IP address: 3.137.181.194
Contents Online
Statistics and Its Interface
Volume 15 (2022)
Number 2
A new $k$-nearest neighbors classifier for functional data
Pages: 247 – 260
DOI: https://dx.doi.org/10.4310/20-SII650
Authors
Abstract
For supervised classification of functional data, several classifiers have been proposed in the literature, including the well-known classic $k$-nearest neighbors (kNN) classifier. The classic kNN classifier selects $k$ nearest neighbors around a new observation and determines its class-membership according to a majority vote. A difficulty arises when there are two classes having the same largest number of votes. To overcome this difficulty, we propose a new kNN classifier which selects $k$ nearest neighbors around a new observation from each class. The class-membership of the new observation is determined by the minimum average distance or semi-distance between the $k$ nearest neighbors and the new observation. Good performance of the new kNN classifier is demonstrated by three simulation studies and two real data examples.
Keywords
functional data analysis, supervised classification, functional dissimilarity measures, knearest neighbors classifier, ties broken, class imbalance problem
2010 Mathematics Subject Classification
Primary 62H30. Secondary 62M99.
This work was financially supported by the National University of Singapore academic research grant R-155-000-212-114.
Received 28 February 2020
Accepted 18 October 2020
Published 11 January 2022