4.7 Article

Is the k-NN classifier in high dimensions affected by the curse of dimensionality?

Journal

COMPUTERS & MATHEMATICS WITH APPLICATIONS
Volume 65, Issue 10, Pages 1427-1437

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.camwa.2012.09.011

Keywords

Nearest neighbour search; The curse of dimensionality; Approximate k-NN classifier; Borel dimensionality reduction

Ask authors/readers for more resources

There is an increasing body of evidence suggesting that exact nearest neighbour search in high-dimensional spaces is affected by the curse of dimensionality at a fundamental level. Does it necessarily mean that the same is true for k nearest neighbours based learning algorithms such as the k-NN classifier? We analyse this question at a number of levels and show that the answer is different at each of them. As our first main observation, we show the consistency of a k approximate nearest neighbour classifier. However, the performance of the classifier in very high dimensions is provably unstable. As our second main observation, we point out that the existing model for statistical learning is oblivious of dimension of the domain and so every learning problem admits a universally consistent deterministic reduction to the one-dimensional case by means of a Bore! isomorphism. (C) 2012 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available