4.7 Article

Locality constrained representation-based K-nearest neighbor classification

Journal

KNOWLEDGE-BASED SYSTEMS
Volume 167, Issue -, Pages 38-52

Publisher

ELSEVIER SCIENCE BV
DOI: 10.1016/j.knosys.2019.01.016

Keywords

K-nearest neighbor rule; Local mean vector; Representation-based distance; Pattern recognition

Funding

  1. National Natural Science Foundation of China [61502208, 61762021, 61402122]
  2. Natural Science Foundation of Jiangsu Province of China [BK20150522]
  3. International Postdoctoral Exchange Fellowship Program of China Postdoctoral Council [20180051]
  4. Research Foundation for Talented Scholars of JiangSu University [14JDG037]
  5. China Postdoctoral Science Foundation [2015M570411]
  6. Open Foundation of Artificial Intelligence Key Laboratory of Sichuan Province [2017RYJ04]
  7. Natural Science Foundation of Guizhou Province [[2017]1130, [2017]5726-32]

Ask authors/readers for more resources

K-nearest neighbor rule (KNN) is one of the most widely used methods in pattern recognition. However, the KNN-based classification performance is severely affected by the sensitivity of the neighborhood size k and the simple majority voting in the regions of k-neighborhoods, especially in the case of the small sample size with the existing outliers. To overcome the issues, we propose two locality constrained representation-based k-nearest neighbor rules with the purpose of further improving the KNN-based classification performance. The one is the weighted representation-based k-nearest neighbor rule (WRKNN). In the WRKNN, the test sample is represented as the linear combination of its k-nearest neighbors from each class, and the localities of k-nearest neighbors per class as the weights constrain their corresponding representation coefficients. Using the representation coefficients of k-nearest neighbors per class, the representation-based distance between the test sample and the class-specific k-nearest neighbors is calculated as the classification decision rule. The other one is the weighted local mean representation-based k-nearest neighbor rule (WLMRKNN). In the WLMRKNN, k-local mean vectors of k-nearest neighbors per class are first calculated and then used for representing the test sample. In the linear combination of the class-specific k-local mean vectors to represent the test sample, the localities of k-local mean vectors per class are considered as the weights to constrain the representation coefficients of k-local mean vectors. The representation coefficients are employed to design the classification decision rule which is the class-specific representation-based distance between the test sample and k-local mean vectors per class. To demonstrate the effectiveness of the proposed methods, we conduct extensive experiments on the UCI and UCR data sets and face databases, in comparisons with seven related competitive KNN-based methods. The experimental results show that the proposed methods perform better with less sensitiveness to k, especially in the small sample size cases. (C) 2019 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available