4.6 Article

EEkNN: k-Nearest Neighbor Classifier with an Evidential Editing Procedure for Training Samples

期刊

ELECTRONICS
卷 8, 期 5, 页码 -

出版社

MDPI
DOI: 10.3390/electronics8050592

关键词

pattern classification; k-nearest-neighbor classifier; fuzzy editing; evidential editing; belief function theory

资金

  1. National Natural Science Foundation of China [61790552, 61801386]
  2. Natural Science Basic Research Plan in Shaanxi Province of China [2018JQ6043]
  3. China Postdoctoral Science Foundation [2019M653743]
  4. Aerospace Science and Technology Foundation of China

向作者/读者索取更多资源

The k-nearest neighbor (kNN) rule is one of the most popular classification algorithms applied in many fields because it is very simple to understand and easy to design. However, one of the major problems encountered in using the kNN rule is that all of the training samples are considered equally important in the assignment of the class label to the query pattern. In this paper, an evidential editing version of the kNN rule is developed within the framework of belief function theory. The proposal is composed of two procedures. An evidential editing procedure is first proposed to reassign the original training samples with new labels represented by an evidential membership structure, which provides a general representation model regarding the class membership of the training samples. After editing, a classification procedure specifically designed for evidently edited training samples is developed in the belief function framework to handle the more general situation in which the edited training samples are assigned dependent evidential labels. Three synthetic datasets and six real datasets collected from various fields were used to evaluate the performance of the proposed method. The reported results show that the proposal achieves better performance than other considered kNN-based methods, especially for datasets with high imprecision ratios.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据