4.7 Article

A decision cognizant Kullback-Leibler divergence

期刊

PATTERN RECOGNITION
卷 61, 期 -, 页码 470-478

出版社

ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2016.08.018

关键词

Kullback-Leibler divergence; Divergence clutter; Classifier incongruence

资金

  1. EPSRC [EP/ K014307/1]
  2. FACER2VM [EP/N007743/1]
  3. FAPESP [2015/24652-2, 2015/13504-2]
  4. EPSRC [EP/K014307/2, EP/N007743/1] Funding Source: UKRI
  5. Engineering and Physical Sciences Research Council [EP/K014307/2, EP/N007743/1] Funding Source: researchfish

向作者/读者索取更多资源

In decision making systems involving multiple classifiers there is the need to assess classifier (in)congruence, that is to gauge the degree of agreement between their outputs. A commonly used measure for this purpose is the Kullback-Leibler (KL) divergence. We propose a variant of the KL divergence, named decision cognizant Kullback-Leibler divergence (DC-KL), to reduce the contribution of the minority classes, which obscure the true degree of classifier incongruence. We investigate the properties of the novel divergence measure analytically and by simulation studies. The proposed measure is demonstrated to be more robust to minority class clutter. Its sensitivity to estimation noise is also shown to be considerably lower than that of the classical KL divergence. These properties render the DC-KL divergence a much better statistic for discriminating between classifier congruence and incongruence in pattern recognition systems. (C) 2016 Elsevier Ltd. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据