4.7 Article

Learning With Auxiliary Less-Noisy Labels

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNNLS.2016.2546956

关键词

Maximum likelihood approach; noisy degrees; noisy labels; soft constraints

资金

  1. NSFC [61379098]

向作者/读者索取更多资源

Obtaining a sufficient number of accurate labels to form a training set for learning a classifier can be difficult due to the limited access to reliable label resources. Instead, in real-world applications, less-accurate labels, such as labels from nonexpert labelers, are often used. However, learning with less-accurate labels can lead to serious performance deterioration because of the high noise rate. Although several learning methods (e.g., noise-tolerant classifiers) have been advanced to increase classification performance in the presence of label noise, only a few of them take the noise rate into account and utilize both noisy but easily accessible labels and less-noisy labels, a small amount of which can be obtained with an acceptable added time cost and expense. In this brief, we propose a learning method, in which not only noisy labels but also auxiliary less-noisy labels, which are available in a small portion of the training data, are taken into account. Based on a flipping probability noise model and a logistic regression classifier, this method estimates the noise rate parameters, infers ground-truth labels, and learns the classifier simultaneously in a maximum likelihood manner. The proposed method yields three learning algorithms, which correspond to three prior knowledge states regarding the less-noisy labels. The experiments show that the proposed method is tolerant to label noise, and outperforms classifiers that do not explicitly consider the auxiliary less-noisy labels.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据