期刊
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
卷 33, 期 11, 页码 6775-6788出版社
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNNLS.2021.3083397
关键词
Phase locked loops; Training; Fasteners; Faces; Computational modeling; Predictive models; Prediction algorithms; Classification; multiclass; partial label; top-k error
To address ambiguities in partial label learning, a novel top-k partial label machine (TPLM) is proposed, utilizing top-k partial loss and convex top-k partial hinge loss for classification. An efficient optimization algorithm based on accelerated Prox-SDCA and linear programming is introduced, along with theoretical analysis of generalization error for TPLM. Experimental results demonstrate the superiority of the proposed method over existing approaches on controlled UCI datasets and real-world partial label datasets.
To deal with ambiguities in partial label learning (PLL), the existing PLL methods implement disambiguations, by either identifying the ground-truth label or averaging the candidate labels. However, these methods can be easily misled by the false-positive labels in the candidate label set. We find that these ambiguities often originate from the noise caused by highly correlated or overlapping candidate labels, which leads to the difficulty in identifying the ground-truth label on the first attempt. To give the trained models more tolerance, we first propose the top-k partial loss and convex top-k partial hinge loss. Based on the losses, we present a novel top-k partial label machine (TPLM) for partial label classification. An efficient optimization algorithm is proposed based on accelerated proximal stochastic dual coordinate ascent (Prox-SDCA) and linear programming (LP). Moreover, we present a theoretical analysis of the generalization error for TPLM. Comprehensive experiments on both controlled UCI datasets and real-world partial label datasets demonstrate that the proposed method is superior to the state-of-the-art approaches.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据