4.6 Article

Combining example selection with instance selection to speed up multiple-instance learning

期刊

NEUROCOMPUTING
卷 129, 期 -, 页码 504-515

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2013.09.008

关键词

Multiple-instance learning; Artificial immune systems; Example selection; Instance selection; Support vector machines

向作者/读者索取更多资源

Recently, several instance selection-based methods have been presented to solve the multiple-instance learning (MIL) problem. The basic idea is converting MIL into standard supervised learning by selecting some representative instance prototypes from the training set. However, training examples are not single instances but bags composed of one or more instances in MIL so the computational complexity is often very high. Previous methods consider this issue only from the perspective of instance selection not from that of example selection. In this paper, we try to address this issue via combining example selection with instance selection. Three general example selection methods are derived by adapting three immune-inspired algorithms to MIL Additionally, we propose a simple instance selection method for MIL based on the probability that an instance is positive given a set of negative instances. Our example selection methods are combined with the new MIL method and other previous instance selection-based ones as a preprocessing step. The theoretical analysis and empirical results show that our MIL method is competitive to the state-of-the-art and the proposed example selection methods could significantly speed up various instance selection-based MIL methods with slightly weakening their performance or even strengthening it. (C) 2013 Elsevier BM. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据