4.6 Article

Generalized Pinball Loss SVMs

期刊

NEUROCOMPUTING
卷 322, 期 -, 页码 151-165

出版社

ELSEVIER SCIENCE BV
DOI: 10.1016/j.neucom.2018.08.079

关键词

Support Vector Machine; Pinball loss; epsilon insensitive zone; VC dimension; Sparsity

向作者/读者索取更多资源

Support Vector Machine (SVM) is a well-known efficient classification technique which appropriately trade-offs between the training error and the generalization ability of the classifier. But SVM's are known to be sensitive towards noise and not stable with respect to re-sampling. Recently, Huang et al. (2014) proposed a novel SVM formulation with pinball loss function (Pin-SVM) model to handle noise sensitivity and instability to re-sampling. The Pin-SVM model in its pure form loses sparsity and therefore Huang et al. (2014) proposed a new pinball SVM model termed as epsilon-insensitive zone Pin-SVM. The epsilon insensitive zone Pin-SVM formulation requires the value of epsilon to be pre-specified. Taking motivation from these developments, we propose a modified (epsilon(1), epsilon(2))-insensitive zone Pin-SVM ((epsilon(1), epsilon(2))-Mod-Pin-SVM) model in which the asymmetric spread of insensitive zone is optimized and therefore it is data-driven. An interesting feature of our proposed (epsilon(1), epsilon(2))-Mod-Pin-SVM model is that it appropriately trade-offs the generalization ability, the sparsity as well as the noise insensitivity. The (epsilon(1), epsilon(2))-Mod-Pin-SVM model is very general and subsumes a number of popular SVM type models. Its relation with Minimal Complexity Machine (MCM) introduced by Jayadeva (2015) is interesting and novel. The experimental results on several benchmark datasets prove the effectiveness of our proposed (epsilon(1), epsilon(2))-Mod-Pin-SVM formulation to that of other state-of-the-art classification algorithms. (C) 2018 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据