4.6 Article

Non-parallel support vector classifiers with different loss functions

期刊

NEUROCOMPUTING
卷 143, 期 -, 页码 294-301

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2014.05.063

关键词

Non-parallel classifiers; Least squares loss; Pinball loss; Hinge loss; Kernel trick

资金

  1. European Research Council under the European Union's Seventh Framework Programme (FP7) / ERC AdG A-DATADRIVE-B
  2. Research Council KUL [GOA/10/09 MaNet, CoE PFV/10/002, BIL12/11T]
  3. PhD/Postdoc grants Flemish Government
  4. FWO [G.0377.12, G.088114N]
  5. IWT [100031]
  6. Belgian Federal Science Policy Office [IUAP P7/19]

向作者/读者索取更多资源

This paper introduces a general framework of non-parallel support vector machines, which involves a regularization term, a scatter loss and a misclassification loss. When dealing with binary problems, the framework with proper losses covers some existing non-parallel classifiers, such as multisurface proximal support vector machine via generalized eigenvalues, twin support vector machines, and its least squares version. The possibility of incorporating different existing scatter and misclassification loss functions into the general framework is discussed. Moreover, in contrast with the mentioned methods, which applies kernel-generated surface, we directly apply the kernel trick in the dual and then obtain nonparametric models. Therefore, one does not need to formulate two different primal problems for the linear and nonlinear kernel respectively. In addition, experimental results are given to illustrate the performance of different loss functions. (C) 2014 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据