4.5 Article

An approximation of the Gaussian RBF kernel for efficient classification with SVMs

期刊

PATTERN RECOGNITION LETTERS
卷 84, 期 -, 页码 107-113

出版社

ELSEVIER
DOI: 10.1016/j.patrec.2016.08.013

关键词

Support vector machines; Kernel methods; Feature map; Approximation

资金

  1. Deutsche Telekom Stiftung

向作者/读者索取更多资源

In theory, kernel support vector machines (SVMs) can be reformulated to linear SVMs. This reformulation can speed up SVM classifications considerably, in particular, if the number of support vectors is high. For the widely-used Gaussian radial basis function (RBF) kernel, however, this theoretical fact is impracticable because the reproducing kernel Hilbert space (RKHS) of this kernel has infinite dimensionality. Therefore, we derive a finite-dimensional approximative feature map, based on an orthonormal basis of the kernel's RKHS, to enable the reformulation of Gaussian RBF SVMs to linear SVMs. We show that the error of this approximative feature map decreases with factorial growth if the approximation quality is linearly increased. Experimental evaluations demonstrated that the approximative feature map achieves considerable speed-ups (about 18-fold on average), mostly without losing classification accuracy. Therefore, the proposed approximative feature map provides an efficient SVM evaluation method with minimal loss of precision. (C) 2016ElsevierB. V. Allrightsreserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据