4.5 Article

Improved Bounds for the Nystrom Method With Application to Kernel Classification

期刊

IEEE TRANSACTIONS ON INFORMATION THEORY
卷 59, 期 10, 页码 6939-6949

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIT.2013.2271378

关键词

Approximation error; concentration inequality; kernel methods; Nystrom method; random matrix theory

资金

  1. Office of Navy Research (ONR) [N00014-09-1-0663, N00014-12-10431]
  2. National Fundamental Research Program of China [2010CB327903]
  3. National Science Foundation of China [61073097, 61021062]

向作者/读者索取更多资源

We develop two approaches for analyzing the approximation error bound for the Nystrom method that approximates a positive semidefinite (PSD) matrix by sampling a small set of columns, one based on a concentration inequality for integral operators, and one based on random matrix theory. We show that the approximation error, measured in the spectral norm, can be improved from O(N/root m) to O(N/m(1-rho)) in the case of large eigengap, where N is the total number of data points, m is the number of sampled data points, and rho is an element of (0, 1/2) is a positive constant that characterizes the eigengap. When the eigenvalues of the kernel matrix follow a p-power law, our analysis based on random matrix theory further improves the bound to O(N/m(p-1))under an incoherence assumption. We present a kernel classification approach based on the Nystrom method and derive its generalization performance using the improved bound. We show that when the eigenvalues of the kernel matrix follow a p-power law, we can reduce the number of support vectors to N-2p/(p(2)-1), which is sublinear in N when p > 1 + root 2, without seriously sacrificing its generalization performance.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据