4.7 Review

A review of Nystrom methods for large-scale machine learning

Journal

INFORMATION FUSION
Volume 26, Issue -, Pages 36-48

Publisher

ELSEVIER
DOI: 10.1016/j.inffus.2015.03.001

Keywords

Low-rank approximation; Nystrom method; Sampling method; Machine learning

Funding

  1. National Natural Science Foundation of China [61370175]
  2. Shanghai Knowledge Service Platform Project [ZF1213]

Ask authors/readers for more resources

Generating a low-rank matrix approximation is very important in large-scale machine learning applications. The standard Nystrom method is one of the state-of-the-art techniques to generate such an approximation. It has got rapid developments since being applied to Gaussian process regression. Several enhanced Nystrom methods such as ensemble Nystrom, modified Nystrom and SS-Nystrom have been proposed. In addition, many sampling methods have been developed. In this paper, we review the Nystrom methods for large-scale machine learning. First, we introduce various Nystrom methods. Second, we review different sampling methods for the Nystrom methods and summarize them from the perspectives of both theoretical analysis and practical performance. Then, we list several typical machine learning applications that utilize the Nystrom methods. Finally, we make our conclusions after discussing some open machine learning problems related to Nystrom methods. (C) 2015 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available