4.4 Article

Learning gradients on manifolds

Journal

BERNOULLI
Volume 16, Issue 1, Pages 181-207

Publisher

INT STATISTICAL INST
DOI: 10.3150/09-BEJ206

Keywords

classification; feature selection; manifold learning; regression; shrinkage estimator; Tikhonov regularization

Funding

  1. Research Grants Council of Hong Kong [CityU 104007]
  2. National Science Fund for Distinguished Young Scholars, China [10529101]
  3. NSF [DMS-07-32260]
  4. NIH [R01 CA12317501A1, P50 GM 081883]

Ask authors/readers for more resources

A common belief in high-dimensional data analysis is that data are concentrated on a low-dimensional manifold. This motivates simultaneous dimension reduction and regression on manifolds. We provide an algorithm for learning gradients on manifolds for dimension reduction for high-dimensional data with few observations. We obtain generalization error bounds for the gradient estimates and show that the convergence rate depends on the intrinsic dimension of the manifold and not on the dimension of the ambient space. We illustrate the efficacy of this approach empirically on simulated and real data and compare the method to other dimension reduction procedures.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.4
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available