Journal
BERNOULLI
Volume 16, Issue 1, Pages 181-207Publisher
INT STATISTICAL INST
DOI: 10.3150/09-BEJ206
Keywords
classification; feature selection; manifold learning; regression; shrinkage estimator; Tikhonov regularization
Categories
Funding
- Research Grants Council of Hong Kong [CityU 104007]
- National Science Fund for Distinguished Young Scholars, China [10529101]
- NSF [DMS-07-32260]
- NIH [R01 CA12317501A1, P50 GM 081883]
Ask authors/readers for more resources
A common belief in high-dimensional data analysis is that data are concentrated on a low-dimensional manifold. This motivates simultaneous dimension reduction and regression on manifolds. We provide an algorithm for learning gradients on manifolds for dimension reduction for high-dimensional data with few observations. We obtain generalization error bounds for the gradient estimates and show that the convergence rate depends on the intrinsic dimension of the manifold and not on the dimension of the ambient space. We illustrate the efficacy of this approach empirically on simulated and real data and compare the method to other dimension reduction procedures.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available