4.2 Article

Convergence rates for kernel regression in infinite-dimensional spaces

期刊

出版社

SPRINGER HEIDELBERG
DOI: 10.1007/s10463-018-0697-2

关键词

Adaptive estimate; Bias-variance decomposition; Gaussian process; Maximum likelihood regression; Mean square error; Optimal bandwidth; Small ball probability; t process

向作者/读者索取更多资源

We consider a nonparametric regression setup, where the covariate is a random element in a complete separable metric space, and the parameter of interest associated with the conditional distribution of the response lies in a separable Banach space. We derive the optimum convergence rate for the kernel estimate of the parameter in this setup. The small ball probability in the covariate space plays a critical role in determining the asymptotic variance of kernel estimates. Unlike the case of finite-dimensional covariates, we show that the asymptotic orders of the bias and the variance of the estimate achieving the optimum convergence rate may be different for infinite-dimensional covariates. Also, the bandwidth, which balances the bias and the variance, may lead to an estimate with suboptimal mean square error for infinite-dimensional covariates. We describe a data-driven adaptive choice of the bandwidth and derive the asymptotic behavior of the adaptive estimate.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.2
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据