Article
Computer Science, Interdisciplinary Applications
J-L Akian, L. Bonnet, H. Owhadi, E. Savin
Summary: This paper introduces algorithms for selecting/designing kernels in Gaussian process regression/kriging surrogate modeling techniques. It presents two classes of algorithms: kernel flow, which selects the best kernel by minimizing the loss of accuracy caused by removing a portion of the dataset, and spectral kernel ridge regression, which selects the best kernel by minimizing the norm of the function to be approximated in the associated RKHS. The effectiveness of both approaches is demonstrated through numerical examples.
JOURNAL OF COMPUTATIONAL PHYSICS
(2022)
Article
Automation & Control Systems
Peter Koepernik, Florian Pfaff
Summary: This paper investigates the formal consistency of Gaussian process (GP) regression on general metric spaces, demonstrating that the variance of the posterior GP converges to zero almost surely and the posterior mean converges pointwise in L-2 to the unknown function.
JOURNAL OF MACHINE LEARNING RESEARCH
(2021)
Article
Automation & Control Systems
George Wynne, Francois-Xavier Briol, Mark Girolami
Summary: Gaussian processes are widely used in machine learning, statistics, and applied mathematics given their flexibility in approximating functions and quantifying uncertainty. However, when the smoothness of the model and the likelihood function are misspecified, the accuracy of Gaussian process approximations can be affected, and adjusting experimental designs and choosing kernels and hyperparameters can help alleviate this issue.
JOURNAL OF MACHINE LEARNING RESEARCH
(2021)
Article
Automation & Control Systems
Wenjia Wang, Bing-Yi Jing
Summary: This paper investigates the consequences of misspecified correlation function in Gaussian process regression and derives upper and lower error bounds. The study reveals that optimal convergence rate can be achieved even when the smoothness of the imposed correlation function exceeds that of the true correlation function, under a quasi-uniform sampling scheme. The paper also provides convergence rates for kernel ridge regression with misspecified kernel function.
JOURNAL OF MACHINE LEARNING RESEARCH
(2022)
Article
Computer Science, Artificial Intelligence
Kaito Watanabe, Kotaro Sakamoto, Ryo Karakida, Sho Sonoda, Shun-ichi Amari
Summary: The paper investigates the neural fields formed by a biological neural network in the cortex, specifically focusing on their supervised learning in a multilayer architecture. The field model is compared with randomly connected deep networks, and it is found that the neural field model exhibits better robustness and slight improvement in generalization ability compared to conventional models.
Article
Automation & Control Systems
Paul Scharnhorst, Emilio T. Maddalena, Yuning Jiang, Colin N. Jones
Summary: This study considers the problem of estimating out-of-sample bounds for an unknown ground-truth function. The main framework used is kernels and their associated Hilbert spaces, along with an observational model that includes bounded measurement noise. The noise can come from any compactly supported distribution, and no independent assumptions are made about the available data. The study shows how solving parametric quadratically constrained linear programs can compute tight, finite-sample uncertainty bounds. The properties of the approach are established and its relationship with another method is studied through numerical experiments.
IEEE TRANSACTIONS ON AUTOMATIC CONTROL
(2023)
Article
Mathematics, Applied
Palle Jorgensen, James Tian
Summary: We have developed a new harmonic analysis of reproducing kernels for various applications including feature fitting in machine learning algorithms, dynamics in large networks, spline approximation, determinantal point processes, stochastic processes, and fractals.
APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS
(2021)
Article
Computer Science, Artificial Intelligence
Jiamin Liu, Wangli Xu, Fode Zhang, Heng Lian
Summary: Kernel Fisher discriminant (KFD) is a popular nonlinear extension of Fisher's linear discriminant, but its asymptotic properties have been rarely studied. In this study, we propose an operator-theoretical formulation of KFD and establish the convergence of the KFD solution to its population target. We also introduce a sketched estimation approach based on a m x n sketching matrix, which retains the asymptotic properties even when m is much smaller than n. Numerical results demonstrate the effectiveness of the sketched estimator.
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
(2023)
Article
Mathematics, Applied
Toni Karvonen, Simo Sarkka, Ken'ichiro Tanaka
Summary: In this study, we construct approximate Fekete point sets for kernel-based interpolation by maximising the determinant of a kernel Gram matrix obtained via truncation of an orthonormal expansion of the kernel. Uniform error estimates are proved for kernel interpolants at the resulting points. For the Gaussian kernel, it is shown that the approximate Fekete points in one dimension are the solution to a convex optimisation problem and that the interpolants converge with a super-exponential rate. Numerical examples are provided for the Gaussian kernel.
NUMERICAL ALGORITHMS
(2021)
Article
Mathematics, Applied
Wei Qu, Tao Qian, Haichou Li, Kehe Zhu
Summary: This study explores the best kernel approximation problem for analytic functions on the unit disk D in the reproducing kernel Hilbert space H, proving the existence of the best kernel approximation for weighted Bergman spaces with standard weights.
APPLIED MATHEMATICS AND COMPUTATION
(2022)
Article
Mathematics
Guan-Tie Deng, Yun Huang, Tao Qian
Summary: In this paper, the theory of Bergman kernel is extended to the weighted case, obtaining a general form of weighted Bergman reproducing kernel which can be used to calculate concrete Bergman kernel functions for specific weights and domains.
JOURNAL OF GEOMETRIC ANALYSIS
(2021)
Article
Statistics & Probability
Rui Tuo, Shiyuan He, Arash Pourhabib, Yu Ding, Jianhua Z. Huang
Summary: This article presents a frequentist solution to the functional calibration problem, using reproducing kernel Hilbert spaces (RKHS) to model the optimal calibration function and estimating it through penalized least squares with an RKHS-norm penalty and physical data. The proposed method also includes an uncertainty quantification procedure and provides theoretical guarantees of prediction consistency and estimation of the optimal calibration function. It outperforms existing parametric functional calibration and Bayesian methods in prediction and uncertainty quantification.
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
(2023)
Article
Engineering, Electrical & Electronic
Haojie Wang, Xifeng Li, Dongjie Bi, Xuan Xie, Yongle Xie
Summary: This paper introduces a kernel adaptive filter based on the Student's t distribution in the RKHS, which utilizes a special reproducing kernel function and Strengthened Surprise Criterion to achieve higher accuracy and more compact neural network structures compared to traditional algorithms.
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS
(2021)
Article
Mathematics, Interdisciplinary Applications
Dah-Chin Luor, Liang-Yu Hsieh
Summary: This paper investigates the connections between fractal interpolation functions (FIFs) and reproducing kernel Hilbert spaces (RKHSs). By establishing a fractal-type positive semi-definite kernel, it is shown that the span of linearly independent smooth FIFs is the corresponding RKHS. Furthermore, the nth derivatives of these FIFs, properties of related positive semi-definite kernels, and the importance of subspaces in curve-fitting applications are studied.
FRACTAL AND FRACTIONAL
(2023)
Article
Mathematics
Zeyuan Song, Zuoren Sun
Summary: The central problem of this study is to represent any holomorphic and square integrable function on the Kepler manifold in the series form based on Fourier analysis. Three different domains on the Kepler manifold are considered and the weak pre-orthogonal adaptive Fourier decomposition (POAFD) is proposed. The weak maximal selection principle is shown to select the coefficient of the series, and a convergence theorem is proved to demonstrate the accuracy of the method.