4.6 Article

DATA-DRIVEN POLYNOMIAL RIDGE APPROXIMATION USING VARIABLE PROJECTION

期刊

SIAM JOURNAL ON SCIENTIFIC COMPUTING
卷 40, 期 3, 页码 A1566-A1589

出版社

SIAM PUBLICATIONS
DOI: 10.1137/17M1117690

关键词

active subspaces; emulator; Grassmann manifold; response surface; ridge function; variable projection

资金

  1. Department of Defense, Defense Advanced Research Project Agency's program Enabling Quantification of Uncertainty in Physical Systems
  2. U.S. Department of Energy Office of Science, Office of Advanced Scientific Computing Research, Applied Mathematics program [DE-SC-0011077]

向作者/读者索取更多资源

Inexpensive surrogates are useful for reducing the cost of science and engineering studies involving large-scale, complex computational models with many input parameters. A ridge approximation is one class of surrogate that models a quantity of interest as a nonlinear function of a few linear combinations of the input parameters. When used in parameter studies (e.g., optimization or uncertainty quantification), ridge approximations allow the low-dimensional structure to be exploited, reducing the effective dimension. We introduce a new, fast algorithm for constructing a ridge approximation where the nonlinear function is a polynomial. This polynomial ridge approximation is chosen to minimize least squares mismatch between the surrogate and the quantity of interest on a given set of inputs. Naively, this would require optimizing both the polynomial coefficients and the linear combination of weights, the latter of which define a low-dimensional subspace of the input space. However, given a fixed subspace the optimal polynomial can be found by solving a linear leastsquares problem. Hence using variable projection the polynomial can be implicitly defined, leaving an optimization problem over the subspace alone. Here we develop an algorithm that finds this polynomial ridge approximation by minimizing over the Grassmann manifold of low-dimensional subspaces using a Gauss{Newton method. Our Gauss{Newton method has superior theoretical guarantees and faster convergence on our numerical examples than the alternating approach for polynomial ridge approximation earlier proposed by Constantine, Eftekhari, Hokanson, and Ward [Comput. Methods Appl. Mech. Engrg., 326 (2017), pp. 402{421] that alternates between (i) optimizing the polynomial coefficients given the subspace and (ii) optimizing the subspace given the coefficients.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

Article Mathematics, Applied

A LIPSCHITZ MATRIX FOR PARAMETER REDUCTION IN COMPUTATIONAL SCIENCE

Jeffrey M. Hokanson, Paul G. Constantine

Summary: The Lipschitz matrix is a generalization of the scalar Lipschitz constant for functions with many inputs, providing a function-dependent metric and improving performance in computational science tasks. The Lipschitz matrix reduces worst-case cost and dimensionality curse, with the ability to define uncertainty and perform parameter reduction in complex computational models.

SIAM JOURNAL ON SCIENTIFIC COMPUTING (2021)

Article Mathematics, Applied

A DATA-DRIVEN MCMILLAN DEGREE LOWER BOUND

Jeffrey M. Hokanson

SIAM JOURNAL ON SCIENTIFIC COMPUTING (2020)

Article Mathematics, Applied

H2-OPTIMAL MODEL REDUCTION USING PROJECTED NONLINEAR LEAST SQUARES

Jeffrey M. Hokanson, Caleb C. Magruder

SIAM JOURNAL ON SCIENTIFIC COMPUTING (2020)

暂无数据