4.4 Article

Nonlinear gradient denoising: Finding accurate extrema from inaccurate functional derivatives

期刊

INTERNATIONAL JOURNAL OF QUANTUM CHEMISTRY
卷 115, 期 16, 页码 1102-1114

出版社

WILEY
DOI: 10.1002/qua.24937

关键词

density functional theory; machine learning; nonlinear gradient denoising; orbital-free density functional theory

资金

  1. NSF [CHE-1240252]
  2. Alexander von Humboldt Foundation
  3. EU PASCAL2
  4. DFG
  5. Einstein Foundation
  6. National Research Foundation of Korea [2012-005741]

向作者/读者索取更多资源

A method for nonlinear optimization with machine learning (ML) models, called nonlinear gradient denoising (NLGD), is developed, and applied with ML approximations to the kinetic energy density functional in an orbital-free density functional theory. Due to systematically inaccurate gradients of ML models, in particular when the data is very high-dimensional, the optimization must be constrained to the data manifold. We use nonlinear kernel principal component analysis (PCA) to locally reconstruct the manifold, enabling a projected gradient descent along it. A thorough analysis of the method is given via a simple model, designed to clarify the concepts presented. Additionally, NLGD is compared with the local PCA method used in previous work. Our method is shown to be superior in cases when the data manifold is highly nonlinear and high dimensional. Further applications of the method in both density functional theory and ML are discussed. (c) 2015 Wiley Periodicals, Inc.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.4
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据