4.7 Article

Improved accuracy and transferability of molecular-orbital-based machine learning: Organics, transition-metal complexes, non-covalent interactions, and transition states

期刊

JOURNAL OF CHEMICAL PHYSICS
卷 154, 期 6, 页码 -

出版社

AIP Publishing
DOI: 10.1063/5.0032362

关键词

-

资金

  1. U.S. Army Research Laboratory [W911NF-12-2-0023]
  2. U.S. Department of Energy [DE-SC0019390]
  3. Caltech DeLogi Fund
  4. Camille and Henry Dreyfus Foundation [ML-20-196]
  5. Early Post-Doc Mobility Fellowship by the Swiss National Science Foundation [P2EZP2_184234]
  6. Molecular Software Sciences Institute (MolSSI)
  7. DOE Office of Science [DE-AC02-05CH11231]
  8. U.S. Department of Energy (DOE) [DE-SC0019390] Funding Source: U.S. Department of Energy (DOE)
  9. Swiss National Science Foundation (SNF) [P2EZP2_184234] Funding Source: Swiss National Science Foundation (SNF)

向作者/读者索取更多资源

MOB-ML is a molecular orbital-based machine learning approach that can accurately predict correlation energies. By preserving physical constraints, MOB-ML demonstrates numerical improvements on different datasets. By training on only 1% of the QM7b-T dataset, it can predict the total energy of the remaining 99% with higher accuracy compared to other methods.
Molecular-orbital-based machine learning (MOB-ML) provides a general framework for the prediction of accurate correlation energies at the cost of obtaining molecular orbitals. The application of Nesbet's theorem makes it possible to recast a typical extrapolation task, training on correlation energies for small molecules and predicting correlation energies for large molecules, into an interpolation task based on the properties of orbital pairs. We demonstrate the importance of preserving physical constraints, including invariance conditions and size consistency, when generating the input for the machine learning model. Numerical improvements are demonstrated for different datasets covering total and relative energies for thermally accessible organic and transition-metal containing molecules, non-covalent interactions, and transition-state energies. MOB-ML requires training data from only 1% of the QM7b-T dataset (i.e., only 70 organic molecules with seven and fewer heavy atoms) to predict the total energy of the remaining 99% of this dataset with sub-kcal/mol accuracy. This MOB-ML model is significantly more accurate than other methods when transferred to a dataset comprising of 13 heavy atom molecules, exhibiting no loss of accuracy on a size intensive (i.e., per-electron) basis. It is shown that MOB-ML also works well for extrapolating to transition-state structures, predicting the barrier region for malonaldehyde intramolecular proton-transfer to within 0.35 kcal/mol when only trained on reactant/product-like structures. Finally, the use of the Gaussian process variance enables an active learning strategy for extending the MOB-ML model to new regions of chemical space with minimal effort. We demonstrate this active learning strategy by extending a QM7b-T model to describe non-covalent interactions in the protein backbone-backbone interaction dataset to an accuracy of 0.28 kcal/mol.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据