Journal
JOURNAL OF PETROLEUM SCIENCE AND ENGINEERING
Volume 205, Issue -, Pages -Publisher
ELSEVIER
DOI: 10.1016/j.petrol.2021.108975
Keywords
Numerical simulation; Machine learning; Surrogate modeling; Uncertainty estimation; Hyperparameter optimization; Model goodness
Categories
Ask authors/readers for more resources
This study proposes a method for improving the tuning of Dropout frequency in deep learning models by integrating model uncertainty and assessing the goodness of the entire uncertainty model based on prediction realizations. Testing on a subsurface flow surrogate model shows that using the proposed method results in accurate and precise uncertainty models. The robustness and generalization of the method are also tested with out-of-distribution testing data.
Machine learning hyperparameter tuning generally relies on minimizing the prediction error and maximizing the accuracy at withheld testing data locations. However, for many subsurface applications, prediction accuracy is not sufficient, and we must consider the goodness, accuracy, and precision, of the entire uncertainty model. Bayesian deep neural networks provide a robust framework to model uncertainty. But at a high computational cost and difficulty to scale in high-dimensional parameter spaces. Dropout offers a potential alternative for calculating prediction model realizations to represent the uncertainty model. We propose a method to compare the entire uncertainty model based on the prediction realizations and the addition of a single objective function, known as uncertainty model goodness, to the loss function to integrate model uncertainty for improved tuning of the dropout frequency in deep learning models. We test our method with a subsurface flow surrogate model based on an image-to-image regression problem. Implementing the proposed method with the uncertainty model goodness results in accurate and precise uncertainty models using the tuned dropout hyperparameter. We test the robustness and generalization of the proposed method with out-of-distribution testing data.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available