4.5 Article

Data-driven modeling of coarse mesh turbulence for reactor transient analysis using convolutional recurrent neural networks

Journal

NUCLEAR ENGINEERING AND DESIGN
Volume 390, Issue -, Pages -

Publisher

ELSEVIER SCIENCE SA
DOI: 10.1016/j.nucengdes.2022.111716

Keywords

Machine learning; Deep neural network; Thermal mixing and stratification; Convolutional recurrent neural networks

Funding

  1. Laboratory Directed Research and Development (LDRD) from Argonne National Laboratory by the Office of Science, of the U.S. Department of Energy [DEAC02-06CH11357]

Ask authors/readers for more resources

Advanced nuclear reactors often exhibit complex thermal-fluid phenomena during transients. This paper proposes a data-driven coarse-mesh turbulence model based on local flow features for the transient analysis of thermal mixing and stratification in a sodium-cooled fast reactor. The model combines a densely connected convolutional network and a long-short-term-memory network to efficiently learn from the spatial temporal computational fluid dynamics (CFD) transient simulation results. The trained model demonstrates high accuracy and generalization capability for predicting turbulent viscosity in different transients, showing the potential of the proposed data-driven approach for supporting the coarse-mesh multi-dimensional modeling of advanced reactors.
Advanced nuclear reactors often exhibit complex thermal-fluid phenomena during transients. To accurately capture such phenomena, a coarse-mesh three-dimensional (3-D) modeling capability is desired for modern nuclear-system code. In the coarse-mesh 3-D modeling of advanced-reactor transients that involve flow and heat transfer, accurately predicting the turbulent viscosity is a challenging task that requires an accurate and computationally efficient model to capture the unresolved fine-scale turbulence.& nbsp;In this paper, we propose a data-driven coarse-mesh turbulence model based on local flow features for the transient analysis of thermal mixing and stratification in a sodium-cooled fast reactor. The model has a coarse mesh setup to ensure computational efficiency, while it is trained by fine-mesh computational fluid dynamics (CFD) data to ensure accuracy. A novel neural network architecture, combining a densely connected convolutional network and a long-short-term-memory network, is developed that can efficiently learn from the spatial temporal CFD transient simulation results. The neural network model was trained and optimized on a loss-of flow transient and demonstrated high accuracy in predicting the turbulent viscosity field during the whole transient. The trained model's generalization capability was also investigated on two other transients with different inlet conditions. The study demonstrates the potential of applying the proposed data-driven approach to support the coarse-mesh multi-dimensional modeling of advanced reactors.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available