1
|
Ivanov E, De Saint-Jean C, Sobes V. Nuclear data assimilation, scientific basis and current status. EPJ NUCLEAR SCIENCES & TECHNOLOGIES 2021. [DOI: 10.1051/epjn/2021008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
The use of Data Assimilation methodologies, known also as a data adjustment, liaises the results of theoretical and experimental studies improving an accuracy of simulation models and giving a confidence to designers and regulation bodies. From the mathematical point of view, it approaches an optimized fit to experimental data revealing unknown causes by known consequences that would be crucial for data calibration and validation. Data assimilation adds value in a ND evaluation process, adjusting nuclear data to particular application providing so-called optimized design-oriented library, calibrating nuclear data involving IEs since all theories and differential experiments provide the only relative values, and providing an evidence-based background for validation of Nuclear data libraries substantiating the UQ process. Similarly, it valorizes experimental data and the experiments, as such involving them in a scientific turnover extracting essential information inherently contained in legacy and newly set up experiments, and prioritizing dedicated basic experimental programs. Given that a number of popular algorithms, including deterministic like Generalized Linear Least Square methodology and stochastic ones like Backward and Hierarchic or Total Monte-Carlo, Hierarchic Monte-Carlo, etc., being different in terms of particular numerical formalism are, though, commonly grounded on the Bayesian theoretical basis. They demonstrated sufficient maturity, providing optimized design-oriented data libraries or evidence-based backgrounds for a science-driven validation of general-purpose libraries in a wide range of practical applications.
Collapse
|
2
|
Siefman D, Hursin M, Sjostrand H, Schnabel G, Rochman D, Pautz A. Data assimilation of post-irradiation examination data for fission yields from GEF. EPJ NUCLEAR SCIENCES & TECHNOLOGIES 2020. [DOI: 10.1051/epjn/2020015] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
Nuclear data, especially fission yields, create uncertainties in the predicted concentrations of fission products in spent fuel which can exceed engineering target accuracies. Herein, we present a new framework that extends data assimilation methods to burnup simulations by using post-irradiation examination experiments. The adjusted fission yields lowered the bias and reduced the uncertainty of the simulations. Our approach adjusts the model parameters of the code GEF. We compare the BFMC and MOCABA approaches to data assimilation, focusing especially on the effects of the non-normality of GEF’s fission yields. In the application that we present, the best data assimilation framework decreased the average bias of the simulations from 26% to 14%. The average relative standard deviation decreased from 21% to 14%. The GEF fission yields after data assimilation agreed better with those in JEFF3.3. For Pu-239 thermal fission, the average relative difference from JEFF3.3 was 16% before data assimilation and after it was 12%. For the standard deviations of the fission yields, GEF’s were 100% larger than JEFF3.3’s before data assimilation and after were only 4% larger. The inconsistency of the integral data had an important effect on MOCABA, as shown with the Marginal Likelihood Optimization method. When the method was not applied, MOCABA’s adjusted fission yields worsened the bias of the simulations by 30%. BFMC showed that it inherently accounted for this inconsistency. Applying Marginal Likelihood Optimization with BFMC gave a 2% lower bias compared to not applying it, but the results were more poorly converged.
Collapse
|
3
|
Siefman D, Hursin M, Pautz A. Data assimilation of post irradiation examination experiments to adjust fission yields. EPJ WEB OF CONFERENCES 2020. [DOI: 10.1051/epjconf/202023913004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
Nuclear data, especially fission yields, create uncertainties in the predicted concentrations of fission products in spent fuel. Herein, we present a new framework that extends data assimilation methods to burnup simulations by using data from post-irradiation examination experiments. The adjusted fission yields improve the bias and reduce the uncertainty of predicted fission product concentrations in spent fuel. Our approach modifies fission yields by adjusting the model parameters of the code GEF with post-irradiation examination experiments. We used the BFMC data assimilation method to account for the non-normality of GEF's fission yields. In the application that we present, the assimilation decreased the average bias of the predicted fission product concentrations from 26% to 15%. The average relative standard deviation decreased from 21% to 14%. The GEF fission yields after data assimilation agreed better with those in ENDF/B-VIII.O. For Pu-239 thermal fission, the average relative difference from ENDF/B-VIII.O was 16% before data assimilation and 11% after. For the standard deviations of the fission yields, GEF's were, on average, 16% larger than those from ENDF/B-VIII.O before data assimilation and 15% smaller after.
Collapse
|