101
|
Albani RAS, Albani VVL, Migon HS, Silva Neto AJ. Uncertainty quantification and atmospheric source estimation with a discrepancy-based and a state-dependent adaptative MCMC. ENVIRONMENTAL POLLUTION (BARKING, ESSEX : 1987) 2021; 290:118039. [PMID: 34467885 DOI: 10.1016/j.envpol.2021.118039] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/15/2021] [Revised: 08/05/2021] [Accepted: 08/22/2021] [Indexed: 06/13/2023]
Abstract
We address the source characterization of atmospheric releases using adaptive strategies in Bayesian inference in combination with the numerical solution of the dispersion problem by a stabilized finite element method and uncertainty quantification in the measurements. The adaptive techniques accelerate the convergence of Monte Carlo Markov Chain (MCMC) algorithms, leading to accurate reconstructions of the source parameters. Such accuracy is illustrated by the comparison with results from previous works. Moreover, the technique used to simulate the corresponding dispersion problem allowed us to introduce relevant meteorological information. The uncertainty quantification also improves the quality of reconstructions. Numerical examples using data from the Copenhagen experimental campaign illustrate the effectiveness of the proposed methodology. We found errors in reconstructions ranging from 0.11% to 8.67% of the size of the search region, which is similar to results found in previous works using deterministic techniques, with comparable computational time.
Collapse
|
102
|
Bickel DR. Propagating clade and model uncertainty to confidence intervals of divergence times and branch lengths. Mol Phylogenet Evol 2021; 167:107357. [PMID: 34785383 DOI: 10.1016/j.ympev.2021.107357] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2021] [Revised: 11/01/2021] [Accepted: 11/08/2021] [Indexed: 12/01/2022]
Abstract
Confidence intervals of divergence times and branch lengths do not reflect uncertainty about their clades or about the prior distributions and other model assumptions on which they are based. Uncertainty about the clade may be propagated to a confidence interval by multiplying its confidence level by the bootstrap proportion of its clade or by another probability that the clade is correct. (If the confidence level is 95% and the bootstrap proportion is 90%, then the uncertainty-adjusted confidence level is (0.95)(0.90) = 86%.) Uncertainty about the model can be propagated to the confidence interval by reporting the union of the confidence intervals from all the plausible models. Unless there is no overlap between the confidence intervals, that results in an uncertainty-adjusted interval that has as its lower and upper limits the most extreme limits of the models. The proposed methods of uncertainty quantification may be used together.
Collapse
|
103
|
Nilsen GK, Munthe-Kaas AZ, Skaug HJ, Brun M. Epistemic uncertainty quantification in deep learning classification by the Delta method. Neural Netw 2021; 145:164-176. [PMID: 34749029 DOI: 10.1016/j.neunet.2021.10.014] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2021] [Revised: 08/18/2021] [Accepted: 10/18/2021] [Indexed: 10/20/2022]
Abstract
The Delta method is a classical procedure for quantifying epistemic uncertainty in statistical models, but its direct application to deep neural networks is prevented by the large number of parameters P. We propose a low cost approximation of the Delta method applicable to L2-regularized deep neural networks based on the top K eigenpairs of the Fisher information matrix. We address efficient computation of full-rank approximate eigendecompositions in terms of the exact inverse Hessian, the inverse outer-products of gradients approximation and the so-called Sandwich estimator. Moreover, we provide bounds on the approximation error for the uncertainty of the predictive class probabilities. We show that when the smallest computed eigenvalue of the Fisher information matrix is near the L2-regularization rate, the approximation error will be close to zero even when K≪P. A demonstration of the methodology is presented using a TensorFlow implementation, and we show that meaningful rankings of images based on predictive uncertainty can be obtained for two LeNet and ResNet-based neural networks using the MNIST and CIFAR-10 datasets. Further, we observe that false positives have on average a higher predictive epistemic uncertainty than true positives. This suggests that there is supplementing information in the uncertainty measure not captured by the classification alone.
Collapse
|
104
|
Harbrecht H, Schmidlin M. Multilevel quadrature for elliptic problems on random domains by the coupling of FEM and BEM. STOCHASTIC PARTIAL DIFFERENTIAL EQUATIONS : ANALYSIS AND COMPUTATIONS 2021; 10:1619-1650. [PMID: 36324998 PMCID: PMC9617976 DOI: 10.1007/s40072-021-00214-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/15/2019] [Revised: 09/08/2021] [Accepted: 09/08/2021] [Indexed: 06/16/2023]
Abstract
Elliptic boundary value problems which are posed on a random domain can be mapped to a fixed, nominal domain. The randomness is thus transferred to the diffusion matrix and the loading. While this domain mapping method is quite efficient for theory and practice, since only a single domain discretisation is needed, it also requires the knowledge of the domain mapping. However, in certain applications, the random domain is only described by its random boundary, while the quantity of interest is defined on a fixed, deterministic subdomain. In this setting, it thus becomes necessary to compute a random domain mapping on the whole domain, such that the domain mapping is the identity on the fixed subdomain and maps the boundary of the chosen fixed, nominal domain on to the random boundary. To overcome the necessity of computing such a mapping, we therefore couple the finite element method on the fixed subdomain with the boundary element method on the random boundary. We verify on one hand the regularity of the solution with respect to the random domain mapping required for many multilevel quadrature methods, such as the multilevel quasi-Monte Carlo quadrature using Halton points, the multilevel sparse anisotropic Gauss-Legendre and Clenshaw-Curtis quadratures and multilevel interlaced polynomial lattice rules. On the other hand, we derive the coupling formulation and show by numerical results that the approach is feasible.
Collapse
|
105
|
Ceresa L, Guadagnini A, Porta GM, Riva M. Formulation and probabilistic assessment of reversible biodegradation pathway of Diclofenac in groundwater. WATER RESEARCH 2021; 204:117466. [PMID: 34530227 DOI: 10.1016/j.watres.2021.117466] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/04/2021] [Revised: 07/14/2021] [Accepted: 07/25/2021] [Indexed: 06/13/2023]
Abstract
We present a conceptual and mathematical framework leading to the development of a biodegradation model capable to interpret the observed reversibility of the Pharmaceutical Sodium Diclofenac along its biological degradation pathway in groundwater. Diclofenac occurrence in water bodies poses major concerns due to its persistent (and bioactive) nature and its detection in surface waters and aquifer systems. Despite some evidences of its biodegradability at given reducing conditions, Diclofenac attenuation is often interpreted with models which are too streamlined, thus potentially hampering appropriate quantification of its fate. In this context, we propose a modeling framework based on the conceptualization of the molecular mechanisms of Diclofenac biodegradation which we then embed in a stochastic context, thus enabling one to quantify predictive uncertainty. We consider reference environmental conditions (biotic and denitrifying) associated with a set of batch experiments that evidence the occurrence of a reversible biotransformation pathway, a feature that is fully captured by our model. The latter is then calibrated in the context of a Bayesian modeling framework through an Acceptance-Rejection Sampling approach. By doing so, we quantify the uncertainty associated with model parameters and predicted Diclofenac concentrations. We discuss the probabilistic nature of uncertain model parameters and the challenges posed by their calibration with the available data. Our results are consistent with the recalcitrant behavior exhibited by Diclofenac in groundwater and documented through experimental data and support the observation that unbiased estimates of the hazard posed by Diclofenac to water resources should be assessed through a modeling strategy which fully embeds uncertainty quantification.
Collapse
|
106
|
Henyš P, Vořechovský M, Kuchař M, Heinemann A, Kopal J, Ondruschka B, Hammer N. Bone mineral density modeling via random field: Normality, stationarity, sex and age dependence. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2021; 210:106353. [PMID: 34500142 DOI: 10.1016/j.cmpb.2021.106353] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/28/2021] [Accepted: 08/06/2021] [Indexed: 06/13/2023]
Abstract
BACKGROUND AND OBJECTIVE Capturing the population variability of bone properties is of paramount importance to biomedical engineering. The aim of the present paper is to describe variability and correlations in bone mineral density with a spatial random field inferred from routine computed tomography data. METHODS Random fields were simulated by transforming pairwise uncorrelated Gaussian random variables into correlated variables through the spectral decomposition of an age-detrended correlation matrix. The validity of the random field model was demonstrated in the spatiotemporal analysis of bone mineral density. The similarity between the computed tomography samples and those generated via random fields was analyzed with the energy distance metric. RESULTS The random field of bone mineral density was found to be approximately Gaussian/slightly left-skewed/strongly right-skewed at various locations. However, average bone density could be simulated well with the proposed Gaussian random field for which the energy distance, i.e., a measure that quantifies discrepancies between two distribution functions, is convergent with respect to the number of correlation eigenpairs. CONCLUSIONS The proposed random field model allows the enhancement of computational biomechanical models with variability in bone mineral density, which could increase the usability of the model and provides a step forward in in-silico medicine.
Collapse
|
107
|
Wang D, Yu J, Chen L, Li X, Jiang H, Chen K, Zheng M, Luo X. A hybrid framework for improving uncertainty quantification in deep learning-based QSAR regression modeling. J Cheminform 2021; 13:69. [PMID: 34544485 PMCID: PMC8454160 DOI: 10.1186/s13321-021-00551-x] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/16/2021] [Accepted: 09/05/2021] [Indexed: 11/24/2022] Open
Abstract
Reliable uncertainty quantification for statistical models is crucial in various downstream applications, especially for drug design and discovery where mistakes may incur a large amount of cost. This topic has therefore absorbed much attention and a plethora of methods have been proposed over the past years. The approaches that have been reported so far can be mainly categorized into two classes: distance-based approaches and Bayesian approaches. Although these methods have been widely used in many scenarios and shown promising performance with their distinct superiorities, being overconfident on out-of-distribution examples still poses challenges for the deployment of these techniques in real-world applications. In this study we investigated a number of consensus strategies in order to combine both distance-based and Bayesian approaches together with post-hoc calibration for improved uncertainty quantification in QSAR (Quantitative Structure-Activity Relationship) regression modeling. We employed a set of criteria to quantitatively assess the ranking and calibration ability of these models. Experiments based on 24 bioactivity datasets were designed to make critical comparison between the model we proposed and other well-studied baseline models. Our findings indicate that the hybrid framework proposed by us can robustly enhance the model ability of ranking absolute errors. Together with post-hoc calibration on the validation set, we show that well-calibrated uncertainty quantification results can be obtained in domain shift settings. The complementarity between different methods is also conceptually analyzed.
Collapse
|
108
|
McNulty MJ, Kelada K, Paul D, Nandi S, McDonald KA. Techno-economic process modelling and Monte Carlo simulation data of uncertainty quantification in field-grown plant-based manufacturing. Data Brief 2021; 38:107317. [PMID: 34485647 PMCID: PMC8405912 DOI: 10.1016/j.dib.2021.107317] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2021] [Accepted: 08/19/2021] [Indexed: 11/21/2022] Open
Abstract
This data article is related to the research article, “M.J. McNulty, K. Kelada, D. Paul, S. Nandi, and K.A. McDonald, Introducing uncertainty quantification to techno-economic models of manufacturing field-grown plant-made products, Food Bioprod. Process. 128 (2021) 153–165.” The raw and analyzed data presented are related to generation, analysis, and optimization of ultra-large-scale field-grown plant-based manufacturing of high-value recombinant protein under uncertainty. The data have been acquired using deterministic techno-economic process model simulation in SuperPro Designer integrated with stochastic Monte Carlo-based simulation in Microsoft Excel using the Crystal Ball plug-in. The purpose of the article is to make techno-economic and associated uncertainty data available to be leveraged and adapted for other research purposes.
Collapse
|
109
|
Mancini T, Calvo-Pardo H, Olmo J. Extremely randomized neural networks for constructing prediction intervals. Neural Netw 2021; 144:113-128. [PMID: 34487958 DOI: 10.1016/j.neunet.2021.08.020] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2021] [Revised: 07/20/2021] [Accepted: 08/12/2021] [Indexed: 11/29/2022]
Abstract
The aim of this paper is to propose a novel prediction model based on an ensemble of deep neural networks adapting the extremely randomized trees method originally developed for random forests. The extra-randomness introduced in the ensemble reduces the variance of the predictions and improves out-of-sample accuracy. As a byproduct, we are able to compute the uncertainty about our model predictions and construct interval forecasts. Some of the limitations associated with bootstrap-based algorithms can be overcome by not performing data resampling and thus, by ensuring the suitability of the methodology in low and mid-dimensional settings, or when the i.i.d. assumption does not hold. An extensive Monte Carlo simulation exercise shows the good performance of this novel prediction method in terms of mean square prediction error and the accuracy of the prediction intervals in terms of out-of-sample prediction interval coverage probabilities. The advanced approach delivers better out-of-sample accuracy in experimental settings, improving upon state-of-the-art methods like MC dropout and bootstrap procedures.
Collapse
|
110
|
Oala L, Heiß C, Macdonald J, März M, Kutyniok G, Samek W. Detecting failure modes in image reconstructions with interval neural network uncertainty. Int J Comput Assist Radiol Surg 2021; 16:2089-2097. [PMID: 34480723 PMCID: PMC8616888 DOI: 10.1007/s11548-021-02482-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2021] [Accepted: 08/10/2021] [Indexed: 12/03/2022]
Abstract
Purpose The quantitative detection of failure modes is important for making deep neural networks reliable and usable at scale. We consider three examples for common failure modes in image reconstruction and demonstrate the potential of uncertainty quantification as a fine-grained alarm system. Methods We propose a deterministic, modular and lightweight approach called Interval Neural Network (INN) that produces fast and easy to interpret uncertainty scores for deep neural networks. Importantly, INNs can be constructed post hoc for already trained prediction networks. We compare it against state-of-the-art baseline methods (MCDrop, ProbOut). Results We demonstrate on controlled, synthetic inverse problems the capacity of INNs to capture uncertainty due to noise as well as directional error information. On a real-world inverse problem with human CT scans, we can show that INNs produce uncertainty scores which improve the detection of all considered failure modes compared to the baseline methods. Conclusion Interval Neural Networks offer a promising tool to expose weaknesses of deep image reconstruction models and ultimately make them more reliable. The fact that they can be applied post hoc to equip already trained deep neural network models with uncertainty scores makes them particularly interesting for deployment.
Collapse
|
111
|
Gomes J, Kong J, Kurc T, Melo ACMA, Ferreira R, Saltz JH, Teodoro G. Building robust pathology image analyses with uncertainty quantification. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2021; 208:106291. [PMID: 34333205 DOI: 10.1016/j.cmpb.2021.106291] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/12/2021] [Accepted: 07/09/2021] [Indexed: 06/13/2023]
Abstract
BACKGROUND AND OBJECTIVE Computerized pathology image analysis is an important tool in research and clinical settings, which enables quantitative tissue characterization and can assist a pathologist's evaluation. The aim of our study is to systematically quantify and minimize uncertainty in output of computer based pathology image analysis. METHODS Uncertainty quantification (UQ) and sensitivity analysis (SA) methods, such as Variance-Based Decomposition (VBD) and Morris One-At-a-Time (MOAT), are employed to track and quantify uncertainty in a real-world application with large Whole Slide Imaging datasets - 943 Breast Invasive Carcinoma (BRCA) and 381 Lung Squamous Cell Carcinoma (LUSC) patients. Because these studies are compute intensive, high-performance computing systems and efficient UQ/SA methods were combined to provide efficient execution. UQ/SA has been able to highlight parameters of the application that impact the results, as well as nuclear features that carry most of the uncertainty. Using this information, we built a method for selecting stable features that minimize application output uncertainty. RESULTS The results show that input parameter variations significantly impact all stages (segmentation, feature computation, and survival analysis) of the use case application. We then identified and classified features according to their robustness to parameter variation, and using the proposed features selection strategy, for instance, patient grouping stability in survival analysis has been improved from in 17% and 34% for BRCA and LUSC, respectively. CONCLUSIONS This strategy created more robust analyses, demonstrating that SA and UQ are important methods that may increase confidence digital pathology.
Collapse
|
112
|
Antonuccio MN, Mariotti A, Fanni BM, Capellini K, Capelli C, Sauvage E, Celi S. Effects of Uncertainty of Outlet Boundary Conditions in a Patient-Specific Case of Aortic Coarctation. Ann Biomed Eng 2021; 49:3494-3507. [PMID: 34431017 PMCID: PMC8671284 DOI: 10.1007/s10439-021-02841-9] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2021] [Accepted: 07/20/2021] [Indexed: 12/22/2022]
Abstract
Computational Fluid Dynamics (CFD) simulations of blood flow are widely used to compute a variety of hemodynamic indicators such as velocity, time-varying wall shear stress, pressure drop, and energy losses. One of the major advances of this approach is that it is non-invasive. The accuracy of the cardiovascular simulations depends directly on the level of certainty on input parameters due to the modelling assumptions or computational settings. Physiologically suitable boundary conditions at the inlet and outlet of the computational domain are needed to perform a patient-specific CFD analysis. These conditions are often affected by uncertainties, whose impact can be quantified through a stochastic approach. A methodology based on a full propagation of the uncertainty from clinical data to model results is proposed here. It was possible to estimate the confidence associated with model predictions, differently than by deterministic simulations. We evaluated the effect of using three-element Windkessel models as the outflow boundary conditions of a patient-specific aortic coarctation model. A parameter was introduced to calibrate the resistances of the Windkessel model at the outlets. The generalized Polynomial Chaos method was adopted to perform the stochastic analysis, starting from a few deterministic simulations. Our results show that the uncertainty of the input parameter gave a remarkable variability on the volume flow rate waveform at the systolic peak simulating the conditions before the treatment. The same uncertain parameter had a slighter effect on other quantities of interest, such as the pressure gradient. Furthermore, the results highlight that the fine-tuning of Windkessel resistances is not necessary to simulate the post-stenting scenario.
Collapse
|
113
|
Browning AP, Maclaren OJ, Buenzli PR, Lanaro M, Allenby MC, Woodruff MA, Simpson MJ. Model-based data analysis of tissue growth in thin 3D printed scaffolds. J Theor Biol 2021; 528:110852. [PMID: 34358535 DOI: 10.1016/j.jtbi.2021.110852] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2021] [Revised: 07/08/2021] [Accepted: 07/26/2021] [Indexed: 10/24/2022]
Abstract
Tissue growth in three-dimensional (3D) printed scaffolds enables exploration and control of cell behaviour in more biologically realistic geometries than that allowed by traditional 2D cell culture. Cell proliferation and migration in these experiments have yet to be explicitly characterised, limiting the ability of experimentalists to determine the effects of various experimental conditions, such as scaffold geometry, on cell behaviour. We consider tissue growth by osteoblastic cells in melt electro-written scaffolds that comprise thin square pores with sizes that were deliberately increased between experiments. We collect highly detailed temporal measurements of the average cell density, tissue coverage, and tissue geometry. To quantify tissue growth in terms of the underlying cell proliferation and migration processes, we introduce and calibrate a mechanistic mathematical model based on the Porous-Fisher reaction-diffusion equation. Parameter estimates and uncertainty quantification through profile likelihood analysis reveal consistency in the rate of cell proliferation and steady-state cell density between pore sizes. This analysis also serves as an important model verification tool: while the use of reaction-diffusion models in biology is widespread, the appropriateness of these models to describe tissue growth in 3D scaffolds has yet to be explored. We find that the Porous-Fisher model is able to capture features relating to the cell density and tissue coverage, but is not able to capture geometric features relating to the circularity of the tissue interface. Our analysis identifies two distinct stages of tissue growth, suggests several areas for model refinement, and provides guidance for future experimental work that explores tissue growth in 3D printed scaffolds.
Collapse
|
114
|
Jebeile J, Crucifix M. Value management and model pluralism in climate science. STUDIES IN HISTORY AND PHILOSOPHY OF SCIENCE 2021; 88:120-127. [PMID: 34166920 DOI: 10.1016/j.shpsa.2021.06.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/28/2020] [Revised: 05/11/2021] [Accepted: 06/04/2021] [Indexed: 06/13/2023]
Abstract
Non-epistemic values pervade climate modelling, as is now well documented and widely discussed in the philosophy of climate science. Recently, Parker and Winsberg have drawn attention to what can be termed "epistemic inequality": this is the risk that climate models might more accurately represent the future climates of the geographical regions prioritised by the values of the modellers. In this paper, we promote value management as a way of overcoming epistemic inequality. We argue that value management can be seriously considered as soon as the value-free ideal and inductive risk arguments commonly used to frame the discussions of value influence in climate science are replaced by alternative social accounts of objectivity. We consider objectivity in Longino's sense as well as strong objectivity in Harding's sense to be relevant options here, because they offer concrete proposals that can guide scientific practice in evaluating and designing so-called multi-model ensembles and, in fine, improve their capacity to quantify and express uncertainty in climate projections.
Collapse
|
115
|
El Mohtar S, Ait-El-Fquih B, Knio O, Lakkis I, Hoteit I. Bayesian identification of oil spill source parameters from image contours. MARINE POLLUTION BULLETIN 2021; 169:112514. [PMID: 34091253 DOI: 10.1016/j.marpolbul.2021.112514] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/07/2020] [Revised: 05/14/2021] [Accepted: 05/15/2021] [Indexed: 06/12/2023]
Abstract
Oil spills at sea pose a serious threat to coastal environments. Identifying oil pollution sources could help to investigate unreported spills, and satellite imagery can be an effective tool for this purpose. We present a Bayesian approach to estimate the source parameters of a spill from contours of oil slicks detected by remotely sensed images. Five parameters of interest are estimated: the 2D coordinates of the source of release, the time and duration of the spill, and the quantity of oil released. Two synthetic experiments of a spill released from a fixed point source are investigated, where a contour is fully observed in the first case, while two contours are partially observed at two different times in the second. In both experiments, the proposed method is able to provide good estimates of the parameters along with a level of confidence reflected by the uncertainties within.
Collapse
|
116
|
Stochastic simulation of the FDA centrifugal blood pump benchmark. Biomech Model Mechanobiol 2021; 20:1871-1887. [PMID: 34191187 DOI: 10.1007/s10237-021-01482-0] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2020] [Accepted: 06/17/2021] [Indexed: 10/21/2022]
Abstract
In the present study, the effect of physical and operational uncertainties on the hydrodynamic and hemocompatibility characteristics of a centrifugal blood pump designed by the U.S. food and drug administration is investigated. Physical uncertainties include the randomness in the blood density and viscosity, while the operational uncertainties are composed of the pump rotational speed, mass flow rate, and turbulence intensity. The non-intrusive polynomial chaos expansion has been employed to conduct the uncertainty quantification analysis. Additionally, to assess each stochastic parameter's influence on the quantities of interest, the sensitivity analysis is utilized through the Sobol' indices. For numerical simulation of the pump's blood flow, the SST [Formula: see text] turbulence model and a power-law model of hemolysis were employed. The pump's velocity field is profoundly affected by the rotational speed in the bladed regions and the mass flow rate in other zones. Furthermore, the hemolysis index is dominantly sensitive to blood viscosity. According to the results, pump hydraulic characteristics (i.e., head and efficiency) show a more robust behavior than the hemocompatibility characteristics (i.e., hemolysis index) regarding the operational and physical uncertainties. Finally, it was found that the probability distribution function of the hemolysis index covers the experimental measurements.
Collapse
|
117
|
Phylogenetic uncertainty and the inference of patterns in community ecology and comparative studies. Oecologia 2021; 196:633-647. [PMID: 34146131 DOI: 10.1007/s00442-021-04972-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/24/2020] [Accepted: 06/12/2021] [Indexed: 10/21/2022]
Abstract
Progress in phylogenetic community ecology is often limited by the availability of phylogenetic information and the lack of appropriate methods and solutions to deal with this problem. We estimate the effect of the lack of phylogenetic information on the relations among taxa measured by commonly used phylogenetic metrics in comparative studies and community ecology, namely: Blomberg's K phylogenetic signal, Faith's Phylogenetic Diversity (PD), Mean Phylogenetic Distance (MPD) and Mean Nearest Taxon Distance (MNTD). To overcome this problem, we tested two possible solutions: Polytomic trees and Operational trees. Our results show that the effects on K values strongly depended on the level of phylogenetic signal. In the case of the community metrics, the effects were insensitive to the patterns of species distribution in the communities. Community metrics tended to be overestimated with both Polytomic and Operational trees, but the overestimation was higher with Polytomic trees. PD and MPD metrics were less biased than MNTD metric. We show that the lack of phylogenetic resolution is not necessarily problematic for all analyses and that its effect will depend on the chosen metric and on the solutions used to deal with the problem. Based on our results, we suggest that ecologists should prefer the Operational tree solution to remove polytomies in the phylogenetic tree and take careful consideration while designing experiments, and analyzing and interpreting the results of phylogenetic metrics.
Collapse
|
118
|
Laloy E, Rogiers B, Bielen A, Boden S. Bayesian inference of 1D activity profiles from segmented gamma scanning of a heterogeneous radioactive waste drum. Appl Radiat Isot 2021; 175:109803. [PMID: 34118589 DOI: 10.1016/j.apradiso.2021.109803] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2020] [Revised: 04/20/2021] [Accepted: 05/31/2021] [Indexed: 11/28/2022]
Abstract
We present a Bayesian approach to probabilistically infer vertical activity profiles within a radioactive waste drum from segmented gamma scanning (SGS) measurements. Our approach resorts to Markov chain Monte Carlo (MCMC) sampling using the state-of-the-art Hamiltonian Monte Carlo (HMC) technique and accounts for two important sources of uncertainty: the measurement uncertainty and the uncertainty in the source distribution within the drum. In addition, our efficiency model simulates the contributions of all considered segments to each count measurement. Our approach is first demonstrated with a synthetic example, after which it is used to resolve the vertical activity distribution of 5 nuclides in a real waste package.
Collapse
|
119
|
Albi G, Pareschi L, Zanella M. Control with uncertain data of socially structured compartmental epidemic models. J Math Biol 2021; 82:63. [PMID: 34023964 PMCID: PMC8141280 DOI: 10.1007/s00285-021-01617-y] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2020] [Revised: 04/23/2021] [Accepted: 05/15/2021] [Indexed: 10/24/2022]
Abstract
The adoption of containment measures to reduce the amplitude of the epidemic peak is a key aspect in tackling the rapid spread of an epidemic. Classical compartmental models must be modified and studied to correctly describe the effects of forced external actions to reduce the impact of the disease. The importance of social structure, such as the age dependence that proved essential in the recent COVID-19 pandemic, must be considered, and in addition, the available data are often incomplete and heterogeneous, so a high degree of uncertainty must be incorporated into the model from the beginning. In this work we address these aspects, through an optimal control formulation of a socially structured epidemic model in presence of uncertain data. After the introduction of the optimal control problem, we formulate an instantaneous approximation of the control that allows us to derive new feedback controlled compartmental models capable of describing the epidemic peak reduction. The need for long-term interventions shows that alternative actions based on the social structure of the system can be as effective as the more expensive global strategy. The timing and intensity of interventions, however, is particularly relevant in the case of uncertain parameters on the actual number of infected people. Simulations related to data from the first wave of the recent COVID-19 outbreak in Italy are presented and discussed.
Collapse
|
120
|
Yin J, Medellín-Azuara J, Escriva-Bou A, Liu Z. Bayesian machine learning ensemble approach to quantify model uncertainty in predicting groundwater storage change. THE SCIENCE OF THE TOTAL ENVIRONMENT 2021; 769:144715. [PMID: 33736244 DOI: 10.1016/j.scitotenv.2020.144715] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/07/2020] [Revised: 12/09/2020] [Accepted: 12/24/2020] [Indexed: 05/12/2023]
Abstract
Agricultural water demand, groundwater extraction, surface water delivery and climate have complex nonlinear relationships with groundwater storage in agricultural regions. As an alternative to elaborate computationally intensive physical models, machine learning methods are often adopted as surrogate to capture such complex relationships due to their high computational efficiency. Inevitably, using only one machine learning model is prone to underestimate prediction uncertainty and subjected to poor accuracy. This study presents a novel machine learning-based groundwater ensemble modeling framework in conjunction with a Bayesian model averaging approach to predict groundwater storage change and improve overall model predicting reliability. Three different machine learning models have been developed namely artificial neural network, support vector machine and response surface regression. To explicitly quantify uncertainty from machine learning model parameter and structure, Bayesian model averaging is employed to produce a forecast distribution associated with each machine learning prediction. Model weights and variances are obtained based on model performance to construct ensemble models. Then, the developed individual and Bayesian model averaging machine learning ensemble models are applied, evaluated and validated at different spatial scales including subregional and regional scales in an overdrafted agricultural region-the San Joaquin River Basin, through independent training and testing dataset. Results shows the machine learning models have remarkable predicting capability without sacrificing accuracy but with higher computational efficiency. Compared to a single-model approach, the ensemble model is able to produce consistently reliable predictions across the basin, yet it does not always outperform the best model in the ensemble. Additionally, model results suggest that groundwater pumping for agricultural irrigation is the primary driving force of groundwater storage change across the region. The modeling framework can serve as an alternative approach to simulating groundwater response, especially in those agricultural regions where lack of subsurface data hinders physically-based modeling.
Collapse
|
121
|
Ninos G, Bartzis V, Merlemis N, Sarris IE. Uncertainty quantification implementations in human hemodynamic flows. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2021; 203:106021. [PMID: 33721602 DOI: 10.1016/j.cmpb.2021.106021] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/17/2021] [Accepted: 02/19/2021] [Indexed: 06/12/2023]
Abstract
BACKGROUND AND OBJECTIVE Human hemodynamic modeling is usually influenced by uncertainties occurring from a considerable unavailability of information linked to the boundary conditions and the physical properties used in the numerical models. Calculating the effect of these uncertainties on the numerical findings along the cardiovascular system is a demanding process due to the complexity of the morphology of the body and the area dynamics. To cope with all these difficulties, Uncertainty Quantification (UQ) methods seem to be an ideal tool. RESULTS This study focuses on analyzing and summarizing some of the recent research efforts and directions of implementing UQ in human hemodynamic flows by analyzing 139 research papers. Initially, the suitability of applying this approach is analyzed and demonstrated. Then, an overview of the most significant research work in various fields of biomedical hemodynamic engineering is presented. Finally, it is attempted to identify any possible forthcoming directions for research and methodological progress of UQ in biomedical sciences. CONCLUSION This review concludes that by finding the best statistical methods and parameters to represent the propagated uncertainties, while achieving a good interpretation of the interaction between input-output, is crucial for implementing UQ in biomedical sciences.
Collapse
|
122
|
Olivares A, Staffetti E. Uncertainty quantification of a mathematical model of COVID-19 transmission dynamics with mass vaccination strategy. CHAOS, SOLITONS, AND FRACTALS 2021; 146:110895. [PMID: 33814733 PMCID: PMC7998051 DOI: 10.1016/j.chaos.2021.110895] [Citation(s) in RCA: 19] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/07/2021] [Revised: 02/18/2021] [Accepted: 03/17/2021] [Indexed: 05/17/2023]
Abstract
In this paper, the uncertainty quantification and sensitivity analysis of a mathematical model of the SARS-CoV-2 virus transmission dynamics with mass vaccination strategy has been carried out. More specifically, a compartmental epidemic model has been considered, in which vaccination, social distance measures, and testing of susceptible individuals have been included. Since the application of these mitigation measures entails a degree of uncertainty, the effects of the uncertainty about the application of social distance actions and testing of susceptible individuals on the disease transmission have been quantified, under the assumption of a mass vaccination program deployment. A spectral approach has been employed, which allows the uncertainty propagation through the epidemic model to be represented by means of the polynomial chaos expansion of the output random variables. In particular, a statistical moment-based polynomial chaos expansion has been implemented, which provides a surrogate model for the compartments of the epidemic model, and allows the statistics, the probability distributions of the interesting output variables of the model at a given time instant to be estimated and the sensitivity analysis to be conducted. The purpose of the sensitivity analysis is to understand which uncertain parameters have most influence on a given output random variable of the model at a given time instant. Several numerical experiments have been conducted whose results show that the proposed spectral approach to uncertainty quantification and sensitivity analysis of epidemic models provides a useful tool to control and mitigate the effects of the COVID-19 pandemic, when it comes to healthcare resource planning.
Collapse
|
123
|
Trucchia A, Frunzo L. Surrogate based Global Sensitivity Analysis of ADM1-based Anaerobic Digestion Model. JOURNAL OF ENVIRONMENTAL MANAGEMENT 2021; 282:111456. [PMID: 33441259 DOI: 10.1016/j.jenvman.2020.111456] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/30/2019] [Revised: 09/16/2020] [Accepted: 09/26/2020] [Indexed: 06/12/2023]
Abstract
In order to calibrate the model parameters, Sensitivity Analysis routines are mandatory to rank the parameters by their relevance and fix to nominal values the least influential factors. Despite the high number of works based on ADM1, very few are related to sensitivity analysis. In this study Global Sensitivity Analysis (GSA) and Uncertainty Quantification (UQ) for an ADM1-based Anaerobic Digestion Model have been performed. The modified version of ADM-based model selected in this study was presented by Esposito and co-authors in 2013. Unlike the first version of ADM1, focused on sewage sludge degradation, the model of Esposito is focused on organic fraction of municipal solid waste digestion. It his recalled that in many applications the hydrolysis is considered the bottleneck of the overall anaerobic digestion process when the input substrate is constituted of complex organic matter. In Esposito's model a surfaced based kinetic approach for the disintegration of complex organic matter is introduced. This approach allows to better model the disintegration step taking into account the effect of particle size distribution on the digestion process. This model needs thus GSA and UQ to pave the way for further improvements and reach a deep understanding of the main processes and leading input factors. Due to the large number of parameters to be analyzed a first preliminary screening analysis, with the Morris' Method, has been conducted. Since two quantities of interest (QoI) have been considered, the initial screening has been performed twice, obtaining two set of parameters containing the most influential factors in determining the value of each QoI. A surrogate of ADM1 model has been defined making use of the two defined quantities of interest. The output results from the surrogate model have been analyzed with Sobol' indices for the quantitative GSA. Finally, uncertainty quantification has been performed. By adopting kernel smoothing techniques, the Probability Density Functions of each quantity of interest have been defined.
Collapse
|
124
|
Denéchère R, Delpierre N, Apostol EN, Berveiller D, Bonne F, Cole E, Delzon S, Dufrêne E, Gressler E, Jean F, Lebourgeois F, Liu G, Louvet JM, Parmentier J, Soudani K, Vincent G. The within-population variability of leaf spring and autumn phenology is influenced by temperature in temperate deciduous trees. INTERNATIONAL JOURNAL OF BIOMETEOROLOGY 2021; 65:369-379. [PMID: 31352524 DOI: 10.1007/s00484-019-01762-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/14/2019] [Revised: 07/08/2019] [Accepted: 07/14/2019] [Indexed: 06/10/2023]
Abstract
Leaf phenology is a major driver of ecosystem functioning in temperate forests and a robust indicator of climate change. Both the inter-annual and inter-population variability of leaf phenology have received much attention in the literature; in contrast, the within-population variability of leaf phenology has been far less studied. Beyond its impact on individual tree physiological processes, the within-population variability of leaf phenology can affect the estimation of the average budburst or leaf senescence dates at the population scale. Here, we monitored the progress of spring and autumn leaf phenology over 14 tree populations (9 tree species) in six European forests over the period of 2011 to 2018 (yielding 16 site-years of data for spring, 14 for autumn). We monitored 27 to 512 (with a median of 62) individuals per population. We quantified the within-population variability of leaf phenology as the standard deviation of the distribution of individual dates of budburst or leaf senescence (SDBBi and SDLSi, respectively). Given the natural variability of phenological dates occurring in our tree populations, we estimated from the data that a minimum sample size of 28 (resp. 23) individuals, are required to estimate SDBBi (resp. SDLSi) with a precision of 3 (resp. 7) days. The within-population of leaf senescence (average SDLSi = 8.5 days) was on average two times larger than for budburst (average SDBBi = 4.0 days). We evidenced that warmer temperature during the budburst period and a late average budburst date were associated with a lower SDBBi, as a result of a quicker spread of budburst in tree populations, with a strong species effect. Regarding autumn phenology, we observed that later senescence and warm temperatures during the senescence period were linked with a high SDLSi, with a strong species effect. The shares of variance explained by our models were modest suggesting that other factors likely influence the within-population variation in leaf phenology. For instance, a detailed analysis revealed that summer temperatures were negatively correlated with a lower SDLSi.
Collapse
|
125
|
Perez-Raya I, Fathi MF, Baghaie A, Sacho R, D'Souza RM. Modeling and Reducing the Effect of Geometric Uncertainties in Intracranial Aneurysms with Polynomial Chaos Expansion, Data Decomposition, and 4D-Flow MRI. Cardiovasc Eng Technol 2021; 12:127-143. [PMID: 33415699 DOI: 10.1007/s13239-020-00511-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/16/2020] [Accepted: 12/16/2020] [Indexed: 11/27/2022]
Abstract
PURPOSE Variations in the vessel radius of segmented surfaces of intracranial aneurysms significantly influence the fluid velocities given by computer simulations. It is important to generate models that capture the effect of these variations in order to have a better interpretation of the numerically predicted hemodynamics. Also, it is highly relevant to develop methods that combine experimental observations with uncertainty modeling to get a closer approximation to the blood flow behavior. METHODS This work applies polynomial chaos expansion to model the effect of geometric uncertainties on the simulated fluid velocities of intracranial aneurysms. The radius of the vessel is defined as the uncertainty variable. Proper orthogonal decomposition is applied to characterize the solution space of fluid velocities. Next, a process of projecting the 4D-Flow MRI velocities on the basis vectors followed by coefficient mapping using generalized dynamic mode decomposition enables the merging of 4D-Flow MRI with the uncertainty propagated fluid velocities. RESULTS Polynomial chaos expansion propagates the fluid velocities with an error of 2% in velocity magnitude relative to computer simulations. Also, the bifurcation region (or impingement location) shows a standard deviation of 0.17 m/s (since an available reported variance in the vessel radius is adopted to model the uncertainty, the expected standard deviation may be different). Numerical phantom experiments indicate that the proposed approach reconstructs the fluid velocities with 0.3% relative error in presence of geometric uncertainties. CONCLUSION Polynomial chaos expansion is an effective approach to propagate the effect of the uncertainty variable in the blood flow velocities of intracranial aneurysms. Merging 4D-Flow MRI and uncertainty propagated fluid velocities leads to more realistic flow trends relative to ignoring the uncertainty in the vessel radius.
Collapse
|