1
|
McAndrew T, Cambeiro J, Besiroglu T. Aggregating human judgment probabilistic predictions of the safety, efficacy, and timing of a COVID-19 vaccine. Vaccine 2022; 40:2331-2341. [PMID: 35292162 PMCID: PMC8882426 DOI: 10.1016/j.vaccine.2022.02.054] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/22/2021] [Revised: 02/08/2022] [Accepted: 02/10/2022] [Indexed: 12/15/2022]
Abstract
Safe, efficacious vaccines were developed to reduce the transmission of SARS-CoV-2 during the COVID-19 pandemic. But in the middle of 2020, vaccine effectiveness, safety, and the timeline for when a vaccine would be approved and distributed to the public was uncertain. To support public health decision making, we solicited trained forecasters and experts in vaccinology and infectious disease to provide monthly probabilistic predictions from July to September of 2020 of the efficacy, safety, timing, and delivery of a COVID-19 vaccine. We found, that despite sparse historical data, a linear pool—a combination of human judgment probabilistic predictions—can quantify the uncertainty in clinical significance and timing of a potential vaccine. The linear pool underestimated how fast a therapy would show a survival benefit and the high efficacy of approved COVID-19 vaccines. However, the linear pool did make an accurate prediction for when a vaccine would be approved by the FDA. Compared to individual forecasters, the linear pool was consistently above the median of the most accurate forecasts. A linear pool is a fast and versatile method to build probabilistic predictions of a developing vaccine that is robust to poor individual predictions. Though experts and trained forecasters did underestimate the speed of development and the high efficacy of a SARS-CoV-2 vaccine, linear pool predictions can improve situational awareness for public health officials and for the public make clearer the risks, rewards, and timing of a vaccine.
Collapse
|
2
|
Rheinberger CM. A Unified Probabilistic Framework for Cancer Risk Management. RISK ANALYSIS : AN OFFICIAL PUBLICATION OF THE SOCIETY FOR RISK ANALYSIS 2021; 41:584-595. [PMID: 33340129 DOI: 10.1111/risa.13666] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/16/2020] [Revised: 10/28/2020] [Accepted: 12/06/2020] [Indexed: 06/12/2023]
Abstract
Cancer risk assessments in the regulatory realm are often deterministic. Probabilistic approaches that allow characterizing and propagating uncertainty and variability are better suited to predict the socioeconomic impacts of regulating carcinogens. In this article, I present a unified framework for cancer risk management consisting of (i) a probabilistic exposure model that takes into account variability in individual exposure to the substance of concern; (ii) a probabilistic dose-response model that accounts for differences in individual cancer susceptibility; (iii) an impact assessment model that quantifies individuals' excess lifetime cancer risk; and (iv) a welfare model that values changes in disability-adjusted life expectancy based on workers' willingness-to-pay and aggregates individual valuations across the population at risk. I illustrate the framework with data on occupational exposure to hexavalent chromium in France. In a cohort of 10,000 synthetic workers, about one third of the exposed benefit from the introduction of a binding occupational exposure limit (BOEL). Limiting hexavalent chromium exposure to the BOEL reduces the statistical worker's excess lifetime risk of fatal and nonfatal lung cancer by 4.7E-3 and 1.5E-3, respectively. At cohort level, the risk reduction corresponds to 738.4 full and 30.7 disability-adjusted life years saved. The expected welfare gain of introducing the BOEL is close to €30 million. A major advantage of the framework is its ability to visualize uncertainty and variability inherent to cancer risk assessment. Notwithstanding some implementation challenges, the framework provides a transparent characterization of regulatory impacts that supports informed risk management decisions.
Collapse
Affiliation(s)
- Christoph M Rheinberger
- Risk Management Directorate, European Chemicals Agency, Postal address: P.O. Box 400, Helsinki, 00121, Finland
| |
Collapse
|
3
|
Finkel AM, Gray GM. The Pebble Remains in the Master's Hand: Two Careers Spent Learning (Still) from John Evans. RISK ANALYSIS : AN OFFICIAL PUBLICATION OF THE SOCIETY FOR RISK ANALYSIS 2021; 41:678-693. [PMID: 33325061 DOI: 10.1111/risa.13649] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/24/2020] [Revised: 08/14/2020] [Accepted: 10/26/2020] [Indexed: 06/12/2023]
Abstract
In this article, we discuss four vexing problems in risk-based decision making that John Evans has addressed over the last nearly 40 years and has perennially challenged the two of us and others to think about. We tackle the role in decision making of potential thresholds in dose-response functions, how the lack of health reference values for many chemicals may distort risk management, the challenge of model uncertainty for risk characterization, and the yet-untapped potential for value-of-information analysis to enhance public health decision making. Our theme is that work remains to be done on each of these, but that some of that work would merely involve listening to ideas that John has already offered.
Collapse
Affiliation(s)
- Adam M Finkel
- Department of Environmental Health Sciences, University of Michigan School of Public Health, Ann Arbor, MI, USA
| | - George M Gray
- Department of Environmental and Occupational Health, George Washington University Milken Institute School of Public Health, Washington, DC, USA
| |
Collapse
|
4
|
McAndrew T, Wattanachit N, Gibson GC, Reich NG. Aggregating predictions from experts: a review of statistical methods, experiments, and applications. WILEY INTERDISCIPLINARY REVIEWS. COMPUTATIONAL STATISTICS 2021; 13:e1514. [PMID: 33777310 PMCID: PMC7996321 DOI: 10.1002/wics.1514] [Citation(s) in RCA: 16] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/04/2020] [Accepted: 04/18/2020] [Indexed: 11/11/2022]
Abstract
Forecasts support decision making in a variety of applications. Statistical models can produce accurate forecasts given abundant training data, but when data is sparse or rapidly changing, statistical models may not be able to make accurate predictions. Expert judgmental forecasts-models that combine expert-generated predictions into a single forecast-can make predictions when training data is limited by relying on human intuition. Researchers have proposed a wide array of algorithms to combine expert predictions into a single forecast, but there is no consensus on an optimal aggregation model. This review surveyed recent literature on aggregating expert-elicited predictions. We gathered common terminology, aggregation methods, and forecasting performance metrics, and offer guidance to strengthen future work that is growing at an accelerated pace.
Collapse
Affiliation(s)
- Thomas McAndrew
- Department of Biostatistics and Epidemiology, School of Public Health and Health Sciences, University of Massachusetts at Amherst, Amherst, Massachusetts, USA
| | - Nutcha Wattanachit
- Department of Biostatistics and Epidemiology, School of Public Health and Health Sciences, University of Massachusetts at Amherst, Amherst, Massachusetts, USA
| | - Graham C. Gibson
- Department of Biostatistics and Epidemiology, School of Public Health and Health Sciences, University of Massachusetts at Amherst, Amherst, Massachusetts, USA
| | - Nicholas G. Reich
- Department of Biostatistics and Epidemiology, School of Public Health and Health Sciences, University of Massachusetts at Amherst, Amherst, Massachusetts, USA
| |
Collapse
|
5
|
Imeah B, Penz E, Rana M, Trask C. Economic analysis of new workplace technology including productivity and injury: The case of needle-less injection in swine. PLoS One 2020; 15:e0233599. [PMID: 32555636 PMCID: PMC7299390 DOI: 10.1371/journal.pone.0233599] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2019] [Accepted: 05/09/2020] [Indexed: 11/18/2022] Open
Abstract
Increasing intensification in swine production has led to new and specialized technologies, but the occupational health and safety impacts are rarely quantified in the business plans for adoption. Needle-less injection has potential to increase productivity and eliminate needle stick injury in workers, but it is not clear whether these benefits offset high capital investment and potential increases in musculoskeletal loads. This economic evaluation employed probabilistic scenario analysis using injury, cost, and production data gathered from interviews with swine producers in Manitoba and Saskatchewan. After adoption of needle-less injection, rates of needle-stick injury went down with no measureable effect on upper limb musculoskeletal disorders, resulting in lower health and safety costs for needle-less injectors. Needle-less injection duration was 40% faster once workers acclimatized, but large start-up costs mean economic benefits are realized only after the first year. The incremental benefit cost ratio promoted adoption of needle-less injectors over conventional needles for the base case of a 1200 sow barn; the conventional method is beneficial for barns with 600 sows or less. Findings indicate that well-designed technologies have the potential to achieve the dual ergonomics goals of enhancing human wellbeing and system performance. We anticipate that the economic and decision models developed in this study can be applied to other new technologies in agriculture and animal production.
Collapse
Affiliation(s)
- Biaka Imeah
- Canadian Centre for Health and Safety in Agriculture, College of Medicine, University of Saskatchewan, Saskatoon, Saskatchewan, Canada
| | - Erika Penz
- College of Medicine, University of Saskatchewan, Saskatoon, Saskatchewan, Canada
| | - Masud Rana
- Collaborative Program in Biostatistics, University of Saskatchewan, Saskatoon, Saskatchewan, Canada
| | - Catherine Trask
- Canadian Centre for Health and Safety in Agriculture, College of Medicine, University of Saskatchewan, Saskatoon, Saskatchewan, Canada
- * E-mail:
| | | |
Collapse
|
6
|
Clayton RH, Aboelkassem Y, Cantwell CD, Corrado C, Delhaas T, Huberts W, Lei CL, Ni H, Panfilov AV, Roney C, dos Santos RW. An audit of uncertainty in multi-scale cardiac electrophysiology models. PHILOSOPHICAL TRANSACTIONS. SERIES A, MATHEMATICAL, PHYSICAL, AND ENGINEERING SCIENCES 2020; 378:20190335. [PMID: 32448070 PMCID: PMC7287340 DOI: 10.1098/rsta.2019.0335] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 03/16/2020] [Indexed: 05/21/2023]
Abstract
Models of electrical activation and recovery in cardiac cells and tissue have become valuable research tools, and are beginning to be used in safety-critical applications including guidance for clinical procedures and for drug safety assessment. As a consequence, there is an urgent need for a more detailed and quantitative understanding of the ways that uncertainty and variability influence model predictions. In this paper, we review the sources of uncertainty in these models at different spatial scales, discuss how uncertainties are communicated across scales, and begin to assess their relative importance. We conclude by highlighting important challenges that continue to face the cardiac modelling community, identifying open questions, and making recommendations for future studies. This article is part of the theme issue 'Uncertainty quantification in cardiac and cardiovascular modelling and simulation'.
Collapse
Affiliation(s)
- Richard H. Clayton
- Insigneo institute for in-silico Medicine and Department of Computer Science, University of Sheffield, Sheffield, UK
- e-mail:
| | - Yasser Aboelkassem
- Department of Bioengineering, University of California, San Diego, CA, USA
| | | | - Cesare Corrado
- Division of Imaging Sciences and Biomedical Engineering, King’s College London, London, UK
| | - Tammo Delhaas
- School of Cardiovascular Diseases, Maastricht University, Maastricht, The Netherlands
| | - Wouter Huberts
- School of Cardiovascular Diseases, Maastricht University, Maastricht, The Netherlands
- Department of Biomedical Engineering, Eindhoven University of Technology, Eindhoven, The Netherlands
| | - Chon Lok Lei
- Computational Biology and Health Informatics, Department of Computer Science, University of Oxford, Oxford, UK
| | - Haibo Ni
- Department of Pharmacology, University of California, Davis, CA, USA
| | - Alexander V. Panfilov
- Department of Physics and Astronomy, University of Gent, Gent, Belgium
- Laboratory of Computational Biology and Medicine, Ural Federal University, Ekaterinburg, Russia
| | - Caroline Roney
- Division of Imaging Sciences and Biomedical Engineering, King’s College London, London, UK
| | | |
Collapse
|
7
|
Experts know more than just facts: eliciting functional understanding to help prioritise weed biological control targets. Biol Invasions 2016. [DOI: 10.1007/s10530-016-1175-5] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
8
|
Butler AJ, Thomas MK, Pintar KDM. Systematic review of expert elicitation methods as a tool for source attribution of enteric illness. Foodborne Pathog Dis 2015; 12:367-82. [PMID: 25826450 DOI: 10.1089/fpd.2014.1844] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022] Open
Abstract
Expert elicitation is a useful tool to explore sources of uncertainty and to answer questions where data are expensive or difficult to collect. It has been used across a variety of disciplines and represents an important method for estimating source attribution for enteric illness. A systematic review was undertaken to explore published expert elicitation studies, identify key considerations, and to make recommendations for designing an expert elicitation in the context of enteric illness source attribution. Fifty-nine studies were reviewed. Five key themes were identified: the expert panel including composition and recruitment; the pre-elicitation material, which clarifies the research question and provides training in uncertainty and probability; the choice of elicitation tool and method (e.g., questionnaires, surveys, and interviews); research design; and analysis of elicited data. Careful consideration of these themes is critical in designing and implementing an expert elicitation in order to reduce bias and produce the best possible results. While there are various epidemiological and microbiological methods available to explore source attribution of enteric illness, expert elicitation provides an opportunity to identify gaps in our understanding and where such studies are not feasible or available, represents the only possible method for synthesizing knowledge about transmission.
Collapse
Affiliation(s)
- Ainslie J Butler
- Centre for Foodborne, Environmental, and Zoonotic Infectious Diseases , Public Health Agency of Canada, Guelph, Canada
| | | | | |
Collapse
|
9
|
Morgan MG. Use (and abuse) of expert elicitation in support of decision making for public policy. Proc Natl Acad Sci U S A 2014; 111:7176-84. [PMID: 24821779 PMCID: PMC4034232 DOI: 10.1073/pnas.1319946111] [Citation(s) in RCA: 170] [Impact Index Per Article: 17.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
The elicitation of scientific and technical judgments from experts, in the form of subjective probability distributions, can be a valuable addition to other forms of evidence in support of public policy decision making. This paper explores when it is sensible to perform such elicitation and how that can best be done. A number of key issues are discussed, including topics on which there are, and are not, experts who have knowledge that provides a basis for making informed predictive judgments; the inadequacy of only using qualitative uncertainty language; the role of cognitive heuristics and of overconfidence; the choice of experts; the development, refinement, and iterative testing of elicitation protocols that are designed to help experts to consider systematically all relevant knowledge when they make their judgments; the treatment of uncertainty about model functional form; diversity of expert opinion; and when it does or does not make sense to combine judgments from different experts. Although it may be tempting to view expert elicitation as a low-cost, low-effort alternative to conducting serious research and analysis, it is neither. Rather, expert elicitation should build on and use the best available research and analysis and be undertaken only when, given those, the state of knowledge will remain insufficient to support timely informed assessment and decision making.
Collapse
Affiliation(s)
- M Granger Morgan
- Department of Engineering and Public Policy, Carnegie Mellon University, Pittsburgh, PA 15213
| |
Collapse
|
10
|
Borek A, Parlikad AK, Woodall P, Tomasella M. A risk based model for quantifying the impact of information quality. COMPUT IND 2014. [DOI: 10.1016/j.compind.2013.12.004] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|
11
|
Fischer K, Lewandowski D, Janssen MP. Estimating unknown parameters in haemophilia using expert judgement elicitation. Haemophilia 2013; 19:e282-8. [PMID: 23672712 DOI: 10.1111/hae.12166] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/31/2013] [Indexed: 11/30/2022]
Abstract
The increasing attention to healthcare costs and treatment efficiency has led to an increasing demand for quantitative data concerning patient and treatment characteristics in haemophilia. However, most of these data are difficult to obtain. The aim of this study was to use expert judgement elicitation (EJE) to estimate currently unavailable key parameters for treatment models in severe haemophilia A. Using a formal expert elicitation procedure, 19 international experts provided information on (i) natural bleeding frequency according to age and onset of bleeding, (ii) treatment of bleeds, (iii) time needed to control bleeding after starting secondary prophylaxis, (iv) dose requirements for secondary prophylaxis according to onset of bleeding, and (v) life-expectancy. For each parameter experts provided their quantitative estimates (median, P10, P90), which were combined using a graphical method. In addition, information was obtained concerning key decision parameters of haemophilia treatment. There was most agreement between experts regarding bleeding frequencies for patients treated on demand with an average onset of joint bleeding (1.7 years): median 12 joint bleeds per year (95% confidence interval 0.9-36) for patients ≤ 18, and 11 (0.8-61) for adult patients. Less agreement was observed concerning estimated effective dose for secondary prophylaxis in adults: median 2000 IU every other day The majority (63%) of experts expected that a single minor joint bleed could cause irreversible damage, and would accept up to three minor joint bleeds or one trauma related joint bleed annually on prophylaxis. Expert judgement elicitation allowed structured capturing of quantitative expert estimates. It generated novel data to be used in computer modelling, clinical care, and trial design.
Collapse
Affiliation(s)
- K Fischer
- Julius Center for Health Sciences and Primary Care, University Medical Center, Utrecht, The Netherlands.
| | | | | |
Collapse
|
12
|
Roman HA, Hammitt JK, Walsh TL, Stieb DM. Expert elicitation of the value per statistical life in an air pollution context. RISK ANALYSIS : AN OFFICIAL PUBLICATION OF THE SOCIETY FOR RISK ANALYSIS 2012; 32:2133-2151. [PMID: 22571466 DOI: 10.1111/j.1539-6924.2012.01826.x] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/31/2023]
Abstract
The monetized value of avoided premature mortality typically dominates the calculated benefits of air pollution regulations; therefore, characterization of the uncertainty surrounding these estimates is key to good policymaking. Formal expert judgment elicitation methods are one means of characterizing this uncertainty. They have been applied to characterize uncertainty in the mortality concentration-response function, but have yet to be used to characterize uncertainty in the economic values placed on avoided mortality. We report the findings of a pilot expert judgment study for Health Canada designed to elicit quantitative probabilistic judgments of uncertainties in Value-per-Statistical-Life (VSL) estimates for use in an air pollution context. The two-stage elicitation addressed uncertainties in both a base case VSL for a reduction in mortality risk from traumatic accidents and in benefits transfer-related adjustments to the base case for an air quality application (e.g., adjustments for age, income, and health status). Results for each expert were integrated to develop example quantitative probabilistic uncertainty distributions for VSL that could be incorporated into air quality models.
Collapse
Affiliation(s)
- Henry A Roman
- Industrial Economics, Incorporated, 2067 Massachusetts Ave, Cambridge, MA 02140, USA.
| | | | | | | |
Collapse
|
13
|
Williams PR, Holicky KC, Paustenbach DJ. Current Methods for Evaluating Children's Exposures for Use in Health Risk Assessment. ACTA ACUST UNITED AC 2011. [DOI: 10.3109/713610246] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
|
14
|
Sahmel J, Devlin K, Paustenbach D, Hollins D, Gaffney S. The role of exposure reconstruction in occupational human health risk assessment: current methods and a recommended framework. Crit Rev Toxicol 2010; 40:799-843. [PMID: 20722488 DOI: 10.3109/10408444.2010.501052] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
Abstract
Exposure reconstruction for substances of interest to human health is a process that has been used, with various levels of sophistication, as far back as the 1930s. The importance of robust and high-quality exposure reconstruction has been recognized by many researchers. It has been noted that misclassification of reconstructed exposures is relatively common and can result in potentially significant effects on the conclusions of a human health risk assessment or epidemiology study. In this analysis, a review of the key exposure reconstruction approaches described in over 400 papers in the peer-reviewed literature is presented. These approaches have been critically evaluated and classified according to quantitative, semiquantitative, and qualitative approaches. Our analysis indicates that much can still be done to improve the overall quality and consistency of exposure reconstructions and that a systematic framework would help to standardize the exposure reconstruction process in the future. The seven recommended steps in the exposure reconstruction process include identifying the goals of the reconstruction, organizing and ranking the available data, identifying key data gaps, selecting the best information sources and methodology for the reconstruction, incorporating probabilistic methods into the reconstruction, conducting an uncertainty analysis, and validating the results of the reconstruction. Influential emerging techniques, such as Bayesian data analysis, are highlighted. Important issues that will likely influence the conduct of exposure reconstruction into the future include improving statistical analysis methods, addressing the issue of chemical mixtures, evaluating aggregate exposures, and ensuring transparency with respect to variability and uncertainty in the reconstruction effort.
Collapse
|
15
|
Knol AB, Slottje P, van der Sluijs JP, Lebret E. The use of expert elicitation in environmental health impact assessment: a seven step procedure. Environ Health 2010; 9:19. [PMID: 20420657 PMCID: PMC2879247 DOI: 10.1186/1476-069x-9-19] [Citation(s) in RCA: 59] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2009] [Accepted: 04/26/2010] [Indexed: 05/17/2023]
Abstract
BACKGROUND Environmental health impact assessments often have to deal with substantial uncertainties. Typically, the knowledge-base is limited with incomplete, or inconsistent evidence and missing or ambiguous data. Consulting experts can help to identify and address uncertainties. METHODS Formal expert elicitation is a structured approach to systematically consult experts on uncertain issues. It is most often used to quantify ranges for poorly known parameters, but may also be useful to further develop qualitative issues such as definitions, assumptions or conceptual (causal) models. A thorough preparation and systematic design and execution of an expert elicitation process may increase the validity of its outcomes and transparency and trustworthiness of its conclusions. Various expert elicitation protocols and methods exist. However, these are often not universally applicable, and need customization to suite the needs of a specific study. In this paper, we set out to develop a widely applicable method for the use of expert elicitation in environmental health impact assessment. RESULTS We present a practical yet flexible seven step procedure towards organising expert elicitation in the context of environmental health impact assessment, based on existing protocols. We describe how customization for specific applications is always necessary. In particular, three issues affect the choice of methods for a particular application: the types of uncertainties considered, the intended use of the elicited information, and the available resources. We outline how these three considerations guide choices regarding the design and execution of expert elicitation. We present signposts to sources where the issues are discussed in more depth to give the newcomer the insights needed to make the protocol work. The seven step procedure is illustrated using examples from earlier published elicitations in the field of environmental health research. CONCLUSIONS We conclude that, despite some known criticism on its validity, formal expert elicitation can support environmental health research in various ways. Its main purpose is to provide a temporary summary of the limited available knowledge, which can serve as a provisional basis for policy until further research has been carried out.
Collapse
Affiliation(s)
- Anne B Knol
- National Institute for Public Health and the Environment (RIVM), Bilthoven, the Netherlands
| | - Pauline Slottje
- University of Utrecht, Institute for Risk Assessment Sciences, Utrecht, the Netherlands
| | - Jeroen P van der Sluijs
- University of Utrecht, Institute for Risk Assessment Sciences, Utrecht, the Netherlands
- Copernicus Institute for Sustainable Development and Innovation; Utrecht University, the Netherlands
- Recherches en Economie-Ecologie, Eco-innovation et ingénierie du Développement Soutenable, Université de Versailles, Saint-Quentin-en-Yvelines, France
| | - Erik Lebret
- National Institute for Public Health and the Environment (RIVM), Bilthoven, the Netherlands
- University of Utrecht, Institute for Risk Assessment Sciences, Utrecht, the Netherlands
| |
Collapse
|
16
|
Choi JY, Ramachandran G. Review of the OSHA framework for oversight of occupational environments. THE JOURNAL OF LAW, MEDICINE & ETHICS : A JOURNAL OF THE AMERICAN SOCIETY OF LAW, MEDICINE & ETHICS 2009; 37:633-650. [PMID: 20122106 DOI: 10.1111/j.1748-720x.2009.00437.x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/28/2023]
Abstract
The OSHA system for oversight of chemicals in the workplace was evaluated to derive lessons for oversight of nanotechnology. Criteria relating to the development, attributes, evolution, and outcomes of the system were used for evaluation that was based upon quantitative expert elicitation and historical literature analysis. The oversight system had inadequate resources in terms of finances, expertise, and personnel, and insufficient incentive for compliance. The system showed a lack of flexibility in novel situations. There were minimal requirements on companies for data on health and safety of their products. These factors have a strong influence on public confidence and health and safety. The oversight system also scored low on attributes such as public input, transparency, empirical basis, conflict of interest, and informed consent. The experts in our sample tend to believe that the current oversight system for chemicals in the workplace is neither adequate nor effective. It is very likely that the performance of the OSHA oversight system for nanomaterials will be equally inadequate.
Collapse
Affiliation(s)
- Jae-Young Choi
- Division of Health Policy and Management at School of Public Health at University of Minnesota, USA
| | | |
Collapse
|
17
|
Small MJ. Methods for assessing uncertainty in fundamental assumptions and associated models for cancer risk assessment. RISK ANALYSIS : AN OFFICIAL PUBLICATION OF THE SOCIETY FOR RISK ANALYSIS 2008; 28:1289-308. [PMID: 18844862 DOI: 10.1111/j.1539-6924.2008.01134.x] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/24/2023]
Abstract
The distributional approach for uncertainty analysis in cancer risk assessment is reviewed and extended. The method considers a combination of bioassay study results, targeted experiments, and expert judgment regarding biological mechanisms to predict a probability distribution for uncertain cancer risks. Probabilities are assigned to alternative model components, including the determination of human carcinogenicity, mode of action, the dosimetry measure for exposure, the mathematical form of the dose-response relationship, the experimental data set(s) used to fit the relationship, and the formula used for interspecies extrapolation. Alternative software platforms for implementing the method are considered, including Bayesian belief networks (BBNs) that facilitate assignment of prior probabilities, specification of relationships among model components, and identification of all output nodes on the probability tree. The method is demonstrated using the application of Evans, Sielken, and co-workers for predicting cancer risk from formaldehyde inhalation exposure. Uncertainty distributions are derived for maximum likelihood estimate (MLE) and 95th percentile upper confidence limit (UCL) unit cancer risk estimates, and the effects of resolving selected model uncertainties on these distributions are demonstrated, considering both perfect and partial information for these model components. A method for synthesizing the results of multiple mechanistic studies is introduced, considering the assessed sensitivities and selectivities of the studies for their targeted effects. A highly simplified example is presented illustrating assessment of genotoxicity based on studies of DNA damage response caused by naphthalene and its metabolites. The approach can provide a formal mechanism for synthesizing multiple sources of information using a transparent and replicable weight-of-evidence procedure.
Collapse
Affiliation(s)
- Mitchell J Small
- Civil & Environmental Engineering and Engineering & Public Policy, Carnegie Mellon University, Pittsburgh, PA 15213, USA.
| |
Collapse
|
18
|
Belzer RB, Bus JS, Cavalieri EL, Lewis SC, North DW, Pleus RC. The naphthalene state of the science symposium: Objectives, organization, structure, and charge. Regul Toxicol Pharmacol 2008; 51:S1-5. [DOI: 10.1016/j.yrtph.2007.10.017] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2007] [Revised: 09/12/2007] [Accepted: 10/15/2007] [Indexed: 11/24/2022]
|
19
|
Marino S, Hogue IB, Ray CJ, Kirschner DE. A methodology for performing global uncertainty and sensitivity analysis in systems biology. J Theor Biol 2008; 254:178-96. [PMID: 18572196 DOI: 10.1016/j.jtbi.2008.04.011] [Citation(s) in RCA: 1230] [Impact Index Per Article: 76.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/06/2007] [Revised: 03/14/2008] [Accepted: 04/12/2008] [Indexed: 10/22/2022]
Abstract
Accuracy of results from mathematical and computer models of biological systems is often complicated by the presence of uncertainties in experimental data that are used to estimate parameter values. Current mathematical modeling approaches typically use either single-parameter or local sensitivity analyses. However, these methods do not accurately assess uncertainty and sensitivity in the system as, by default, they hold all other parameters fixed at baseline values. Using techniques described within we demonstrate how a multi-dimensional parameter space can be studied globally so all uncertainties can be identified. Further, uncertainty and sensitivity analysis techniques can help to identify and ultimately control uncertainties. In this work we develop methods for applying existing analytical tools to perform analyses on a variety of mathematical and computer models. We compare two specific types of global sensitivity analysis indexes that have proven to be among the most robust and efficient. Through familiar and new examples of mathematical and computer models, we provide a complete methodology for performing these analyses, in both deterministic and stochastic settings, and propose novel techniques to handle problems encountered during these types of analyses.
Collapse
Affiliation(s)
- Simeone Marino
- Department of Microbiology and Immunology, University of Michigan Medical School, Ann Arbor, MI 48109-0620, USA
| | | | | | | |
Collapse
|
20
|
Roman HA, Walker KD, Walsh TL, Conner L, Richmond HM, Hubbell BJ, Kinney PL. Expert judgment assessment of the mortality impact of changes in ambient fine particulate matter in the U.S. ENVIRONMENTAL SCIENCE & TECHNOLOGY 2008; 42:2268-74. [PMID: 18504952 DOI: 10.1021/es0713882] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/04/2023]
Abstract
In this paper, we present findings from a multiyear expert judgment study that comprehensively characterizes uncertainty in estimates of mortality reductions associated with decreases in fine particulate matter (PM(2.5)) in the U.S. Appropriate characterization of uncertainty is critical because mortality-related benefits represent up to 90% of the monetized benefits reported in the Environmental Protection Agency's (EPA's) analyses of proposed air regulations. Numerous epidemiological and toxicological studies have evaluated the PM(2.5)-mortality association and investigated issues that may contribute to uncertainty in the concentration-response (C-R) function, such as exposure misclassification and potential confounding from other pollutant exposures. EPA's current uncertainty analysis methods rely largely on standard errors in published studies. However, no one study can capture the full suite of issues that arise in quantifying the C-R relationship. Therefore, EPA has applied state-of-the-art expert judgment elicitation techniques to develop probabilistic uncertainty distributions that reflect the broader array of uncertainties in the C-R relationship. These distributions, elicited from 12 of the world's leading experts on this issue, suggest both potentially larger central estimates of mortality reductions for decreases in long-term PM(2.5) exposure in the U.S. and a wider distribution of uncertainty than currently employed in EPA analyses.
Collapse
Affiliation(s)
- Henry A Roman
- Industrial Economics, Incorporated, Cambridge, Massachusetts 02140, USA.
| | | | | | | | | | | | | |
Collapse
|
21
|
Williams P, Paustenbach D, Balzer JL, Mangold C. Retrospective exposure assessment of airborne asbestos related to skilled craftsmen at a petroleum refinery in Beaumont, Texas (1940-2006). JOURNAL OF TOXICOLOGY AND ENVIRONMENTAL HEALTH. PART A 2007; 70:1076-107. [PMID: 17558804 DOI: 10.1080/15287390701208305] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/15/2023]
Abstract
Despite efforts over the past 50 or more years to estimate airborne dust or fiber concentrations for specific job tasks within different industries, there have been no known attempts to reconstruct historical asbestos exposures for the many types of trades employed in various nonmanufacturing settings. In this paper, 8-h time-weighted average (TWA) asbestos exposures were estimated for 12 different crafts from the 1940s to the present day at a large petroleum refinery in Beaumont, TX. The crafts evaluated were insulators, pipefitters, boilermakers, masons, welders, sheet-metal workers, millwrights, electricians, carpenters, painters, laborers, and maintenance workers. This analysis quantitatively accounts for (1) the historical use of asbestos-containing materials at the refinery, (2) the typical workday of the different crafts and specific opportunities for exposure to asbestos, (3) industrial hygiene asbestos air monitoring data collected at this refinery and similar facilities since the early 1970s, (4) published and unpublished data sets on task-specific dust or fiber concentrations encountered in various industrial settings since the late 1930s, and (5) the evolution of respirator use and other workplace practices that occurred as the hazards of asbestos became better understood over time. Due to limited air monitoring data for most crafts, 8-h TWA fiber concentrations were calculated only for insulators, while all other crafts were estimated to have experienced 8-h TWA fiber concentrations at some fraction of that experienced by insulators. A probabilistic (Monte Carlo) model was used to account for potential variability in the various data sets and the uncertainty in our knowledge of selected input parameters used to estimate exposure. Significant reliance was also placed on our collective professional experiences working in the fields of industrial hygiene, exposure assessment, and process engineering over the last 40 yr. Insulators at this refinery were estimated to have experienced 50th (and 95th) percentile 8-h TWA asbestos exposures (which incorporated 8-h TWA fiber concentrations, respirator use and effectiveness, and time spent working with asbestos-containing materials) of 9 (16) fibers/cc (cubic centimeter) from 1940 to 1950, 8 (13) fibers/cc from 1951 to 1965, 2 (5) fibers/cc from 1966 to 1971, 0.3 (0.5) fibers/cc from 1972 to 1975, and 0.005 (0.02) fibers/cc from 1976 to 1985 (estimated exposures were <0.001 fibers/cc after 1985). Estimated 8-h TWA exposures for all other crafts were at least 50- to 100-fold less than that of insulators, with the exception of laborers, whose estimated 8-h TWA exposures were approximately one-fifth to one-tenth of those of insulators. In spite of the data gaps, the available evidence indicates that our estimates of 8-h TWA asbestos exposures reasonably characterize the typical range of values for these categories of workers over time.
Collapse
|
22
|
Linkov I, Carini F, Collins C, Eged K, Mitchell NG, Mourlon C, Ould-Dada Z, Robles B, Sweeck L, Venter A. Radionuclides in fruit systems: model-model intercomparison study. THE SCIENCE OF THE TOTAL ENVIRONMENT 2006; 364:124-37. [PMID: 16157363 DOI: 10.1016/j.scitotenv.2005.08.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/26/2004] [Revised: 07/02/2005] [Accepted: 08/01/2005] [Indexed: 05/04/2023]
Abstract
Modeling is widely used to predict radionuclide distribution following accidental radionuclide releases. Modeling is crucial in emergency response planning and risk communication, and understanding model uncertainty is important not only in conducting analysis consistent with current regulatory guidance, but also in gaining stakeholder and decision-maker trust in the process and confidence in the results. However, while methods for dealing with parameter uncertainty are fairly well developed, an adequate representation of uncertainties associated with models remains rare. This paper addresses uncertainty about a model's structure (i.e., the relevance of simplifying assumptions and mathematical equations) that is seldom addressed in practical applications of environmental modeling. The use of several alternative models to derive a range of model outputs or risks is probably the only available technique to assess consistency in model prediction. Since each independent model requires significant resources for development and calibration, multiple models are not generally applied to the same problem. This study uses results from one such model intercomparison conducted by the Fruits Working Group, which was created under the International Atomic Energy Agency (IAEA) BIOMASS (BIOsphere Modelling and ASSessment) Program. Model-model intercomparisons presented in this study were conducted by the working group for two different scenarios (acute or continuous deposition), one radionuclide ((137)Cs), and three fruit-bearing crops (strawberries, apples, and blackcurrants). The differences between models were as great as five orders of magnitude for short-term predictions following acute radionuclide deposition. For long-term predictions and for the continuous deposition scenario, the differences between models were about two orders of magnitude. The difference between strawberry, apple, and blackcurrant contamination predicted by one model is far less than the difference in prediction of contamination for a single plant species given by different models. This study illustrates the importance of problem formulation and implementation of an analytic-deliberative process in risk characterization.
Collapse
Affiliation(s)
- I Linkov
- Cambridge Environmental, 58 Charles Street, Cambridge, MA 02141, USA.
| | | | | | | | | | | | | | | | | | | |
Collapse
|
23
|
Paustenbach DJ, Fehling K, Scott P, Harris M, Kerger BD. Identifying soil cleanup criteria for dioxins in urban residential soils: how have 20 years of research and risk assessment experience affected the analysis? JOURNAL OF TOXICOLOGY AND ENVIRONMENTAL HEALTH. PART B, CRITICAL REVIEWS 2006; 9:87-145. [PMID: 16613806 DOI: 10.1080/10937400500538482] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/08/2023]
Abstract
This article reviews the scientific evidence and methodologies that have been used to assess the risks posed by 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) and presents a probabilistic analysis for identifying virtually safe concentrations of TCDD toxicity equivalents (TEQ) in residential soils. Updated data distributions that consider state-of-the-science cancer and noncancer toxicity criteria, child soil ingestion and dermal uptake, bioavailability in soil, and residential exposure duration are incorporated. The probabilistic analysis shows that the most sensitive determinants of dose and risk are childhood soil ingestion, exposure duration, and the selected TCDD cancer potency factor. It also shows that the cancer risk at 1 per 100,000 predicted more conservative (lower) soil criteria values than did the noncancer hazard (e.g., developmental and reproductive effects). In this analysis, acceptable or tolerable soil dioxin concentrations (TCDD TEQ) ranged from 0.4 to 5.5 ppb at the 95th percentile for cancer potency factors from 9600 to 156,000 (mg/kg/d)(-1) with site-specific adjustments not included. Various possible soil guidelines based on cancer and noncancer risks are presented and discussed. In the main, the current toxicology, epidemiology, and exposure assessment data indicate that the historical 1 ppb TEQ soil guidance value remains a reasonable screening value for most residential sites. This analysis provides risk managers with a thorough and transparent methodology, as well as a comprehensive information base, for making informed decisions about selecting soil cleanup values for PCDD/Fs in urban residential settings.
Collapse
|
24
|
Lewandowski TA, Rhomberg LR. A proposed methodology for selecting a trichloroethylene inhalation unit risk value for use in risk assessment. Regul Toxicol Pharmacol 2005; 41:39-54. [PMID: 15649826 DOI: 10.1016/j.yrtph.2004.09.003] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2004] [Revised: 09/28/2004] [Accepted: 09/28/2004] [Indexed: 11/22/2022]
Abstract
U.S. EPA's 2001 draft assessment of trichloroethylene (TCE) toxicity reviews the existing human and animal data on TCE carcinogenicity and proposes a 20-fold range of cancer potency values for use in risk assessment. Each value in the range is derived from a different source of data, either animal bioassays or epidemiology studies, and thus the range does not represent a distribution which can be characterized by statistical parameters such as a mean or 95% confidence interval. The U.S. EPA suggests users choose a single slope factor from among those it describes as appropriate for the population of interest and mode of exposure, but little guidance is given for making this choice. We propose an approach for determining the most scientifically defensible carcinogenic inhalation unit risk estimate from the range of slope factors developed by U.S. EPA, one that relies on accepted principles for evaluating scientific studies. Based on these considerations, we identify the most appropriate interim unit risk for low-level inhalation exposure as 9 x 10(-7) per microg/m(3). This approach may have fairly broad utility if U.S. EPA elects to use a similar approach in future assessments of other chemicals.
Collapse
Affiliation(s)
- T A Lewandowski
- Gradient Corporation, 600 Stewart Street, Suite 803, Seattle, WA 98101, USA.
| | | |
Collapse
|
25
|
Linkov I, Burmistrov D. Sources of uncertainty in model predictions: lessons learned from the IAEA Forest and Fruit Working Group model intercomparisons. JOURNAL OF ENVIRONMENTAL RADIOACTIVITY 2005; 84:297-314. [PMID: 15978707 DOI: 10.1016/j.jenvrad.2003.10.009] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/14/2002] [Revised: 10/16/2003] [Accepted: 10/17/2003] [Indexed: 05/03/2023]
Abstract
The International Atomic Energy Agency (IAEA), through the BIOMASS program, has provided a unique international forum for assessing the relative contribution of different sources of uncertainty associated with environmental modeling. The methodology and guidance for dealing with parameter uncertainty have been fairly well developed and quantitative tools such as Monte-Carlo modeling are often recommended. The issue of model uncertainty is still rarely addressed in practical applications and the use of several alternative models to derive a range of model outputs (similar to what was done in IAEA model intercomparisons) is one of a few available techniques. This paper addresses the often overlooked issue of what we call 'modeler uncertainty,' i.e., differences in problem formulation, model implementation and parameter selection originating from subjective interpretation of the problem at hand. This study uses results from the Fruit and Forest Working Groups created under the BIOMASS program (BIOsphere Modeling and ASSessment). The greatest uncertainty was found to result from modelers' interpretation of scenarios and approximations made by modelers. In scenarios that were unclear for modelers, the initial differences in model predictions were as high as seven orders of magnitude. Only after several meetings and discussions about specific assumptions did the differences in predictions by various models merge. Our study shows that the parameter uncertainty (as evaluated by a probabilistic Monte-Carlo assessment) may have contributed over one order of magnitude to the overall modeling uncertainty. The final model predictions ranged between one and three orders of magnitude, depending on the specific scenario. This study illustrates the importance of problem formulation and implementation of an analytic-deliberative process in fate and transport modeling and risk characterization.
Collapse
Affiliation(s)
- Igor Linkov
- Cambridge Environmental Inc., 58 Charles Street, Cambridge, MA 02141, USA.
| | | |
Collapse
|
26
|
Ha-Duong M, Casman EA, Morgan MG. Bounding poorly characterized risks: a lung cancer example. RISK ANALYSIS : AN OFFICIAL PUBLICATION OF THE SOCIETY FOR RISK ANALYSIS 2004; 24:1071-1083. [PMID: 15563277 DOI: 10.1111/j.0272-4332.2004.00508.x] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/24/2023]
Abstract
For diseases with more than one risk factor, the sum of probabilistic estimates of the number of cases caused by each individual factor may exceed the total number of cases observed, especially when uncertainties about exposure and dose response for some risk factors are high. In this study, we outline a method of bounding the fraction of lung cancer fatalities not due to specific well-studied causes. Such information serves as a "reality check" for estimates of the impacts of the minor risk factors, and, as such, complements the traditional risk analysis. With lung cancer as our example, we allocate portions of the observed lung cancer mortality to known causes (such as smoking, residential radon, and asbestos fibers) and describe the uncertainty surrounding those estimates. The interactions among the risk factors are also quantified, to the extent possible. We then infer an upper bound on the residual mortality due to "other" causes, using a consistency constraint on the total number of deaths, the maximum uncertainty principle, and the mathematics originally developed of imprecise probabilities.
Collapse
Affiliation(s)
- Minh Ha-Duong
- Centre International de Recherche sur l'Environnement et le Développement, Centre National de la Recherche Scientifique, France
| | | | | |
Collapse
|
27
|
Linkov I, Burmistrov D. Model uncertainty and choices made by modelers: lessons learned from the International Atomic Energy Agency model intercomparisons. RISK ANALYSIS : AN OFFICIAL PUBLICATION OF THE SOCIETY FOR RISK ANALYSIS 2003; 23:1297-308. [PMID: 14641902 DOI: 10.1111/j.0272-4332.2003.00402.x] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/22/2023]
Abstract
The treatment of uncertainties associated with modeling and risk assessment has recently attracted significant attention. The methodology and guidance for dealing with parameter uncertainty have been fairly well developed and quantitative tools such as Monte Carlo modeling are often recommended. However, the issue of model uncertainty is still rarely addressed in practical applications of risk assessment. The use of several alternative models to derive a range of model outputs or risks is one of a few available techniques. This article addresses the often-overlooked issue of what we call "modeler uncertainty," i.e., difference in problem formulation, model implementation, and parameter selection originating from subjective interpretation of the problem at hand. This study uses results from the Fruit Working Group, which was created under the International Atomic Energy Agency (IAEA) BIOMASS program (BIOsphere Modeling and ASSessment). Model-model and model-data intercomparisons reviewed in this study were conducted by the working group for a total of three different scenarios. The greatest uncertainty was found to result from modelers' interpretation of scenarios and approximations made by modelers. In scenarios that were unclear for modelers, the initial differences in model predictions were as high as seven orders of magnitude. Only after several meetings and discussions about specific assumptions did the differences in predictions by various models merge. Our study shows that parameter uncertainty (as evaluated by a probabilistic Monte Carlo assessment) may have contributed over one order of magnitude to the overall modeling uncertainty. The final model predictions ranged between one and three orders of magnitude, depending on the specific scenario. This study illustrates the importance of problem formulation and implementation of an analytic-deliberative process in risk characterization.
Collapse
|
28
|
Williams PRD, Paustenbach DJ. Reconstruction of benzene exposure for the Pliofilm cohort (1936-1976) using Monte Carlo techniques. JOURNAL OF TOXICOLOGY AND ENVIRONMENTAL HEALTH. PART A 2003; 66:677-781. [PMID: 12746133 DOI: 10.1080/15287390306379] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/24/2023]
Abstract
The current cancer slope factor and occupational standards for benzene are based primarily on studies of the rubber hydrochloride (Pliofilm) workers. Previous assessments of this cohort by Rinsky et al. (1981, 1987), Crump and Allen (1984), and Paustenbach et al. (1992) relied on different assumptions about the available industrial hygiene data and workplace practices and processes over time, thereby yielding significantly different estimates of annual benzene exposures for many jobs. Given the inherent limitations and uncertainties involved in estimating historical exposures for this cohort, a probabilistic approach was used to better characterize their likely degree of benzene exposure. Ambient air exposures to benzene were based, in part, on the distribution of air sampling data collected at the Pliofilm facilities and assumptions about how workplace concentrations probably decreased over time as the threshold limit value (TLV) was lowered. The likely uptake of benzene from dermal exposures was estimated based on probability distributions for several exposure factors, including surface area, contact rate and duration, and skin absorption. The assessment also quantitatively accounts for improved engineering controls, extended work hours, incomplete Pliofilm production, and the use and effectiveness of respirators over time. All original data and assumptions are presented in this assessment, as is all new information obtained through additional interviews of former workers. Estimated benzene exposures at the 50th and 95th percentiles are reported as equivalent 8-h time-weighted average (TWA) airborne concentrations for 13 job categories from 1936 to 1965 (Akron I and II facilities) and 1939 to 1976 (St. Mary's facility). Data indicate that estimated equivalent airborne benzene concentrations for St. Mary's workers were highest for four job categories (Neutralizer, Quencher, Knifeman, Spreader), typically ranging from about 50 to 90 ppm during 1939-1946 (lower during 1942-1945), and 10 to 40 ppm during 1947-1976 at the 50th percentile. These estimates are 2-3 times greater than for other jobs in the Pliofilm process, and about 1.5 times less than those estimated at the 95th percentile. Estimates of equivalent airborne benzene concentrations for Akron I and II were about 1.5 times higher than for St. Mary's, but there is less confidence in these estimates, given the lack of industrial hygiene monitoring data for these facilities. Study results suggest that Paustenbach et al. (1992) generally over-estimated exposures for those job categories that had the highest exposure by about a factor of two to four. On the other hand, it was concluded that Rinsky et al. (1981, 1987) under-predicted benzene exposures for most jobs, and Crump and Allen (1984) both under- and overpredicted benzene exposures, depending on the specific job category and time period. The new estimates presented in this analysis incorporate what is considered to be the most likely range of plausible exposure values, and, accordingly, provide a better characterization of the potential workplace exposures for this cohort. These data could be combined with current or future mortality information to calculate a new cancer potency factor or occupational health standard for benzene.
Collapse
|
29
|
Williams PRD, Paustenbach DJ. Risk characterization: principles and practice. JOURNAL OF TOXICOLOGY AND ENVIRONMENTAL HEALTH. PART B, CRITICAL REVIEWS 2002; 5:337-406. [PMID: 12396672 DOI: 10.1080/10937400290070161] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/23/2023]
Abstract
In the field of risk assessment, characterizing the nature and magnitude of human health or environmental risks is arguably the most important step in the analytical process. In this step, data on the dose-response relationship of an agent are integrated with estimates of the degree of exposure in a population to characterize the likelihood and severity of risk. Although the purpose of risk characterizations is to make sense of the available data and describe what they mean to a broad audience, this step is often given insufficient attention in health risk evaluations. Too often, characterizations fail to interpret or summarize risk information in a meaningful way, or they present single numerical estimates of risk without an adequate discussion of the uncertainties inherent in key exposure parameters or the dose-response assessment, model assumptions, or analytical limitations. Consequently, many users of risk information have misinterpreted the findings of a risk assessment or have false impressions about the degree of accuracy (or the confidence of the scientist) in reported risk estimates. In this article we collected and integrated the published literature on conducting and reporting risk characterizations to provide a broad, yet comprehensive, analysis of the risk characterization process as practiced in the United States and some other countries. Specifically, the following eight topics are addressed: (1) objective of risk characterization, (2) guidance documents on risk characterization, (3) key components of risk characterizations, (4) toxicity criteria for evaluating health risks, (5) descriptors used to characterize health risks, (6) methods for quantifying human health risks, (7) key uncertainties in risk characterizations, and (8) the risk decision-making process. A brief discussion is also provided on international aspects of risk characterization. A number of examples are presented that illustrate key concepts, and citations are provided for approximately 100 of the most relevant papers.
Collapse
|
30
|
Moschandrea DJ, Karuchit S. Scenario-model-parameter: a new method of cumulative risk uncertainty analysis. ENVIRONMENT INTERNATIONAL 2002; 28:247-261. [PMID: 12220111 DOI: 10.1016/s0160-4120(02)00025-9] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/23/2023]
Abstract
The recently developed concepts of aggregate risk and cumulative risk rectify two limitations associated with the classical risk assessment paradigm established in the early 1980s. Aggregate exposure denotes the amount of one pollutant available at the biological exchange boundaries from multiple routes of exposure. Cumulative risk assessment is defined as an assessment of risk from the accumulation of a common toxic effect from all routes of exposure to multiple chemicals sharing a common mechanism of toxicity. Thus, cumulative risk constitutes an improvement over the classical risk paradigm, which treats exposures from multiple routes as independent events associated with each specific route. Risk assessors formulate complex models and identify many realistic scenarios of exposure that enable them to estimate risks from exposures to multiple pollutants and multiple routes. The increase in complexity of the risk assessment process is likely to increase risk uncertainty. Despite evidence that scenario and model uncertainty contribute to the overall uncertainty of cumulative risk estimates, present uncertainty analysis of risk estimates accounts only for parameter uncertainty and excludes model and scenario uncertainties. This paper provides a synopsis of the risk assessment evolution and associated uncertainty analysis methods. This evolution leads to the concept of the scenario-model-parameter (SW) cumulative risk uncertainty analysis method. The SMP uncertainty analysis is a multiple step procedure that assesses uncertainty associated with the use of judiciously selected scenarios and models of exposure and risk. Ultimately, the SMP uncertainty analysis method compares risk uncertainty estimates determined using all three sources of uncertainty with conventional risk uncertainty estimates obtained using only the parameter source. An example of applying the SMP uncertainty analysis to cumulative risk estimates from exposures to two pesticides indicates that inclusion of scenario and model sources.
Collapse
Affiliation(s)
- D J Moschandrea
- Illinois Institute of Technology, Department of Chemical and Environmental Engineering, Chicago 60616-3783, USA.
| | | |
Collapse
|
31
|
Walker KD, Evans JS, MacIntosh D. Use of expert judgment in exposure assessment. Part I. Characterization of personal exposure to benzene. JOURNAL OF EXPOSURE ANALYSIS AND ENVIRONMENTAL EPIDEMIOLOGY 2001; 11:308-22. [PMID: 11571610 DOI: 10.1038/sj.jea.7500171] [Citation(s) in RCA: 21] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/27/1998] [Accepted: 04/25/2001] [Indexed: 04/14/2023]
Abstract
This paper presents the results of the first phase of a study, conducted as an element of the National Human Exposure Assessment Survey (NHEXAS), to demonstrate the use of expert subjective judgment elicitation techniques to characterize the magnitude of and uncertainty in environmental exposure to benzene. In decisions about the value of exposure research or of regulatory controls, the characterization of uncertainty can play an influential role. Classical methods for characterizing uncertainty may be sufficient when adequate amounts of relevant data are available. Frequently, however, data are neither abundant nor directly relevant, making it necessary to rely to varying degrees on subjective judgment. Since the 1950s, methods to elicit and quantify subjective judgments have been explored but have rarely been applied to the field of environmental exposure assessment. In this phase of the project, seven experts in benzene exposure assessment were selected through a peer nomination process, participated in a 2-day workshop, and were interviewed individually to elicit their judgments about the distributions of residential ambient, residential indoor, and personal air benzene concentrations (6-day integrated average) experienced by both the non-smoking, non-occupationally exposed target and study populations of the US EPA Region V pilot study. Specifically, each expert was asked to characterize, in probabilistic form, the arithmetic means and the 90th percentiles of these distributions. This paper presents the experts' judgments about the concentrations of benzene encountered by the target population. The experts' judgments about levels of benzene in personal air were demonstrative of patterns observed in the judgments about the other distributions. They were in closest agreement about their predictions of the mean; with one exception, their best estimates of the mean fell within 7-11 microg/m(3) although they exhibited striking differences in the degree of uncertainty expressed. Their estimates of the 90th percentile were more varied with the best estimates ranging from 12 to 26 microg/m(3) for all but one expert. However, their predictions of the 90th percentile were far more uncertain. The paper demonstrates that coherent subjective judgments can be elicited from exposure assessment scientists and critically examines the challenges and potential benefits of a subjective judgment approach. The results of the second phase of the project, in which measurements from the NHEXAS field study in Region V are used to calibrate the experts' judgments about the benzene exposures in the study population, will be presented in a second paper.
Collapse
Affiliation(s)
- K D Walker
- Harvard School of Public Health, Boston, Massachusetts, USA.
| | | | | |
Collapse
|
32
|
Stevenson DE, Walborg EF, North DW, Sielken RL, Ross CE, Wright AS, Xu Y, Kamendulis LM, Klaunig JE. Monograph: reassessment of human cancer risk of aldrin/dieldrin. Toxicol Lett 1999; 109:123-86. [PMID: 10555138 DOI: 10.1016/s0378-4274(99)00132-0] [Citation(s) in RCA: 36] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/26/2022]
Abstract
In 1987, the US Environmental Protection Agency (EPA) classified aldrin and dieldrin as category B2 carcinogens, i.e. probable human carcinogens, based largely on the increase in liver tumors in mice fed either organochlorine insecticide. At that date, the relevant epidemiology was deemed inadequate to influence the cancer risk assessment. More time has now elapsed since early exposures of manufacturing workers to aldrin/dieldrin; therefore, updated epidemiological data possess more power to detect exposure-related differences in cancer risk and mortality. Also, recent experimental studies provide a plausible mode of action to explain the mouse specificity of dieldrin-induced hepatocarcinogenesis and call into question the relevance of this activity to human cancer risk. This monograph places this new information within the historic and current perspectives of human cancer risk assessment, including EPA's 1996 Proposed Guidelines for Carcinogen Risk Assessment. Updated epidemiological studies of manufacturing workers in which lifetime exposures to aldrin/dieldrin have been quantified do not indicate increased mortality or cancer risk. In fact, at the middle range of exposures, there is evidence of a decrease in both mortality from all causes and cancer. Recent experimental studies indicate that dieldrin-induced hepatocarcinogenesis in mice occurs through a nongenotoxic mode of action, in which the slow oxidative metabolism of dieldrin is accompanied by an increased production of reactive oxygen species, depletion of hepatic antioxidant defenses (particularly alpha-tocopherol), and peroxidation of liver lipids. Dieldrin-induced oxidative stress or its sequelae apparently result in modulation of gene expression that favors expansion of initiated mouse, but not rat, liver cells; thus, dieldrin acts as a nongenotoxic promoter/accelerator of background liver tumorigenesis in the mouse. Within the framework of EPA's Proposed Guidelines for Carcinogen Risk Assessment, it is proposed that the most appropriate cancer risk descriptor for aldrin/dieldrin, relating to the mouse liver tumor response, is 'not likely a human carcinogen', a descriptor consistent with the example of phenobarbital cited by EPA.
Collapse
|
33
|
Jackson MA, Stack HF, Waters MD. Activity profiles of carcinogenicity data: application in hazard identification and risk assessment. Mutat Res 1997; 394:113-24. [PMID: 9434850 DOI: 10.1016/s1383-5718(97)00123-x] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Abstract
Animal cancer data play a primary role in human risk assessment due to the limited epidemiological data. The current database of test results from the NCI/NTP rodent bioassays provide valuable information concerning the carcinogenic potential of hundreds of environmental agents. An approach is presented to reduce and graphically display these data as activity profiles to allow visualization and assessment of tumor response trends across multiple parameters, e.g. species, sex, target site, and route of exposure. Spreadsheet graphics are used to construct the profiles organized on the multiple parameters of carcinogenicity in a format that enables comparative analysis among chemicals. Several example applications are described to illustrate the value of activity profiles in hazard identification and risk assessment. The NCI/NTP data used in developing this concept are from the Carcinogen Potency Database (CPDB) complied by Gold et al. (Environ. Health Perspect. 103 (Suppl. 8) (1995) 3-122). Computer links to the underlying details in the CPDB are maintained such that specific histopathologies at individual tumor sites, duration of the study, dose-response data, and notes related to diet, survival, treatments, and the authors evaluation are available to aid in the assessment process. The profiles display carcinogen potency based on the tumorigenic dose rate 50 (TD50), i.e. the chronic dose rate that would induce tumors in half of the test animals at the end of their standard lifespan adjusting for spontaneous tumors. The TD50 values provide an index for establishing a relative potency ranking of the chemicals for any specific parameter, such as species or target site. An example ranking of hepatocarcinogens is presented to illustrate relative potencies for chemical analogs. The rank order indicates that the degree and type of halogenation of alkanes has a direct bearing on the carcinogenic potency of these compounds.
Collapse
Affiliation(s)
- M A Jackson
- Integrated Laboratory Systems, Research Triangle Park, NC 27709, USA
| | | | | |
Collapse
|
34
|
Paustenbach DJ. Methods for Setting Limits for Acute and Chronic Toxic Ambient Air Contaminants. ACTA ACUST UNITED AC 1997. [DOI: 10.1080/1047322x.1997.10389531] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
|
35
|
Sielken RL, Reitz RH, Hays SM. Using PBPK modeling and comprehensive realism methodology for the quantitative cancer risk assessment of butadiene. Toxicology 1996; 113:231-7. [PMID: 8901903 DOI: 10.1016/0300-483x(96)03450-6] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/02/2023]
Abstract
The National Academy of Sciences and many others have noted the need for quantitative health risk assessment methodology that goes beyond a simple screening analysis based on upper bounds on risk. The Academy recommended adoption of methodologies which provide a higher-tier analysis based on realistic estimates of risk which reflect more of the available biological information. In recent years, scientists have challenged the assumption of low-dose linearity and other default assumptions in cancer risk assessment. These challenges have stimulated the continued evolution of quantitative risk assessment methodologies, because effective risk management requires accurate characterizations of uncertainty and greater utilization of cost-benefit analyses for decision making. "Comprehensive Realism" is an emerging quantitative weight-of-evidence based risk assessment methodology for both cancer and noncancer health effects which utilizes probability distributions and decision analysis techniques to reflect more of the available human and animal dose-response data. The current state of knowledge about the relative plausibility of alternative dose-response analyses is also addressed in this approach. The framework discussed here should lead to a higher-tier assessment of butadiene.
Collapse
|
36
|
Conducting a state-of-the-art risk assessment of 1,3-butadiene. Toxicology 1996. [DOI: 10.1016/0300-483x(96)03446-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
|
37
|
Rhomberg L. Risk assessment and the use of information on underlying biologic mechanisms: a perspective. Mutat Res 1996; 365:175-89. [PMID: 8898997 DOI: 10.1016/s0165-1110(96)90020-2] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/02/2023]
Abstract
Recent years have seen the rapid expansion of scientific understanding of the underlying biologic bases of toxic reactions to chemicals. Use of this information in health risk assessment is expanding, but it has yet to reach its full potential. This article considers what has successfully been done, what approaches are now being developed, and what impediments and difficulties have been encountered in attempts to bring case-specific, mechanistic toxicological information to bear on risk estimation. In hazard identification, mechanistic information can help explain the bearing of various empirical experimental results for inferring human hazard, can increase the sensitivity of detection, and can be considered in attempts to replace 2-year animal bioassays with hazard identification methods that rest on identifying key biological properties underlying carcinogenicity rather than relying only on the experimental observation of tumors. In carcinogen potency estimation, mechanistic information can potentially extend relevant observation to lower dose levels, provide the basis for choosing among empirically based dose-response models, lead to potency estimates through relationships with quantitative measures of short-term test outcomes, and can be considered as a basis for providing direct observation of the biological parameters in biologically based dose-response modeling.
Collapse
Affiliation(s)
- L Rhomberg
- Harvard Center for Risk Analysis, Harvard School of Public Health, Boston, MA 02115, USA
| |
Collapse
|