1
|
Björnham O, Brännström N, Persson L. Absolutely continuous copulas with prescribed support constructed by differential equations, with an application in toxicology. COMMUN STAT-THEOR M 2022. [DOI: 10.1080/03610926.2020.1864825] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Affiliation(s)
- Oscar Björnham
- Division of CBRN Defence and Security, Swedish Defence Research Agency FOI, Umeå, Sweden
| | - Niklas Brännström
- Division of CBRN Defence and Security, Swedish Defence Research Agency FOI, Umeå, Sweden
| | - Leif Persson
- Department of Mathematics and Mathematical Statistics, Umeå University, Umeå, Sweden
| |
Collapse
|
2
|
Dilger M, Schneider K, Drossard C, Ott H, Kaiser E. Distributions for time, inter‐ and intraspecies extrapolation for deriving occupational exposure limits. J Appl Toxicol 2022; 42:898-912. [PMID: 35187686 PMCID: PMC9314728 DOI: 10.1002/jat.4305] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2021] [Revised: 01/25/2022] [Accepted: 02/16/2022] [Indexed: 11/28/2022]
Abstract
This work aimed at improving the empirical database of time (i.e., exposure duration), interspecies and intraspecies extrapolation when deriving occupational exposure limits (OELs). For each extrapolation step, a distribution was derived, which can be used to model the associated uncertainties. For time and interspecies extrapolation, distributions of ratios of dose descriptors were derived from studies of different length or species. National Toxicology Program (NTP) study data were manually assessed, and data from REACH (Registration, Evaluation and Authorisation of Chemicals) registration dossiers were evaluated semi‐automatically. Intraspecies extrapolation was investigated by compiling published studies on human toxicokinetic and toxicodynamic variability. A new database was established for toxicokinetic differences in interindividual susceptibility, including many inhalation studies. Using NTP data produced more reliable results than using REACH data. The geometric mean (GM) for time extrapolation subacute/chronic agreed with previous evaluations (GM = 4.11), whereas the GM for subchronic/chronic extrapolation was slightly higher (GM = 2.93) than the GMs found by others. No significant differences were observed between systemically and locally acting substances. Observed interspecies differences confirmed the suitability of allometric scaling, with the derived distribution describing remaining uncertainty. Distributions of intraspecies variability at the 1% and 5% incidence level had medians of 7.25 and 3.56, respectively. When compared with assessment factors (AFs) currently used in the EU, probabilities that these AFs are protective enough span a wide range from 10% to 95%, depending on the extrapolation step. These results help to select AFs in a transparent and informed way and, by allowing to compare protection levels achieved, to harmonise methods for deriving OELs. This work aimed at improving the empirical database of time (i.e., exposure duration), interspecies and intraspecies extrapolation. Distributions were derived, which can be used to model the associated uncertainties. When compared with assessment factors (AFs) currently used in the EU, probabilities that these AFs are protective enough span a wide range from 10% to 95%, depending on the extrapolation step. These results help to select AFs in a transparent and informed way and to harmonise methods for deriving OELs.
Collapse
Affiliation(s)
- Marco Dilger
- Forschungs‐ und Beratungsinstitut Gefahrstoffe GmbH (FoBiG) Freiburg Germany
| | - Klaus Schneider
- Forschungs‐ und Beratungsinstitut Gefahrstoffe GmbH (FoBiG) Freiburg Germany
| | - Claudia Drossard
- Federal Institute for Occupational Safety and Health Dortmund Germany
| | - Heidi Ott
- Federal Institute for Occupational Safety and Health Dortmund Germany
| | - Eva Kaiser
- Forschungs‐ und Beratungsinstitut Gefahrstoffe GmbH (FoBiG) Freiburg Germany
| |
Collapse
|
3
|
McHale CM, Osborne G, Morello-Frosch R, Salmon AG, Sandy MS, Solomon G, Zhang L, Smith MT, Zeise L. Assessing health risks from multiple environmental stressors: Moving from G×E to I×E. MUTATION RESEARCH-REVIEWS IN MUTATION RESEARCH 2017; 775:11-20. [PMID: 29555026 DOI: 10.1016/j.mrrev.2017.11.003] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/24/2017] [Revised: 11/21/2017] [Accepted: 11/22/2017] [Indexed: 02/06/2023]
Abstract
Research on disease causation often attempts to isolate the effects of individual factors, including individual genes or environmental factors. This reductionist approach has generated many discoveries, but misses important interactive and cumulative effects that may help explain the broad range of variability in disease occurrence observed across studies and individuals. A disease rarely results from a single factor, and instead results from a broader combination of factors, characterized here as intrinsic (I) and extrinsic (E) factors. Intrinsic vulnerability or resilience emanates from a variety of both fixed and shifting biological factors including genetic traits, while extrinsic factors comprise all biologically-relevant external stressors encountered across the lifespan. The I×E concept incorporates the multi-factorial and dynamic nature of health and disease and provides a unified, conceptual basis for integrating results from multiple areas of research, including genomics, G×E, developmental origins of health and disease, and the exposome. We describe the utility of the I×E concept to better understand and characterize the cumulative impact of multiple extrinsic and intrinsic factors on individual and population health. New research methods increasingly facilitate the measurement of multifactorial and interactive effects in epidemiological and toxicological studies. Tiered or indicator-based approaches can guide the selection of potentially relevant I and E factors for study and quantification, and exposomics methods may eventually produce results that can be used to generate a response function over the life course. Quantitative data on I×E interactive effects should generate a better understanding of the variability in human response to environmental factors. The proposed I×E concept highlights the role for broader study design in order to identify extrinsic and intrinsic factors amenable to interventions at the individual and population levels in order to enhance resilience, reduce vulnerability and improve health.
Collapse
Affiliation(s)
- Cliona M McHale
- Superfund Research Center, School of Public Health, University of California, Berkeley, CA 94720, USA.
| | - Gwendolyn Osborne
- Office of Environmental Health Hazard Assessment, California Environmental Protection Agency, Oakland, CA 94612, USA
| | - Rachel Morello-Frosch
- Superfund Research Center, School of Public Health, University of California, Berkeley, CA 94720, USA; Department of Environmental Science, Policy and Management, University of California, Berkeley, CA 94720, USA
| | - Andrew G Salmon
- Office of Environmental Health Hazard Assessment, California Environmental Protection Agency, Oakland, CA 94612, USA
| | - Martha S Sandy
- Office of Environmental Health Hazard Assessment, California Environmental Protection Agency, Oakland, CA 94612, USA
| | - Gina Solomon
- California Environmental Protection Agency, Sacramento, CA 95814, USA
| | - Luoping Zhang
- Superfund Research Center, School of Public Health, University of California, Berkeley, CA 94720, USA
| | - Martyn T Smith
- Superfund Research Center, School of Public Health, University of California, Berkeley, CA 94720, USA
| | - Lauren Zeise
- Office of Environmental Health Hazard Assessment, California Environmental Protection Agency, Oakland, CA 94612, USA
| |
Collapse
|
4
|
Oldenkamp R, Huijbregts MAJ, Ragas AMJ. Uncertainty and variability in human exposure limits - a chemical-specific approach for ciprofloxacin and methotrexate. Crit Rev Toxicol 2015; 46:261-78. [PMID: 26648512 DOI: 10.3109/10408444.2015.1112768] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
Abstract
Human exposure limits (HELs) for chemicals with a toxicological threshold are traditionally derived using default assessment factors that account for variations in exposure duration, species sensitivity and individual sensitivity. The present paper elaborates a probabilistic approach for human hazard characterization and the derivation of HELs. It extends the framework for evaluating and expressing uncertainty in hazard characterization recently proposed by WHO-IPCS, i.e. by the incorporation of chemical-specific data on human variability in toxicokinetics. The incorporation of human variability in toxicodynamics was based on the variation between adverse outcome pathways (AOPs). Furthermore, sources of interindividual variability and uncertainty are propagated separately throughout the derivation process. The outcome is a two-dimensional human dose distribution that quantifies the population fraction exceeding a pre-selected critical effect level with an estimate of the associated uncertainty. This enables policy makers to set separate standards for the fraction of the population to be protected and the confidence level of the assessment. The main sources of uncertainty in the human dose distribution can be identified in order to plan new research for reducing uncertainty. Additionally, the approach enables quantification of the relative risk for specific subpopulations. The approach is demonstrated for two pharmaceuticals, i.e. the antibiotic ciprofloxacin and the antineoplastic methotrexate. For both substances, the probabilistic HEL is mainly influenced by uncertainty originating from: (1) the point of departure (PoD), (2) extrapolation from sub-acute to chronic toxicity and (3) interspecies extrapolation. However, when assessing the tails of the two-dimensional human dose distributions, i.e. the section relevant for the derivation of human exposure limits, interindividual variability in toxicodynamics also becomes important.
Collapse
Affiliation(s)
- Rik Oldenkamp
- a Department of Environmental Science , Institute for Wetland and Water Research, Radboud University Nijmegen , Nijmegen , The Netherlands
| | - Mark A J Huijbregts
- a Department of Environmental Science , Institute for Wetland and Water Research, Radboud University Nijmegen , Nijmegen , The Netherlands
| | - Ad M J Ragas
- a Department of Environmental Science , Institute for Wetland and Water Research, Radboud University Nijmegen , Nijmegen , The Netherlands
| |
Collapse
|
5
|
DeBord DG, Burgoon L, Edwards SW, Haber LT, Kanitz MH, Kuempel E, Thomas RS, Yucesoy B. Systems Biology and Biomarkers of Early Effects for Occupational Exposure Limit Setting. JOURNAL OF OCCUPATIONAL AND ENVIRONMENTAL HYGIENE 2015; 12 Suppl 1:S41-54. [PMID: 26132979 PMCID: PMC4654673 DOI: 10.1080/15459624.2015.1060324] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/04/2023]
Abstract
In a recent National Research Council document, new strategies for risk assessment were described to enable more accurate and quicker assessments. This report suggested that evaluating individual responses through increased use of bio-monitoring could improve dose-response estimations. Identification of specific biomarkers may be useful for diagnostics or risk prediction as they have the potential to improve exposure assessments. This paper discusses systems biology, biomarkers of effect, and computational toxicology approaches and their relevance to the occupational exposure limit setting process. The systems biology approach evaluates the integration of biological processes and how disruption of these processes by chemicals or other hazards affects disease outcomes. This type of approach could provide information used in delineating the mode of action of the response or toxicity, and may be useful to define the low adverse and no adverse effect levels. Biomarkers of effect are changes measured in biological systems and are considered to be preclinical in nature. Advances in computational methods and experimental -omics methods that allow the simultaneous measurement of families of macromolecules such as DNA, RNA, and proteins in a single analysis have made these systems approaches feasible for broad application. The utility of the information for risk assessments from -omics approaches has shown promise and can provide information on mode of action and dose-response relationships. As these techniques evolve, estimation of internal dose and response biomarkers will be a critical test of these new technologies for application in risk assessment strategies. While proof of concept studies have been conducted that provide evidence of their value, challenges with standardization and harmonization still need to be overcome before these methods are used routinely.
Collapse
Affiliation(s)
- D. Gayle DeBord
- National Institute for Occupational Safety and Health, Division of Applied Research and Technology, Cincinnati, Ohio
| | - Lyle Burgoon
- U.S. Environmental Protection Agency, Office of Research and Development, Research Triangle Park, North Carolina
| | - Stephen W. Edwards
- U.S. Environmental Protection Agency, Office of Research and Development, Research Triangle Park, North Carolina
| | - Lynne T. Haber
- Toxicology Excellence for Risk Assessment (TERA), Cincinnati, Ohio
| | - M. Helen Kanitz
- National Institute for Occupational Safety and Health, Division of Applied Research and Technology, Cincinnati, Ohio
| | - Eileen Kuempel
- National Institute for Occupational Safety and Health, Education and Information Division, Cincinnati, Ohio
| | - Russell S. Thomas
- U.S. Environmental Protection Agency, Office of Research and Development, Research Triangle Park, North Carolina
- The Hamner Institute for Health Sciences, Research Triangle Park, North Carolina
| | - Berran Yucesoy
- National Institute for Occupational Safety and Health, Heath Effects Laboratory Division, Morgantown, West Virginia
| |
Collapse
|
6
|
Kuempel ED, Sweeney LM, Morris JB, Jarabek AM. Advances in Inhalation Dosimetry Models and Methods for Occupational Risk Assessment and Exposure Limit Derivation. JOURNAL OF OCCUPATIONAL AND ENVIRONMENTAL HYGIENE 2015; 12 Suppl 1:S18-40. [PMID: 26551218 PMCID: PMC4685615 DOI: 10.1080/15459624.2015.1060328] [Citation(s) in RCA: 57] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/26/2023]
Abstract
The purpose of this article is to provide an overview and practical guide to occupational health professionals concerning the derivation and use of dose estimates in risk assessment for development of occupational exposure limits (OELs) for inhaled substances. Dosimetry is the study and practice of measuring or estimating the internal dose of a substance in individuals or a population. Dosimetry thus provides an essential link to understanding the relationship between an external exposure and a biological response. Use of dosimetry principles and tools can improve the accuracy of risk assessment, and reduce the uncertainty, by providing reliable estimates of the internal dose at the target tissue. This is accomplished through specific measurement data or predictive models, when available, or the use of basic dosimetry principles for broad classes of materials. Accurate dose estimation is essential not only for dose-response assessment, but also for interspecies extrapolation and for risk characterization at given exposures. Inhalation dosimetry is the focus of this paper since it is a major route of exposure in the workplace. Practical examples of dose estimation and OEL derivation are provided for inhaled gases and particulates.
Collapse
Affiliation(s)
- Eileen D. Kuempel
- National Institute for Occupational Safety and Health, Education and Information Division, Cincinnati, Ohio
| | - Lisa M. Sweeney
- Henry M. Jackson Foundation for the Advancement of Military Medicine, Naval Medical Research Unit Dayton, Wright-Patterson Air Force Base, Ohio
| | - John B. Morris
- School of Pharmacy, University of Connecticut, Storrs, Connecticut
| | - Annie M. Jarabek
- U.S. Environmental Protection Agency, National Center for Environmental Assessment, Research Triangle Park, North Carolina
| |
Collapse
|
7
|
Finkel AM. EPA underestimates, oversimplifies, miscommunicates, and mismanages cancer risks by ignoring human susceptibility. RISK ANALYSIS : AN OFFICIAL PUBLICATION OF THE SOCIETY FOR RISK ANALYSIS 2014; 34:1785-1794. [PMID: 25335498 DOI: 10.1111/risa.12288] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/04/2023]
Abstract
If exposed to an identical concentration of a carcinogen, every human being would face a different level of risk, determined by his or her genetic, environmental, medical, and other uniquely individual characteristics. Various lines of evidence indicate that this susceptibility variable is distributed rather broadly in the human population, with perhaps a factor of 25- to 50-fold between the center of this distribution and either of its tails, but cancer risk assessment at the EPA and elsewhere has always treated every (adult) human as identically susceptible. The National Academy of Sciences "Silver Book" concluded that EPA and the other agencies should fundamentally correct their mis-computation of carcinogenic risk in two ways: (1) adjust individual risk estimates upward to provide information about the upper tail; and (2) adjust population risk estimates upward (by about sevenfold) to correct an underestimation due to a mathematical property of the interindividual distribution of human susceptibility, in which the susceptibility averaged over the entire (right-skewed) population exceeds the median value for the typical human. In this issue of Risk Analysis, Kenneth Bogen disputes the second adjustment and endorses the first, though he also relegates the problem of underestimated individual risks to the realm of "equity concerns" that he says should have little if any bearing on risk management policy. In this article, I show why the basis for the population risk adjustment that the NAS recommended is correct-that current population cancer risk estimates, whether they are derived from animal bioassays or from human epidemiologic studies, likely provide estimates of the median with respect to human variation, which in turn must be an underestimate of the mean. If cancer risk estimates have larger "conservative" biases embedded in them, a premise I have disputed in many previous writings, such a defect would not excuse ignoring this additional bias in the direction of underestimation. I also demonstrate that sensible, legally appropriate, and ethical risk policy must not only inform the public when the tail of the individual risk distribution extends into the "high-risk" range, but must alter benefit-cost balancing to account for the need to try to reduce these tail risks preferentially.
Collapse
|
8
|
Sheehan MC, Burke TA, Breysse PN, Navas-Acien A, McGready J, Fox MA. Association of markers of chronic viral hepatitis and blood mercury levels in US reproductive-age women from NHANES 2001-2008: a cross-sectional study. Environ Health 2012; 11:62. [PMID: 22970929 PMCID: PMC3511886 DOI: 10.1186/1476-069x-11-62] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2012] [Accepted: 08/23/2012] [Indexed: 05/07/2023]
Abstract
BACKGROUND Methylmercury (MeHg) is a neurotoxin primarily found in seafood; exposures in reproductive-age women are of concern due to vulnerability of the developing fetus. MeHg is mainly eliminated via an enterohepatic cycle involving the liver and gallbladder. Dysfunction in these organs has been associated with slower MeHg elimination in laboratory animals. We hypothesized that women testing positive for chronic hepatitis B (HBV) or C (HCV), both associated with risk of longer-term liver and gallbladder impairment, would have higher total blood mercury (TBHg) concentrations than those negative for the viruses, reflecting slower MeHg elimination. METHODS Geometric mean (GM) TBHg levels from a representative sample of over 5,000 seafood-consuming, reproductive-age women from eight years (2001-2008) of the US NHANES survey were compared by viral hepatitis status (as determined by serological assay) using multiple linear regression. Adjustment was made for estimated MeHg intake from seafood consumption, social and demographic variables and other predictors. RESULTS Women with chronic HBV had 1.52 (95% CI 1.13, 2.05, p < 0.01) times the GM TBHg of women who had not come into contact with the virus. The positive association was strongest in those with most severe disease. A modest negative association was found with HCV markers. CONCLUSIONS While study design prevents inferences on causality, the finding that MeHg biomarkers differ by hepatitis status in this population suggests viral hepatitis may alter the pace of MeHg elimination. Offspring of HBV-infected seafood-consuming women may be at higher risk of MeHg-induced developmental delays than offspring of those uninfected. Possible reasons for the unanticipated negative association with HCV are explored.
Collapse
Affiliation(s)
- Mary C Sheehan
- Department of Health Policy and Management, Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, USA
| | - Thomas A Burke
- Department of Health Policy and Management, Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, USA
| | - Patrick N Breysse
- Department of Environmental Health Sciences, Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, USA
| | - Ana Navas-Acien
- Department of Environmental Health Sciences, Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, USA
| | - John McGready
- Department of Statistics, Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, USA
| | - Mary A Fox
- Department of Health Policy and Management, Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, USA
| |
Collapse
|
9
|
Integrating susceptibility into environmental policy: an analysis of the national ambient air quality standard for lead. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2012; 9:1077-96. [PMID: 22690184 PMCID: PMC3366601 DOI: 10.3390/ijerph9041077] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/01/2012] [Revised: 03/20/2012] [Accepted: 03/21/2012] [Indexed: 11/21/2022]
Abstract
Susceptibility to chemical toxins has not been adequately addressed in risk assessment methodologies. As a result, environmental policies may fail to meet their fundamental goal of protecting the public from harm. This study examines how characterization of risk may change when susceptibility is explicitly considered in policy development; in particular we examine the process used by the U.S. Environmental Protection Agency (EPA) to set a National Ambient Air Quality Standard (NAAQS) for lead. To determine a NAAQS, EPA estimated air lead-related decreases in child neurocognitive function through a combination of multiple data elements including concentration-response (CR) functions. In this article, we present alternative scenarios for determining a lead NAAQS using CR functions developed in populations more susceptible to lead toxicity due to socioeconomic disadvantage. The use of CR functions developed in susceptible groups resulted in cognitive decrements greater than original EPA estimates. EPA’s analysis suggested that a standard level of 0.15 µg/m3 would fulfill decision criteria, but by incorporating susceptibility we found that options for the standard could reasonably be extended to lower levels. The use of data developed in susceptible populations would result in the selection of a more protective NAAQS under the same decision framework applied by EPA. Results are used to frame discussion regarding why cumulative risk assessment methodologies are needed to help inform policy development.
Collapse
|
10
|
Abstract
Under current guidelines, exposure guidelines for toxicants are determined by following one of two different tracks depending on whether the toxicant's mode of action (MOA) is believed to involve an exposure threshold. Although not denying the existence of thresholds, this paper points out problems with how the threshold concept and MOA is used in risk assessment. Thresholds are frequently described using imprecise terms that imply some unspecified increase in risk, which robs them of any meaning (any reasonable dose response will satisfy such a definition) and tacitly implies a value judgment about how large a risk is acceptable. MOA is generally used only to inform a threshold's existence and not its value. Often MOA is used only to conclude that the adverse effect requires an upstream cellular or biochemical response for which a threshold is simply assumed. Data to inform MOA often come from animals, which complicates evaluation of the role of human variation in genetic and environmental conditions, and the possible interaction of the toxicant with processes already producing background toxicity in humans. In response to these and other problems with the current two-track approach, this paper proposes a modified point of departure/safety factor approach to setting exposure guidelines for all toxicants. MOA and the severity of the toxic effect would be addressed using safety factors calculated from guidelines established by consensus and based on scientific judgment. The method normally would not involve quantifying low-dose risk, and would not require a threshold determination, although MOA information regarding the likelihood of a threshold could be used in setting safety factors.
Collapse
Affiliation(s)
- Kenny S Crump
- Department of Mathematics and Statistics, Louisiana Tech University, Ruston, Louisiana 71272-0046, USA.
| |
Collapse
|
11
|
Crump KS, Chiu WA, Subramaniam RP. Issues in using human variability distributions to estimate low-dose risk. ENVIRONMENTAL HEALTH PERSPECTIVES 2010; 118:387-93. [PMID: 20064772 PMCID: PMC2854768 DOI: 10.1289/ehp.0901250] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/24/2009] [Accepted: 10/23/2009] [Indexed: 05/22/2023]
Abstract
BACKGROUND The National Research Council (NRC) Committee on Improving Risk Analysis Approaches Used by the U.S. EPA (Environmental Protection Agency) recommended that low-dose risks be estimated in some situations using human variability distributions (HVDs). HVD modeling estimates log-normal distributions from data on pharmacokinetic and pharmacodynamic variables that affect individual sensitivities to the toxic response. These distributions are combined into an overall log-normal distribution for the threshold dose (dose below which there is no contribution to a toxic response) by assuming the variables act independently and multiplicatively. This distribution is centered at a point-of-departure dose that is usually estimated from animal data. The resulting log-normal distribution is used to quantify low-dose risk. OBJECTIVE We examined the implications of various assumptions in HVD modeling for estimating low-dose risk. METHODS The assumptions and data used in HVD modeling were subjected to rigorous scrutiny. RESULTS We found that the assumption that the variables affecting human sensitivity vary log normally is not scientifically defensible. Other distributions that are equally consistent with the data provide very different estimates of low-dose risk. HVD modeling can also involve an assumption that a threshold dose defined by dichotomizing a continuous apical response has a log-normal distribution. This assumption is shown to be incompatible (except under highly specialized conditions) with assuming that the continuous apical response itself is log normal. However, the two assumptions can lead to very different estimates of low-dose risk. The assumption in HVD modeling that the threshold dose can be expressed as a function of a product of independent variables lacks phenomenological support. We provide an example that shows that this assumption is generally invalid. CONCLUSION In view of these problems, we recommend caution in the use of HVD modeling as a general approach to estimating low-dose risks from human exposures to toxic chemicals.
Collapse
Affiliation(s)
- Kenny S Crump
- Louisiana Tech University, Ruston, Louisiana 71272-0046, USA.
| | | | | |
Collapse
|
12
|
Ginsberg G, Smolenski S, Neafsey P, Hattis D, Walker K, Guyton KZ, Johns DO, Sonawane B. The influence of genetic polymorphisms on population variability in six xenobiotic-metabolizing enzymes. JOURNAL OF TOXICOLOGY AND ENVIRONMENTAL HEALTH. PART B, CRITICAL REVIEWS 2009; 12:307-333. [PMID: 20183525 DOI: 10.1080/10937400903158318] [Citation(s) in RCA: 24] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/28/2023]
Abstract
This review provides variability statistics for polymorphic enzymes that are involved in the metabolism of xenobiotics. Six enzymes were evaluated: cytochrome P-450 (CYP) 2D6, CYP2E1, aldehyde dehydrogenase-2 (ALDH2), paraoxonase (PON1), glutathione transferases (GSTM1, GSTT1, and GSTP1), and N-acetyltransferases (NAT1 and NAT2). The polymorphisms were characterized with respect to (1) number and type of variants, (2) effects of polymorphisms on enzyme function, and (3) frequency of genotypes within specified human populations. This information was incorporated into Monte Carlo simulations to predict the population distribution and describe interindividual variability in enzyme activity. The results were assessed in terms of (1) role of these enzymes in toxicant activation and clearance, (2) molecular epidemiology evidence of health risk, and (3) comparing enzyme variability to that commonly assumed for pharmacokinetics. Overall, the Monte Carlo simulations indicated a large degree of interindividual variability in enzyme function, in some cases characterized by multimodal distributions. This study illustrates that polymorphic metabolizing systems are potentially important sources of pharmacokinetic variability, but there are a number of other factors including blood flow to liver and compensating pathways for clearance that affect how a specific polymorphism will alter internal dose and toxicity. This is best evaluated with the aid of physiologically based pharmacokinetic (PBPK) modeling. The population distribution of enzyme activity presented in this series of articles serves as inputs to such PBPK modeling analyses.
Collapse
Affiliation(s)
- Gary Ginsberg
- Connecticut Department of Public Health, Hartford, 06134, USA.
| | | | | | | | | | | | | | | |
Collapse
|
13
|
van der Voet H, van der Heijden GWAM, Bos PMJ, Bosgra S, Boon PE, Muri SD, Brüschweiler BJ. A model for probabilistic health impact assessment of exposure to food chemicals. Food Chem Toxicol 2008; 47:2926-40. [PMID: 19150381 DOI: 10.1016/j.fct.2008.12.027] [Citation(s) in RCA: 30] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2008] [Revised: 11/26/2008] [Accepted: 12/17/2008] [Indexed: 01/08/2023]
Abstract
A statistical model is presented extending the integrated probabilistic risk assessment (IPRA) model of van der Voet and Slob [van der Voet, H., Slob, W., 2007. Integration of probabilistic exposure assessment and probabilistic hazard characterisation. Risk Analysis, 27, 351-371]. The aim is to characterise the health impact due to one or more chemicals present in food causing one or more health effects. For chemicals with hardly any measurable safety problems we propose health impact characterisation by margins of exposure. In this probabilistic model not one margin of exposure is calculated, but rather a distribution of individual margins of exposure (IMoE) which allows quantifying the health impact for small parts of the population. A simple bar chart is proposed to represent the IMoE distribution and a lower bound (IMoEL) quantifies uncertainties in this distribution. It is described how IMoE distributions can be combined for dose-additive compounds and for different health effects. Health impact assessment critically depends on a subjective valuation of the health impact of a given health effect, and possibilities to implement this health impact valuation step are discussed. Examples show the possibilities of health impact characterisation and of integrating IMoE distributions. The paper also includes new proposals for modelling variable and uncertain factors describing food processing effects and intraspecies variation in sensitivity.
Collapse
Affiliation(s)
- Hilko van der Voet
- Biometris, Wageningen University and Research Centre, P.O. Box 100, 6700 AC Wageningen, Netherlands.
| | | | | | | | | | | | | |
Collapse
|
14
|
Rhomberg LR, Baetcke K, Blancato J, Bus J, Cohen S, Conolly R, Dixit R, Doe J, Ekelman K, Fenner-Crisp P, Harvey P, Hattis D, Jacobs A, Jacobson-Kram D, Lewandowski T, Liteplo R, Pelkonen O, Rice J, Somers D, Turturro A, West W, Olin S. Issues in the Design and Interpretation of Chronic Toxicity and Carcinogenicity Studies in Rodents: Approaches to Dose Selection. Crit Rev Toxicol 2008; 37:729-837. [DOI: 10.1080/10408440701524949] [Citation(s) in RCA: 43] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
|
15
|
Ryker SJ, Small MJ. Combining occurrence and toxicity information to identify priorities for drinking-water mixture research. RISK ANALYSIS : AN OFFICIAL PUBLICATION OF THE SOCIETY FOR RISK ANALYSIS 2008; 28:653-666. [PMID: 18643823 DOI: 10.1111/j.1539-6924.2008.00985.x] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/26/2023]
Abstract
Characterizing all possible chemical mixtures in drinking water is a potentially overwhelming project, and the task of assessing each mixture's net toxicity even more daunting. We propose that analyzing occurrence information on mixtures in drinking water may help to narrow the priorities and inform the approaches taken by researchers in mixture toxicology. To illustrate the utility of environmental data for refining the mixtures problem, we use a recent compilation of national ground-water-quality data to examine proposed U.S. Environmental Protection Agency (EPA) and Agency for Toxic Substances and Disease Registry (ATSDR) models of noncancer mixture toxicity. We use data on the occurrence of binary and ternary mixtures of arsenic, cadmium, and manganese to parameterize an additive model and compute hazard index scores for each drinking-water source in the data set. We also use partially parameterized interaction models to perform a bounding analysis estimating the interaction potential of several binary and ternary mixtures for which the toxicological literature is limited. From these results, we estimate a relative value of additional toxicological information for each mixture. For example, we find that according to the U.S. EPA's interaction model, the levels of arsenic and cadmium found in U.S. drinking water are unlikely to have synergistic cardiovascular effects, but the same mixture's potential for synergistic neurological effects merits further study. Similar analysis could in future be used to prioritize toxicological studies based on their potential to reduce scientific and regulatory uncertainty. Environmental data may also provide a means to explore the implications of alternative risk models for the toxicity and interaction of complex mixtures.
Collapse
Affiliation(s)
- Sarah J Ryker
- Department of Engineering and Public Policy, Carnegie Mellon University, Pittsburgh, PA 15213, USA.
| | | |
Collapse
|
16
|
Falk-Filipsson A, Hanberg A, Victorin K, Warholm M, Wallén M. Assessment factors--applications in health risk assessment of chemicals. ENVIRONMENTAL RESEARCH 2007; 104:108-27. [PMID: 17166493 DOI: 10.1016/j.envres.2006.10.004] [Citation(s) in RCA: 44] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/28/2005] [Revised: 10/03/2006] [Accepted: 10/17/2006] [Indexed: 05/13/2023]
Abstract
We review the scientific basis for default assessment factors used in risk assessment of nongenotoxic chemicals including the use of chemical- and pathways specific assessment factors, and extrapolation approaches relevant to species differences, age and gender. One main conclusion is that the conventionally used default factor of 100 does not cover all inter-species and inter-individual differences. We suggest that a species-specific default factor based on allometric scaling should be used for inter-species extrapolation (basal metabolic rate). Regarding toxicodynamic and remaining toxicokinetic differences we suggest that a percentile from a probabilistic distribution is chosen to derive the assessment factor. Based on the scarce information concerning the human-to-human variability it is more difficult to suggest a specific assessment factor. However, extra emphasis should be put on sensitive populations such as neonates and genetically sensitive subgroups, and also fetuses and children which may be particularly vulnerable during development and maturation. Factors that also need to be allowed for are possible gender differences in sensitivity, deficiencies in the databases, nature of the effect, duration of exposure, and route-to-route extrapolation. Since assessment factors are used to compensate for lack of knowledge we feel that it is prudent to adopt a "conservative" approach, erring on the side of protectiveness.
Collapse
|
17
|
van der Voet H, Slob W. Integration of probabilistic exposure assessment and probabilistic hazard characterization. RISK ANALYSIS : AN OFFICIAL PUBLICATION OF THE SOCIETY FOR RISK ANALYSIS 2007; 27:351-71. [PMID: 17511703 DOI: 10.1111/j.1539-6924.2007.00887.x] [Citation(s) in RCA: 57] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/15/2023]
Abstract
A method is proposed for integrated probabilistic risk assessment where exposure assessment and hazard characterization are both included in a probabilistic way. The aim is to specify the probability that a random individual from a defined (sub)population will have an exposure high enough to cause a particular health effect of a predefined magnitude, the critical effect size (CES). The exposure level that results in exactly that CES in a particular person is that person's individual critical effect dose (ICED). Individuals in a population typically show variation, both in their individual exposure (IEXP) and in their ICED. Both the variation in IEXP and the variation in ICED are quantified in the form of probability distributions. Assuming independence between both distributions, they are combined (by Monte Carlo) into a distribution of the individual margin of exposure (IMoE). The proportion of the IMoE distribution below unity is the probability of critical exposure (PoCE) in the particular (sub)population. Uncertainties involved in the overall risk assessment (i.e., both regarding exposure and effect assessment) are quantified using Monte Carlo and bootstrap methods. This results in an uncertainty distribution for any statistic of interest, such as the probability of critical exposure (PoCE). The method is illustrated based on data for the case of dietary exposure to the organophosphate acephate. We present plots that concisely summarize the probabilistic results, retaining the distinction between variability and uncertainty. We show how the relative contributions from the various sources of uncertainty involved may be quantified.
Collapse
Affiliation(s)
- Hilko van der Voet
- Wageningen UR, Biometris and RIKILT Institute of Food Safety, Wageningen, The Netherlands.
| | | |
Collapse
|
18
|
Schneider K, Schuhmacher-Wolz U, Hassauer M, Darschnik S, Elmshäuser E, Mosbach-Schulz O. A probabilistic effect assessment model for hazardous substances at the workplace. Regul Toxicol Pharmacol 2006; 44:172-81. [PMID: 16356615 DOI: 10.1016/j.yrtph.2005.11.001] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2005] [Indexed: 11/23/2022]
Abstract
A major problem in risk assessment is the quantification of uncertainties. A probabilistic model was developed to consider uncertainties in the effect assessment of hazardous substances at the workplace. Distributions for extrapolation factors (time extrapolation, inter- and intraspecies extrapolation) were determined on the basis of appropriate empirical data. Together with the distribution for the benchmark dose obtained from substance-specific dose-response modelling for the exemplary substances 2,4,4-trimethylpentene (TMP) and aniline, they represent the input distributions for probabilistic modelling. These distributions were combined by Monte Carlo simulation. The resulting target distribution describes the probability that an aspired protection level for workers is achieved at a certain dose and the uncertainty associated with the assessment. In the case of aniline, substance-specific data on differences in susceptibility (between species; among humans due to genetic polymorphisms of N-acetyltransferase) were integrated in the model. Medians of the obtained target distributions of the basic models for TMP and aniline, but not of the specific aniline model are similar to deterministically derived reference values. Differences of more than one order of magnitude between the medians and the 5th percentile of the target distributions indicate substantial uncertainty associated with the effect assessment of these substances. The probabilistic effect assessment model proves to be a practical tool to integrate quantitative information on uncertainty and variability in hazard characterisation.
Collapse
Affiliation(s)
- K Schneider
- Forschungs- und Beratungsinstitut Gefahrstoffe GmbH (FoBiG), D-79098 Freiburg, Germany.
| | | | | | | | | | | |
Collapse
|
19
|
LaKind JS, Brent RL, Dourson ML, Kacew S, Koren G, Sonawane B, Tarzian AJ, Uhl K. Human milk biomonitoring data: interpretation and risk assessment issues. JOURNAL OF TOXICOLOGY AND ENVIRONMENTAL HEALTH. PART A 2005; 68:1713-69. [PMID: 16176917 DOI: 10.1080/15287390500225724] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/04/2023]
Abstract
Biomonitoring data can, under certain conditions, be used to describe potential risks to human health (for example, blood lead levels used to determine children's neurodevelopmental risk). At present, there are very few chemical exposures at low levels for which sufficient data exist to state with confidence the link between levels of environmental chemicals in a person's body and his or her risk of adverse health effects. Human milk biomonitoring presents additional complications. Human milk can be used to obtain information on both the levels of environmental chemicals in the mother and her infant's exposure to an environmental chemical. However, in terms of the health of the mother, there are little to no extant data that can be used to link levels of most environmental chemicals in human milk to a particular health outcome in the mother. This is because, traditionally, risks are estimated based on dose, rather than on levels of environmental chemicals in the body, and the relationship between dose and human tissue levels is complex. On the other hand, for the infant, some information on dose is available because the infant is exposed to environmental chemicals in milk as a "dose" from which risk estimates can be derived. However, the traditional risk assessment approach is not designed to consider the benefits to the infant associated with breastfeeding and is complicated by the relatively short-term exposures to the infant from breastfeeding. A further complexity derives from the addition of in utero exposures, which complicates interpretation of epidemiological research on health outcomes of breastfeeding infants. Thus, the concept of "risk assessment" as it applies to human milk biomonitoring is not straightforward, and methodologies for undertaking this type of assessment have not yet been fully developed. This article describes the deliberations of the panel convened for the Technical Workshop on Human Milk Surveillance and Biomonitoring for Environmental Chemicals in the United States, held at the Hershey Medical Center, Pennsylvania State College of Medicine, on several issues related to risk assessment and human milk biomonitoring. Discussion of these topics and the thoughts and conclusions of the panel are described in this article.
Collapse
Affiliation(s)
- Judy S LaKind
- Department of Pediatrics, Milton S. Hershey Medical Center, Pennsylvania State University, College of Medicine, Hershey, PA 17033, USA.
| | | | | | | | | | | | | | | |
Collapse
|
20
|
Holsapple MP, Paustenbach DJ, Charnley G, West LJ, Luster MI, Dietert RR, Burns-Naas LA. Symposium summary: children's health risk--what's so special about the developing immune system? Toxicol Appl Pharmacol 2004; 199:61-70. [PMID: 15289091 DOI: 10.1016/j.taap.2004.03.003] [Citation(s) in RCA: 45] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/01/2004] [Accepted: 03/02/2004] [Indexed: 10/26/2022]
Abstract
In recent years, there has been increasing regulatory pressure to protect the health of children, with the basic tenet being that children differ significantly from adults in their biological or physiological responses to chemical exposures. In a regulatory context, this has been translated to mean a requirement for an additional 10-fold safety factor for environmental contaminants, specialized tests, or both. Much of the initial focus has been on the developing endocrine and nervous systems; but increasingly, the developing immune system has been identified as a potential target organ for chemically mediated toxicity. More recently, the question has been raised regarding whether the current state of science supports the creation of developmental immunotoxicology (DIT) test guidelines. What is needed is a risk-based evaluation of the biology associated with the proposed differential sensitivity between children and adults and the impact of that assessment on additional regulatory measures to protect children in risk assessment analyses. Additionally, an understanding of whether the developing immune system shows greater susceptibility, either qualitatively or quantitatively, to chemical perturbation is critical. To address the question "What's so special about the developing immune system?" a symposium was organized for the 2003 Society of Toxicology annual meeting that brought together risk assessors, clinicians, immunologists, and toxicologists.
Collapse
Affiliation(s)
- Michael P Holsapple
- ILSI Health and Environmental Sciences Institute, Washington, DC 20005-5802, USA.
| | | | | | | | | | | | | |
Collapse
|
21
|
Krewski D, Bakshi K, Garrett R, Falke E, Rusch G, Gaylor D. Development of acute exposure guideline levels for airborne exposures to hazardous substances. Regul Toxicol Pharmacol 2004; 39:184-201. [PMID: 15041148 DOI: 10.1016/j.yrtph.2003.11.009] [Citation(s) in RCA: 17] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2003] [Accepted: 11/11/2003] [Indexed: 10/26/2022]
Abstract
Hazardous substances can be released into the atmosphere due to industrial and transportation accidents, fires, tornadoes, earthquakes, and terrorists, thereby exposing workers and the nearby public to potential adverse health effects. Various enforceable guidelines have been set by regulatory agencies for worker and ambient air quality. However, these exposure levels generally are not applicable to rare lifetime acute exposures, which possibly could occur at high concentrations. Acute exposure guideline levels (AEGLs) provide estimates of concentrations for airborne exposures for an array of short durations that possibly could cause mild (AEGL-1), severe, irreversible, potentially disabling adverse health effects (AEGL-2), or life threatening effects (AEGL-3). These levels can be useful for emergency responders and planners in reducing or eliminating potential risks to the public. Procedures and methodologies for deriving AEGLs are reviewed in this paper that have been developed in the United States, with direct input from international representatives of OECD member-countries, by the National Advisory Committee for Acute Exposure Guidelines for Hazardous Substances and reviewed by the National Research Council. Techniques are discussed for the extrapolation of effects across different exposure durations. AEGLs provide a viable approach for assisting in the prevention, planning, and response to acute airborne exposures to toxic agents.
Collapse
|
22
|
Kalberlah F, Schneider K, Schuhmacher-Wolz U. Uncertainty in toxicological risk assessment for non-carcinogenic health effects. Regul Toxicol Pharmacol 2003; 37:92-104. [PMID: 12662913 DOI: 10.1016/s0273-2300(02)00032-6] [Citation(s) in RCA: 34] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
Abstract
Uncertainty in risk assessment results from the lack of knowledge on toxicity to the target population for a substance. Currently used deterministic risk assessment methods yield human limit values or margins of safety (MOS) without quantitative measurements of uncertainty. Qualitative and quantitative uncertainty analysis would enable risk managers to better judge the consequences of different management options. This article discusses sources of uncertainty and possibilities for quantification of uncertainty associated with different steps in the risk assessment of non-carcinogenic health effects. Knowledge gaps causing uncertainty in risk assessment are overcome by extrapolation. Distribution functions for extrapolation factors are based on empirical data and provide information about the extent of uncertainty introduced by these factors. Whereas deterministic methods can account only qualitatively for uncertainty of the resulting human limit value, probabilistic risk assessment methods are able to quantify several aspects of uncertainty. However, there is only limited experience with these methods in practice. Their acceptance and future application will depend on the establishment of evidence based distribution functions, flexibility and practicability of the methods, and the unambiguity of the results.
Collapse
Affiliation(s)
- Fritz Kalberlah
- Research and Advisory Institute on Hazardous Substances (FoBiG), Werderring 16, D-79098 Freiburg, Germany.
| | | | | |
Collapse
|
23
|
Ginsberg G, Smolenski S, Hattis D, Sonawane B. Population distribution of aldehyde dehydrogenase-2 genetic polymorphism: implications for risk assessment. Regul Toxicol Pharmacol 2002; 36:297-309. [PMID: 12473414 DOI: 10.1006/rtph.2002.1591] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/02/2023]
Abstract
The role of genetic polymorphisms in modulating xenobiotic metabolism and susceptibility to cancer and other health effects has been suggested in numerous studies. However, risk assessments have generally not used this information to characterize population variability or adjust risks for susceptible subgroups. This paper focuses upon the aldehyde dehydrogenase-2 (ALDH2) system because it exemplifies the pivotal role genetic polymorphisms can play in determining enzyme function and susceptibility. Allelic variants in ALDH2 cause decreased ability to clear acetaldehyde and other aldehyde substrates, with homozygous variants (ALDH2*2/2) having no activity and heterozygotes (ALDH2*1/2) having intermediate activity relative to the predominant wild type (ALDH2*1/1). These polymorphisms are associated with increased buildup of acetaldehyde following ethanol ingestion and increased immediate symptoms (flushing syndrome) and long-term cancer risks. We have used Monte Carlo simulation to characterize the population distribution of ALDH2 allelic variants and inter-individual variability in aldehyde internal dose. The nonfunctional allele is rare in most populations, but is common in Asians such that 40% are heterozygotes and 5% are homozygote variants. The ratio of the 95th or 99th percentiles of the Asian population compared to the median of the U.S. population is 14- to 26-fold, a variability factor that is larger than the default pharmacokinetic uncertainty factor (3.2-fold) commonly used in risk assessment. Approaches are described for using ALDH2 population distributions in physiologically based pharmacokinetic-Monte Carlo refinements of risk assessments for xenobiotics which are metabolized to aldehyde intermediates (e.g., ethanol, toluene, ethylene glycol monomethyl ether).
Collapse
Affiliation(s)
- Gary Ginsberg
- Connecticut Department of Public Health, Hartford, 06134, USA.
| | | | | | | |
Collapse
|
24
|
Abstract
Dozens of chemicals, both natural and manmade, are often found in drinking water. Some, such as the natural contaminants uranium and arsenic, are well-known toxicants with a large toxicology database. Other chemicals, such as methyl tertiary-butyl ether (MTBE) from leaking fuel tanks, we learn about as we go along. For still others, such as the alkyl benzenes, there are very little available data, and few prospects of obtaining more. In some cases, chemicals are purposely added to drinking water for beneficial purposes (e.g., chlorine, fluoride, alum), which may cause a countervailing hazard. Removing all potentially toxic chemicals from the water is virtually impossible and is precluded for beneficial uses and for economic reasons. Determination of safe levels of chemicals in drinking water merges the available toxicity data with exposure and human effect assumptions into detailed hazard assessments. This process should incorporate as much conservatism as is needed to allow for uncertainty in the toxicity and exposure estimates. Possible sensitive subpopulations such as unborn children, infants, the elderly, and those with common diseases such as impaired kidney function must also be considered. However, the range of sensitivity and the variability of toxicity and exposure parameters can never be fully documented. In addition, the validity of the low-dose extrapolations, and whether the toxic effect found in animals occurs at all in humans, is never clear. This publication discusses how these competing needs and uncertainties intersect in the development of Public Health Goals for uranium, fluoride, arsenic, perchlorate, and other highly debated chemicals.
Collapse
Affiliation(s)
- Robert A Howd
- Office of Environmental Health Hazard Assessment, California Environmental Protection Agency, Oakland, California 94612, USA.
| |
Collapse
|
25
|
Dourson M, Charnley G, Scheuplein R. Differential sensitivity of children and adults to chemical toxicity. II. Risk and regulation. Regul Toxicol Pharmacol 2002; 35:448-67. [PMID: 12202058 DOI: 10.1006/rtph.2002.1559] [Citation(s) in RCA: 60] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
Abstract
Animals can be useful predictors of chemical hazards to humans. Growth and development are compressed into a shorter period in animals, which makes interpretation of animal testing inherently more difficult. However, similar events occur in both humans and laboratory animals and testing that covers the full period of animal development can reasonably be considered an appropriate surrogate for human development. Some have proposed an additional 10-fold factor for the extra protection of children when estimating safe exposures. Use of such an additional factor, as required by the Food Quality Protection Act (FQPA), is meant to address the same issues covered by the EPA's database uncertainty factor, UF(D), and additional issues related to exposure uncertainty. Thus, when UF(D) has already been deployed, the EPA modifies its use of the FQPA factor. Based on our analysis, we agree with the EPA. Drawing conclusions about the adequacy of UF(H), the uncertainty factor used to account for intrahuman variability, in terms of its ability to protect children on the basis of the modest data available is challenging. However, virtually all studies available suggest that a high percentage of the population, including children, is protected by using a 10-fold uncertainty factor for human variability or by using a 3.16-fold factor each for toxicokinetic and toxicodynamic variability. Based on specific comparisons for newborns, infants, children, adults, and those with severe disease, the population protected is between 60 and 100%, with the studies in larger populations that include sensitive individuals suggesting that the value is closer to 100%.
Collapse
Affiliation(s)
- Michael Dourson
- Toxicology Excellence for Risk Assessment, Cincinnati, Ohio 45223, USA
| | | | | |
Collapse
|
26
|
Haber LT, Maier A, Gentry PR, Clewell HJ, Dourson ML. Genetic polymorphisms in assessing interindividual variability in delivered dose. Regul Toxicol Pharmacol 2002; 35:177-97. [PMID: 12052003 DOI: 10.1006/rtph.2001.1517] [Citation(s) in RCA: 43] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
Abstract
Increasing sophistication in methods used to account for human variability in susceptibility to toxicants has been one of the success stories in the continuing evolution of risk assessment science. Genetic polymorphisms have been suggested as an important contributor to overall human variability. Recently, data on polymorphisms in metabolic enzymes have been integrated with physiologically based pharmacokinetic (PBPK) modeling as an approach to determining the resulting overall variability. We present an analysis of the potential contribution of polymorphisms in enzymes modulating the disposition of four diverse compounds: methylene chloride, warfarin, parathion, and dichloroacetic acid. Through these case studies, we identify key uncertainties likely to be encountered in the use of polymorphism data and highlight potential simplifying assumptions that might be required to test the hypothesis that genetic factors are a substantive source of human variability in susceptibility to environmental toxicants. These uncertainties include (1) the relative contribution of multiple enzyme systems, (2) the extent of induction/inhibition through coexposure, (3) allelic frequencies of major ethnic groups, (4) the absence of chemical-specific data on the kinetic parameters for the different allelic forms of key enzymes, (5) large numbers of low-frequency alleles, and (6) uncertainty regarding differences between in vitro and in vivo kinetic data. Our effort sets the stage for the acquisition of critical data and further integration of polymorphism data with PBPK modeling as a means to quantitate population variability.
Collapse
Affiliation(s)
- L T Haber
- Toxicology Excellence for Risk Assessment, 1757 Chase Avenue, Cincinnati, OH 45223, USA.
| | | | | | | | | |
Collapse
|
27
|
Edler L, Poirier K, Dourson M, Kleiner J, Mileson B, Nordmann H, Renwick A, Slob W, Walton K, Würtzen G. Mathematical modelling and quantitative methods. Food Chem Toxicol 2002; 40:283-326. [PMID: 11893400 DOI: 10.1016/s0278-6915(01)00116-8] [Citation(s) in RCA: 91] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
The present review reports on the mathematical methods and statistical techniques presently available for hazard characterisation. The state of the art of mathematical modelling and quantitative methods used currently for regulatory decision-making in Europe and additional potential methods for risk assessment of chemicals in food and diet are described. Existing practices of JECFA, FDA, EPA, etc., are examined for their similarities and differences. A framework is established for the development of new and improved quantitative methodologies. Areas for refinement, improvement and increase of efficiency of each method are identified in a gap analysis. Based on this critical evaluation, needs for future research are defined. It is concluded from our work that mathematical modelling of the dose-response relationship would improve the risk assessment process. An adequate characterisation of the dose-response relationship by mathematical modelling clearly requires the use of a sufficient number of dose groups to achieve a range of different response levels. This need not necessarily lead to an increase in the total number of animals in the study if an appropriate design is used. Chemical-specific data relating to the mode or mechanism of action and/or the toxicokinetics of the chemical should be used for dose-response characterisation whenever possible. It is concluded that a single method of hazard characterisation would not be suitable for all kinds of risk assessments, and that a range of different approaches is necessary so that the method used is the most appropriate for the data available and for the risk characterisation issue. Future refinements to dose-response characterisation should incorporate more clearly the extent of uncertainty and variability in the resulting output.
Collapse
Affiliation(s)
- L Edler
- Deutsches Krebsforschungszentrum, German Cancer Research Center, Abteilung Biostatistik R 0700, Postfach 10 19 49, D-69009, Heidelberg, Germany
| | | | | | | | | | | | | | | | | | | |
Collapse
|
28
|
Charnley G, Putzrath RM. Children's health, susceptibility, and regulatory approaches to reducing risks from chemical carcinogens. ENVIRONMENTAL HEALTH PERSPECTIVES 2001; 109:187-92. [PMID: 11266331 PMCID: PMC1240641 DOI: 10.1289/ehp.01109187] [Citation(s) in RCA: 28] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/03/2023]
Abstract
Risk-based regulation of chemical exposures from the environment generally relies on assumptions about the extent of people's susceptibility to chemically induced diseases. Those assumptions are intended to be health-protective; that is, they err on the side of overstating susceptibility. Recent concern about children's special susceptibilities has led to proposals that would make risk-based regulations one-tenth more stringent, unless data are available to refute the assumption that children are more susceptible than adults. In this paper we highlight some of the questions that should be addressed in the context of risk assessment to determine whether such increased stringency would accomplish the desired result of improving children's health. In particular, characterizing benefits of greater stringency requires more information about dose-response relationships than is currently available. Lowering regulatory levels has attendant costs but may not achieve benefits, for example, if the previous level were already below an actual or practical threshold. Without an ability to understand the potential benefit (or lack thereof) of the additional stringency, an appropriate consideration of benefits and costs is not possible.
Collapse
Affiliation(s)
- G Charnley
- HealthRisk Strategies, Washington, DC 20003, USA.
| | | |
Collapse
|