26
|
Stang JL, DeVries PA, Klein LR, Cole JB, Martel M, Reing ML, Raiter AM, Driver BE. Medical needs of emergency department patients presenting with acute alcohol and drug intoxication. Am J Emerg Med 2021; 42:38-42. [PMID: 33440329 DOI: 10.1016/j.ajem.2020.12.079] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2020] [Revised: 12/23/2020] [Accepted: 12/29/2020] [Indexed: 10/22/2022] Open
Abstract
STUDY OBJECTIVE Some contend that patients with acute alcohol or illicit substance intoxication should be treated in outpatient detoxification centers rather than in the ED. However, these patients often have underlying acute medical needs. We sought to determine the frequency of medical interventions required by ED patients with alcohol or illicit substance intoxication. METHODS This was a prospective observational study of consecutive ED patients presenting to an urban tertiary care ED with altered mental status due to alcohol or illicit substance use. We performed data collection for patients deemed to be low-risk for complications, as defined by receiving care in an intoxication observation unit. Trained staff observed and recorded all medical interventions, including medications administered, diagnostic testing, procedures performed, and airway interventions. The incidence of agitation was recorded using the Altered Mental Status Scale (AMSS, ordinal scale from -4 to +4, where +4 is most agitated). The data analysis is descriptive. RESULTS This analysis included 2685 encounters (1645 unique patients; median age 39; 73% male) from January to May 2019. Average breath alcohol concentration was 0.20 g/dL (range 0.00-0.47). There were 89% encounters with alcohol intoxication, and in 17% encounters the patient was suspected or known to have drug intoxication (either alone or in conjunction with alcohol use). On arrival to the ED, 372 (14%) had agitation (AMSS +1 or higher) and 32 (1%) were profoundly agitated (AMSS +4). In total, 1526 (56%) received at least one intervention that could not be provided by a local detoxification or sobering facility. Of the study population, 955 (36%) received a sedating medication, 903 (34%) required physical restraints for patients or staff safety, 575 (21%) underwent imaging studies, 318 (12%) underwent laboratory testing, 367 (13%) received another intervention (IV access, EKG, splinting, wound care, etc). Additionally, 111 (4%) patients received an airway intervention (19 intubation, 23 nasal airway, 85 supplemental oxygen) and 275 (10%) required repositioning to protect the airway. There were 168 (6%) patients admitted to the hospital. CONCLUSION In this population of relatively low-risk ED patients with drug and alcohol intoxication, a substantial proportion of patients received medical interventions.
Collapse
|
27
|
Cole JB, Lee SC, Martel ML, Smith SW, Biros MH, Miner JR. Respone to: "Limitations of Retrospective Chart Reviews to Determine Rare Events, and the Unknown Relative Risk of Droperidol". West J Emerg Med 2020; 22:396-397. [PMID: 33856329 PMCID: PMC7972375 DOI: 10.5811/westjem.2020.9.49870] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2020] [Accepted: 09/14/2020] [Indexed: 11/24/2022] Open
|
28
|
Li B, VanRaden PM, Null DJ, O'Connell JR, Cole JB. Major quantitative trait loci influencing milk production and conformation traits in Guernsey dairy cattle detected on Bos taurus autosome 19. J Dairy Sci 2020; 104:550-560. [PMID: 33189290 DOI: 10.3168/jds.2020-18766] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2020] [Accepted: 09/07/2020] [Indexed: 01/30/2023]
Abstract
The goal of this study was to identify potential quantitative trait loci (QTL) for 27 production, fitness, and conformation traits of Guernsey cattle through genome-wide association (GWA) analyses, with extra emphasis on BTA19, where major QTL were observed for several traits. Animals' de-regressed predicted transmitting abilities (PTA) from the December 2018 traditional US evaluation were used as phenotypes. All of the Guernsey cattle included in the QTL analyses were predictor animals in the reference population, ranging from 1,077 to 1,685 animals for different traits. Single-trait GWA analyses were carried out by a mixed-model approach for all 27 traits using imputed high-density genotypes. A major QTL was detected on BTA19, influencing several milk production traits, conformation traits, and livability of Guernsey cattle, and the most significant SNP lie in the region of 26.2 to 28.3 Mb. The myosin heavy chain 10 (MYH10) gene residing within this region was found to be highly associated with milk production and body conformation traits of dairy cattle. After the initial GWA analyses, which suggested that many significant SNP are in linkage with one another, conditional analyses were used for fine mapping. The top significant SNP on BTA19 were fixed as covariables in the model, one at a time, until no more significant SNP were detected on BTA19. After this fine-mapping approach was applied, only 1 significant SNP was detected on BTA19 for most traits, but multiple, independent significant SNP were found for protein yield, dairy form, and stature. In addition, the haplotype that hosts the major QTL on BTA19 was traced to a US Guernsey born in 1954. The haplotype is common in the breed, indicating a long-term influence of this QTL on the US Guernsey population.
Collapse
|
29
|
Olives TD, Westgard B, Steinberg LW, Cole JB. Characterization of Regional Poison Center Utilization Through Geospatial Mapping. West J Emerg Med 2020; 21:249-256. [PMID: 33207173 PMCID: PMC7673882 DOI: 10.5811/westjem.2020.7.46385] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/27/2020] [Accepted: 07/17/2020] [Indexed: 11/13/2022] Open
Abstract
Introduction Penetrance is the annual rate of human exposure calls per 1000 persons, a measure that historically describes poison center (PC) utilization. Penetrance varies by sociodemographic characteristics and by geography. Our goal in this study was to characterize the geospatial distribution of PC calls and describe the contribution of geospatial mapping to the understanding of PC utilization. Methods This was a single-center, retrospective study of closed, human, non-healthcare facility exposure calls to a regional PC over a five-year period. Exposure substance, gender, age, and zone improvement plan (ZIP) Code were geocoded to 2010 US Census data (household income, educational attainment, age, primary language) and spatially apportioned to US census tracts, and then analyzed with linear regression. Penetrance was geospatially mapped and qualitatively analyzed. Results From a total of 304,458 exposure calls during the study period, we identified 168,630 non-healthcare exposure calls. Of those records, 159,794 included ZIP Codes. After exclusions, we analyzed 156,805 records. Penetrance ranged from 0.081 – 38.47 calls/1000 population/year (median 5.74 calls/1000 persons/year). Regression revealed positive associations between >eighth-grade educational attainment (β = 5.05, p = 0.008), non-Hispanic Black (β = 1.18, p = 0.032) and American Indian (β = 3.10, p = 0.000) populations, suggesting that regions with higher proportions of these groups would display greater PC penetrance. Variability explained by regression modelling was low (R2 = 0.054), as anticipated. Geospatial mapping identified previously undocumented penetrance variability that was not evident in regression modeling. Conclusion PC calls vary substantially across sociodemographic strata. Higher proportions of non-Hispanic Black or American Indian residents and >eighth-grade educational attainment were associated with higher PC call penetrance. Geospatial mapping identified novel variations in penetrance that were not identified by regression modelling. Coupled with sociodemographic correlates, geospatial mapping may reveal disparities in PC access, identifying communities at which PC resources may be appropriately directed. Although the use of penetrance to describe PC utilization has fallen away, it may yet provide an important measure of disparity in healthcare access when coupled with geospatial mapping.
Collapse
|
30
|
Farmer BM, Cole JB, Olives TD, Farrell NM, Rao R, Nelson LS, Mazer-Amirshahi M, Stolbach AI. ACMT Position Statement: Medication Administration and Safety During the Response to COVID-19 Pandemic. J Med Toxicol 2020; 16:481-483. [PMID: 32617893 PMCID: PMC7332309 DOI: 10.1007/s13181-020-00794-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/04/2020] [Revised: 06/10/2020] [Accepted: 06/11/2020] [Indexed: 11/30/2022] Open
|
31
|
McWhorter TM, Hutchison JL, Norman HD, Cole JB, Fok GC, Lourenco DAL, VanRaden PM. Investigating conception rate for beef service sires bred to dairy cows and heifers. J Dairy Sci 2020; 103:10374-10382. [PMID: 32896403 DOI: 10.3168/jds.2020-18399] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2020] [Accepted: 06/23/2020] [Indexed: 11/19/2022]
Abstract
The widespread use of sexed semen on US dairy cows and heifers has led to an excess of replacement heifers' calves, and the sale prices for those calves are much lower than in the past. Females not selected to produce the next generation of replacement heifers are increasingly being bred to beef bulls to produce crossbred calves for beef production. The purpose of this study was to investigate the use of beef service sires bred to dairy cows and heifers and to provide a tool for dairy producers to evaluate beef service sires' conception. Sire conception rate (SCR) is a phenotypic evaluation of service sire fertility that is routinely calculated for US dairy bulls. A total of 268,174 breedings were available, which included 36 recognized beef breeds and 7 dairy breeds. Most of the beef-on-dairy inseminations (95.4%) were to Angus (AN) bulls. Because of the limited number of records among other breeds, we restricted our final evaluations to AN service sires bred to Holstein (HO) cows. Service-sire inbreeding and expected inbreeding of resulting embryo were set to zero because pedigree data for AN bulls were unavailable. There were 233,379 breedings from 1,344 AN service sire to 163,919 HO cows. A mean (SD) conception rate of 33.8% (47.3%) was observed compared with 34.3% (47.5%) for breedings with HO sires mated to HO cows. Publishable AN bulls were required to have ≥100 total matings, ≥10 matings in the most recent 12 mo, and breedings in at least 5 herds. Mean SCR reliability was 64.5% for 116 publishable bulls, with a maximum reliability of 99% based on 25,217 breedings. Average SCR was near zero (on AN base) with a range of -5.1 to 4.4. Breedings to HO heifers were also examined, which included 19,437 breedings (443 AN service sire and 15,971 HO heifers). A mean (SD) conception rate of 53.0% (49.9%) was observed, compared with 55.3% (49.7%) for breedings with a HO sire mated to a HO heifer. Beef sires were used more frequently in cows known to be problem breeders, which explains some of the difference in conception rate. Mean service number was 1.92 and 2.87 for HO heifers and 2.13 and 3.04 for HO cows mated to HO and AN sires, respectively. Mating dairy cows and heifers to beef bulls may be profitable if calf prices are higher, fertility is improved, or if practices such as sexed semen, genomic testing, and improved cow productive life allow herd owners to produce both higher quality dairy replacement and increased income from market calves.
Collapse
|
32
|
Corcoran J, Gray T, Bangh SA, Singh V, Cole JB. Fatal Yellow Oleander Poisoning Masquerading as Benign Candlenut Ingestion Taken for Weight Loss. J Emerg Med 2020; 59:e209-e212. [PMID: 32917446 DOI: 10.1016/j.jemermed.2020.07.026] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2020] [Revised: 06/23/2020] [Accepted: 07/11/2020] [Indexed: 11/24/2022]
Abstract
BACKGROUND Candlenuts (Aleurites moluccana) and yellow oleander seeds (Thevetia peruviana) bear a physical resemblance to one another. Candlenuts are benign and marketed as weight loss supplements. Yellow oleander seeds, however, contain toxic cardioactive steroids; as few as 2 seeds may cause fatal poisoning. Because of their physical similarities, the potential for a lethal substitution exists. CASE REPORT A 63-year-old woman presented to the emergency department with vomiting after ingesting 5 of what she believed to be candlenuts that were ordered online under the colloquial name "Nuez de la India" for the purpose of weight loss. She was bradycardic (nadir pulse of 30 beats/min) and hyperkalemic (serum potassium 7.3 mEq/L). Within hours of presentation she suffered a ventricular fibrillation arrest, followed by a terminal asystolic arrest. Postmortem analyses of liver tissue and the seeds were consistent with fatal T. peruviana poisoning. WHY SHOULD AN EMERGENCY PHYSICIAN BE AWARE OF THIS?: T. peruviana seeds contain toxic cardioactive steroids; their physical resemblance to candlenuts poses a risk of potentially fatal substitution. Therapy with high-dose digoxin specific immune fragments (20-30 vials) may be helpful.
Collapse
|
33
|
Stellpflug SJ, Cole JB, Greller HA. Urine Drug Screens in the Emergency Department: The Best Test May Be No Test at All. J Emerg Nurs 2020; 46:923-931. [PMID: 32843202 DOI: 10.1016/j.jen.2020.06.003] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2020] [Revised: 05/11/2020] [Accepted: 06/03/2020] [Indexed: 11/27/2022]
Abstract
The manuscript purpose is to provide a resource for clinicians on the functionality and pitfalls of the rapid urine drug screen for clinical decision making. Many providers remain under-informed about the inherent inaccuracies. The rapid urine drug screen is the first, and often only, step of drug testing. In the majority of emergency departments the urine drug screen is a collection of immunoassays reliant on an interaction between the structure of a particular drug or metabolite and an antibody. Drugs in separate pharmacologic classes often have enough structural similarity to cause false positives. Conversely, drugs within the same pharmacologic class often have different enough structures that they may result in inappropriate negatives. This lack of sensitivity and specificity significantly reduces the test utility, and may cause decision-making confusion. The timing of the drug screen relative to the drug exposure also limits accuracy, as does detection threshold. Confirmatory steps following the initial immunoassay include chromatography and/or mass spectrometry. These are unavailable at many institutions and results rarely return while the patient is in the emergency department. In addition, institutional capabilities vary, even with confirmatory testing. Confirmation accuracy depends on a number of factors, including the extent of the catalog of drugs/metabolites that the facility is calibrated to detect and report. In summary, the standard emergency department urine drug screen is a test with extremely limited clinical utility with multiple properties contributing to poor sensitivity, specificity, and accuracy. The test should be used rarely, if ever, for clinical decision making.
Collapse
|
34
|
Cole JB, Olives TD, Ulici A, Litell JM, Bangh SA, Arens AM, Puskarich MA, Prekker ME. Extracorporeal Membrane Oxygenation for Poisonings Reported to U.S. Poison Centers from 2000 to 2018: An Analysis of the National Poison Data System. Crit Care Med 2020; 48:1111-1119. [PMID: 32697480 DOI: 10.1097/ccm.0000000000004401] [Citation(s) in RCA: 14] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
OBJECTIVES To assess trends in the use of extracorporeal membrane oxygenation for poisoning in the United States. DESIGN Retrospective cohort study. SETTING The National Poison Data System, the databased owned and managed by the American Association of Poison Control Centers, the organization that supports and accredits all 55 U.S. Poison Centers, 2000-2018. PATIENTS All patients reported to National Poison Data System treated with extracorporeal membrane oxygenation. INTERVENTIONS None. MEASUREMENTS AND MAIN RESULTS In total, 407 patients met final inclusion criteria (332 adults, 75 children). Median age was 27 years (interquartile range, 15-39 yr); 52.5% were male. Median number of ingested substances was three (interquartile range, 2-4); 51.5% were single-substance exposures. Extracorporeal membrane oxygenation use in poisoned patients in the United States has significantly increased over time (z = 3.18; p = 0.001) in both adults (age > 12 yr) and children (age ≤ 12 yr), increasing by 9-100% per year since 2008. Increase in use occurred more commonly in adults. We found substantial geographical variation in extracorporeal membrane oxygenation use by geospatially mapping the ZIP code associated with the initial call, with large, primarily rural areas of the United States reporting no cases. Overall survival was 70% and did not vary significantly over the study period for children or adults. Patients with metabolic and hematologic poisonings were less likely to survive following extracorporeal membrane oxygenation than those with other poisonings (49% vs 72%; p = 0.004). CONCLUSIONS The use of extracorporeal membrane oxygenation to support critically ill, poisoned patients in the United States is increasing, driven primarily by increased use in patients greater than 12 years old. We observed no trends in survival over time. Mortality was higher when extracorporeal membrane oxygenation was used for metabolic or hematologic poisonings. Large, predominantly rural regions of the United States reported no cases of extracorporeal membrane oxygenation for poisoning. Further research should focus on refining criteria for the use of extracorporeal membrane oxygenation in poisoning.
Collapse
|
35
|
Corcoran JN, Jacoby KJ, Olives TD, Bangh SA, Cole JB. Persistent Hyperinsulinemia Following High-Dose Insulin Therapy: A Case Report. J Med Toxicol 2020; 16:465-469. [PMID: 32656624 DOI: 10.1007/s13181-020-00796-2] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2020] [Revised: 06/23/2020] [Accepted: 06/24/2020] [Indexed: 12/01/2022] Open
Abstract
INTRODUCTION Overdoses of beta-adrenergic antagonists and calcium channel antagonists represent an uncommonly encountered but highly morbid clinical presentation. Potential therapies include fluids, calcium salts, vasopressors, intravenous lipid emulsion, methylene blue, and high-dose insulin. Although high-dose insulin is commonly used, the kinetics of insulin under these conditions are unknown. CASE REPORT We present a case of a 51-year-old male who sustained a life-threatening overdose after ingesting approximately 40 tablets of a mixture of amlodipine 5 mg and metoprolol tartrate 25 mg. Due to severe bradycardia and hypotension, he was started on high-dose insulin (HDI) therapy; this was augmented with epinephrine. Despite the degree of his initial shock state, he ultimately recovered, and HDI was discontinued. Insulin was infused for a total of approximately 37 hours, most of which was dosed at 10 U/kg/hour; following discontinuation, serial serum insulin levels were drawn and remained at supraphysiologic levels for at least 24 hours and well above reference range for multiple days thereafter. CONCLUSION The kinetics of insulin following discontinuation of high-dose insulin therapy are largely unknown, but supraphysiologic insulin levels persist for some time following therapy; this may allow for simple discontinuation rather than titration of insulin at the end of therapy. Dextrose replacement is frequently needed; although the duration is often difficult to predict, prolonged infusions may not be necessary.
Collapse
|
36
|
Cole JB, Lee SC, Martel ML, Smith SW, Biros MH, Miner JR. The Incidence of QT Prolongation and Torsades des Pointes in Patients Receiving Droperidol in an Urban Emergency Department. West J Emerg Med 2020; 21:728-736. [PMID: 32726229 PMCID: PMC7390553 DOI: 10.5811/westjem.2020.4.47036] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2020] [Accepted: 04/13/2020] [Indexed: 11/11/2022] Open
Abstract
INTRODUCTION Droperidol carries a boxed warning from the United States Food and Drug Administration for QT prolongation and torsades des pointes (TdP). After a six-year hiatus, droperidol again became widely available in the US in early 2019. With its return, clinicians must again make decisions regarding the boxed warning. Thus, the objective of this study was to report the incidence of QT prolongation or TdP in patients receiving droperidol in the ED. METHODS Patients receiving droperidol at an urban Level I trauma center from 1997-2001 were identified via electronic health record query. All patients were reviewed for cardiac arrest. We reviewed electrocardiogram (ECG) data for both critically-ill and noncritical patients and recorded Bazett's corrected QT intervals (QTc). ECGs from critically-ill patients undergoing resuscitation were further risk-stratified using the QT nomogram. RESULTS Of noncritical patients, 15,374 received 18,020 doses of droperidol; 2,431 had an ECG. In patients with ECGs before and after droperidol, the mean QTc was 424.3 milliseconds (ms) (95% confidence interval [CI], 419.7-428.9) before and 427.6 ms (95% CI, 424.3-430.9), after droperidol (n = 170). Regarding critically-ill patients, 1,172 received droperidol and 396 had an ECG. In the critically-ill group with ECGs before and after droperidol mean QTc was 435.7 ms (95% CI, 426.7-444.7) before and 435.8 ms (95% CI, 427.5-444.1) after droperidol (n = 114). Of 337 ECGs suitable for plotting on the QT nomogram, 13 (3.8%) were above the "at-risk" line; 3/136 (2.2%; 95% CI, 0.05-6.3%) in the before group, and 10/202 (4.9%; 95% CI, 2.4%-8.9%) in the after group. A single case of TdP occurred in a patient with multiple risk factors that did not reoccur after a droperidol rechallenge. Thus, the incidence of TdP was 1/16,546 (0.006%; 95% CI, 0.00015 - 0.03367%). CONCLUSION We found the incidence of QTc prolongation and TdP in ED patients receiving droperidol to be extremely rare. Our data suggest the FDA "black box warning" is overstated, and that close ECG monitoring is useful only in high-risk patients.
Collapse
|
37
|
Parker Gaddis KL, VanRaden PM, Cole JB, Norman HD, Nicolazzi E, Dürr JW. Symposium review: Development, implementation, and perspectives of health evaluations in the United States. J Dairy Sci 2020; 103:5354-5365. [PMID: 32331897 DOI: 10.3168/jds.2019-17687] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2019] [Accepted: 02/29/2020] [Indexed: 12/28/2022]
Abstract
The rate at which new traits are being developed is increasing, leading to an expanding number of evaluations provided to dairy producers, especially for functional traits. This review will discuss the development and implementation of genetic evaluations for direct health traits in the United States, as well as potential future developments. Beginning in April 2018, routine official genomic evaluations for 6 direct health traits in Holsteins were made available to US producers from the Council on Dairy Cattle Breeding (Bowie, MD). Traits include resistance to milk fever, displaced abomasum, ketosis, clinical mastitis, metritis, and retained placenta. These health traits were included in net merit indices beginning in August 2018, with a total weight of approximately 2%. Previously, improvement of cow health was primarily made through changes to management practices or genetic selection on indicator traits, such as somatic cell score, productive life, or livability. Widespread genomic testing now allows for accelerated improvement of traits with low heritabilities such as health; however, phenotypes remain essential to the success of genomic evaluations. Establishment and maintenance of data pipelines is a critical component of health trait evaluations, as well as appropriate data quality control standards. Data standardization is a necessary process when multiple data sources are involved. Model refinement continues, including implementation of variance adjustments beginning with the April 2019 evaluation. Mastitis evaluations are submitted to Interbull along with somatic cell score for international validation and evaluation of udder health. Additional areas of research include evaluation of other breeds for direct health traits, use of multiple-trait models, and evaluations for additional functional traits such as calf health and feed efficiency. Future developments will require new and continued cooperation among numerous industry stakeholders. There is more information available than ever before with which to make better selection decisions; however, this also makes it increasingly important to provide accurate and unbiased information.
Collapse
|
38
|
Maltecca C, Tiezzi F, Cole JB, Baes C. Symposium review: Exploiting homozygosity in the era of genomics-Selection, inbreeding, and mating programs. J Dairy Sci 2020; 103:5302-5313. [PMID: 32331889 DOI: 10.3168/jds.2019-17846] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2019] [Accepted: 02/25/2020] [Indexed: 01/06/2023]
Abstract
The advent of genomic selection paved the way for an unprecedented acceleration in genetic progress. The increased ability to select superior individuals has been coupled with a drastic reduction in the generation interval for most dairy populations, representing both an opportunity and a challenge. Homozygosity is now rapidly accumulating in dairy populations. Currently, inbreeding depression is managed mostly by culling at the farm level and by controlling the overall accumulation of homozygosity at the population level. A better understanding of how homozygosity and recessive load are related will guarantee continued genetic improvement while curtailing the accumulation of harmful recessives and maintaining enough genetic variability to ensure the possibility of selection in the face of changing environmental conditions. In this review, we present a snapshot of the current dairy selection structure as it relates to response to selection and accumulation of homozygosity, briefly outline the main approaches currently used to manage inbreeding and overall variability, and present some approaches that can be used in the short term to control accumulation of harmful recessives while maintaining sustained selection pressure.
Collapse
|
39
|
Driver BE, Reardon RF, Cole JB, Klein LR, Miner JR, Prekker ME. In Reply. Acad Emerg Med 2020; 27:347-348. [PMID: 31648400 DOI: 10.1111/acem.13875] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
40
|
Monte AA, Hopkinson A, Saben J, Shelton S, Thornton S, Schneir A, Pomerleau A, Hendrickson R, Arens AM, Cole JB, Chenoweth J, Martin S, Adams A, Banister SD, Gerona RR. The Psychoactive Surveillance Consortium and Analysis Network (PSCAN): the first year. Addiction 2020; 115:270-278. [PMID: 31769125 PMCID: PMC6982594 DOI: 10.1111/add.14808] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/22/2019] [Revised: 06/21/2019] [Accepted: 08/30/2019] [Indexed: 01/20/2023]
Abstract
BACKGROUND AND AIMS The Psychoactive Surveillance Consortium and Analysis Network (PSCAN) is a national network of academic emergency departments (ED), analytical toxicologists and pharmacologists that collects clinical data paired with biological samples to identify and improve treatments of medical conditions arising from use of new psychoactive substances (NPS). The aim of this study was to gather clinical data with paired drug identification from NPS users who presented to EDs within PSCAN during its first year (2016-17). DESIGN Observational study involving patient records and biological samples. SETTING Seven academic emergency medical centers across the United States. PARTICIPANTS ED patients (n = 127) > 8 years of age with possible NPS use who were identified and enrolled in PSCAN by clinical providers or research personnel. MEASUREMENTS Clinical signs, symptoms and treatments were abstracted from the patients' health records. Biological samples were collected from leftover urine, serum and whole blood. Biological and drug samples, when available, were tested for drugs and drug metabolites via liquid chromatography-quadrupole time-of-flight mass spectrometry (LC-QTOF/MS). FINDINGS Patients in whom synthetic opioids were detected (n = 9) showed higher rates of intubation (four of nine), impaired mental status (four of nine) and respiratory acidosis (five of nine) compared with the rest of the cohort (nine of 118, P-value < 0.05). Patients in whom synthetic cannabinoid (SC) were found (n = 27) had lower median diastolic blood pressures (70.5 versus 77 mmHg, P = 0.046) compared with the rest of the cohort. In 64 cases of single drug ingestion, benzodiazepines were administered in 25 cases and considered effective by the treating physician in 21 (84%) cases. CONCLUSIONS During its first year of operation, the Psychoactive Surveillance Consortium and Analysis Network captured clinical data on new classes of drugs paired with biological samples over a large geographical area in the United States. Synthetic cannabinoids were the most common new psychoactive drug identified. Synthetic opioids were associated with a high rate of intubation and respiratory acidosis.
Collapse
|
41
|
Li B, Fang L, Null DJ, Hutchison JL, Connor EE, VanRaden PM, VandeHaar MJ, Tempelman RJ, Weigel KA, Cole JB. High-density genome-wide association study for residual feed intake in Holstein dairy cattle. J Dairy Sci 2019; 102:11067-11080. [PMID: 31563317 DOI: 10.3168/jds.2019-16645] [Citation(s) in RCA: 33] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2019] [Accepted: 07/19/2019] [Indexed: 01/27/2023]
Abstract
Improving feed efficiency (FE) of dairy cattle may boost farm profitability and reduce the environmental footprint of the dairy industry. Residual feed intake (RFI), a candidate FE trait in dairy cattle, can be defined to be genetically uncorrelated with major energy sink traits (e.g., milk production, body weight) by including genomic predicted transmitting ability of such traits in genetic analyses for RFI. We examined the genetic basis of RFI through genome-wide association (GWA) analyses and post-GWA enrichment analyses and identified candidate genes and biological pathways associated with RFI in dairy cattle. Data were collected from 4,823 lactations of 3,947 Holstein cows in 9 research herds in the United States. Of these cows, 3,555 were genotyped and were imputed to a high-density list of 312,614 SNP. We used a single-step GWA method to combine information from genotyped and nongenotyped animals with phenotypes as well as their ancestors' information. The estimated genomic breeding values from a single-step genomic BLUP were back-solved to obtain the individual SNP effects for RFI. The proportion of genetic variance explained by each 5-SNP sliding window was also calculated for RFI. Our GWA analyses suggested that RFI is a highly polygenic trait regulated by many genes with small effects. The closest genes to the top SNP and sliding windows were associated with dry matter intake (DMI), RFI, energy homeostasis and energy balance regulation, digestion and metabolism of carbohydrates and proteins, immune regulation, leptin signaling, mitochondrial ATP activities, rumen development, skeletal muscle development, and spermatogenesis. The region of 40.7 to 41.5 Mb on BTA25 (UMD3.1 reference genome) was the top associated region for RFI. The closest genes to this region, CARD11 and EIF3B, were previously shown to be related to RFI of dairy cattle and FE of broilers, respectively. Another candidate region, 57.7 to 58.2 Mb on BTA18, which is associated with DMI and leptin signaling, was also associated with RFI in this study. Post-GWA enrichment analyses used a sum-based marker-set test based on 4 public annotation databases: Gene Ontology, Kyoto Encyclopedia of Genes and Genomes (KEGG) pathways, Reactome pathways, and medical subject heading (MeSH) terms. Results of these analyses were consistent with those from the top GWA signals. Across the 4 databases, GWA signals for RFI were highly enriched in the biosynthesis and metabolism of amino acids and proteins, digestion and metabolism of carbohydrates, skeletal development, mitochondrial electron transport, immunity, rumen bacteria activities, and sperm motility. Our findings offer novel insight into the genetic basis of RFI and identify candidate regions and biological pathways associated with RFI in dairy cattle.
Collapse
|
42
|
Driver BE, Klein LR, Cole JB, Miner JR, Reardon RF, Prekker ME. In Reply. Acad Emerg Med 2019; 26:1108. [PMID: 31121080 DOI: 10.1111/acem.13806] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
43
|
Driver BE, Klein LR, Prekker ME, Cole JB, Satpathy R, Kartha G, Robinson A, Miner JR, Reardon RF. Drug Order in Rapid Sequence Intubation. Acad Emerg Med 2019; 26:1014-1021. [PMID: 30834639 DOI: 10.1111/acem.13723] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2018] [Revised: 02/04/2019] [Accepted: 02/06/2019] [Indexed: 11/28/2022]
Abstract
BACKGROUND The optimal order of drug administration (sedative first vs. neuromuscular blocking agent first) in rapid sequence intubation (RSI) is debated. OBJECTIVE We sought to determine if RSI drug order was associated with the time elapsed from administration of the first RSI drug to the end of a successful first intubation attempt. METHODS We conducted a planned secondary analysis of a randomized trial of adult ED patients undergoing emergency orotracheal intubation that demonstrated higher first-attempt success with bougie use compared to a tracheal tube + stylet. Drug choice, dose, and the order of sedative and neuromuscular blocking agent were not stipulated. We analyzed trial patients who received both a sedative and a neuromuscular blocking agent within 30 seconds of each other who were intubated successfully on the first attempt. The primary outcome was the time elapsed from complete administration of the first RSI drug to the end of the first intubation attempt, a surrogate outcome for apnea time. We performed a multivariable analysis using a mixed-effects generalized linear model. RESULTS Of 757 original trial patients, 562 patients (74%) met criteria for analysis; 153 received the sedative agent first, and 409 received the neuromuscular blocking agent first. Administration of the neuromuscular blocking agent before the sedative agent was associated with a reduction in time from RSI administration to the end of intubation attempt of 6 seconds (95% confidence interval = 0 to 11 sec). CONCLUSION Administration of either the neuromuscular blocking or the sedative agent first are both acceptable. Administering the neuromuscular blocking agent first may result in modestly faster time to intubation. For now, it is reasonable for physicians to continue performing RSI in the way they are most comfortable with. If future research determines that the order of medication administration is not associated with awareness of neuromuscular blockade, administration of the neuromuscular blocking agent first may be a logical default administration method to attempt to minimize apnea time during intubation.
Collapse
|
44
|
Chittineni C, Driver BE, Halverson M, Cole JB, Prekker ME, Pandey V, Lai T, Harrington J, Zhao S, Klein LR. Incidence and Causes of Iatrogenic Hypoglycemia in the Emergency Department. West J Emerg Med 2019; 20:833-837. [PMID: 31539342 PMCID: PMC6754198 DOI: 10.5811/westjem.2019.7.42996] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2019] [Accepted: 07/14/2019] [Indexed: 11/14/2022] Open
Abstract
Introduction Hypoglycemia is frequently encountered in the emergency department (ED) and has potential for serious morbidity. The incidence and causes of iatrogenic hypoglycemia are not known. We aim to describe how often the cause of ED hypoglycemia is iatrogenic and to identify its specific causes. Methods We included adult patients with a chief complaint or ED diagnosis of hypoglycemia, or an ED glucose value of ≤70 milligrams per deciliter (mg/dL) between 2009–2014. Two independent abstractors each reviewed charts of patients with an initial glucose ≤ 50 mg/dL, or initial glucose ≥ 70 mg/dL with a subsequent glucose ≤ 50 mg/dL, to determine if the hypoglycemia was caused by iatrogenesis. The data analysis was descriptive. Results We reviewed the charts of 591 patients meeting inclusion criteria. Of these 591 patients, 99 (17%; 95% confidence interval, 14–20%) were classified as iatrogenic. Of these 99 patients, 61 (61%) cases of hypoglycemia were caused by insulin administration and 38 (38%) were caused by unrecognized malnutrition. Of the 61 patients with iatrogenic hypoglycemia after ED insulin administration, 45 and 15 patients received insulin for hyperkalemia and uncomplicated hyperglycemia, respectively. One patient received insulin for diabetic ketoacidosis. Conclusion In ED patients with hypoglycemia, iatrogenic causes are relatively common. The most frequent cause was insulin administration for hyperkalemia and uncomplicated hyperglycemia. Additionally, patients at risk of hypoglycemia in the absence of insulin, including those with alcohol intoxication or poor nutritional status, should be monitored closely in the ED.
Collapse
|
45
|
Cole JB, Knack SK, Karl ER, Horton GB, Satpathy R, Driver BE. Human Errors and Adverse Hemodynamic Events Related to "Push Dose Pressors" in the Emergency Department. J Med Toxicol 2019; 15:276-286. [PMID: 31270748 DOI: 10.1007/s13181-019-00716-z] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2019] [Revised: 06/03/2019] [Accepted: 06/10/2019] [Indexed: 11/29/2022] Open
Abstract
BACKGROUND Though the use of small bolus doses of vasopressors, termed "push dose pressors," has become common in emergency medicine, data examining this practice are scant. Push dose pressors frequently involve bedside dilution, which may result in errors and adverse events. The objective of this study was to assess for instances of human error and adverse hemodynamic events during push dose pressor use in the emergency department. METHODS This was a structured chart and video review of all patients age ≥ 16 years undergoing resuscitation and receiving push dose pressors from a single center from January 2010 to November 2017. Push dose pressors were defined as intended intravenous boluses of phenylephrine (any dose) or epinephrine (≤ 100 mcg). RESULTS A total of 249 patients were analyzed. Median age was 60 years (range, 16-97), 58% were male, 49% survived to discharge. Median initial epinephrine dose was 20 mcg (n = 139, IQR 10-100, range 1-100); median phenylephrine dose was 100 mcg (n = 110, IQR 100-100, range 25-10,000). Adverse hemodynamic events occurred in 98 patients (39%); 30 in the phenylephrine group (27%; 95% CI, 19-36%), and 68 in the epinephrine group (50%; 95% CI, 41-58%). Human errors were observed in 47 patients (19%), including 7 patients (3%) experiencing dosing errors (all overdoses; range, 2.5- to 100-fold) and 43 patients (17%) with a documentation error. Only one dosing error occurred when a pharmacist was present. CONCLUSIONS Human errors and adverse hemodynamic events were common with the use of push dose pressors in the emergency department. Adverse hemodynamic events were more common than in previous studies. Future research should determine if push dose pressors improve outcomes and if so, how to safely implement them into practice.
Collapse
|
46
|
Cole JB, Null DJ. Short communication: Phenotypic and genetic effects of the polled haplotype on yield, longevity, and fertility in US Brown Swiss, Holstein, and Jersey cattle. J Dairy Sci 2019; 102:8247-8250. [PMID: 31255269 DOI: 10.3168/jds.2019-16530] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2019] [Accepted: 05/03/2019] [Indexed: 12/20/2022]
Abstract
Phenotypes from the December 2018 US national genetic evaluations were used to compute effects of the polled haplotype in US Brown Swiss (BS), Holstein (HO), and Jersey (JE) cattle on milk, fat, and protein yields, somatic cell score, single-trait productive life, daughter pregnancy rate, heifer conception rate, and cow conception rate. Lactation records pre-adjusted for nongenetic factors and direct genomic values were used to estimate phenotypic and genetic effects of the polled haplotype, respectively. No phenotypic or direct genomic values effects were different from zero for any trait in any breed. Genomic PTA (gPTA) for the lifetime net merit (NM$) selection index of bulls born since January 1, 2012, that received a marketing code from the National Association of Animal Breeders (Madison, WI), and cows born on or after January 1, 2015, were compared to determine whether there was a systematic benefit to polled or horned genetics. Horned bulls had the highest average gPTA for NM$ in all 3 breeds, but that difference was significant only in HO and JE (HO: 615.4 ± 1.9, JE: 402.3 ± 3.4). Homozygous polled BS cows had significantly higher average gPTA for NM$ than their heterozygous polled or horned contemporaries (PP = 261.4 ± 43.5, Pp = 166.1 ± 13.7, pp = 174.1 ± 1.8), but the sample size was very small (n = 9). In HO and JE, horned cows had higher gPTA for NM$ (HO = 378.3 ± 0.2, JE = 283.3 ± 0.3). Selection for polled cattle should not have a detrimental effect on yield, fertility, or longevity, but these differences show that, in the short term, selection for polled over horned cattle will result in lower rates of genetic gain.
Collapse
|
47
|
Klein LR, Driver BE, Martel ML, Miner JR, Cole JB. In reply:. Ann Emerg Med 2019; 73:693-694. [DOI: 10.1016/j.annemergmed.2019.01.012] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/02/2019] [Indexed: 11/26/2022]
|
48
|
Perez-Lauterbach D, Nahum R, Ahmad H, Topeff JM, Dossick D, Cole JB, Arens AM. Dose-Dependent Pulmonary Injury Following Nitrogen Dioxide Inhalation From Kinepak ™ Detonation. J Emerg Med 2019; 57:177-180. [PMID: 31060842 DOI: 10.1016/j.jemermed.2019.03.028] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2018] [Revised: 03/06/2019] [Accepted: 03/16/2019] [Indexed: 11/17/2022]
Abstract
BACKGROUND Nitrogen dioxide (NO2) is a pulmonary irritant produced as a byproduct of bacterial anaerobic metabolism of organic materials, and is also produced as a byproduct of explosive detonations. Significant NO2 exposure results in free-radical-induced pulmonary injury that may be delayed up to 3-30 h after exposure and can progress to acute respiratory distress syndrome (ARDS) and death. Here we present a case series of 3 patients with dose-dependent pulmonary injury consistent with NO2 inhalation following exposure to fumes from detonation of an ammonium nitrate/nitromethane (ANNM) explosive device. CASE REPORTS Three individuals presented to the emergency department over the course of 16 h, beginning approximately 16 h after exposure to fumes from an ANNM explosive device. Patient 1, with the most significant exposure, developed ARDS necessitating intubation and mechanical ventilation. Patient 2 exhibited hypoxia and findings concerning for diffuse airway inflammation, but ultimately required only supplemental oxygen. Patient 3, with the least exposure, had imaging abnormalities but required no intervention. WHY SHOULD AN EMERGENCY PHYSICIAN BE AWARE OF THIS?: Respiratory distress is a common presenting complaint to the emergency department. Because of the delayed presentation and the potential for progressive worsening of symptoms associated with NO2 exposure, it is important that emergency physicians be aware of the multiple potential means of exposure and consider this diagnosis in the proper clinical context. Patients with suspicion of NO2-related lung injury should undergo more extended observation than their initial clinical presentation may suggest.
Collapse
|
49
|
Cole JB, Klein LR, Mullinax SZ, Nordstrom KD, Driver BE, Wilson MP. Study Enrollment When "Preconsent" Is Utilized for a Randomized Clinical Trial of Two Treatments for Acute Agitation in the Emergency Department. Acad Emerg Med 2019; 26:559-566. [PMID: 30548977 DOI: 10.1111/acem.13673] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2018] [Revised: 11/27/2018] [Accepted: 12/05/2018] [Indexed: 11/30/2022]
Abstract
BACKGROUND Acute agitation in the emergency department (ED) represents a danger to both patients and their caregivers. Medication is often needed, and few high-quality randomized trials have evaluated the optimal drugs for this vulnerable population. In the United States, as of 2017, randomized trials of drugs typically cannot be conducted under Waiver of Consent (46 CFR 45.116), and Exception From Informed Consent trials (21 CFR 50.24) are limited to life-threatening conditions, are onerous, and require filing an investigational new drug application with the FDA. We sought to conduct a randomized double-dummy trial of inhaled loxapine versus intramuscular haloperidol + lorazepam for acute agitation in the ED by obtaining consent in advance ("preconsent") in patients at risk of future agitation, allowing study drug administration up to 3 years later if the patient presented with acute agitation. OBJECTIVE We sought to report the successful enrollment rate of patients preconsented at an earlier ED visit for this trial. METHODS This was an analysis of patients age 18 to 64 with bipolar I disorder or schizophrenia preconsented for enrollment in the trial (clinicaltrials.gov, NCT02877108) conducted at a single urban academic center seeing approximately 60,000 patients per year. Eligible patients were assessed for capacity to consent by trained research associates, and informed consent was obtained at an ED visit for the possibility of administering drugs for agitation within the next 3 years. In the event the patient later presented to the ED and the attending physician deemed the patient required treatment for acute agitation, preconsent was confirmed and study drug would be administered. RESULTS Over 67 days, 1,461 patients were screened in the ED, 269 had bipolar I or schizophrenia, 194 of whom had a contraindication to inhaled loxapine leaving 75 eligible patients; preconsent was obtained in 43 patients. Four additional patients who had not preconsented were consented for the trial in real time (three by surrogate, one patient had capacity while agitated) resulting in a total of 47 consented patients. Of these 47, a total of 12 were later removed from the study: 10 patients had unrecognized exclusion criteria for inhaled loxapine, one preconsented patient contacted the investigators at a later date and asked to be removed, and one surrogate revoked consent immediately after providing it. Only two patients were successfully enrolled, neither by preconsent: one was enrolled via a surrogate the day of enrollment, and the other was mildly agitated and had capacity to consent. The remaining patient with a valid surrogate consent did not receive study medication. CONCLUSIONS Utilization of preconsent to enroll patients in a randomized trial of treatments for acute agitation in the ED requires substantial resources and may not be feasible.
Collapse
|
50
|
Santos DJA, Cole JB, Lawlor TJ, VanRaden PM, Tonhati H, Ma L. Variance of gametic diversity and its application in selection programs. J Dairy Sci 2019; 102:5279-5294. [PMID: 30981488 DOI: 10.3168/jds.2018-15971] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2018] [Accepted: 02/27/2019] [Indexed: 11/19/2022]
Abstract
The variance of gametic diversity ( σgamete2) can be used to find individuals that more likely produce progeny with extreme breeding values. The aim of this study was to obtain this variance for individuals from routine genomic evaluations, and to apply gametic variance in a selection criterion in conjunction with breeding values to improve genetic progress. An analytical approach was developed to estimate σgamete2 by the sum of binomial variances of all individual quantitative trait loci across the genome. Simulation was used to verify the predictability of this variance in a range of scenarios. The accuracy of prediction ranged from 0.49 to 0.85, depending on the scenario and model used. Compared with sequence data, SNP data are sufficient for estimating σgamete2 Results also suggested that markers with low minor allele frequency and the covariance between markers should be included in the estimation. To incorporate σgamete2 into selective breeding programs, we proposed a new index, relative predicted transmitting ability, which better utilizes the genetic potential of individuals than traditional predicted transmitting ability. Simulation with a small genome showed an additional genetic gain of up to 16% in 10 generations, depending on the number of quantitative trait loci and selection intensity. Finally, we applied σgamete2 to the US genomic evaluations for Holstein and Jersey cattle. As expected, the DGAT1 gene had a strong effect on the estimation of σgamete2 for several production traits. However, inbreeding had a small impact on gametic variability, with greater effect for more polygenic traits. In conclusion, gametic variance, a potentially important parameter for selection programs, can be easily computed and is useful for improving genetic progress and controlling genetic diversity.
Collapse
|