1
|
Policy in clinical practice: Impact of restoring the 1999 public charge rule on healthcare access for noncitizen immigrants. J Hosp Med 2024; 19:215-218. [PMID: 38358059 DOI: 10.1002/jhm.13296] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/18/2023] [Revised: 12/02/2023] [Accepted: 01/28/2024] [Indexed: 02/16/2024]
|
2
|
Policy in clinical practice: Elimination of the buprenorphine "X-waiver". J Hosp Med 2023; 18:931-933. [PMID: 37545111 DOI: 10.1002/jhm.13176] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/12/2023] [Revised: 06/27/2023] [Accepted: 07/13/2023] [Indexed: 08/08/2023]
|
3
|
Visit and Between-Visit Interaction Frequency Before and After COVID-19 Telehealth Implementation. JAMA Netw Open 2023; 6:e2333944. [PMID: 37713198 PMCID: PMC10504619 DOI: 10.1001/jamanetworkopen.2023.33944] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/22/2023] [Accepted: 08/08/2023] [Indexed: 09/16/2023] Open
Abstract
Importance Telehealth implementation associated with the COVID-19 public health emergency (PHE) affected patient-clinical team interactions in numerous ways. Yet, studies have narrowly examined billed patient-clinician visits rather than including visits with other team members (eg, pharmacists) or between-visit interactions. Objective To evaluate rates of change over time in visits (in-person, telehealth) and between-visit interactions (telephone calls, patient portal messages) overall and by key patient characteristics. Design, Setting, and Participants This retrospective cohort study included adults with diabetes receiving primary care at urban academic (University of California San Francisco [UCSF]) and safety-net (San Francisco Health Network [SFHN]) health care systems. Encounters from April 2019 to March 2021 were analyzed. Exposure Telehealth implementation over 3 periods: pre-PHE (April 2019 to March 2020), strict shelter-in-place (April to June 2020), and hybrid-PHE (July 2020 to March 2021). Main Outcomes and Measures The main outcomes were rates of change in monthly mean number of total encounters, visits with any health care team member, visits with billing clinicians, and between-visit interactions. Key patient-level characteristics were age, race and ethnicity, language, and neighborhood socioeconomic status (nSES). Results Of 15 148 patients (4976 UCSF; 8975 SFHN) included, 2464 (16%) were 75 years or older, 7734 (51%) were female patients, 9823 (65%) self-identified as racially or ethnically minoritized, 6223 (41%) had a non-English language preference, and 4618 (31%) lived in the lowest nSES quintile. After accounting for changes to care delivery through an interrupted time-series analysis, total encounters increased in the hybrid-PHE period (UCSF: 2.3% per patient/mo; 95% CI, 1.6%-2.9% per patient/mo; SFHN: 1.8% per patient/mo, 95% CI, 1.3%-2.2% per patient/mo), associated primarily with growth in between-visit interactions (UCSF: 3.1% per patient/mo, 95% CI, 2.3%-3.8% per patient/mo; SFHN: 2.9% per patient/mo, 95% CI, 2.3%-3.4% per patient/mo). In contrast, rates of visits were stable during the hybrid-PHE period. Although there were fewer differences in visit use by key patient-level characteristics during the hybrid-PHE period, pre-PHE differences in between-visit interactions persisted during the hybrid-PHE period at SFHN. Asian and Chinese-speaking patients at SFHN had fewer monthly mean between-visit interactions compared with White patients (0.46 [95% CI, 0.42-0.50] vs 0.59 [95% CI, 0.53-0.66] between-visit interactions/patient/mo; P < .001) and English-speaking patients (0.52 [95% CI, 0.47-0.58] vs 0.61 [95% CI, 0.56-0.66] between-visit interactions/patient/mo; P = .03). Conclusions and Relevance In this study, pre-PHE growth in overall patient-clinician encounters persisted after PHE-related telehealth implementation, driven in both periods by between-visit interactions. Differential utilization based on patient characteristics was observed, which may indicate disparities. The implications for health care team workload and patient outcomes are unknown, particularly regarding between-visit interactions. Therefore, to comprehensively understand care utilization for patients with chronic diseases, research should expand beyond billed visits.
Collapse
|
4
|
Characterizing patients hospitalized without an acute care indication: A retrospective cohort study. J Hosp Med 2023; 18:294-301. [PMID: 36757173 DOI: 10.1002/jhm.13061] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/20/2022] [Revised: 01/04/2023] [Accepted: 01/25/2023] [Indexed: 02/10/2023]
Abstract
BACKGROUND Hospitalizations by patients who do not meet acute inpatient criteria are common and overburden healthcare systems. Studies have characterized these alternate levels of care (ALC) but have not delineated prolonged (pALC) versus short ALC (sALC) stays. OBJECTIVE To descriptively compare pALC and sALC hospitalizations-groups we hypothesize have unique needs. DESIGNS, SETTINGS, AND PARTICIPANTS A retrospective study of hospitalizations from March-April 2018 at an academic safety-net hospital. MAIN OUTCOME AND MEASURES Levels of care for pALC (>3 days) and sALC (1-3 days) were determined using InterQual©, an industry standard utilization review tool for determining the clinical appropriateness of hospitalization. We examined sociodemographic and clinical characteristics. RESULTS Of 2365 hospitalizations, 215 (9.1%) were pALC, 277 (11.7%) were sALC, and 1873 (79.2%) had no ALC days. There were 17,683 hospital days included, and 28.3% (n = 5006) were considered ALC. Compared to patients with sALC, those with pALC were older and more likely to be publicly insured, experience homelessness, and have substance use or psychiatric comorbidities. Patients with pALC were more likely to be admitted for care meeting inpatient criteria (89.3% vs. 66.8%, p < .001), had significantly more ALC days (median 8 vs. 1 day, p < .001), and were less likely to be discharged to the community (p < .001). CONCLUSIONS Patients with prolonged ALC stays were more likely to be admitted for acute care, had greater psychosocial complexity, significantly longer lengths of stay, and unique discharge needs. Given the complexity and needs for hospitalizations with pALC days, intensive interdisciplinary coordination and resource mobilization are necessary.
Collapse
|
5
|
Photocatalytic Activity of Ti-SBA-15/C3N4 for Degradation of 2,4-Dichlorophenoxyacetic Acid in Water under Visible Light. JOURNAL OF ANALYTICAL METHODS IN CHEMISTRY 2022; 2022:5531219. [PMID: 35360448 PMCID: PMC8964217 DOI: 10.1155/2022/5531219] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/12/2021] [Revised: 02/15/2022] [Accepted: 02/23/2022] [Indexed: 06/14/2023]
Abstract
In the present study, the photocatalytic activity of Ti-SBA-15/C3N4 catalysts was investigated to degrade 2,4-Dichlorophenoxyacetic acid (2,4-D) herbicides in water under visible light irradiation. The catalysts were synthesized via a simple hydrothermal method and characterized by various analytical techniques, including SAXS, N2 adsorption-desorption isotherms, Zeta potential, PL, FT-IR, XRF, TGA, and UV-DRS. Our study indicated that the 2.5Ti-SBA-15/C3N4 had higher efficiency in the degradation of 2,4-D than Ti-SBA-15 and C3N4. The decomposition of 2,4-D reached 60% under 180 minutes of visible light irradiation at room temperature on 2.5Ti-SBA-15/C3N4. Moreover, the degradation of 2,4-D on Ti-SBA-15/C3N4 was pseudo-first-order kinetics with the highest rate constant (0.00484 min-1), which was much higher than that obtained for other photocatalysts reported recently. Furthermore, the catalyst can be reused at least two times for photodegradation of 2,4-D solution under visible light irradiation within a slight decrease in catalytic activity.
Collapse
|
6
|
Social Determinants of Health and 30-Day Readmissions Among Adults Hospitalized for Heart Failure in the REGARDS Study. Circ Heart Fail 2022; 15:e008409. [PMID: 34865525 PMCID: PMC8849604 DOI: 10.1161/circheartfailure.121.008409] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 01/03/2023]
Abstract
BACKGROUND It is not known which social determinants of health (SDOH) impact 30-day readmission after a heart failure (HF) hospitalization among older adults. We examined the association of 9 individual SDOH with 30-day readmission after an HF hospitalization. METHODS AND RESULTS Using the REGARDS study (Reasons for Geographic and Racial Differences in Stroke), we included Medicare beneficiaries who were discharged alive after an HF hospitalization between 2003 and 2014. We assessed 9 SDOH based on the Healthy People 2030 Framework: race, education, income, social isolation, social network, residential poverty, Health Professional Shortage Area, rural residence, and state public health infrastructure. The primary outcome was 30-day all-cause readmission. For each SDOH, we calculated incidence per 1000 person-years and multivariable-adjusted hazard ratios of readmission. Among 690 participants, the median age was 76 years at hospitalization (interquartile range, 71-82), 44.3% were women, 35.5% were Black, 23.5% had low educational attainment, 63.0% had low income, 21.0% had zip code-level poverty, 43.5% resided in Health Professional Shortage Areas, 39.3% lived in states with poor public health infrastructure, 13.1% were socially isolated, 13.3% had poor social networks, and 10.2% lived in rural areas. The 30-day readmission rate was 22.4%. In an unadjusted analysis, only Health Professional Shortage Area was significantly associated with 30-day readmission; in a fully adjusted analysis, none of the 9 SDOH were individually associated with 30-day readmission. CONCLUSIONS In this modestly sized national cohort, although prevalent, none of the SDOH were associated with 30-day readmission after an HF hospitalization. Policies or interventions that only target individual SDOH to reduce readmissions after HF hospitalizations may not be sufficient to prevent readmission among older adults.
Collapse
|
7
|
Time to ACT: launching an Addiction Care Team (ACT) in an urban safety-net health system. BMJ Open Qual 2021; 10:bmjoq-2020-001111. [PMID: 33500326 PMCID: PMC7843300 DOI: 10.1136/bmjoq-2020-001111] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2020] [Revised: 01/02/2021] [Accepted: 01/10/2021] [Indexed: 12/19/2022] Open
Abstract
Across the USA, morbidity and mortality from substance use are rising as reflected by increases in acute care hospitalisations for substance use complications and substance-related deaths. Patients with substance use disorders (SUD) have long and costly hospitalisations and higher readmission rates compared to those without SUD. Hospitalisation presents an opportunity to diagnose and treat individuals with SUD and connect them to ongoing care. However, SUD care often remains unaddressed by hospital providers due to lack of a systems approach and addiction medicine knowledge, and is compounded by stigma. We present a blueprint to launching an interprofessional inpatient addiction care team embedded in the hospital medicine division of an urban, safety-net integrated health system. We describe key factors for successful implementation including: (1) demonstrating the scope and impact of SUD in our health system via a needs assessment; (2) aligning improvement areas with health system leadership priorities; (3) involving executive leadership to create goal and initiative alignment; and (4) obtaining seed funding for a pilot programme from our Medicaid health plan partner. We also present challenges and lessons learnt.
Collapse
|
8
|
Abstract
BACKGROUND Over the past decade, nearly half of internal medicine residencies have implemented block clinic scheduling; however, the effects on residency-related outcomes are unknown. The authors systematically reviewed the impact of block versus traditional ambulatory scheduling on residency-related outcomes, including (1) resident satisfaction, (2) resident-perceived conflict between inpatient and outpatient responsibilities, (3) ambulatory training time, (4) continuity of care, (5) patient satisfaction, and (6) patient health outcomes. METHOD The authors reviewed the following databases: Ovid MEDLINE, Ovid MEDLINE InProcess, EBSCO CINAHL, EBSCO ERIC, and the Cochrane Library from inception through March 2017 and included studies of residency programs comparing block to traditional scheduling with at least one outcome of interest. Two authors independently extracted data on setting, participants, schedule design, and the outcomes of interest. RESULTS Of 8139 studies, 11 studies of fair to moderate methodologic quality were included in the final analysis. Overall, block scheduling was associated with marked improvements in resident satisfaction (n = 7 studies, effect size range - 0.3 to + 0.9), resident-perceived conflict between inpatient and outpatient responsibilities (n = 5, effect size range + 0.3 to + 2.6), and available ambulatory training time (n = 5). Larger improvements occurred in programs implementing short (1 week) ambulatory blocks. However, block scheduling may result in worse physician continuity (n = 4). Block scheduling had inconsistent effects on patient continuity (n = 4), satisfaction (n = 3), and health outcomes (n = 3). DISCUSSION Although block scheduling improves resident satisfaction, conflict between inpatient and outpatient responsibilities, and ambulatory training time, there may be important tradeoffs with worse care continuity.
Collapse
|
9
|
Abstract
Overtreatment is pervasive in medicine and leads to potential patient harms and excessive costs in health care. Although evidence-based medicine is often derided as practice by rote algorithmic medicine, the appropriate application of key evidence-based medicine principles in clinical decision making is fundamental to preventing overtreatment and promoting high-value, individualized patient-centered care. Specifically, this article discusses the importance of (1) using absolute rather than relative estimates of benefits to inform treatment decisions; (2) considering the time horizon to benefit of treatments; (3) balancing potential harms and benefits; and (4) using shared decision making by physicians to incorporate the patient's values and preferences into treatment decisions. Here, we illustrate the application of these principles to considering the decision of whether or not to recommend intensive glycemic control to patients to minimize microvascular and cardiovascular complications in type 2 diabetes mellitus. Through this lens, this example will illustrate how an evidence-based medicine approach can be used to individualize glycemic goals and prevent overtreatment, and can serve as a template for applying evidence-based medicine to inform treatment decisions for other conditions to optimize health and individualize patient care.
Collapse
|
10
|
Abstract
BACKGROUND The use of rapid response systems (RRS), which were designed to bring clinicians with critical care expertise to the bedside to prevent unnecessary deaths, has increased. RRS rely on accurate detection of acute deterioration events. Early warning scores (EWS) have been used for this purpose but were developed using heterogeneous populations. Predictive performance may differ in medical vs surgical patients. OBJECTIVE To evaluate the performance of published EWS in medical vs surgical patient populations. DESIGN Retrospective cohort study. SETTING Two tertiary care academic medical center hospitals in the Midwest totaling more than 1500 beds. PATIENTS All patients discharged from January to December 2011. INTERVENTION None. MEASUREMENTS Time-stamped longitudinal database of patient variables and outcomes, categorized as surgical or medical. Outcomes included unscheduled transfers to the intensive care unit, activation of the RRS, and calls for cardiorespiratory resuscitation ("resuscitation call"). The EWS were calculated and updated with every new patient variable entry over time. Scores were considered accurate if they predicted an outcome in the following 24 hours. RESULTS All EWS demonstrated higher performance within the medical population as compared to surgical: higher positive predictive value (P < .0001 for all scores) and sensitivity (P < .0001 for all scores). All EWS had positive predictive values below 25%. CONCLUSIONS The overall poor performance of the evaluated EWS was marginally better in medical patients when compared to surgical patients. Journal of Hospital Medicine 2017;12:217-223.
Collapse
|
11
|
Abstract 032: Is Bigger Data Better? Predicting Readmissions in Acute Myocardial Infarction on Admission versus Discharge With Electronic Health Record Data. Circ Cardiovasc Qual Outcomes 2017. [DOI: 10.1161/circoutcomes.10.suppl_3.032] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Background:
Readmissions after hospitalization for acute myocardial infarction (AMI) are common, but the few available risk prediction models have poor predictive ability. Including more data from hospitalization may improve risk prediction.
Objectives:
To assess if an AMI-specific electronic health record (EHR) readmission risk prediction model derived and validated from data through the entire hospital course (‘full stay’ model) outperforms a model using data available only from the first day of hospitalization (‘first day’ model).
Methods:
EHR data from AMI hospitalizations from 6 diverse hospitals in north Texas from 2009-2010 were used to derive a model predicting all-cause non-elective 30-day readmissions which was then validated using five-fold cross-validation.
Results:
Of 826 consecutive index AMI admissions, 13% were followed by a 30-day readmission. History of diabetes (AOR 2.41, 95% CI 1.37-4.24), SBP <100 mmHg on admission (AOR 2.18, 95% CI 1.68-2.82), elevated Cr (≥2 mg/dL) on admission (AOR 2.56, 95% CI 2.52-6.08), elevated BNP on admission (AOR 6.36, 95% CI 1.65-24.47) and lack of PCI within 24 hours of admission (AOR 1.31, 95% CI 1.02-1.69) were significant predictors of readmission. Our ‘first-day’ AMI readmissions model based on these predictors had good discrimination (
Table
). Adding three other variables from the hospital course - use of IV diuretics (AOR 1.58, 95% CI 1.07-2.31), anemia (hematocrit ≤ 33%) on discharge (AOR 2.04, 95% CI 1.20-3.46), and discharge to post-acute care (AOR 1.50, 95% CI 0.90-2.50) - improved discrimination of the ‘full stay’ AMI model but only modestly improved net reclassification and calibration.
Conclusions:
A ‘full-stay’ AMI-specific EHR readmission model modestly outperformed a ‘first-day’ EHR model, a multi-condition EHR model, and the CMS AMI model. Surprisingly, incorporating more hospitalization data improved discrimination of the full-stay AMI model but did
not
meaningfully improve reclassification compared to the first-day model. Readmissions in AMI may be accurately predicted on the first day of hospitalization; waiting until later in hospitalization does not markedly improve risk prediction.
Collapse
|
12
|
Abstract 026: Development and Validation of an Electronic Health Record Model for Predicting 30-Day Readmissions in Acute Myocardial Infarction: the AMI READMITS Score. Circ Cardiovasc Qual Outcomes 2017. [DOI: 10.1161/circoutcomes.10.suppl_3.026] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Background:
Readmissions after hospitalization for acute myocardial infarction (AMI) are common, but the few available risk prediction models have poor predictive ability and are not readily usable in real-time.
Objectives:
To develop and validate an AMI readmission risk prediction model from electronic health record (EHR) data available on the first day of hospitalization, and to compare model performance to the Centers for Medicare and Medicaid Services (CMS) AMI model and a validated multi-condition EHR model.
Methods:
EHR data from AMI readmissions from 6 diverse hospitals in north Texas from 2009-2010 were used to derive a model predicting all-cause non-elective 30-day readmissions to any of 75 hospitals in the region, which was then validated using five-fold cross-validation.
Results:
Of 826 consecutive index AMI admissions, 13% were followed by a 30-day readmission. The AMI READMITS score included seven predictors, all ascertainable within the first 24 hours of hospitalization (
Table 1A
). The AMI READMITS score was strongly associated with 30-day readmission in our cross-validation cohort: ≤13 points = extremely low risk (bottom quintile, mean predicted risk 3%); 14-15 points = low risk (4
th
quintile, predicted risk 7%); 16-17 points = moderate risk (3
rd
quintile, predicted risk 11%); 18-19 points = high risk (2
nd
quintile, predicted risk 16%); and ≥20 points = extremely high risk (top quintile, predicted risk 35%). The READMITS score had good discrimination with comparable performance to the CMS model in our cohort; it had improved discrimination, reclassification, and calibration compared to a multi-condition EHR model (
Table 1B
).
Conclusions:
The AMI READMITS score accurately stratifies patients hospitalized with AMI into groups at varying risk of 30-day readmission. Unlike claims-based models which require data not available until after discharge, READMITS is parsimonious, easy to implement, and leverages actionable real-time data available from the EHR within the first 24 hours of hospitalization to enable early prospective identification of high-risk AMI patients for targeted readmissions reduction interventions.
Collapse
|
13
|
Trends in Long-Term Acute Care Hospital Use in Texas from 2002-2011. ANNALS OF GERONTOLOGY AND GERIATRIC RESEARCH 2015; 2:1031. [PMID: 26702452 PMCID: PMC4686275] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
OBJECTIVE To assess regional trends in long-term acute care hospital (LTAC) use over time. DESIGN SETTING PARTICIPANTS Retrospective study using 100% Texas Medicare data. Separate cohorts were created for each year from 2002-2011, which included all beneficiaries residing in 23 hospital referral regions (HRRs) with continuous enrollment in Parts A and B in the previous and current year, or until death. MEASUREMENTS LTAC utilization rate was defined as the number of individuals with a LTAC stay per 100,000 Medicare beneficiaries residing in the HRR. Baseline LTAC use at the HRR-level was categorized by tertiles of use in 2002. RESULTS Overall, LTAC use increased 35% from 2002-2011 and coincided with major Medicare policy changes. However, there were marked regional differences in LTAC utilization trends. From 2002-2011, HRRs in the lowest tertile of baseline LTAC use, which included regions with 0 to 1 LTAC facilities in 2002, had an increase in utilization by 211%, from 190 to 591 individuals per 100,000 persons. In contrast, HRRs in the highest tertile of baseline LTAC use, which included some of the most densely LTAC-bedded regions in the country, experienced a 21% decline (915 to 719 individuals per 100,000 persons; p<0.001 for interaction of LTAC utilization and tertile of baseline use). CONCLUSION These findings suggest substantial regional variation in the trends in LTAC use over time. Further research is needed to estimate how much of this variation is due to differences in clinical need due to increasing number of severely ill older adults versus regional market supply.
Collapse
|
14
|
Diagnostic accuracy and effectiveness of automated electronic sepsis alert systems: A systematic review. J Hosp Med 2015; 10:396-402. [PMID: 25758641 PMCID: PMC4477829 DOI: 10.1002/jhm.2347] [Citation(s) in RCA: 60] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/08/2014] [Revised: 01/09/2015] [Accepted: 01/26/2015] [Indexed: 12/23/2022]
Abstract
BACKGROUND Although timely treatment of sepsis improves outcomes, delays in administering evidence-based therapies are common. PURPOSE To determine whether automated real-time electronic sepsis alerts can: (1) accurately identify sepsis and (2) improve process measures and outcomes. DATA SOURCES We systematically searched MEDLINE, Embase, The Cochrane Library, and Cumulative Index to Nursing and Allied Health Literature from database inception through June 27, 2014. STUDY SELECTION Included studies that empirically evaluated 1 or both of the prespecified objectives. DATA EXTRACTION Two independent reviewers extracted data and assessed the risk of bias. Diagnostic accuracy of sepsis identification was measured by sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and likelihood ratio (LR). Effectiveness was assessed by changes in sepsis care process measures and outcomes. DATA SYNTHESIS Of 1293 citations, 8 studies met inclusion criteria, 5 for the identification of sepsis (n = 35,423) and 5 for the effectiveness of sepsis alerts (n = 6894). Though definition of sepsis alert thresholds varied, most included systemic inflammatory response syndrome criteria ± evidence of shock. Diagnostic accuracy varied greatly, with PPV ranging from 20.5% to 53.8%, NPV 76.5% to 99.7%, LR+ 1.2 to 145.8, and LR- 0.06 to 0.86. There was modest evidence for improvement in process measures (ie, antibiotic escalation), but only among patients in non-critical care settings; there were no corresponding improvements in mortality or length of stay. Minimal data were reported on potential harms due to false positive alerts. CONCLUSIONS Automated sepsis alerts derived from electronic health data may improve care processes but tend to have poor PPV and do not improve mortality or length of stay.
Collapse
|
15
|
Abstract
IMPORTANCE Cardiac biomarker testing is not routinely indicated in the emergency department (ED) because of low utility and potential downstream harms from false-positive results. However, current rates of testing are unknown. OBJECTIVE To determine the use of cardiac biomarker testing overall, as well as stratified by disposition status and selected characteristics. DESIGN, SETTING, AND PARTICIPANTS Retrospective study of ED visits by adults (≥18 years old) selected from the 2009 and 2010 National Hospital Ambulatory Medical Care Survey, a probability sample of ED visits in the United States. EXPOSURES Selected patient, visit, and ED characteristics. MAIN OUTCOMES AND MEASURES Receipt of cardiac biomarker testing during the ED visit. RESULTS Of 44,448 ED visits, cardiac biomarkers were tested in 16.9% of visits, representing 28.6 million visits. Biomarker testing occurred in 8.2% of visits in the absence of acute coronary syndrome (ACS)-related symptoms, representing 8.5 million visits, almost one-third of all visits with biomarker testing. Among individuals subsequently hospitalized, cardiac biomarkers were tested in 47.0% of all visits. In this group, biomarkers were tested in 35.4% of visits despite the absence of ACS-related symptoms. Among all ED visits, the number of other tests or services performed was the strongest predictor of biomarker testing independent of symptoms of ACS. Compared with 0 to 5 other tests or services performed, more than 10 other tests or services performed was associated with 59.55 (95% CI, 39.23-90.40) times the odds of biomarker testing. The adjusted probabilities of biomarker testing if 0 to 5, 6 to 10, or more than 10 other tests or services performed were 6.3%, 34.3%, and 62.3%, respectively. CONCLUSIONS AND RELEVANCE Cardiac biomarker testing in the ED is common even among those without symptoms suggestive of ACS. Cardiac biomarker testing is also frequently used during visits with a high volume of other tests or services independent of the clinical presentation. More attention is needed to develop strategies for appropriate use of cardiac biomarkers.
Collapse
|
16
|
Identifying patients with diabetes and the earliest date of diagnosis in real time: an electronic health record case-finding algorithm. BMC Med Inform Decis Mak 2013; 13:81. [PMID: 23915139 PMCID: PMC3733983 DOI: 10.1186/1472-6947-13-81] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2013] [Accepted: 07/26/2013] [Indexed: 12/16/2022] Open
Abstract
BACKGROUND Effective population management of patients with diabetes requires timely recognition. Current case-finding algorithms can accurately detect patients with diabetes, but lack real-time identification. We sought to develop and validate an automated, real-time diabetes case-finding algorithm to identify patients with diabetes at the earliest possible date. METHODS The source population included 160,872 unique patients from a large public hospital system between January 2009 and April 2011. A diabetes case-finding algorithm was iteratively derived using chart review and subsequently validated (n = 343) in a stratified random sample of patients, using data extracted from the electronic health records (EHR). A point-based algorithm using encounter diagnoses, clinical history, pharmacy data, and laboratory results was used to identify diabetes cases. The date when accumulated points reached a specified threshold equated to the diagnosis date. Physician chart review served as the gold standard. RESULTS The electronic model had a sensitivity of 97%, specificity of 90%, positive predictive value of 90%, and negative predictive value of 96% for the identification of patients with diabetes. The kappa score for agreement between the model and physician for the diagnosis date allowing for a 3-month delay was 0.97, where 78.4% of cases had exact agreement on the precise date. CONCLUSIONS A diabetes case-finding algorithm using data exclusively extracted from a comprehensive EHR can accurately identify patients with diabetes at the earliest possible date within a healthcare system. The real-time capability may enable proactive disease management.
Collapse
|
17
|
Abstract
Hospitals now have the responsibility to implement strategies to prevent adverse outcomes after discharge. This systematic review addressed the effectiveness of hospital-initiated care transition strategies aimed at preventing clinical adverse events (AEs), emergency department (ED) visits, and readmissions after discharge in general medical patients. MEDLINE, CINAHL, EMBASE, and Cochrane Database of Clinical Trials (January 1990 to September 2012) were searched, and 47 controlled studies of fair methodological quality were identified. Forty-six studies reported readmission rates, 26 reported ED visit rates, and 9 reported AE rates. A "bridging" strategy (incorporating both predischarge and postdischarge interventions) with a dedicated transition provider reduced readmission or ED visit rates in 10 studies, but the overall strength of evidence for this strategy was low. Because of scant evidence, no conclusions could be reached on methods to prevent postdischarge AEs. Most studies did not report intervention context, implementation, or cost. The strategies hospitals should implement to improve patient safety at hospital discharge remain unclear.
Collapse
|
18
|
Limits to relying on expert information: the Delphi technique in a study of ethnic Vietnamese injection drug users in Melbourne, Australia. SOCIAL WORK IN PUBLIC HEALTH 2009; 24:371-379. [PMID: 19731183 DOI: 10.1080/19371910802672197] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/28/2023]
Abstract
Information from experts, or "key informants," is often used when estimating the prevalence of a disease, the numbers in particular risk groups, or the frequency of particular risk behaviors. This study aimed to better understand and describe the usefulness of key informants in informing an area such as injection drug use, where the populations are often marginalized and difficult to identify and the illnesses (HIV and hepatitis C virus) associated with the risk behavior can lead to discrimination by the general community. Our study results highlight the limitations of relying upon key informant information alone to provide specific information or accurate data about ethnic Vietnamese injection drug users. While exercises such as the Delphi technique can be used to generate the broad views and opinions of experts around a particular issue, we argue that care must be taken when using such information as evidence on which to base the direction and design of social and public health policy and resources, particularly in relation to marginalized populations.
Collapse
|
19
|
The prevalence and risk behaviours associated with the transmission of blood-borne viruses among ethnic-Vietnamese injecting drug users. Aust N Z J Public Health 2007; 30:519-25. [PMID: 17209266 DOI: 10.1111/j.1467-842x.2006.tb00779.x] [Citation(s) in RCA: 16] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022] Open
Abstract
OBJECTIVE To measure the prevalence and determinants of blood-borne virus (BBV) transmission in ethnic Vietnamese injecting drug users (IDUs). METHODS The study was conducted in Melbourne, Australia, in 2003. It was a cross-sectional design with participants recruited from street-based illicit drug markets predominately using a snowball technique. One hundred and twenty-seven participants completed a questionnaire that asked about illicit drug use and participants' blood samples were tested for HIV, HCV and HBV. RESULTS One hundred and three (81.1%) ethnic Vietnamese IDU study participants were HCV positive and three (2.4%) were HIV positive. More than 60% had evidence of being infected with HBV (either in the past, acute infection or chronic infection). Almost 60% had injected daily over the past 12 months. Fifty-nine participants had recently travelled to Vietnam; 24 (41%) had injected drugs in Vietnam; and three (12.5%) reported sharing injecting equipment in Vietnam. CONCLUSION The prevalence of BBVs was higher in this study's IDU population compared with IDUs in Australia generally, despite the fact that the injecting risk behaviours were similar to IDUs more generally. IMPLICATIONS Culturally sensitive drug treatment and education programs need to be developed in Australia for both ethnic Vietnamese IDUs and their families to reduce this group's risk of contracting a BBV.
Collapse
|
20
|
Recruitment and follow-up of injecting drug users in the setting of early hepatitis C treatment: insights from the ATAHC study. THE INTERNATIONAL JOURNAL OF DRUG POLICY 2007; 18:447-51. [PMID: 17854736 DOI: 10.1016/j.drugpo.2007.01.007] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2006] [Revised: 12/07/2006] [Accepted: 01/06/2007] [Indexed: 01/26/2023]
Abstract
Despite current injecting drug users (IDUs) being the major risk group for new hepatitis C virus (HCV) infections in most countries, they constitute a small minority of study populations in almost all studies of acute HCV infection treatment. The Australian Trial in Acute Hepatitis C (ATAHC) is examining natural history and treatment efficacy among predominantly IDU-acquired acute HCV. Recruitment is through an Australian network of primary and tertiary care sites. Eligible participants are offered treatment with pegylated-interferon alpha-2a (PEG-IFN) for 24 weeks, with both treated and untreated participants followed for up to three years. Quantitative and qualitative data on injecting behaviour is collected on study participants. Participants are regularly reviewed by a multidisciplinary team that includes the treating clinician, HCV clinic nurse, outreach worker and when necessary are referred to a drug and alcohol worker, social worker, psychiatrist or other appropriate services. A contact log records all interactions between participants and the study team. In September 2006, 121 subjects had been screened, 107 were enrolled and 75 had chosen to commence a 24-week course of PEG-IFN (HIV/HCV coinfected participants are treated with PEG-IFN/ribavirin combination therapy). Eighty per cent of ATAHC participants reported IDU within the previous six months. Recruitment is planned to continue through mid-2007. Through a series of case reports, this paper describes factors that are potential barriers to recruitment, follow-up, and treatment of IDUs in the context of acute HCV infection. PEG-IFN adherence and toxicity, current substance use or mental health issues are not presenting as the only barriers to HCV treatment. Financial and transport difficulties, isolation and social support, and legal issues have been prominent and had the potential to impact on clinic attendance and treatment success. Our work suggests that by using a multidisciplinary approach, potential barriers to recruitment and follow-up of current IDUs to HCV treatment can be effectively addressed, and this highly marginalised population can be successfully engaged and treated.
Collapse
|
21
|
Pediatric Cortical Dysplasia: Correlations between Neuroimaging, Electrophysiology and Location of Cytomegalic Neurons and Balloon Cells and Glutamate/GABA Synaptic Circuits. Dev Neurosci 2005; 27:59-76. [PMID: 15886485 DOI: 10.1159/000084533] [Citation(s) in RCA: 73] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2004] [Accepted: 11/08/2004] [Indexed: 11/19/2022] Open
Abstract
Seizures in cortical dysplasia (CD) could be from cytomegalic neurons and balloon cells acting as epileptic 'pacemakers', or abnormal neurotransmission. This study examined these hypotheses using in vitro electrophysiological techniques to determine intrinsic membrane properties and spontaneous glutamatergic and GABAergic synaptic activity for normal-pyramidal neurons, cytomegalic neurons and balloon cells from 67 neocortical sites originating from 43 CD patients (ages 0.2-14 years). Magnetic resonance imaging (MRI), (18)fluoro-2-deoxyglucose positron emission tomography (FDG-PET) and electrocorticography graded cortical sample sites from least to worst CD abnormality. Results found that cytomegalic neurons and balloon cells were observed more frequently in areas of severe CD compared with mild or normal CD regions as assessed by FDG-PET/MRI. Cytomegalic neurons (but not balloon cells) correlated with the worst electrocorticography scores. Electrophysiological recordings demonstrated that cytomegalic and normal-pyramidal neurons displayed similar firing properties without intrinsic bursting. By contrast, balloon cells were electrically silent. Normal-pyramidal and cytomegalic neurons displayed decreased spontaneous glutamatergic synaptic activity in areas of severe FDG-PET/MRI abnormalities compared with normal regions, while GABAergic activity was unaltered. In CD, these findings indicate that cytomegalic neurons (but not balloon cells) might contribute to epileptogenesis, but are not likely to be 'pacemaker' cells capable of spontaneous paroxysmal depolarizations. Furthermore, there was more GABA relative to glutamate synaptic neurotransmission in areas of severe CD. Thus, in CD tissue alternate mechanisms of epileptogenesis should be considered, and we suggest that GABAergic synaptic circuits interacting with cytomegalic and normal-pyramidal neurons with immature receptor properties might contribute to seizure generation.
Collapse
|
22
|
Abstract
Huntington's disease (HD) is characterized by loss of striatal gamma-aminobutyric acid (GABA)ergic medium-sized spiny projection neurons (MSSNs), whereas some classes of striatal interneurons are relatively spared. Striatal interneurons provide most of the inhibitory synaptic input to MSSNs and use GABA as their neurotransmitter. We reported previously alterations in glutamatergic synaptic activity in the R6/2 and R6/1 mouse models of HD. In the present study, we used whole-cell voltage clamp recordings to examine GABAergic synaptic currents in MSSNs from striatal slices in these two mouse models compared to those in age-matched control littermates. The frequency of spontaneous GABAergic synaptic currents was increased significantly in MSSNs from R6/2 transgenics starting around 5-7 weeks (when the overt behavioral phenotype begins) and continuing in 9-14-week-old mice. A similar increase was observed in 12-15-month-old R6/1 transgenics. Bath application of brain-derived neurotrophic factor, which is downregulated in HD, significantly reduced the frequency of spontaneous GABAergic synaptic currents in MSSNs from R6/2 but not control mice at 9-14 weeks. Increased GABA current densities also occurred in acutely isolated MSSNs from R6/2 animals. Immunofluorescence demonstrated increased expression of the ubiquitous alpha1 subunit of GABA(A) receptors in MSSNs from R6/2 animals. These results indicate that increases in spontaneous GABAergic synaptic currents and postsynaptic receptor function occur in parallel to progressive decreases in glutamatergic inputs to MSSNs. In conjunction, both changes will severely alter striatal outputs to target areas involved in the control of movement.
Collapse
|
23
|
Morphological and electrophysiological characterization of abnormal cell types in pediatric cortical dysplasia. J Neurosci Res 2003; 72:472-86. [PMID: 12704809 DOI: 10.1002/jnr.10604] [Citation(s) in RCA: 145] [Impact Index Per Article: 6.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
The mechanisms responsible for seizure generation in cortical dysplasia (CD) are unknown, but morphologically abnormal cells could contribute. We examined the passive and active membrane properties of cells from pediatric CD in vitro. Normal- and abnormal-appearing cells were identified morphologically by using infrared videomicroscopy and biocytin in slices from children with mild to severe CD. Electrophysiological properties were assessed with patch clamp recordings. Four groups of abnormal-appearing cells were observed. The first consisted of large, pyramidal cells probably corresponding to cytomegalic neurons. Under conditions that reduced the contribution of K(+) conductances, these cells generated large Ca(2+) currents and influx when depolarized. When these cells were acutely dissociated, peak Ca(2+) currents and densities were greater in cytomegalic compared with normal-appearing pyramidal neurons. The second group included large, nonpyramidal cells with atypical somatodendritic morphology that could correspond to "balloon" cells. These cells did not display active voltage- or ligand-gated currents and did not appear to receive synaptic inputs. The third group included misoriented and dysmorphic pyramidal neurons, and the fourth group consisted of immature-looking pyramidal neurons. Electrophysiologically, neurons in these latter two groups did not display significant abnormalities when compared with normal-appearing pyramidal neurons. We conclude that there are cells with abnormal intrinsic membrane properties in pediatric CD. Among the four groups of cells, the most abnormal electrophysiological properties were displayed by cytomegalic neurons and large cells with atypical morphology. Cytomegalic neurons could play an important role in the generation of epileptic activity.
Collapse
|
24
|
Transient and progressive electrophysiological alterations in the corticostriatal pathway in a mouse model of Huntington's disease. J Neurosci 2003; 23:961-9. [PMID: 12574425 PMCID: PMC6741903] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/28/2023] Open
Abstract
Alterations in the corticostriatal pathway may precede symptomatology and striatal cell death in Huntington's disease (HD) patients. Here we examined spontaneous EPSCs in striatal medium-sized spiny neurons in slices from a mouse model of HD (R6/2). Spontaneous EPSC frequency was similar in young (3-4 weeks) transgenics and controls but decreased significantly in transgenics when overt behavioral symptoms began (5-7 weeks) and was most pronounced in severely impaired transgenics (11-15 weeks). These differences were maintained after bicuculline or tetrodotoxin, indicating they were specific to glutamatergic input and likely presynaptic in origin. Decreases in presynaptic and postsynaptic protein markers, synaptophysin and postsynaptic density-95, occurred in 11-15 week R6/2 mice, supporting the electrophysiological results. Furthermore, isolated, large-amplitude synaptic events (>100 pA) occurred more frequently in transgenic animals, particularly at 5-7 weeks, suggesting additional dysregulation of cortical inputs. Large events were blocked by tetrodotoxin, indicating a possible cortical origin. Addition of bicuculline and 4-aminopyridine facilitated the occurrence of large events. Riluzole, a compound that decreases glutamate release, reduced these events. Together, these observations indicate that both progressive and transient alterations occur along the corticostriatal pathway in experimental HD. These alterations are likely to contribute to the selective vulnerability of striatal medium-sized spiny neurons.
Collapse
|