1
|
Development of a standards-based city-wide health information exchange for public health in response to COVID-19. JMIR Public Health Surveill 2022; 8:e35973. [PMID: 35544440 PMCID: PMC9518711 DOI: 10.2196/35973] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/24/2021] [Revised: 03/27/2022] [Accepted: 05/07/2022] [Indexed: 11/23/2022] Open
Abstract
Background Disease surveillance is a critical function of public health, provides essential information about the disease burden and the clinical and epidemiologic parameters of disease, and is an important element of effective and timely case and contact tracing. The COVID-19 pandemic demonstrates the essential role of disease surveillance in preserving public health. In theory, the standard data formats and exchange methods provided by electronic health record (EHR) meaningful use should enable rapid health care data exchange in the setting of disruptive health care events, such as a pandemic. In reality, access to data remains challenging and, even if available, often lacks conformity to regulated standards. Objective We sought to use regulated interoperability standards already in production to generate awareness of regional bed capacity and enhance the capture of epidemiological risk factors and clinical variables among patients tested for SARS-CoV-2. We described the technical and operational components, governance model, and timelines required to implement the public health order that mandated electronic reporting of data from EHRs among hospitals in the Chicago jurisdiction. We also evaluated the data sources, infrastructure requirements, and the completeness of data supplied to the platform and the capacity to link these sources. Methods Following a public health order mandating data submission by all acute care hospitals in Chicago, we developed the technical infrastructure to combine multiple data feeds from those EHR systems—a regional data hub to enhance public health surveillance. A cloud-based environment was created that received ELR, consolidated clinical data architecture, and bed capacity data feeds from sites. Data governance was planned from the project initiation to aid in consensus and principles for data use. We measured the completeness of each feed and the match rate between feeds. Results Data from 88,906 persons from CCDA records among 14 facilities and 408,741 persons from ELR records among 88 facilities were submitted. Most (n=448,380, 90.1%) records could be matched between CCDA and ELR feeds. Data fields absent from ELR feeds included travel histories, clinical symptoms, and comorbidities. Less than 5% of CCDA data fields were empty. Merging CCDA with ELR data improved race, ethnicity, comorbidity, and hospitalization information data availability. Conclusions We described the development of a citywide public health data hub for the surveillance of SARS-CoV-2 infection. We were able to assess the completeness of existing ELR feeds, augment those feeds with CCDA documents, establish secure transfer methods for data exchange, develop a cloud-based architecture to enable secure data storage and analytics, and produce dashboards for monitoring of capacity and the disease burden. We consider this public health and clinical data registry as an informative example of the power of common standards across EHRs and a potential template for future use of standards to improve public health surveillance.
Collapse
|
2
|
Abstract
Proper timing of critical care nutrition has long been a matter of controversy. Critical illness waxes and wanes in stages, creating a dynamic flux in energy needs that we have only begun to examine. Furthermore, response to nutrition support likely differs greatly at the level of the individual patient in regard to genetic status, disease stage, comorbidities, and more. We review the observational and randomized literature concerning timing in nutrition support, discuss mechanisms of harm in feeding critically ill patients, and highlight the role of precision nutrition for moving the literature beyond the realm of blunt population averages into one that accounts for the patient-specific complexities of critical illness and host genetics. Expected final online publication date for the Annual Review of Nutrition, Volume 41 is September 2021. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.
Collapse
|
3
|
Impact of MnSOD and GPx1 Genotype at Different Levels of Enteral Nutrition Exposure on Oxidative Stress and Mortality: A Post hoc Analysis From the FeDOx Trial. JPEN J Parenter Enteral Nutr 2020; 45:287-294. [PMID: 32885455 DOI: 10.1002/jpen.2012] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/17/2020] [Accepted: 08/27/2020] [Indexed: 02/02/2023]
Abstract
BACKGROUND Converting nutrition support to energy results in mitochondrial free radical production, possibly increasing oxidative stress. Highly prevalent single nucleotide variants (SNV) exist for the genes encoding antioxidant enzymes responsible for the detoxification of reactive oxygen species. Our objective was to explore the interaction between nutrition support and genetic SNV's for two anti-oxidant proteins (rs4880 SNV for manganese superoxide dismutase and rs1050450 SNV for glutathione peroxidase 1) on oxidative stress and secondarily on intensive care unit (ICU) mortality. METHODS We performed a post-hoc analysis on 34 mechanically ventilated sepsis patients from a randomized control feeding trial. Participants were dichotomized into those who carried both the rs4880 and the rs1050450 at-risk alleles (Risk Group) versus all others (Nonrisk Group). We explored the interaction between genotype and percent time spent in the upper median of energy exposure on oxidative stress and ICU mortality. RESULTS Adjusting for confounders, the slope of log F2-isoprostane levels across percentage of days spent in the upper median of daily kilocalories per kilogram (kcal/kg) was 0.01 higher in the Risk Group compared to the Non-Risk Group (p=0.01). Every 1 percent increase in days spent in the upper median of daily kcal/kg was associated with an adjusted 10.3 percent increased odds of ICU mortality amongst participants in the Risk Group (odds ratio [OR]=1.103, p=0.06) but was highly insignificant in the Nonrisk group (OR=0.991, P=0.79). CONCLUSION Nutrition support may lead to increased oxidative stress and worse clinical outcomes in a large percent of ICU patients with an at-risk genotype.
Collapse
|
4
|
Higher Caloric Exposure in Critically Ill Patients Transiently Accelerates Thyroid Hormone Activation. J Clin Endocrinol Metab 2020; 105:5580691. [PMID: 31581295 PMCID: PMC9633328 DOI: 10.1210/clinem/dgz077] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/24/2019] [Accepted: 09/27/2019] [Indexed: 11/19/2022]
Abstract
INTRODUCTION The inflammatory response of critical illness is accompanied by nonthyroidal illness syndrome (NTIS). Feeding has been shown to attenuate this process, but this has not been explored prospectively over time in critically ill patients. OBJECTIVE To explore the impact of calorie exposure on NTIS over time in critically ill patients. METHODS Mechanically ventilated patients with systemic inflammatory response syndrome (SIRS) were randomized to receive either 100% or 40% of their estimated caloric needs (ECN). Thyroid hormones were measured daily for 7 days or until intensive care unit discharge or death. Mixed level regression modeling was used to explore the effect of randomization group on plasma triiodothyronine (T3), reverse triiodothyronine (rT3), thyroxine (T4), and thyroid stimulating hormone (TSH), as well as the T3/rT3 ratio. RESULTS Thirty-five participants (n=19 in 100% ECN; n=16 in 40% ECN) were recruited. Adjusting for group differences in baseline T3/rT3 ratio, the parameters defining the fitted curves (intercept, linear effect of study day, and quadratic effect of study day) differed by randomization group (P = 0.001, P = 0.01, and P = 0.02 respectively). Plots of the fitted curves revealed that participants in the 100% ECN group had a 54% higher T3/rT3 ratio on postintervention day 1 compared with the 40% ECN group, a difference which attenuated over time. This was driven by a 23% higher plasma T3 and 10% lower plasma rT3 levels on postintervention 1. CONCLUSIONS Higher caloric exposure in NTIS patients transiently attenuates the drop of the plasma T3/rT3 ratio, an effect that is minimized and finally lost over the following 3 days of continued higher caloric exposure.
Collapse
|
5
|
Real-Time Energy Exposure Is Associated With Increased Oxidative Stress Among Feeding-Tolerant Critically Ill Patients: Results From the FEDOX Trial. JPEN J Parenter Enteral Nutr 2020; 44:1484-1491. [PMID: 31995239 PMCID: PMC7754354 DOI: 10.1002/jpen.1776] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2019] [Revised: 11/19/2019] [Accepted: 12/10/2019] [Indexed: 01/13/2023]
Abstract
Background Prospective randomized controlled trials (PRCTs) that found harm in patients receiving higher levels of energy exposure have been largely ignored, in part because of the lack of a known mechanism of harm. Objective The current 7‐day pilot study is a PRCT and post hoc analysis designed to explore the relationship between energy exposure and oxidative stress (as plasma total F2‐isoprostanes) in mechanically ventilated intensive care unit patients with systemic inflammatory response syndrome. Methods Thirty‐five participants were randomized to receive either 100% or 40% of their estimated energy needs. Our intent‐to‐treat model found no differences in F2‐isoprostanes between groups. A post hoc analysis revealed that on days when participants were in the highest tertile of daily kcal/kg, the real‐time energy flow rate within 2 hours of the blood draw was predictive of increased oxidative stress. On these days, participants in the second or third vs the first tertile of real‐time energy flow rate experienced a 41.8% (P = .006) or 26.5% (P = .001) increase in F2‐isoprostane levels, respectively. This was confirmed through a within‐group subanalysis restricted to participants with measurements on both sides of the median of real‐time energy flow rate that found a 28.2% F2‐isoprostane increase on days in the upper vs lower median of flow rate (P = .002). Conclusion The benefits of feeding may be more nuanced than previously suspected. Our findings imply a potential mechanism of harm in meeting the current recommendations for nutrition support in the critically ill that warrants further investigation.
Collapse
|
6
|
Disagreement Between Hospital Rating Systems: Measuring the Correlation of Multiple Benchmarks and Developing a Quality Composite Rank. Am J Med Qual 2019; 35:222-230. [PMID: 31253048 DOI: 10.1177/1062860619860250] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/25/2022]
Abstract
In the United States, hospital rating system usefulness is limited by heterogeneity and conflicting results. US News Best Hospitals, Vizient Quality and Accountability Study, Centers for Medicare & Medicaid Services (CMS) Star Rating, Leapfrog Hospital Safety Grade, and the Truven Top 100 Hospitals ratings were compared using Spearman correlations. Rank aggregation was used to combine the scores generating a Quality Composite Rank (QCR). The highest correlation between rating systems was shown between the Leapfrog Safety Grade and the CMS Star Rating. In a proportional odds logistic regression, a greater discordance between the CMS Star Rating, Vizient rank, US News, and Leapfrog was associated with a lower overall rank in the QCR. Lack of transparency and understanding about the differences and similarities for these hospital ranking systems complicates use of the measures. By combining the results of these ranking systems into a composite, the measurement of hospital quality can be simplified.
Collapse
|
7
|
A comparison of low and standard anti-coagulation regimens in extracorporeal membrane oxygenation. J Heart Lung Transplant 2019; 38:433-439. [DOI: 10.1016/j.healun.2019.01.1313] [Citation(s) in RCA: 21] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2018] [Revised: 01/22/2019] [Accepted: 01/28/2019] [Indexed: 11/24/2022] Open
|
8
|
Extracorporeal Membrane Oxygenation Bridges Inoperable Patients to Definitive Cardiac Operation. ASAIO J 2019; 65:43-48. [DOI: 10.1097/mat.0000000000000741] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022] Open
|
9
|
Role of timing and dose of energy received in patients with acute lung injury on mortality in the Intensive Nutrition in Acute Lung Injury Trial (INTACT): a post hoc analysis. Am J Clin Nutr 2017; 105:411-416. [PMID: 27974311 PMCID: PMC5267300 DOI: 10.3945/ajcn.116.140764] [Citation(s) in RCA: 35] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2016] [Accepted: 11/11/2016] [Indexed: 11/14/2022] Open
Abstract
BACKGROUND Our trial INTACT (Intensive Nutrition in Acute Lung Injury Trial) was designed to compare the impact of feeding from acute lung injury (ALI) diagnosis to hospital discharge, an interval that, to our knowledge, has not yet been explored. It was stopped early because participants who were randomly assigned to energy intakes at nationally recommended amounts via intensive medical nutrition therapy experienced significantly higher mortality hazards than did those assigned to standard nutrition support care that provided energy at 55% of recommended concentrations. OBJECTIVE We assessed the influence of dose and timing of feeding on hospital mortality. DESIGN Participants (n = 78) were dichotomized as died or discharged alive. Associations between the energy and protein received overall, early (days 1-7), and late (days ≥8) and the hazards of hospital mortality were evaluated between groups with multivariable analysis methods. RESULTS Higher overall energy intake predicted significantly higher mortality (OR: 1.14, 95% CI: 1.02, 1.27). Among participants enrolled for ≥8 d (n = 66), higher early energy intake significantly increased the HR for mortality (HR: 1.17, 95% CI: 1.07, 1.28), whereas higher late energy intake was significantly protective (HR: 0.91, 95% CI: 0.83, 1.0). Results were similar for early but not late protein (grams per kilogram) exposure (early-exposure HR: 8.9, 95% CI: 2.3, 34.3; late-exposure HR: 0.15, 95% CI: 0.02, 1.1). Threshold analyses indicated early mean intakes ≥18 kcal/kg significantly increased subsequent mortality. CONCLUSIONS Providing kilocalories per kilogram or grams of protein per kilogram early post-ALI diagnosis at recommended levels was associated with significantly higher hazards for mortality, whereas higher late energy intakes reduced mortality hazards. This time-varying effect violated the Cox proportionality assumption, indicating that feeding trials in similar populations should extend beyond 7 d and use time-varying statistical methods. Future trials are required for corroboration. INTACT was registered at clinicaltrials.gov as NCT01921101.
Collapse
|
10
|
Abstract
Background The optimal timing of surgical treatment for infective endocarditis complicated by cerebrovascular events is controversial, largely due to the perceived risk of perioperative intracranial bleeding. Current guidelines suggest waiting 2 weeks between the diagnosis of stroke and surgery. The aim of this study was to investigate the clinical and neurological outcomes of early surgery following a stroke. Methods This was a single-center retrospective analysis of 12 consecutive patients requiring surgery for infective endocarditis between 2011 and 2014 at Rush University Medical Center, with either ischemic ( n = 6) and/or hemorrhagic ( n = 6) cerebrovascular complications. All underwent computed tomographic angiography prior to early valve reconstructive surgery to identify potentially actionable neurological findings. Early valve surgery was performed for ongoing sepsis or persistent emboli. Neurologic risk and outcome were assessed pre- and postoperatively using the National Institutes of Health Stroke Scale and the Glasgow Outcome Scale, respectively. Results All 12 patients underwent surgical treatment within 10 days of the diagnosis of stroke. Mortality in the immediate postoperative period was 8%. Eleven of the 12 patients exhibited good neurological recovery in the immediate postoperative period, with a Glasgow Outcome Scale score ≥ 3. There was no correlation between duration of cardiopulmonary bypass and neurological outcomes. Conclusion Early cardiac surgery in patients with infective endocarditis and stroke maybe lifesaving with a low neurological risk. Comprehensive neurovascular imaging may help in identifying patient-related risk factors.
Collapse
|
11
|
|
12
|
ECMO for Shock Due to Drug Overdose: Timely Transfer to ECMO Capable Center Can Save Lives. Chest 2015. [DOI: 10.1378/chest.2270082] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/01/2022] Open
|
13
|
|
14
|
Chicago Ebola Response Network (CERN): A Citywide Cross-hospital Collaborative for Infectious Disease Preparedness. Clin Infect Dis 2015; 61:1554-7. [DOI: 10.1093/cid/civ510] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2015] [Accepted: 06/16/2015] [Indexed: 11/12/2022] Open
|
15
|
Abstract
BACKGROUND Despite extensive use of enteral (EN) and parenteral nutrition (PN) in intensive care unit (ICU) populations for 4 decades, evidence to support their efficacy is extremely limited. METHODS A prospective randomized trial was conducted evaluate the impact on outcomes of intensive medical nutrition therapy (IMNT; provision of >75% of estimated energy and protein needs per day via EN and adequate oral diet) from diagnosis of acute lung injury (ALI) to hospital discharge compared with standard nutrition support care (SNSC; standard EN and ad lib feeding). The primary outcome was infections; secondary outcomes included number of days on mechanical ventilation, in the ICU, and in the hospital and mortality. RESULTS Overall, 78 patients (40 IMNT and 38 SNSC) were recruited. No significant differences between groups for age, body mass index, disease severity, white blood cell count, glucose, C-reactive protein, energy or protein needs occurred. The IMNT group received significantly higher percentage of estimated energy (84.7% vs 55.4%, P < .0001) and protein needs (76.1 vs 54.4%, P < .0001) per day compared with SNSC. No differences occurred in length of mechanical ventilation, hospital or ICU stay, or infections. The trial was stopped early because of significantly greater hospital mortality in IMNT vs SNSC (40% vs 16%, P = .02). Cox proportional hazards models indicated the hazard of death in the IMNT group was 5.67 times higher (P = .001) than in the SNSC group. CONCLUSIONS Provision of IMNT from ALI diagnosis to hospital discharge increases mortality.
Collapse
|
16
|
1348. Crit Care Med 2013. [DOI: 10.1097/01.ccm.0000440578.59344.a3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
17
|
Assessing Indicators of Acute and Chronic Malnutrition. J Acad Nutr Diet 2012. [DOI: 10.1016/j.jand.2012.06.241] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
|
18
|
Impact of the implementation of a sepsis protocol for the management of fluid-refractory septic shock: A single-center, before-and-after study. Clin Ther 2010; 32:1285-93. [PMID: 20678676 DOI: 10.1016/j.clinthera.2010.07.003] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/18/2010] [Indexed: 10/19/2022]
Abstract
BACKGROUND Evidence-based guidelines have been published for the acute management of severe sepsis and septic shock. Key goals of institution-driven protocols include timely fluid resuscitation and antibiotic selection, as well as source control. OBJECTIVE This study assessed the impact of a sepsis protocol on the timeliness of antibiotic administration, the adequacy of fluid resuscitation, and 28-day mortality in patients with fluid-refractory septic shock. METHODS This was a single-center, before-and-after study (18 months before July 2007 and 18 months after) with prospective data collection evaluating the outcomes of a sepsis protocol in adult patients with fluid-refractory septic shock. All patients received a fluid challenge and antibiotics; those who did not were excluded from this analysis. Preprotocol findings led to the development of the sepsis protocol, which emphasized fluid resuscitation, timely administration of antibiotic therapy, and collection of specimens for culture at the onset of septic shock. In the pre- and postprotocol phases of the study, data were collected prospectively and analyzed for demographic characteristics; Acute Physiology and Chronic Health Evaluation (APACHE) II score; appropriateness of fluid resuscitation; antibiotic use; number of vasopressor, ventilator, and intensive care unit (ICU) days; and 28-day mortality. Outcomes were measured prospectively at any time during the patient's hospital admission. The primary end points were the time to administration of antimicrobial therapy and the appropriateness of fluid resuscitation before and after implementation of the sepsis protocol. RESULTS A total of 118 patients were included in the analysis: 64 and 54 in the pre- and postprotocol groups, respectively. Patients in the preprotocol group were primarily women (53% [34/64]) and had a mean (SD) age of 61 (15.5) years and a mean APACHE II score of 28 (6.0). Patients in the postprotocol group were primarily men (54% [29/54]) and had a mean age of 52 (18.0) years and a mean APACHE II score of 27 (6.4). Implementation of the sepsis protocol resulted in a greater percentage of patients receiving timely antibiotic therapy (ie, within 4.5 hours of refractory shock; 85% [46/54] vs 56% [36/64]; P = 0.001) and adequate fluid resuscitation (72% [39/54] vs 31% [20/64]; P < 0.001) compared with the preprotocol group. Post hoc analysis found significant decreases in the number of vasopressor days (mean [SD], 3.8 [2.7] to 1.4 [1.5]; P < 0.001), ventilator days (9.1 [12.2] to 2.7 [4.0]; P < 0.001), and ICU days (12.3 [12.6] to 4.9 [3.9]; P < 0.001) in the postprotocol group. In-hospital mortality was not significantly different between the groups (survival 46% [28/61] before vs 54% [33/61] after the protocol). Multivariate analysis for predictors of in-hospital mortality identified an interval between shock and empiric antibiotic administration of >4.5 hours (odds ratio [OR] = 5.54; 95% CI, 1.91-16.07; P < 0.002), vasopressor duration in days (OR = 1.27; 95% CI, 1.01-1.59; P = 0.037), APACHE II score (OR = 1.14; 95% CI, 1.05-1.24; P = 0.003), and type of infection (community vs nosocomial, OR = 0.18; 95% CI, 0.05-0.61; P = 0.006) as significant predictors. The 28-day mortality decreased from 61% (39/64) to 33% (18/54) after implementation of the protocol (P = 0.004). CONCLUSION Implementation of a sepsis protocol emphasizing early administration of antibiotic therapy and adequate fluid resuscitation was associated with improved clinical outcomes and lower 28-day mortality in patients with fluid-refractory septic shock at this institution.
Collapse
|
19
|
Reducing use of indwelling urinary catheters and associated urinary tract infections. Am J Crit Care 2009; 18:535-41; quiz 542. [PMID: 19880955 DOI: 10.4037/ajcc2009938] [Citation(s) in RCA: 70] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/01/2022]
Abstract
BACKGROUND Use of indwelling urinary catheters can lead to complications, most commonly catheter-associated urinary tract infections. Duration of catheterization is the major risk factor. These infections can result in sepsis, prolonged hospitalization, additional hospital costs, and mortality. OBJECTIVES To implement and evaluate the efficacy of an intervention to reduce catheter-associated urinary tract infections in a medical intensive care unit by decreasing use of urinary catheters. METHODS Indications for continuing urinary catheterization with indwelling devices were developed by unit clinicians. For a 6-month intervention period, patients in a medical intensive care unit who had indwelling urinary catheters were evaluated daily by using criteria for appropriate catheter continuance. Recommendations were made to discontinue indwelling urinary catheters in patients who did not meet the criteria. Days of use of a urinary catheter and rates of catheter-associated urinary tract infections during the intervention were compared with those of the preceding 11 months. RESULTS During the study period, 337 patients had a total of 1432 days of urinary catheterization. With use of guidelines, duration of use was significantly reduced to a mean of 238.6 d/mo from the previous rate of 311.7 d/mo. The number of catheter-associated urinary tract infections per 1000 days of use was a mean of 4.7/mo before the intervention and zero during the 6-month intervention period. CONCLUSIONS Implementation of an intervention to judge appropriateness of indwelling urinary catheters may result in significant reductions in duration of catheterization and occurrences of catheter-associated urinary tract infections.
Collapse
|
20
|
Abstract
BACKGROUND Current clinical practice guidelines delineate optimal nutrition management in the intensive care unit (ICU) patient. In light of these existing data, the authors identify current physician perceptions of nutrition in critical illness, preferences relating to initiation of feeding, and management practices specific to nutrition after initiation of feeding in the ICU patient. METHODS The authors electronically distributed a 12-question survey to attending physicians, fellows, and residents who routinely admit patients to medical and surgical ICUs. RESULTS On a scale ranging from 1 to 5 (1 = low, 5 = high), the attending physician's mean rating for importance of nutrition in the ICU was 4.60, the rating for comfort level with the nutrition support at the authors' institution was 3.70, and the rating for the physician's own understanding of nutrition support in critically ill patients was 3.33. Attending physicians, fellows, and residents reported waiting an average of 2.43, 1.79, and 2.63 days, respectively, before addressing nutrition status in an ICU patient. Fifty-two percent of attending physicians chose parenteral nutrition as the preferred route of nutrition support in a patient with necrotizing pancreatitis. If a patient experiences enteral feeding intolerance, physicians most commonly would stop tube feeds. There was no significant difference in responses to any of the survey questions between attending physicians, fellows, and residents. CONCLUSIONS This study demonstrates a substantial discordance in physician perceptions and practice patterns regarding initiation and management of nutrition in ICU patients, indicating an urgent need for nutrition-related education at all levels of training.
Collapse
|
21
|
|
22
|
Abstract
Methotrexate is a commonly prescribed antineoplastic and immune modulating compound that has gained wide acceptance in the management of rheumatoid arthritis, psoriasis, sarcoidosis and a number of neoplastic disorders. Although generally considered safe and easy to use, methotrexate has been associated with a number of adverse reactions. Pulmonary toxicity has been well-described and may take a variety of forms. Pulmonary infiltrates are the most commonly encountered form of methotrexate pulmonary toxicity and these infiltrates resemble hypersensitivity lung disease. This discussion focuses primarily on low-dose methotrexate pulmonary toxicity and will discuss the diagnosis using clinical, pulmonary function, radiographical and pathological manifestations. Suggestions for clinical monitoring to detect adverse effects are given. In addition, management of pulmonary toxicity through discontinuation of the methotrexate, support and possibly the administration of corticosteroids is discussed.
Collapse
|