1
|
Adrenaline and vasopressin for cardiac arrest. EMERGENCIAS : REVISTA DE LA SOCIEDAD ESPANOLA DE MEDICINA DE EMERGENCIAS 2021; 32:133-134. [PMID: 32125114] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
|
2
|
Mitochondrial DNAs provide insight into trypanosome phylogeny and molecular evolution. BMC Evol Biol 2020; 20:161. [PMID: 33297939 PMCID: PMC7724854 DOI: 10.1186/s12862-020-01701-9] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2020] [Accepted: 10/12/2020] [Indexed: 12/20/2022] Open
Abstract
Background Trypanosomes are single-celled eukaryotic parasites characterised by the unique biology of their mitochondrial DNA. African livestock trypanosomes impose a major burden on agriculture across sub-Saharan Africa, but are poorly understood compared to those that cause sleeping sickness and Chagas disease in humans. Here we explore the potential of the maxicircle, a component of trypanosome mitochondrial DNA to study the evolutionary history of trypanosomes. Results We used long-read sequencing to completely assemble maxicircle mitochondrial DNA from four previously uncharacterized African trypanosomes, and leveraged these assemblies to scaffold and assemble a further 103 trypanosome maxicircle gene coding regions from published short-read data. While synteny was largely conserved, there were repeated, independent losses of Complex I genes. Comparison of pre-edited and non-edited genes revealed the impact of RNA editing on nucleotide composition, with non-edited genes approaching the limits of GC loss. African tsetse-transmitted trypanosomes showed high levels of RNA editing compared to other trypanosomes. The gene coding regions of maxicircle mitochondrial DNAs were used to construct time-resolved phylogenetic trees, revealing deep divergence events among isolates of the pathogens Trypanosoma brucei and T. congolense. Conclusions Our data represents a new resource for experimental and evolutionary analyses of trypanosome phylogeny, molecular evolution and function. Molecular clock analyses yielded a timescale for trypanosome evolution congruent with major biogeographical events in Africa and revealed the recent emergence of Trypanosoma brucei gambiense and T. equiperdum, major human and animal pathogens.
Collapse
|
3
|
Abstract
BACKGROUND Adrenaline and vasopressin are widely used to treat people with cardiac arrest, but there is uncertainty about the safety, effectiveness and the optimal dose. OBJECTIVES To determine whether adrenaline or vasopressin, or both, administered during cardiac arrest, afford any survival benefit. SEARCH METHODS We searched the Cochrane Central Register of Controlled Trials, MEDLINE, Embase and DARE from their inception to 8 May 2018, and the International Liaison Committee on Resuscitation 2015 Advanced Life Support Consensus on Science and Treatment Recommendations. We also searched four trial registers on 5 September 2018 and checked the reference lists of the included studies and review papers to identify potential papers for review. SELECTION CRITERIA Any randomised controlled trial comparing: standard-dose adrenaline versus placebo; standard-dose adrenaline versus high-dose adrenaline; and adrenaline versus vasopressin, in any setting, due to any cause of cardiac arrest, in adults and children. There were no language restrictions. DATA COLLECTION AND ANALYSIS Two review authors independently identified trials for review, assessed risks of bias and extracted data, resolving disagreements through re-examination of the trial reports and by discussion. We used risk ratios (RRs) with 95% confidence intervals (CIs) to compare dichotomous outcomes for clinical events. There were no continuous outcomes reported. We examined groups of trials for heterogeneity. We report the quality of evidence for each outcome, using the GRADE approach. MAIN RESULTS We included 26 studies (21,704 participants).Moderate-quality evidence found that adrenaline increased survival to hospital discharge compared to placebo (RR 1.44, 95% CI 1.11 to 1.86; 2 studies, 8538 participants; an increase from 23 to 32 per 1000, 95% CI 25 to 42). We are uncertain about survival to hospital discharge for high-dose compared to standard-dose adrenaline (RR 1.10, 95% CI 0.75 to 1.62; participants = 6274; studies = 10); an increase from 33 to 36 per 1000, 95% CI 24 to 53); standard-dose adrenaline versus vasopressin (RR 1.25, 95% CI 0.84 to 1.85; 6 studies; 2511 participants; an increase from 72 to 90 per 1000, 95% CI 60 to 133); and standard-dose adrenaline versus vasopressin plus adrenaline (RR 0.76, 95% CI 0.47 to 1.22; 3 studies; 3242 participants; a possible decrease from 24 to 18 per 1000, 95% CI 11 to 29), due to very low-quality evidence.Moderate-quality evidence found that adrenaline compared with placebo increased survival to hospital admission (RR 2.51, 95% CI 1.67 to 3.76; 2 studies, 8489 participants; an increase from 83 to 209 per 1000, 95% CI 139 to 313). We are uncertain about survival to hospital admission when comparing standard-dose with high-dose adrenaline, due to very low-quality evidence. Vasopressin may improve survival to hospital admission when compared with standard-dose adrenaline (RR 1.27, 95% CI 1.04 to 1.54; 3 studies, 1953 participants; low-quality evidence; an increase from 260 to 330 per 1000, 95% CI 270 to 400), and may make little or no difference when compared to standard-dose adrenaline plus vasopressin (RR 0.95, 95% CI 0.83 to 1.08; 3 studies; 3249 participants; low-quality evidence; a decrease from 218 to 207 per 1000 (95% CI 181 to 236).There was no evidence that adrenaline (any dose) or vasopressin improved neurological outcomes.The rate of return of spontaneous circulation (ROSC) was higher for standard-dose adrenaline versus placebo (RR 2.86, 95% CI 2.21 to 3.71; participants = 8663; studies = 3); moderate-quality evidence; an increase from 115 to 329 per 1000, 95% CI 254 to 427). We are uncertain about the effect on ROSC for the comparison of standard-dose versus high-dose adrenaline and standard-does adrenaline compared to vasopressin, due to very low-quality evidence. Standard-dose adrenaline may make little or no difference to ROSC when compared to standard-dose adrenaline plus vasopressin (RR 0.97, 95% CI 0.87 to 1.08; 3 studies, 3249 participants; low-quality evidence; a possible decrease from 299 to 290 per 1000, 95% CI 260 to 323).The source of funding was not stated in 11 of the 26 studies. The study drugs were provided by the manufacturer in four of the 26 studies, but neither drug represents a profitable commercial option. The other 11 studies were funded by organisations such as research foundations and government funding bodies. AUTHORS' CONCLUSIONS This review provides moderate-quality evidence that standard-dose adrenaline compared to placebo improves return of spontaneous circulation, survival to hospital admission and survival to hospital discharge, but low-quality evidence that it did not affect survival with a favourable neurological outcome. Very low -quality evidence found that high-dose adrenaline compared to standard-dose adrenaline improved return of spontaneous circulation and survival to admission. Vasopressin compared to standard dose adrenaline improved survival to admission but not return of spontaneous circulation, whilst the combination of adrenaline and vasopressin compared with adrenaline alone had no effect on these outcomes. Neither standard dose adrenaline, high-dose adrenaline,vasopressin nor a combination of adrenaline and vasopressin improved survival with a favourable neurological outcome. Many of these studies were conducted more than 20 years ago. Treatment has changed in recent years, so the findings from older studies may not reflect current practice.
Collapse
|
4
|
Epidemiology of trauma patients attended by ambulance paramedics in Perth, Western Australia. Emerg Med Australas 2018; 30:827-833. [PMID: 30044053 DOI: 10.1111/1742-6723.13148] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/27/2017] [Revised: 05/29/2018] [Accepted: 06/24/2018] [Indexed: 11/30/2022]
Abstract
OBJECTIVE The aim of the study was to describe the epidemiology of trauma in adult patients attended by ambulance paramedics in Perth, Western Australia. METHODS A retrospective cohort study of trauma patients aged ≥16 years attended by St John Ambulance Western Australia (SJA-WA) paramedics in greater metropolitan Perth between 2013 and 2016 using the SJA-WA database and WA death data. Incidence and 30 day mortality rates were calculated. Patients who died prehospital (immediate deaths), on the day of injury (early deaths), within 30 days (late deaths) and those who survived longer than 30 days (survivors) were compared for age, sex, mechanism of injury and acuity level. Prehospital interventions were also reported. RESULTS Overall, 97 724 cases were included. A statistically significant increase in the incidence rate occurred over the study period (from 1466 to 1623 per 100 000 population year P ≤ 0.001). There were 2183 deaths within 30 days (n = 2183/97 724, 2.2%). Motor vehicle accidents were responsible for most immediate and early deaths (n = 98/203, 48.3% and n = 72/156, 46.2%, respectively). The majority of transported patients were low acuity (acuity levels 3 to 5, n = 60 594/79 887, 75.8%) and high-acuity patients accounted for 2.7% (n = 2176/79 997). Analgesia administration was the most frequently performed intervention (n = 32 333/80 643, 40.1%), followed by insertion of intravenous catheters (n = 25 060/80 643, 31.1%). Advanced life support interventions such as endotracheal intubation were performed in <1% of patients. CONCLUSION The trauma incidence rate increased over time and the majority of patients had low-acuity injuries. Focusing research, training and resources solely on high-acuity patients will not cater for the needs of the majority of patients.
Collapse
|
5
|
Reply to: 'Cardiac arrest and breathing, why bother?' Because it's too late if we wait for a definitive diagnosis. Resuscitation 2018; 126:e10-e11. [PMID: 29476892 DOI: 10.1016/j.resuscitation.2018.02.018] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2018] [Accepted: 02/17/2018] [Indexed: 11/17/2022]
|
6
|
'She's sort of breathing': What linguistic factors determine call-taker recognition of agonal breathing in emergency calls for cardiac arrest? Resuscitation 2017; 122:92-98. [PMID: 29183831 DOI: 10.1016/j.resuscitation.2017.11.058] [Citation(s) in RCA: 34] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2017] [Revised: 10/27/2017] [Accepted: 11/24/2017] [Indexed: 11/24/2022]
Abstract
BACKGROUND In emergency ambulance calls, agonal breathing remains a barrier to the recognition of out-of-hospital cardiac arrest (OHCA), initiation of cardiopulmonary resuscitation, and rapid dispatch. We aimed to explore whether the language used by callers to describe breathing had an impact on call-taker recognition of agonal breathing and hence cardiac arrest. METHODS We analysed 176 calls of paramedic-confirmed OHCA, stratified by recognition of OHCA (89 cases recognised, 87 cases not recognised). We investigated the linguistic features of callers' response to the question "is s/he breathing?" and examined the impact on subsequent coding by call-takers. RESULTS Among all cases (recognised and non-recognised), 64% (113/176) of callers said that the patients were breathing (yes-answers). We identified two categories of yes-answers: 56% (63/113) were plain answers, confirming that the patient was breathing ("he's breathing"); and 44% (50/113) were qualified answers, containing additional information ("yes but gasping"). Qualified yes-answers were suggestive of agonal breathing. Yet these answers were often not pursued and most (32/50) of these calls were not recognised as OHCA at dispatch. CONCLUSION There is potential for improved recognition of agonal breathing if call-takers are trained to be alert to any qualification following a confirmation that the patient is breathing.
Collapse
|
7
|
The linguistic and interactional factors impacting recognition and dispatch in emergency calls for out-of-hospital cardiac arrest: a mixed-method linguistic analysis study protocol. BMJ Open 2017; 7:e016510. [PMID: 28694349 PMCID: PMC5541602 DOI: 10.1136/bmjopen-2017-016510] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/04/2022] Open
Abstract
INTRODUCTION Emergency telephone calls placed by bystanders are crucial to the recognition of out-of-hospital cardiac arrest (OHCA), fast ambulance dispatch and initiation of early basic life support. Clear and efficient communication between caller and call-taker is essential to this time-critical emergency, yet few studies have investigated the impact that linguistic factors may have on the nature of the interaction and the resulting trajectory of the call. This research aims to provide a better understanding of communication factors impacting on the accuracy and timeliness of ambulance dispatch. METHODS AND ANALYSIS A dataset of OHCA calls and their corresponding metadata will be analysed from an interdisciplinary perspective, combining linguistic analysis and health services research. The calls will be transcribed and coded for linguistic and interactional variables and then used to answer a series of research questions about the recognition of OHCA and the delivery of basic life-support instructions to bystanders. Linguistic analysis of calls will provide a deeper understanding of the interactional dynamics between caller and call-taker which may affect recognition and dispatch for OHCA. Findings from this research will translate into recommendations for modifications of the protocols for ambulance dispatch and provide directions for further research. ETHICS AND DISSEMINATION The study has been approved by the Curtin University Human Research Ethics Committee (HR128/2013) and the St John Ambulance Western Australia Research Advisory Group. Findings will be published in peer-reviewed journals and communicated to key audiences, including ambulance dispatch professionals.
Collapse
|
8
|
'Tell me exactly what's happened': When linguistic choices affect the efficiency of emergency calls for cardiac arrest. Resuscitation 2017; 117:58-65. [PMID: 28599999 DOI: 10.1016/j.resuscitation.2017.06.002] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2017] [Revised: 05/18/2017] [Accepted: 06/05/2017] [Indexed: 11/25/2022]
Abstract
BACKGROUND Clear and efficient communication between emergency caller and call-taker is crucial to timely ambulance dispatch. We aimed to explore the impact of linguistic variation in the delivery of the prompt "okay, tell me exactly what happened" on the way callers describe the emergency in the Medical Priority Dispatch System®. METHODS We analysed 188 emergency calls for cases of paramedic-confirmed out-of-hospital cardiac arrest. We investigated the linguistic features of the prompt "okay, tell me exactly what happened" in relation to the format (report vs. narrative) of the caller's response. In addition, we compared calls with report vs. narrative responses in the length of response and time to dispatch. RESULTS Callers were more likely to respond with a report format when call-takers used the present perfect ("what's happened") rather than the simple past ("what happened") (Adjusted Odds Ratio [AOR] 4.07; 95% Confidence Interval [95%CI] 2.05-8.28, p<0.001). Reports were significantly shorter than narrative responses (9s vs. 18s, p<0.001), and were associated with less time to dispatch (50s vs. 58s, p=0.002). CONCLUSION These results suggest that linguistic variations in the way the scripted sentences of a protocol are delivered can have an impact on the efficiency with which call-takers process emergency calls. A better understanding of interactional dynamics between caller and call-taker may translate into improvements of dispatch performance.
Collapse
|
9
|
Meeting abstracts from the first European Emergency Medical Services congress (EMS2016). Scand J Trauma Resusc Emerg Med 2017. [PMCID: PMC5356044 DOI: 10.1186/s13049-017-0358-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022] Open
|
10
|
A systematic review and meta-analysis of the association between arterial carbon dioxide tension and outcomes after cardiac arrest. Resuscitation 2016; 111:116-126. [PMID: 27697606 DOI: 10.1016/j.resuscitation.2016.09.019] [Citation(s) in RCA: 47] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2016] [Revised: 08/13/2016] [Accepted: 09/16/2016] [Indexed: 10/20/2022]
Abstract
INTRODUCTION Arterial carbon dioxide tension (PaCO2) abnormalities are common after cardiac arrest (CA). Maintaining a normal PaCO2 makes physiological sense and is recommended as a therapeutic target after CA, but few studies have examined the association between PaCO2 and patient outcomes. This systematic review and meta-analysis aimed to assess the effect of a low or high PaCO2 on patient outcomes after CA. METHODS We searched MEDLINE, EMBASE, CINAHL and Cochrane CENTRAL, for studies that evaluated the association between PaCO2 and outcomes after CA. The primary outcome was hospital survival. Secondary outcomes included neurological status at the end of each study's follow up period, hospital discharge destination and 30-day survival. Meta-analysis was conducted if statistical heterogeneity was low. RESULTS The systematic review included nine studies; eight provided sufficient quantitative data for meta-analysis. Using PaCO2 cut-points of <35mmHg and >45mmHg to define hypo- and hypercarbia, normocarbia was associated with increased hospital survival (odds ratio [OR] 1.30, 95% confidence interval [CI] 1.23, 1.38). Normocarbia was also associated with a good neurological outcome (cerebral performance category score 1 or 2) compared to hypercarbia (OR 1.69, 95% CI 1.13, 2.51) when the analysis also included an additional study with a slightly different definition for normocarbia (PaCO2 30-50mmHg). CONCLUSIONS From the limited data it appears PaCO2 has an important U-shape association with survival and outcomes after CA, consistent with international resuscitation guidelines' recommendation that normocarbia be targeted during post-resuscitation care.
Collapse
|
11
|
Which patients should be transported to the emergency department? A perpetual prehospital dilemma. Emerg Med Australas 2016; 28:647-653. [PMID: 27592495 DOI: 10.1111/1742-6723.12662] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/24/2016] [Revised: 05/23/2016] [Accepted: 07/24/2016] [Indexed: 11/27/2022]
Abstract
OBJECTIVE To examine the ability of paramedics to identify patients who could be managed in the community and to identify predictors that could be used to accurately identify patients who should be transported to EDs. METHODS Lower acuity patients who were assessed by paramedics in the Perth metropolitan area in 2013 were studied. Paramedics prospectively indicated on the patient care record if they considered that the patient could be treated in the community. The paramedic decisions were compared with actual disposition from the ED (discharge and admission), and the occurrence of subsequent events (ambulance request, ED visit, admission and death) for discharged patients at the scene was investigated. Decision tree analysis was used to identify predictors that were associated with hospital admission. RESULTS In total, 57 183 patients were transported to the ED, and 10 204 patients were discharged at the scene by paramedics. Paramedics identified 2717 patients who could potentially be treated in the community among those who were transported to the ED. Of these, 1455 patients (53.6%) were admitted to hospital. For patients discharged at the scene, those who were indicated as suitable for community care were more likely to experience subsequent events than those who were not. The decision tree found that two predictors (age and aetiology) were associated with hospital admission. Overall discriminative power of the decision tree was poor; the area under the receiver operating characteristic curve was 0.686. CONCLUSION Lower acuity patients who could be treated in the community were not accurately identified by paramedics. This process requires further evaluation.
Collapse
|
12
|
Association between ambulance dispatch priority and patient condition. Emerg Med Australas 2016; 28:716-724. [PMID: 27592247 DOI: 10.1111/1742-6723.12656] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2015] [Revised: 05/03/2016] [Accepted: 07/11/2016] [Indexed: 11/30/2022]
Abstract
OBJECTIVE To compare chief complaints of the Medical Priority Dispatch System in terms of the match between dispatch priority and patient condition. METHODS This was a retrospective whole-of-population study of emergency ambulance dispatch in Perth, Western Australia, 1 January 2014 to 30 June 2015. Dispatch priority was categorised as either Priority 1 (high priority), or Priority 2 or 3. Patient condition was categorised as time-critical for patient(s) transported as Priority 1 to hospital or who died (and resuscitation was attempted by paramedics); else, patient condition was categorised as less time-critical. The χ2 statistic was used to compare chief complaints by false omission rate (percentage of Priority 2 or 3 dispatches that were time-critical) and positive predictive value (percentage of Priority 1 dispatches that were time-critical). We also reported sensitivity and specificity. RESULTS There were 211 473 cases of dispatch. Of 99 988 cases with Priority 2 or 3 dispatch, 467 (0.5%) were time-critical. Convulsions/seizures and breathing problems were highlighted as having more false negatives (time-critical despite Priority 2 or 3 dispatch) than expected from the overall false omission rate. Of 111 485 cases with Priority 1 dispatch, 6520 (5.8%) were time-critical. Our analysis highlighted chest pain, heart problems/automatic implanted cardiac defibrillator, unknown problem/collapse, and headache as having fewer true positives (time-critical and Priority 1 dispatch) than expected from the overall positive predictive value. CONCLUSION Scope for reducing under-triage and over-triage of ambulance dispatch varies between chief complaints of the Medical Priority Dispatch System. The highlighted chief complaints should be considered for future research into improving ambulance dispatch system performance.
Collapse
|
13
|
Induction of Therapeutic Hypothermia During Out-of-Hospital Cardiac Arrest Using a Rapid Infusion of Cold Saline: The RINSE Trial (Rapid Infusion of Cold Normal Saline). Circulation 2016; 134:797-805. [PMID: 27562972 DOI: 10.1161/circulationaha.116.021989] [Citation(s) in RCA: 104] [Impact Index Per Article: 13.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/27/2016] [Accepted: 07/27/2016] [Indexed: 11/16/2022]
Abstract
BACKGROUND Patients successfully resuscitated by paramedics from out-of-hospital cardiac arrest often have severe neurologic injury. Laboratory and observational clinical reports have suggested that induction of therapeutic hypothermia during cardiopulmonary resuscitation (CPR) may improve neurologic outcomes. One technique for induction of mild therapeutic hypothermia during CPR is a rapid infusion of large-volume cold crystalloid fluid. METHODS In this multicenter, randomized, controlled trial we assigned adults with out-of-hospital cardiac arrest undergoing CPR to either a rapid intravenous infusion of up to 2 L of cold saline or standard care. The primary outcome measure was survival at hospital discharge; secondary end points included return of a spontaneous circulation. The trial was closed early (at 48% recruitment target) due to changes in temperature management at major receiving hospitals. RESULTS A total of 1198 patients were assigned to either therapeutic hypothermia during CPR (618 patients) or standard prehospital care (580 patients). Patients allocated to therapeutic hypothermia received a mean (SD) of 1193 (647) mL cold saline. For patients with an initial shockable cardiac rhythm, there was a decrease in the rate of return of a spontaneous circulation in patients who received cold saline compared with standard care (41.2% compared with 50.6%, P=0.03). Overall 10.2% of patients allocated to therapeutic hypothermia during CPR were alive at hospital discharge compared with 11.4% who received standard care (P=0.71). CONCLUSIONS In adults with out-of-hospital cardiac arrest, induction of mild therapeutic hypothermia using a rapid infusion of large-volume, intravenous cold saline during CPR may decrease the rate of return of a spontaneous circulation in patients with an initial shockable rhythm and produced no trend toward improved outcomes at hospital discharge. CLINICAL TRIAL REGISTRATION URL: http://www.clinicaltrials.gov. Unique identifier: NCT01173393.
Collapse
|
14
|
A comparison of prognostic significance of strong ion gap (SIG) with other acid-base markers in the critically ill: a cohort study. J Intensive Care 2016; 4:43. [PMID: 27366324 PMCID: PMC4928272 DOI: 10.1186/s40560-016-0166-z] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2016] [Accepted: 06/21/2016] [Indexed: 12/11/2022] Open
Abstract
Background This cohort study compared the prognostic significance of strong ion gap (SIG) with other acid-base markers in the critically ill. Methods The relationships between SIG, lactate, anion gap (AG), anion gap albumin-corrected (AG-corrected), base excess or strong ion difference-effective (SIDe), all obtained within the first hour of intensive care unit (ICU) admission, and the hospital mortality of 6878 patients were analysed. The prognostic significance of each acid-base marker, both alone and in combination with the Admission Mortality Prediction Model (MPM0 III) predicted mortality, were assessed by the area under the receiver operating characteristic curve (AUROC). Results Of the 6878 patients included in the study, 924 patients (13.4 %) died after ICU admission. Except for plasma chloride concentrations, all acid-base markers were significantly different between the survivors and non-survivors. SIG (with lactate: AUROC 0.631, confidence interval [CI] 0.611–0.652; without lactate: AUROC 0.521, 95 % CI 0.500–0.542) only had a modest ability to predict hospital mortality, and this was no better than using lactate concentration alone (AUROC 0.701, 95 % 0.682–0.721). Adding AG-corrected or SIG to a combination of lactate and MPM0 III predicted risks also did not substantially improve the latter’s ability to differentiate between survivors and non-survivors. Arterial lactate concentrations explained about 11 % of the variability in the observed mortality, and it was more important than SIG (0.6 %) and SIDe (0.9 %) in predicting hospital mortality after adjusting for MPM0 III predicted risks. Lactate remained as the strongest predictor for mortality in a sensitivity multivariate analysis, allowing for non-linearity of all acid-base markers. Conclusions The prognostic significance of SIG was modest and inferior to arterial lactate concentration for the critically ill. Lactate concentration should always be considered regardless whether physiological, base excess or physical-chemical approach is used to interpret acid-base disturbances in critically ill patients.
Collapse
|
15
|
Abstract
Primary aldosteronism (PA) is the most frequent endocrine cause of secondary arterial hypertension. Sporadic forms of PA caused mainly by an aldosterone producing adenoma (APA) or idiopathic adrenal hyperplasia (IAH) predominate; in contrast, familial forms (familial hyperaldosteronism types I, II and III) affect only a minor proportion of PA patients. Patient based registries and biobanks, international networks and next generation sequencing technologies have emerged over recent years. Somatic hot-spot mutations in the potassium channel GIRK4 (encoded by KCNJ5), in ATPases and a L-type voltage-gated calcium-channel correlate with the autonomous aldosterone production in approximately half of all APAs. The recently discovered form FH III is caused by different germline KCNJ5 mutations with variable clinical presentations and severity. Autoantibodies to the angiotensin II Type 1 receptor have been identified in patients with PA and possibly play a pathophysiological role in the development of PA. Adrenal vein sampling (AVS) represents the gold standard in differentiating unilateral and bilateral forms of PA. Recent consensus papers have tried to implement current guidelines in order to standardise the technique of AVS. New techniques like segmental AVS might allow a finer mapping of the aldosterone production within the adrenal gland. The measurement of the steroids 18-hydroxycortisol and 18-oxocortisol by liquid chromatography tandem mass spectrometry has been shown to be useful to distinguish between unilateral and bilateral forms of PA.
Collapse
|
16
|
Use of serum lactate levels to predict survival for patients with out-of-hospital cardiac arrest: A cohort study. Emerg Med Australas 2016; 28:171-8. [PMID: 26929190 DOI: 10.1111/1742-6723.12560] [Citation(s) in RCA: 32] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2015] [Revised: 12/01/2015] [Accepted: 12/08/2015] [Indexed: 12/20/2022]
Abstract
OBJECTIVES We examined the association of serum lactate levels and early lactate clearance with survival to hospital discharge for patients suffering an out-of-hospital cardiac arrest (OHCA). METHODS A retrospective cohort analysis was performed of patients with OHCA transported by ambulance to two adult tertiary hospitals in Perth, Western Australia. Exclusion criteria were traumatic cardiac arrest, return of spontaneous circulation prior to the arrival of the ambulance, age less than 18 years and no serum lactate levels recorded. Serum lactate levels recorded for up to 48 h post-arrest were obtained from the hospital clinical information system, and lactate clearance over 48 h was calculated. Descriptive and logistic regression analyses were conducted. RESULTS There were 518 patients with lactate values, of whom 126 (24.3%) survived to hospital discharge. Survivors and non-survivors had different mean initial lactate levels (mean ± SD 6.9 ± 4.7 and 12.2 ± 5.5 mmol/L, respectively; P < 0.001). Lactate clearance was higher in survivors. Lactate levels for non-survivors did not decrease below 2 mmol/L until at least 30 h after the ambulance call. CONCLUSION In OHCA patients who had serum lactate levels measured, both lower initial serum lactate and early lactate clearance in the first 48 h following OHCA were associated with increased likelihood of survival. However, the use of lactate in isolation as a predictor of survival or neurological outcome is not recommended. Prospective studies that minimise selection bias are required to determine the clinical utility of serum lactate levels in OHCA patients.
Collapse
|
17
|
The ability of early warning scores (EWS) to detect critical illness in the prehospital setting: A systematic review. Resuscitation 2016; 102:35-43. [PMID: 26905389 DOI: 10.1016/j.resuscitation.2016.02.011] [Citation(s) in RCA: 56] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/05/2015] [Revised: 01/22/2016] [Accepted: 02/09/2016] [Indexed: 01/01/2023]
Abstract
AIM To examine whether early warning scores (EWS) can accurately predict critical illness in the prehospital setting and affect patient outcomes. METHODS We searched bibliographic databases for comparative studies that examined prehospital EWS for patients transported by ambulance in the prehospital setting. The ability of the different EWS, including pre-alert protocols and physiological-based EWS, to predict critical illness (sensitivity, odds ratio [OR], area under receiver operating characteristic [AUROC] curves) and hospital mortality was summarised. Study quality was assessed using the Newcastle-Ottawa Scale. RESULTS Eight studies were identified. Two studies compared the use of EWS to standard practice using clinical judgement alone to identify critical illness: the pooled diagnostic OR and summary AUROC for EWS were 10.9 (95%CI 4.2-27.9) and 0.78 (95%CI 0.74-0.82), respectively. A study of 144,913 patients reported age and physiological variables predictive of critical illness: AUROC in the independent validation sample was 0.77, 95% CI 0.76-0.78. The high-risk patients stratified by the national early warning score (NEWS) were significantly associated with a higher risk of both mortality and intensive care admission. Data on comparing between different EWS were limited; the Prehospital Early Sepsis Detection (PRESEP) score predicted occurrence of sepsis better than the Modified EWS (AUROC 0.93 versus 0.77, respectively). CONCLUSION EWS in the prehospital setting appeared useful in predicting clinically important outcomes, but the significant heterogeneity between different EWS suggests that these positive promising findings may not be generalisable. Adequately powered prospective studies are needed to identify the EWS best suited to the prehospital setting.
Collapse
|
18
|
|
19
|
Challenges during long-term follow-up of ICU patients with and without chronic disease. Aust Crit Care 2016; 29:27-34. [DOI: 10.1016/j.aucc.2015.04.002] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2014] [Revised: 04/08/2015] [Accepted: 04/15/2015] [Indexed: 11/30/2022] Open
|
20
|
Abstract
Somatic mutations have been identified in the KCNJ5 gene (encoding the potassium channel GIRK4) in aldosterone-producing adenomas (APA). Most of these mutations are located in or near the selectivity filter of the GIRK4 channel pore and several have been shown to lead to the constitutive overproduction of aldosterone. KCNJ5 mutations in APA are more frequent in women; however, this gender dimorphism is a reported phenomenon of Western but not East Asian populations. In this review we discuss some of the issues that could potentially underlie this observation.
Collapse
|
21
|
Abstract
Primary aldosteronism encompasses 2 major underlying causes: (1) aldosterone producing adenoma and (2) bilateral adrenal hyperplasia. In addition to the aldosterone excess, increased production of other compounds of the steroidogenic pathways may be involved. Until recently, most studies examined the production of steroids other than aldosterone in tumor tissue, urine, or peripheral plasma samples, but several new studies have also addressed steroid levels in adrenal venous blood samples using liquid chromatography tandem mass spectrometry. Plasma and tissue levels of several precursors of aldosterone with mineralocorticoid activity are higher in patients with aldosterone producing adenomas than in those with bilateral hyperplasia. These include corticosterone, deoxycorticosterone, and their 18-hydroxylated metabolites. Similarly, urinary, peripheral, and adrenal venous concentrations of the hybrid steroids 18-oxocortisol and 18-hydroxycortisol are higher in patients with aldosterone producing adenomas than in bilateral hyperplasia. Differences in the pathophysiology and in clinical and biochemical phenotypes caused by aldosterone producing adenomas and bilateral adrenal hyperplasia may be related to the differential expression of steroidogenic enzymes, and associated to specific underlying somatic mutations. Correct appreciation of differences in steroid profiling between aldosterone producing adenomas and bilateral adrenal hyperplasia may not only contribute to a better understanding of the pathogenesis of primary aldosteronism but may also be helpful for future subtyping of primary aldosteronism.
Collapse
|
22
|
Abstract
Identification and management of patients with primary aldosteronism are of utmost importance because it is a frequent cause of endocrine hypertension, and affected patients display an increase of cardio- and cerebro-vascular events, compared to essential hypertensives. Distinction of primary aldosteronism subtypes is of particular relevance to allocate the patients to the appropriate treatment, represented by mineralocorticoid receptor antagonists for bilateral forms and unilateral adrenalectomy for patients with unilateral aldosterone secretion. Subtype differentiation of confirmed hyperaldosteronism comprises adrenal CT scanning and adrenal venous sampling. In this review, we will discuss different clinical scenarios where execution, interpretation of adrenal vein sampling and subsequent patient management might be challenging, providing the clinician with useful information to help the interpretation of controversial procedures.
Collapse
|
23
|
Trends in traumatic out-of-hospital cardiac arrest in Perth, Western Australia from 1997 to 2014. Resuscitation 2015; 98:79-84. [PMID: 26620392 DOI: 10.1016/j.resuscitation.2015.10.015] [Citation(s) in RCA: 37] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2015] [Revised: 09/25/2015] [Accepted: 10/25/2015] [Indexed: 11/19/2022]
Abstract
AIM This study aims to describe and compare traumatic and medical out-of-hospital cardiac arrest (OHCA) occurring in Perth, Western Australia, between 1997 and 2014. METHODS The St John Ambulance Western Australia (SJA-WA) OHCA Database was used to identify all adult (≥ 16 years) cases. We calculated annual crude and age-sex standardised incidence rates (ASIRs) for traumatic and medical OHCA and investigated trends over time. RESULTS Over the study period, SJA-WA attended 1,354 traumatic OHCA and 16,076 medical OHCA cases. The mean annual crude incidence rate of traumatic OHCA in adults attended by SJA-WA was 6.0 per 100,000 (73.9 per 100,000 for medical cases), with the majority resulting from motor vehicle collisions (56.7%). We noted no change to either incidence or mechanism of injury over the study period (p>0.05). Compared to medical OHCA, traumatic OHCA cases were less likely to receive bystander cardiopulmonary resuscitation (CPR) (20.4% vs. 24.5%, p=0.001) or have resuscitation commenced by paramedics (38.9% vs. 44.8%, p<0.001). However, rates of bystander CPR and resuscitation commenced by paramedics increased significantly over time in traumatic OHCA (p<0.001). In cases where resuscitation was commenced by paramedics there was no difference in the proportion who died at the scene (37.2% traumatic vs. 34.3% medical, p=0.17), however, fewer traumatic OHCAs survived to hospital discharge (1.7% vs. 8.7%, p<0.001). CONCLUSIONS Despite temporal increases in rates of bystander CPR and paramedic resuscitation, traumatic OHCA survival remains poor with only nine patients surviving from traumatic OHCA over the 18-year period.
Collapse
|
24
|
ARMC5 mutation analysis in patients with primary aldosteronism and bilateral adrenal lesions. J Hum Hypertens 2015; 30:374-8. [PMID: 26446392 DOI: 10.1038/jhh.2015.98] [Citation(s) in RCA: 33] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2015] [Revised: 08/11/2015] [Accepted: 08/17/2015] [Indexed: 11/09/2022]
Abstract
Idiopathic hyperaldosteronism (IHA) due to bilateral adrenal hyperplasia is the most common subtype of primary aldosteronism (PA). The pathogenesis of IHA is still unknown, but the bilateral disease suggests a potential predisposing genetic alteration. Heterozygous germline mutations of armadillo repeat containing 5 (ARMC5) have been shown to be associated with hypercortisolism due to sporadic primary bilateral macronodular adrenal hyperplasia and are also observed in African-American PA patients. We investigated the presence of germline ARMC5 mutations in a group of PA patients who had bilateral computed tomography-detectable adrenal alterations. We sequenced the entire coding region of ARMC5 and all intron/exon boundaries in 39 patients (37 Caucasians and 2 black Africans) with confirmed PA (8 unilateral, 27 bilateral and 4 undetermined subtype) and bilateral adrenal lesions. We identified 11 common variants, 5 rare variants with a minor allele frequency <1% and 2 new variants not previously reported in public databases. We did not detect by in silico analysis any ARMC5 sequence variations that were predicted to alter protein function. In conclusion, ARMC5 mutations are not present in a fairly large series of Caucasian patients with PA associated to bilateral adrenal disease. Further studies are required to definitively clarify the role of ARMC5 in the pathogenesis of adrenal nodules and aldosterone excess in patients with PA.
Collapse
|
25
|
Abstract
Renin-angiotensin-aldosterone system (RAAS) is recognized as the main regulatory system of hemodynamics in man, and its derangements have a key role in the development and maintenance of arterial hypertension. Classification of the hypertensive states according to different patterns of renin and aldosterone levels ("RAAS profiling") allows the diagnosis of specific forms of secondary hypertension and may identify distinct hemodynamic subsets in essential hypertension. In this review, we summarize the application of RAAS profiling for the diagnostic assessment of hypertensive patients and discuss how the pathophysiological framework provided by RAAS profiling may guide therapeutic decision-making, especially in the context of uncontrolled hypertension not responding to multi-therapy.
Collapse
|
26
|
The effect of presenting symptoms and patient characteristics on prehospital delay in MI patients presenting to emergency department by ambulance: a cohort study. Heart Lung Circ 2015; 24:943-50. [PMID: 25922230 DOI: 10.1016/j.hlc.2015.02.026] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2014] [Revised: 02/09/2015] [Accepted: 02/22/2015] [Indexed: 10/23/2022]
Abstract
INTRODUCTION There is little recent information about prehospital delay time for Australian patients with myocardial infarction (MI). OBJECTIVES This study: (1) describes prehospital delay time for patients with MI; (2) identifies variables and presenting symptoms which contribute to the delay. METHODS This retrospective cohort study identified patients with an Emergency Department (ED) discharge diagnosis of MI, transported by ambulance to one of the seven Perth metropolitan EDs, between January 2008 and October 2009. Prehospital delay times were analysed using linear regression models. Non-numeric (word descriptions) of delay time were categorised. RESULTS Of 1,633 patients, symptom onset-time was available for 1,003. For 829 patients with a numeric onset-time, median delay was 2.2hours; decreased delay was associated with age <70 years, presenting with chest pain, and diaphoresis. Increased delay was associated with being with a primary health care provider, and if the patient was at home and if the person who called the ambulance was anyone other than the spouse. For 174 patients with non-numeric onset-times, 37% patients delayed one to three days and 110 (64.0%) patients described their symptoms as intermittent and/or of gradual onset. CONCLUSION Given that prehospital delay times remain longer than is optimal, public awareness of MI symptoms should be enhanced in order to decrease prehospital delay.
Collapse
|
27
|
Prolactinoma and primary aldosteronism: is there a causal link? Exp Clin Endocrinol Diabetes 2015. [DOI: 10.1055/s-0035-1549073] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
28
|
Abstract
INTRODUCTION Separate clinical practice guidelines (CPG) for asthma and chronic obstructive pulmonary disease (COPD) often guide prehospital care. However, having distinct CPGs implies that paramedics can accurately differentiate these conditions. We compared the accuracy of paramedic identification of these two conditions against the emergency department (ED) discharge diagnosis. METHODS A retrospective cohort of all patients transported to ED by ambulance in Perth, Western Australia between July 2012 and June 2013; and identified as "asthma" or "COPD" by paramedics. We linked ambulance data to emergency department discharge diagnosis. RESULTS Of 1,067 patients identified by paramedics as having asthma, 41% had an ED discharge diagnosis of asthma, i.e., positive predictive value (PPV) = 41% (95% CI 38-44%). Of 1,048 patients recorded as COPD, 57% had an ED discharge diagnosis of COPD (PPV 57%; 95% CI 54-60%). Sensitivity for the paramedic identification of patients diagnosed with asthma or COPD in the ED was 66% for asthma (95% CI 63-70%) and 39% for COPD (95% CI 36-41%). Paramedics reported wheezing in 86% of asthma and 55% of COPD patients. CONCLUSION Differentiating between asthma and COPD in the prehospital setting is difficult. A single CPG for respiratory distress would be more useful for the clinical management of these patients by paramedics.
Collapse
|
29
|
Prehospital factors associated with an ICU admission from the emergency department. Crit Care 2015. [PMCID: PMC4471043 DOI: 10.1186/cc14485] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022] Open
|
30
|
Symptoms of Myocardial Infarction: Concordance between Paramedic and Hospital Records. PREHOSP EMERG CARE 2014; 18:393-401. [DOI: 10.3109/10903127.2014.891064] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
|
31
|
The impact of new prehospital practitioners on ambulance transportation to the emergency department: a systematic review and meta-analysis. Emerg Med J 2013; 31:e88-94. [PMID: 24243486 DOI: 10.1136/emermed-2013-202976] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
OBJECTIVE To conduct a systematic review and meta-analysis to examine the impact of new prehospital practitioners (NPPs), including emergency care practitioners (EmCPs), paramedic practitioners and extended care paramedics (ECPs), on ambulance transportation to the emergency department (ED). METHODS We searched MEDLINE, Embase, CINAHL and AUSTHealth databases, and hand searched emergency medicine journals and journal reference lists for relevant papers. To be included, studies were required to target one type of NPP and compare outcomes such as the frequencies of conveyance to the ED, discharge at scene, subsequent ED attendance and/or appropriateness of care between NPPs and conventional ambulance crews. Three investigators independently selected relevant studies. The risk of bias in individual studies was assessed using a validated checklist. We conducted meta-analyses for comparisons which had acceptable heterogeneity (I(2)<75%) and reported pooled estimates of ORs with 95% CIs. RESULTS 13 studies were identified from 16 584 citation reports. EmCPs were most frequently studied. The majority of studies (77%) did not fully report important potential confounders. NPPs were less likely to convey patients to the ED and more likely to discharge patients at the scene than conventional ambulance crews. Pooled ORs for conveyance to the ED and discharge at the scene by ECPs were 0.09 (95% CI 0.04 to 0.18) and 10.5 (95% CI 5.8 to 19), respectively. The evidence for subsequent ED attendance and appropriateness of care was equivocal. CONCLUSIONS The NPP schemes reduced transport to the ED; however, the appropriateness of the decision of the NPPs and the safety of patients were not well supported by the reported studies.
Collapse
|
32
|
A systematic review of air pollution and incidence of out-of-hospital cardiac arrest. J Epidemiol Community Health 2013; 68:37-43. [DOI: 10.1136/jech-2013-203116] [Citation(s) in RCA: 44] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
|
33
|
Frequency of Aspirating Gastric Tubes for Patients Receiving Enteral Nutrition in the ICU. JPEN J Parenter Enteral Nutr 2013; 38:809-16. [DOI: 10.1177/0148607113497223] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/15/2023]
|
34
|
Accuracy of the ‘mode of transportation’ variable in the Emergency Department Information System data. Emerg Med Australas 2013; 25:382-3. [DOI: 10.1111/1742-6723.12106] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
35
|
Evidence-based paramedic models of care to reduce unnecessary emergency department attendance--feasibility and safety. BMC Emerg Med 2013; 13:13. [PMID: 23855265 PMCID: PMC3724748 DOI: 10.1186/1471-227x-13-13] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2013] [Accepted: 07/03/2013] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND As demand for Emergency Department (ED) services continues to exceed increases explained by population growth, strategies to reduce ED presentations are being explored. The concept of ambulance paramedics providing an alternative model of care to the current default 'see and transport to ED' has intuitive appeal and has been implemented in several locations around the world. The premise is that for certain non-critically ill patients, the Extended Care Paramedic (ECP) can either 'see and treat' or 'see and refer' to another primary or community care practitioner, rather than transport to hospital. However, there has been little rigorous investigation of which types of patients can be safely identified and managed in the community, or the impact of ECPs on ED attendance. METHODS/DESIGN St John Ambulance Western Australia paramedics will indicate on the electronic patient care record (e-PCR) of patients attended in the Perth metropolitan area if they consider them to be suitable to be managed in the community. 'Follow-up' will examine these patients using ED data to determine the patient's disposition from the ED. A clinical panel will then develop a protocol to identify those patients who can be safely managed in the community. Paramedics will then assess patients against the derived ECP protocols and identify those deemed suitable to 'see and treat' or 'see and refer'. The ED disposition (and other clinical outcomes) of these 'ECP protocol identified' patients will enable us to assess whether it would have been appropriate to manage these patients in the community. We will also 'track' re-presentations to EDs within seven days of the initial presentation. This is a 'virtual experiment' with no direct involvement of patients or changes in clinical practice. A systems modelling approach will be used to assess the likely impact on ED crowding. DISCUSSION To date the efficacy, cost-effectiveness and safety of alternative community-based models of emergency care have not been rigorously investigated. This study will inform the development of ECP protocols through the identification of types of patient presentation that can be considered both safe and appropriate for paramedics to manage in the community.
Collapse
|
36
|
Paramedic identification of acute pulmonary edema in a metropolitan ambulance service. PREHOSP EMERG CARE 2013; 17:339-47. [PMID: 23484502 DOI: 10.3109/10903127.2013.773114] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
Abstract
INTRODUCTION Acute pulmonary edema (APE) is a common cause of acute dyspnea. In the prehospital setting, it is often difficult to differentiate APE from other causes of shortness of breath (SOB). Radiography and echocardiography aid in the identification of APE but are often not available. There is little information on how accurately ambulance paramedics identify patients with APE. Objectives. This study aimed to 1) describe the prehospital clinical presentation and management of patients with a clinical diagnosis of APE and 2) compare the accuracy of coding of APE by paramedics against the emergency department (ED) medical discharge diagnosis. METHODS This study included a retrospective cohort of all patients who had episodes identified as APE by ambulance paramedics and were transported to a metropolitan hospital ED in 2011. Two databases were used: an ambulance database and the Emergency Department Information System. The ED medical discharge diagnosis (using International Statistical Classification of Diseases and Related Problems, 10th Revision, Australian Modification [ICD-10-AM] codes) was used as the comparator with paramedic-assigned problem codes for APE. The outcomes for the study were the positive predictive value, i.e., the proportion of patients identified as having APE in the ambulance database who also had an ED discharge diagnosis of APE, and the sensitivity of paramedic identification of APE, i.e., the proportion of patients with an ED discharge diagnosis of APE that were correctly identified as APE by the ambulance paramedics. RESULTS Four hundred ninety-five patients were transported to an ED with APE identified by the paramedics as the primary problem code. Shortness of breath, crepitations, high systolic blood pressure, and chest pain were the most common presenting signs and symptoms. Pink frothy sputum was rare (3% of patient episodes of APE). One hundred eighty-six patients received an ED discharge diagnosis of APE, i.e., a positive predictive value of 41%. Of 631 ED presentations with APE, paramedics identified 186, i.e., a sensitivity of 29%. CONCLUSION Acute pulmonary edema is difficult to identify in the prehospital setting because of the variability in the signs and symptoms associated with this condition. Improved identification of APE is essential in the initiation of appropriate and timely care. Ambulance paramedics need to be aware of such variability when considering patients who may be suffering from APE. Key words: pulmonary edema; acute pulmonary edema; emergency medical services; ambulance; paramedics.
Collapse
|
37
|
Prehospital continuous positive airway pressure for acute respiratory failure: a systematic review and meta-analysis. PREHOSP EMERG CARE 2013; 17:261-73. [PMID: 23373591 DOI: 10.3109/10903127.2012.749967] [Citation(s) in RCA: 34] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
Abstract
INTRODUCTION Acute respiratory failure (ARF) is a common problem encountered by emergency medical services and is associated with significant morbidity, mortality, and health care costs. Continuous positive airway pressure (CPAP) is an integral part of the hospital treatment of acute ARF, predominantly because of congestive heart failure. Intuitively, better patient outcomes may be achieved when CPAP is applied early in the prehospital setting, but there are few outcome studies to validate its use in this setting. OBJECTIVE This systematic review and meta-analysis aimed to examine the effectiveness of CPAP in the prehospital setting for patients with ARF. METHODS A literature review of bibliographic databases and secondary sources was conducted and potential papers were assessed by two independent reviewers. Included studies were those that compared CPAP therapy (and usual care) with no CPAP for ARF in the prehospital setting. Studies of other methods of noninvasive ventilation were not included. Methodologic quality was assessed using guidelines from the Cochrane Collaboration. Outcomes included the number of intubations, mortality, physiologic parameters, and dyspnea score. Forrest plots were constructed to estimate the pooled effect of CPAP on outcomes. RESULTS Five studies (1,002 patients) met the selection criteria--three randomized controlled trials (RCTs), a nonrandomized comparative study, and a retrospective comparative study using chart review. Forty-seven percent of the patients were allocated to the CPAP group. Baseline characteristics were similar between groups. The pooled estimates demonstrated significantly fewer intubations (odds ratio [OR] 0.31; 95% confidence interval [CI] 0.19-0.51) and lower mortality (OR 0.41; 95% CI 0.19-0.87) in the CPAP group. CONCLUSION The studies included in this review showed a reduction in the number of intubations and mortality in patients with ARF who received CPAP in the prehospital setting. The results may not be applicable to other health care contexts because of the inherent differences in the organization and staffing of the EMS systems. Information from large RCTs on the efficacy of CPAP initiated early in the prehospital setting is critical to establishing the evidence base underpinning this therapy before ambulance services incorporate CPAP as routine clinical practice.
Collapse
|
38
|
Reducing interruptions to continuous enteral nutrition in the intensive care unit: a comparative study. J Clin Nurs 2013; 22:2838-48. [DOI: 10.1111/jocn.12068] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/06/2012] [Indexed: 12/26/2022]
|
39
|
Is the evidence for the use of subglottic drainage to prevent ventilated-associated pneumonia sufficient to change practice? Aust Crit Care 2012; 25:200-4. [PMID: 22484207 DOI: 10.1016/j.aucc.2012.03.001] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2011] [Accepted: 03/06/2012] [Indexed: 12/29/2022] Open
Abstract
This paper critiques the systematic review and meta-analysis of the effect of subglottic drainage among patients who received mechanical ventilation. Subglottic secretion drainage can reduce bacterial pathogens from entering the lower respiratory tract and potentially reduce the occurrence of ventilator-associated pneumonia. A summary of the systematic review and meta-analysis is provided. The critique examines the study's strengths and weaknesses and implications for practice are discussed. It is a well-conducted systematic review and meta-analysis with few suggestions for improvement. Subglottic secretion drainage reduced the incidence of ventilator-associated pneumonia. Several studies have shown positive effects of using subglottic drainage but despite the evidence, the practice in ICUs is not widespread.
Collapse
|
40
|
Abstract
The toxic effects of aldosterone on the vasculature, and in particular on the endothelial layer, have been proposed as having an important role in the cardiovascular pathology observed in mineralocorticoid-excess states. In order to characterize the genomic molecular mechanisms driving the aldosterone-induced endothelial dysfunction, we performed an expression microarray on transcripts obtained from both human umbilical vein endothelial cells and human coronary artery endothelial cells stimulated with 10 - 7 M aldosterone for 18 h. The results were then subjected to qRT-PCR confirmation, also including a group of genes known to be involved in the control of the endothelial function or previously described as regulated by aldosterone. The state of activation of the mineralocorticoid receptor was investigated by means of a luciferase-reporter assay using a plasmid encoding a mineralocorticoid and glucocorticoid-sensitive promoter. Aldosterone did not determine any significant change in gene expression in either cell type both in the microarray and in the qRT-PCR analysis. The luciferase-reporter assay showed no activation of the mineralocorticoid receptor following aldosterone stimulation. The status of nonfunctionality of the mineralocorticoid receptor expressed in cultured human umbilical and coronary artery endothelial cells does not allow aldosterone to modify gene expression and provides evidence against either a beneficial or harmful genomic effect of aldosterone on healthy endothelial cells.
Collapse
|
41
|
Outcome of ranibizumab treatment in neovascular age related macula degeneration in eyes with baseline visual acuity better than 6/12. Eye (Lond) 2011; 25:1617-21. [PMID: 21921947 DOI: 10.1038/eye.2011.224] [Citation(s) in RCA: 40] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022] Open
Abstract
BACKGROUND The beneficial effect of intravitreal ranibizumab in the treatment of neovascular age-related macula degeneration (nAMD) is well known. Outcome data for eyes presenting with visual acuity better than 6/12 is limited. AIMS To assess the effect of baseline vision on outcome in ranibizumab-treated nAMD eyes, including a subgroup with baseline vision ≥6/12 (<0.30 logmar). DESIGN Prospective, consecutive and interventional case series. METHODS A consecutive cohort of patients treated with intravitreal ranibizumab for nAMD with 52-week follow-up were studied. Patients who had received previous treatment for nAMD were excluded. Eyes were stratified according to baseline logmar visual acuity into four groups: <0.30 (>6/12), 0.30-0.59 (6/12-6/24), 0.60-0.99 (6/24-6/60) and 1.00-1.20 (6/60-6/96). Intravitreal ranibizumab (0.5 mg in 0.05 ml) was administered in three loading monthly doses followed by PRN dosing according to optical coherence tomography (OCT) findings. RESULTS A total of 615 eyes were studied including 88 eyes with baseline vision <0.30. The mean change in logmar letters at 52 weeks was +5.5 (entire study group), -0.5 (<0.30 subgroup), +2.2 (0.30-0.59 subgroup), +6.5 (0.60-0.99 subgroup) and +15.3 (1.00-1.20 subgroup). In the <0.30 subgroup, 60 of 88 eyes (68%) had best-corrected visual acuity (BCVA) equal to or better than baseline and 82 of 88 eyes (93%) lost <15 letters at 52 weeks. Within this subgroup 56 of 67 eyes (84%) maintained UK driving standard BCVA visual acuity over the study period. CONCLUSIONS This study provides evidence that intravitreal ranibizumab treatment stabilises good vision in nAMD presenting with vision better than 6/12 over 52 weeks follow-up.
Collapse
|
42
|
Decrease in proven ventriculitis by reducing the frequency of cerebrospinal fluid sampling from extraventricular drains. J Neurosurg 2011; 115:1040-6. [PMID: 21800964 DOI: 10.3171/2011.6.jns11167] [Citation(s) in RCA: 37] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
OBJECT Ventriculitis associated with extraventricular drains (EVD) increases rates of morbidity and mortality as well as costs. Surveillance samples of CSF are taken routinely from EVD, but there is no consensus on the optimum frequency of sampling. The goal of this study was to assess whether the incidence of ventriculitis changed when CSF sampling frequency was reduced once every 3 days. METHODS After receiving institutional ethics committee approval for their project, the authors compared a prospective sample of EVD-treated patients (admitted 2008-2009) and a historical comparison group (admitted 2005-2007) at two tertiary hospital ICUs. A broad definition of ventriculitis included suspected ventriculitis (that is, treated with antibiotics for ventriculitis) and proven ventriculitis (positive CSF culture). Adult ICU patients with no preexisting neurological infection were enrolled in the study. After staff was provided with an education package, sampling of CSF was changed from daily to once every 3 days. All other management of the EVD remained unchanged. More frequent sampling was permitted if clinically indicated during the third daily sampling phase. RESULTS Two hundred seven patients were recruited during the daily sampling phase and 176 patients when sampling was reduced to once every 3 days. The Acute Physiology and Chronic Health Evaluation (APACHE) II score was lower for the daily sampling group than for the every-3rd-day group (18.6 vs 20.3, respectively; p < 0.01), but there was no difference in mean age (47 and 45 years, respectively; p = 0.14), male or female sex (61% and 59%, respectively; p = 0.68), or median EVD duration in the ICU (4.9 and 5.8 days, respectively; p = 0.14). Most patients were admitted with subarachnoid hemorrhage (42% in the daily group and 33% in the every-3rd-day group) or traumatic head injuries (29% and 36%, respectively). The incidence of ventriculitis decreased from 17% to 11% overall and for proven ventriculitis from 10% to 3% once sampling frequency was reduced. Sampling of CSF once every 3 days was independently associated with ventriculitis (OR 0.44, 95% CI 0.22-0.88, p = 0.02). CONCLUSIONS Reducing the frequency of CSF sampling to once every 3 days was associated with a significant decrease in the incidence of ventriculitis. The authors suggest that CSF sampling should therefore be performed once every 3 days in the absence of clinical indicators of ventriculitis. Reducing frequency of CSF sampling from EVDs decreased proven ventriculitis.
Collapse
|
43
|
Challenges and possible solutions for long-term follow-up of patients surviving critical illness. Aust Crit Care 2011; 24:175-85. [PMID: 21514838 DOI: 10.1016/j.aucc.2011.03.001] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/06/2011] [Revised: 03/17/2011] [Accepted: 03/23/2011] [Indexed: 11/18/2022] Open
Abstract
INTRODUCTION Surviving critical illness can be life-changing and present new healthcare challenges for patients after discharge from hospital. Optimisation of recovery, rather than mere survival, is an important goal of intensive care. Observational studies have identified decreased quality of life and increased healthcare needs for survivors but loss to follow-up can be high with possible selection bias. Patients in need of support may therefore not be included in study results or allocated appropriate follow up support. AIM To examine the frequency and reasons patients admitted to general ICUs who survive critical illness are excluded from study participation or lost to follow-up and consider the possible implications and solutions. METHOD The literature review included searches of the MEDLINE, EMBASE, and CINAHL databases. Studies (2006-2010) were included if they described follow-up of survivors from general ICUs. RESULTS Ten studies were reviewed. Of the 3269 eligible patients, 14% died after hospital discharge, 27% declined, and 22% were lost to follow-up. Reasons for loss to follow-up included no response, inability to contact the patient, too ill or admitted to another facility. CONCLUSION The most appropriate method of care follow-up has yet to be established but is likely to involve an eclectic model that tailors service provision to support individual patient needs. Identifying methods to minimise loss to follow-up may enhance interpretation of patients' recovery, lead to improvements in clinical practice and inform healthcare service decisions and policy.
Collapse
|
44
|
Early experience with influenza A H1N109 in an Australian intensive care unit. Intensive Crit Care Nurs 2010; 26:207-14. [PMID: 20599382 PMCID: PMC7125814 DOI: 10.1016/j.iccn.2010.05.005] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2010] [Revised: 05/19/2010] [Accepted: 05/19/2010] [Indexed: 11/12/2022]
Abstract
Influenza is a common seasonal viral infection that affects large numbers of people. In early 2009, many people were admitted to hospitals in Mexico with severe respiratory failure following an influenza-like illness, subtyped as H1N1. An increased mortality rate was observed. By June 2009, H1N1 was upgraded to pandemic status. In June–July, Australian ICUs were experiencing increased activity due to the influenza pandemic. While hospitals implemented plans for the pandemic, the particularly heavy demand to provide critical care facilities to accommodate an influx of people with severe respiratory failure became evident and placed a great burden on provision of these services. This paper describes the initial experience (June to mid September) of the pandemic from the nursing perspective in a single Australian ICU. Patients were noted to be younger with a higher proportion of women, two of whom were pregnant. Two patients had APACHE III comorbidity. Of the 31 patients admitted during this period, three patients died in ICU and one patient died in hospital. Aerosol precautions were initiated for all patients. The requirement for single room accommodation placed enormous demands for bed management in ICU. Specific infection control procedures were developed to deal with this new pandemic influenza.
Collapse
|
45
|
Should gastric aspirate be discarded or retained when gastric residual volume is removed from gastric tubes? Aust Crit Care 2010; 23:215-7. [PMID: 20558081 DOI: 10.1016/j.aucc.2010.05.001] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/27/2010] [Revised: 05/07/2010] [Accepted: 05/10/2010] [Indexed: 11/25/2022] Open
Abstract
Nursing care of patients with enteral feeding tubes is common in the intensive care unit but the evidence that surrounds the practice is limited. Recent research by Juve-Udina and colleagues (2010) "To return or to discard? Randomised trial on gastric residual volume management" compares two methods of managing gastric residual volume. This critique provides a brief summary of their research and critically appraises the paper. The implications for nursing practice are discussed.
Collapse
|
46
|
Discharge delay, room for improvement? Aust Crit Care 2010; 23:141-9. [PMID: 20347328 DOI: 10.1016/j.aucc.2010.02.003] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/22/2009] [Revised: 02/19/2010] [Accepted: 02/22/2010] [Indexed: 11/16/2022] Open
Abstract
AIM Patients treated in the intensive care unit (ICU) and identified as suitable for discharge to the ward should have their discharge planned and expedited to improve patient outcomes and manage resources efficiently. We examined the hypothesis that the introduction of a critical care outreach role would decrease the frequency of discharge delay from ICU. METHODS Discharge delay was compared for two 6-month periods: (1) after introduction of the outreach role in 2008 and (2) in 2000/2001 (from an earlier study). Patients were included if discharged to a ward in the study hospital. Discharge times and reason for delay were collected by Critical Care Outreach Nurses and Critical Care Nurse Specialists. RESULTS Of the 516 discharges in 2008 (488 patients compared to 607 in 2000/2001), 31% of the discharges were delayed from ICU more than 8h, an increase of 6% from 2000/2001 (p<0.001). Patients in 2008 spent more in hospital from the time of their ICU admission when their discharge was delayed (p<0.001). The most common reasons for delay in 2008 were due to no bed or delay in bed availability (53%) and medical concern (24%). This is in contrast to 2000/2001 when 80% of delays were due to no bed or delay in bed availability and 9% due to medical concern. Many factors impact on patient flow and reducing ICU discharge delays requires a collaborative, multi-factorial approach which adapts to changing organisational policy on patient flow through ICU and the hospital, not just the discharge process in ICU.
Collapse
|
47
|
Effect of length of stay in intensive care unit on hospital and long-term mortality of critically ill adult patients. Br J Anaesth 2010; 104:459-64. [PMID: 20185517 DOI: 10.1093/bja/aeq025] [Citation(s) in RCA: 80] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
BACKGROUND Critical illness leading to prolonged length of stay (LOS) in an intensive care unit (ICU) is associated with significant mortality and resource utilization. This study assessed the independent effect of ICU LOS on in-hospital and long-term mortality after hospital discharge. METHODS Clinical and mortality data of 22 298 patients, aged 16 yr and older, admitted to ICU between 1987 and 2002 were included in this linked-data cohort study. Cox's regression with restricted cubic spline function was used to model the effect of LOS on in-hospital and long-term mortality after adjusting for age, gender, acute physiology score (APS), maximum number of organ failures, era of admission, elective admission, Charlson's co-morbidity index, and diagnosis. The variability each predictor explained was calculated by the percentage of the chi(2) statistic contribution to the total chi(2) statistic. RESULTS Most hospital deaths occurred within the first few days of ICU admission. Increasing LOS in ICU was not associated with an increased risk of in-hospital mortality after adjusting for other covariates, but was associated with an increased risk of long-term mortality after hospital discharge. The variability on the long-term mortality effect associated with ICU LOS (2.3%) appeared to reach a plateau after the first 10 days in ICU and was not as important as age (35.8%), co-morbidities (18.6%), diagnosis (10.9%), and APS (3.6%). CONCLUSIONS LOS in ICU was not an independent risk factor for in-hospital mortality, but it had a small effect on long-term mortality after hospital discharge after adjustment for other risk factors.
Collapse
|
48
|
Abstract
Healthcare utilisation can affect quality of life and is important in assessing the cost-effectiveness of medical interventions. A clinical database was linked to two Australian state administrative databases to assess the difference in incidence of healthcare utilisation of 19,921 patients who survived their first episode of critical illness. The number of hospital admissions and days of hospitalisation per patient-year was respectively 150% and 220% greater after than before an episode of critical illness (assessed over the same time period). This was the case regardless of age or type of surgery (i.e. cardiac vs non-cardiac). After adjusting for the ageing effect of the cohort as a whole, there was still an unexplained two to four-fold increase in hospital admissions per patient-year after an episode of critical illness. We conclude that an episode of critical illness is a robust predictor of subsequent healthcare utilisation.
Collapse
|
49
|
Abstract
To clarify the role of gene polymorphisms on the effect of losartan and losartan plus hydrochlorothiazide on blood pressure (primary end point) and on cardiac, vascular and metabolic phenotypes (secondary end point) after 4, 8, 12, 16 and 48 weeks treatment, an Italian collaborative study - The Study of the Pharmacogenomics in Italian hypertensive patients treated with the Angiotensin receptor blocker losartan (SOPHIA) - on never-treated essential hypertensives (n = 800) was planned. After an 8 week run-in, losartan 50 mg once daily will be given and doubled to 100 mg at week +4 if blood pressure is more than 140/90 mmHg. Hydroclorothiazide 25 mg once daily at week +8 and amlodipine 5 mg at week +16 will be added if blood pressure is more than 140/90 mmHg. Cardiac mass (echocardiography), carotid intima-media thickness, 24 h ambulatory blood pressure, homeostatic model assessment (HOMA) index, microalbuminuria, plasma renin activity and aldosterone, endogenous lithium clearance, brain natriuretic peptide and losartan metabolites will be evaluated. Genes of the renin-angiotensin-aldosterone system, salt sensitivity, the beta-adrenergic system and losartan metabolism will be studied (Illumina custom arrays). A whole-genome scan will also be performed in half of the study cohort (1M array, Illumina 500 GX beadstation).
Collapse
|
50
|
Beyond the walls: A review of ICU clinics and their impact on patient outcomes after leaving hospital. Aust Crit Care 2008; 21:6-17. [DOI: 10.1016/j.aucc.2007.11.001] [Citation(s) in RCA: 44] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2007] [Revised: 10/30/2007] [Accepted: 11/21/2007] [Indexed: 01/21/2023] Open
|