1
|
A population-based comparison of joint survival of hemiarthroplasty versus total shoulder arthroplasty in osteoarthritis and rheumatoid arthritis. Bone Joint J 2019; 101-B:454-460. [DOI: 10.1302/0301-620x.101b4.bjr-2018-0620.r1] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
Aims Few studies have compared survivorship of total shoulder arthroplasty (TSA) with hemiarthroplasty (HA). This observational study compared survivorship of TSA with HA while controlling for important covariables and accounting for death as a competing risk. Patients and Methods All patients who underwent shoulder arthroplasty in Ontario, Canada between April 2002 and March 2012 were identified using population-based health administrative data. We used the Fine–Gray sub-distribution hazard model to measure the association of arthroplasty type with time to revision surgery (accounting for death as a competing risk) controlling for age, gender, Charlson Comorbidity Index, income quintile, diagnosis, and surgeon factors. Results During the study period, 5777 patients underwent shoulder arthroplasty (4079 TSA, 70.6%; 1698 HA, 29.4%), 321 (5.6%) underwent revision, and 1090 (18.9%) died. TSA patients were older (TSA mean age 68.4 years (sd 10.2) vs HA mean age 66.5 years (sd 12.7); p = 0.001). The proportion of female patients was slightly lower in the TSA group (58.0% vs 58.4%). The adjusted association between surgery type and time to shoulder revision interacted significantly with patient age. Compared with TSA patients, revision was more common in the HA group (adjusted-health ratio (HR) 1.214, 95% confidence interval (CI) 0.96 to 1.53) but this did not reach statistical significance. Conclusion Although there was a trend towards higher revision risk in patients undergoing HA, we found no statistically significant difference in survivorship between patients undergoing TSA or HA. Cite this article: Bone Joint J 2019;101-B:454–460.
Collapse
|
2
|
The Ottawa SAH search algorithms: protocol for a multi- centre validation study of primary subarachnoid hemorrhage prediction models using health administrative data (the SAHepi prediction study protocol). BMC Med Res Methodol 2018; 18:94. [PMID: 30219029 PMCID: PMC6139177 DOI: 10.1186/s12874-018-0553-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2018] [Accepted: 08/31/2018] [Indexed: 01/07/2023] Open
Abstract
Background Conducting prospective epidemiological studies of hospitalized patients with rare diseases like primary subarachnoid hemorrhage (pSAH) are difficult due to time and budgetary constraints. Routinely collected administrative data could remove these barriers. We derived and validated 3 algorithms to identify hospitalized patients with a high probability of pSAH using administrative data. We aim to externally validate their performance in four hospitals across Canada. Methods Eligible patients include those ≥18 years of age admitted to these centres from January 1, 2012 to December 31, 2013. We will include patients whose discharge abstracts contain predictive variables identified in the models (ICD-10-CA diagnostic codes I60** (subarachnoid hemorrhage), I61** (intracranial hemorrhage), 162** (other nontrauma intracranial hemorrhage), I67** (other cerebrovascular disease), S06** (intracranial injury), G97 (other postprocedural nervous system disorder) and CCI procedural codes 1JW51 (occlusion of intracranial vessels), 1JE51 (carotid artery inclusion), 3JW10 (intracranial vessel imaging), 3FY20 (CT scan (soft tissue of neck)), and 3OT20 (CT scan (abdominal cavity)). The algorithms will be applied to each patient and the diagnosis confirmed via chart review. We will assess each model’s sensitivity, specificity, negative and positive predictive value across the sites. Discussion Validating the Ottawa SAH Prediction Algorithms will provide a way to accurately identify large SAH cohorts, thereby furthering research and altering care.
Collapse
|
3
|
Association of Frailty and 1-Year Postoperative Mortality Following Major Elective Noncardiac Surgery: A Population-Based Cohort Study. J Vasc Surg 2016. [DOI: 10.1016/j.jvs.2016.08.073] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
|
4
|
Real-Time Polymerase Chain Reaction Detection of Methicillin-ResistantStaphylococcus aureus:Impact on Nosocomial Transmission and Costs. Infect Control Hosp Epidemiol 2015; 28:1134-41. [PMID: 17828689 DOI: 10.1086/520099] [Citation(s) in RCA: 44] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2007] [Accepted: 04/26/2007] [Indexed: 11/03/2022]
Abstract
Objectives.To assess the impact of real-time polymerase chain reaction (PCR) detection of methicillin-resistantStaphylococcus aureus(MRSA) on nosocomial transmission and costs.Design.Monthly MRSA detection rates were measured from April 1, 2000, through December 31, 2005. Time series analysis was used to identify changes in MRSA detection rates, and decision analysis was used to compare the costs of detection by PCR and by culture.Setting.A 1,200-bed, tertiary care hospital in Canada.Patients.Admitted patients at high risk for MRSA colonization. MRSA detection using culture-based screening was compared with a commercial PCR assay.Results.The mean monthly incidence of nosocomial MRSA colonization or infection was 0.37 cases per 1,000 patient-days. The time-series model indicated an insignificant decrease of 0.14 cases per 1,000 patient-days per month (95% confidence interval, —0.18 to 0.46) after the introduction of PCR detection (P= .39). The mean interval from a reported positive result until contact precautions were initiated decreased from 3.8 to 1.6 days (P<.001). However, the cost of MRSA control increased from Can$605,034 to Can$771,609. Of 290 PCR-positive patients, 120 (41.4%) were placed under contact precautions unnecessarily because of low specificity of the PCR assay used in the study; these patients contributed 37% of the increased cost. The modeling study predicted that the cost per patient would be higher with detection by PCR (Can$96) than by culture (Can$67).Conclusion.Detection of MRSA by the PCR assay evaluated in this study was more costly than detection by culture for reducing MRSA transmission in our hospital. The cost benefit of screening by PCR varies according to incidences of MRSA colonization and infection, the predictive values of the assay used, and rates of compliance with infection control measures.
Collapse
|
5
|
Impact of immunosuppressive medication on the risk of renal allograft failure due to recurrent glomerulonephritis. Am J Transplant 2009; 9:804-11. [PMID: 19353768 DOI: 10.1111/j.1600-6143.2009.02554.x] [Citation(s) in RCA: 59] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
Recurrent glomerulonephritis is a major problem in kidney transplantation but the role of immunosuppression in preventing this complication is not known. We used data from the United States Renal Data System to examine the effect of immunosuppressive medication on allograft failure due to recurrent glomerulonephritis for 41,272 patients undergoing kidney transplantation from 1990 to 2003. Ten-year incidence of graft loss due to recurrent glomerulonephritis was 2.6% (95% confidence interval [CI]: 2.3-2.8%). After adjusting for important covariates, the use of cyclosporine, tacrolimus, azathioprine, mycophenolate mofetil, sirolimus or prednisone was not associated with graft failure due to recurrent glomerulonephritis. There was no difference between cyclosporine and tacrolimus or between azathioprine and mycophenolate mofetil in the risk of graft failure due to recurrent glomerulonephritis. However, any change in immunosuppression during follow-up was independently associated with graft loss due to recurrence (adjusted hazard ratio 1.30, 95% CI: 1.06-1.58, p = 0.01). In patients with a pretransplant diagnosis of glomerulonephritis, the risk of graft loss due to recurrence was not associated with any specific immunosuppressive medication. The selection of immunosuppression for kidney transplant recipients should not be made with the goal of reducing graft failure due to recurrent glomerulonephritis.
Collapse
|
6
|
Abstract
BACKGROUND Discharge from the hospital is a critical transition point in a patient's care. Incomplete handoffs at discharge can lead to adverse events for patients and result in avoidable rehospitalization. Care transitions are especially important for elderly patients and other high-risk patients who have multiple comorbidities. Standardizing the elements of the discharge process may help to address the gaps in quality and safety that occur when patients transition from the hospital to an outpatient setting. METHODS The Society of Hospital Medicine's Hospital Quality and Patient Safety committee assembled a panel of care transition researchers, process improvement experts, and hospitalists to review the literature and develop a checklist of processes and elements required for ideal discharge of adult patients. The discharge checklist was presented at the Society of Hospital Medicine's Annual Meeting in April 2005, where it was reviewed and revised by more than 120 practicing hospitalists and hospital-based nurses, case managers, and pharmacists. The final checklist was endorsed by the Society of Hospital Medicine. RESULTS The finalized checklist is a comprehensive list of the processes and elements considered necessary for optimal patient handoff at hospital discharge. This checklist focused on medication safety, patient education, and follow-up plans. CONCLUSIONS The development of content and process standards for discharge is the first step in improving the handoff of care from the inpatient to the posthospital setting. Refining this checklist for patients with specific diagnoses, in specific age categories, and with specific discharge destinations may further improve information transfer and ultimately affect patient outcomes.
Collapse
|
7
|
Accuracy of coding for possible warfarin complications in hospital discharge abstracts. Thromb Res 2005; 118:253-62. [PMID: 16081144 DOI: 10.1016/j.thromres.2005.06.015] [Citation(s) in RCA: 141] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2005] [Revised: 06/10/2005] [Accepted: 06/23/2005] [Indexed: 10/25/2022]
Abstract
BACKGROUND Hospital discharge abstracts could be used to identify complications of warfarin if coding for bleeding and thromboembolic events are accurate. OBJECTIVES To measure the accuracy of International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9CM) codes for bleeding and thromboembolic diagnoses. SETTING University affiliated, tertiary care hospital in Ottawa, Canada. PATIENTS A random sample of patients discharged between September 1999 and September 2000 with an ICD-9-CM code indicating a bleeding or thromboembolic diagnosis. METHODS Gold-standard coding was determined by a trained chart abstractor using explicit standard diagnostic criteria for bleeding, major bleeding, and acute thromboembolism. The abstractor was blinded to the original coding. We calculated the sensitivity, specificity, positive, and negative predictive values of the original ICD-9CM codes for bleeding or thromboembolism diagnoses. RESULTS We reviewed 616 medical records. 361 patients (59%) had a code indicating a bleeding diagnosis, 291 patients (47%) had a code indicating a thromboembolic diagnosis and 36 patients (6%) had a code indicating both. According to the gold standard criteria, 352 patients experienced bleeding, 333 experienced major bleeding, and 188 experienced an acute thromboembolism. For bleeding, the ICD-9CM codes had the following sensitivity, specificity, positive and negative predictive values [95% CI]: 93% [90-96], 88% [83-91], 91% [88-94], and 91% [87-94], respectively. For major bleeding, the ICD-9CM codes had the following sensitivity, specificity, positive and negative predictive values: 94% [91-96], 83% [78-87], 87% [83-90], and 92% [88-95], respectively. For thromboembolism, the ICD-9CM codes had the following sensitivity, specificity, positive and negative predictive values: 97% [94-99], 74% [70-79], 62% [57-68], and 98% [96-99], respectively. By selecting a sub-group of ICD-9CM codes for thromboembolism, the positive predictive value increased to 87%. CONCLUSION In our centre, the discharge abstract could be used to identify and exclude patients hospitalized with a major bleed or thromboembolism. If coding quality for bleeding is similar in other hospitals, these ICD-9-CM diagnostic codes could be used to study population-based warfarin-associated hemorrhagic complications using administrative databases.
Collapse
|
8
|
Risk of Intracerebral Hemorrhage in Patients With Arterial Versus Cardiac Origin of Cerebral Ischemia on Aspirin or Placebo. Stroke 2004; 35:710-4. [PMID: 14764931 DOI: 10.1161/01.str.0000116868.45282.67] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Background and Purpose—
Patients who are anticoagulated after cerebral ischemia have a 19-fold-higher risk of intracerebral hemorrhage (ICH) if they had an arterial rather than a cardiac source. To determine whether this excess risk of ICH was due to the underlying disease (cerebral ischemia of arterial versus cardiac origin) or whether it depended on the antithrombotic regimen, we studied the risk of ICH in arterial versus cardiac origin of cerebral ischemia in patients who received aspirin or no antithrombotic drugs.
Methods—
Individual patient data of patients who received aspirin or placebo after cerebral ischemia were obtained from 9 clinical trials. Presence of atrial fibrillation was considered evidence of a cardiac source. Otherwise, events were considered of arterial origin. Cox proportional-hazards modeling was used for univariate and multivariate analyses.
Results—
Fifty-four ICHs occurred in 16 625 patient-years in the aspirin-treated patients, and 7 ICHs occurred in 4317 patient-years in those on placebo. After multivariate adjustment for age, sex, current smoking, history of hypertension and diabetes, and aspirin dose (aspirin-treated patients only), the hazard ratio for ICH in patients with an arterial versus a cardiac source was 0.74 (95% confidence interval, 0.30 to 1.82) for aspirin-treated patients and 4.34 (95% confidence interval, 0.35 to 54) for placebo-randomized patients.
Conclusions—
Our findings do not confirm the previous finding of an excess risk of ICH in patients with cerebral ischemia of arterial origin. Therefore, it seems that having cerebral ischemia of arterial origin by itself is not associated with an increased risk of ICH, but only in combination with high-intensity anticoagulation.
Collapse
|
9
|
Risk of death or readmission is highest for Friday discharges from hospital. HOSPITAL QUARTERLY 2002; 5:25-6. [PMID: 12357568] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 02/26/2023]
|
10
|
Surveillance mammography after treatment of primary breast cancer: a systematic review. Breast 2002; 11:228-35. [PMID: 14965672 DOI: 10.1054/brst.2001.0404] [Citation(s) in RCA: 67] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2001] [Accepted: 10/17/2001] [Indexed: 11/18/2022] Open
Abstract
As the prevalence of diagnosed breast cancer increases, it is important to define how best to provide long-term follow-up. Whereas many aspects of follow-up remain controversial, guidelines recommend surveillance mammograms as the only investigation to be performed routinely. We conducted a systematic review of the literature to elucidate the effect of routine surveillance mammograms on detecting ipsilateral recurrence (IR) and contralateral breast cancers (CBC). The systematic review yielded 15 articles. All were observational studies and ranked as level II-2 or III evidence. There were no randomized controlled trials identified. Most of the ten studies on detection of IR did not report on outcomes after detection. When reported, most studies found that the method of detection of IR did not influence overall survival or disease-free survival. Two of the nine studies on detection of CBC found that the CBC was detected at an earlier stage than the initial breast cancer, but did not report on long-term outcomes. This systematic review highlights the need for further research to help better define the optimum surveillance mammography regimen.
Collapse
|
11
|
Inhibition of serotonin reuptake by antidepressants and upper gastrointestinal bleeding in elderly patients: retrospective cohort study. BMJ (CLINICAL RESEARCH ED.) 2001; 323:655-8. [PMID: 11566827 PMCID: PMC55923 DOI: 10.1136/bmj.323.7314.655] [Citation(s) in RCA: 171] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/18/2023]
Abstract
OBJECTIVES To determine the association between inhibition of serotonin reuptake by antidepressants and upper gastrointestinal bleeding. DESIGN Retrospective cohort study from population based databases. SETTING Ontario, Canada. PARTICIPANTS 317 824 elderly people observed for more than 130 000 person years. The patients started taking an antidepressant between 1992 and 1998 and were grouped by how much the drug inhibited serotonin reuptake. Patients were observed until they stopped the drug, had an upper gastrointestinal bleed, or died or the study ended. MAIN OUTCOME MEASURE Admission to hospital for acute upper gastrointestinal bleeding. RESULTS Overall, 974 bleeds were observed, with an overall bleeding rate of 7.3 per 1000 person years. After controlling for age or previous gastrointestinal bleeding, the risk of bleeding significantly increased by 10.7% and 9.8%, respectively, with increasing inhibition of serotonin reuptake. Absolute differences in bleeding between antidepressant groups were greatest for octogenarians (low inhibition of serotonin reuptake, 10.6 bleeds/1000 person years v high inhibition of serotonin reuptake, 14.7 bleeds/1000 person years; number needed to harm 244) and those with previous upper gastrointestinal bleeding (low, 28.6 bleeds/1000 person years v high, 40.3 bleeds/1000 person years; number needed to harm 85). CONCLUSIONS After age or previous upper gastrointestinal bleeding were controlled for, antidepressants with high inhibition of serotonin reuptake increased the risk of upper gastrointestinal bleeding. These increases are clinically important for elderly patients and those with previous gastrointestinal bleeding.
Collapse
|
12
|
Abstract
A discharge abstract must be completed for each hospitalization. The most time-consuming component of this task is a complete review of the doctors' progress notes to identify and code all diagnoses and procedures. We have developed a clinical database that creates hospital discharge summaries. To compare diagnostic and procedural coding from a clinical database vs. the standard chart review by health records analysts (HRA). All patients admitted and discharged from general medical and surgical services at a teaching hospital in Ontario, Canada. Diagnostic and procedural codes were identified by reviewing discharge summaries generated from a clinical database. Independently, codes were identified by hospital health records analysts using chart review alone. Codes were compared with a gold standard case review conducted by a health records analyst and a doctor. Coding accuracy (percentage of codes in gold standard review) and completeness (percentage of gold standard codes identified). The study included 124 patients (mean length of stay 5.5 days; 66.4% medical patients). The accuracy of the most responsible diagnosis was 68.5% and 62.9% for the database (D) and chart review (C), respectively (P = 0.18). Overall, the database significantly improved the accuracy (D = 78.9% vs. C = 74.5%; P = 0.02) and completeness (D = 63.9% vs. C = 36.7%; P < 0.0001) of diagnostic coding. Although completeness of procedural coding was similar (D = 5.4% vs. C = 64.2%; P = NS), accuracy decreased with the database (D = 70.3% vs. C = 92.2%; P < 0.0001). Mean resource intensity weightings calculated from the codes (D = 1.3 vs. C = 1.4; P = NS) were similar. Coding from a clinical database may circumvent the need for HRAs to review doctors' progress notes, while maintaining the quality of coding in the discharge abstract.
Collapse
|
13
|
A time series would have been better. CMAJ 2001; 164:1835. [PMID: 11450278 PMCID: PMC81190] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/20/2023] Open
|
14
|
Abstract
Data in health research are frequently structured hierarchically. For example, data may consist of patients nested within physicians, who in turn may be nested in hospitals or geographic regions. Fitting regression models that ignore the hierarchical structure of the data can lead to false inferences being drawn from the data. Implementing a statistical analysis that takes into account the hierarchical structure of the data requires special methodologies. In this paper, we introduce the concept of hierarchically structured data, and present an introduction to hierarchical regression models. We then compare the performance of a traditional regression model with that of a hierarchical regression model on a dataset relating test utilization at the annual health exam with patient and physician characteristics. In comparing the resultant models, we see that false inferences can be drawn by ignoring the structure of the data.
Collapse
|
15
|
Abstract
CONTEXT Most patients undergoing in-hospital cardiac resuscitation do not survive to hospital discharge. In a previous study, we developed a clinical decision aid for identifying all patients undergoing resuscitation who survived to hospital discharge. OBJECTIVE To validate our previously derived clinical decision aid. DESIGN, SETTING, AND PARTICIPANTS Data from a large registry of in-hospital resuscitations at a community teaching hospital in Georgia were analyzed to determine whether patients would be predicted to survive to hospital discharge (ie, whether their arrest was witnessed or their initial cardiac rhythm was either ventricular tachycardia or ventricular fibrillation or they regained a pulse during the first 10 minutes of chest compressions). Data from 2181 in-hospital cardiac resuscitation attempts in 1987-1996 involving 1884 pulseless patients were analyzed. MAIN OUTCOME MEASURE Comparison of predictions based on the decision aid with whether patients were actually discharged alive from the hospital. RESULTS For 327 resuscitations (15.0%), the patient survived to hospital discharge. For 324 of these resuscitations, the patients were predicted to survive to hospital discharge (sensitivity = 99.1%, 95% confidence interval, 97.1%-99.8%). In 269 resuscitations, patients did not satisfy the decision aid and were predicted to have no chance of being discharged from the hospital. Only 3 of these patients (1.1%) were discharged from the hospital (negative predictive value = 98.9%), none of whom were able to live independently following discharge from the hospital. CONCLUSION This decision aid can be used to help physicians identify patients who are extremely unlikely to benefit from continued resuscitative efforts.
Collapse
|
16
|
Vitamin B12 injections versus oral supplements. How much money could be saved by switching from injections to pills? CANADIAN FAMILY PHYSICIAN MEDECIN DE FAMILLE CANADIEN 2001; 47:79-86. [PMID: 11212437 PMCID: PMC2014701] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 02/19/2023]
Abstract
OBJECTIVE To estimate savings, using a third-party payer perspective, if all elderly patients currently receiving vitamin B12 (cobalamin) injections were switched to high-dose oral therapy. DESIGN We modeled high-dose oral B12 supplement costs to include drugs, pharmacists' fees, and one-time conversion costs consisting of two physician visits and laboratory monitoring. The number of vitamin-injection visits avoided by switching to oral therapy was predicted using a multivariate model that considered covariates for overall patient illness. SETTING Ontario family physicians' and internists' practices. PARTICIPANTS Population-based administrative databases for Ontario were used to identify all people between 65 and 100 years who received parenteral vitamin B12 during 1995 and 1996. MAIN OUTCOME MEASURES The cost of parenteral vitamin B12 for each patient, including drugs, injections, pharmacists' fees, and injection-associated physician visits, was measured directly from the databases. RESULTS The annual cost of parenteral vitamin B12 therapy averaged $145.88 per person and totaled a maximum $25 million over 5 years. Converting all patients to high-dose oral B12 and treating them for 5 years would cost $7.4 million. Depending on how many vitamin-injection visits are avoided by switching to oral therapy, between $2.9 million and $17.6 million would be saved. Switching to oral B12 administration saved costs as long as 16.3% of injection-associated visits were avoided. CONCLUSION Switching all patients from B12 injections to oral cobalamin therapy could result in substantial savings.
Collapse
|
17
|
An hypothesis paper on practice environment and the provision of health care: could hospital occupancy rates effect quality? JOURNAL OF QUALITY IN CLINICAL PRACTICE 2000; 20:69-74. [PMID: 11057987 DOI: 10.1046/j.1440-1762.2000.00371.x] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
This paper will explore whether hospital occupancy influences quality of care. It discusses the 'systems' theory of error causation in the context of adverse medical outcomes. It then relates how high occupancy rates may cause problems in hospital systems. The evidence relating to quality of care and occupancy is reviewed. Finally, a new method of studying this relationship using time series analysis is proposed. We conclude that the relationship requires further exploration since revealing 'system' problems may compel clinicians to expose problems medical errors.
Collapse
|
18
|
Postmenopausal estrogen replacement therapy and increased rates of cholecystectomy and appendectomy. CMAJ 2000; 162:1421-4. [PMID: 10834045 PMCID: PMC1232454] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/16/2023] Open
Abstract
BACKGROUND Several studies have indicated that estrogen may prime inflammatory and nociceptive pathways, leading to symptoms that mimic cholecystitis. We set out to confirm the relation between recent estrogen use and cholecystectomy in postmenopausal women and to test the novel hypothesis that a similar relation exists for appendectomy. METHODS We developed a retrospective cohort using prescribing and surgical procedure information from health administrative databases for approximately 800,000 female residents of Ontario who were over 65 years of age between July 1, 1993, and Mar. 31, 1998. We compared the incidence of cholecystectomy and appendectomy among women recently prescribed estrogen replacement therapy, levothyroxine and dihydropyridine calcium-channel antagonists (DCCA) using age-adjusted Cox proportional hazards models. Patients were followed for a mean of 540 (standard deviation [SD] 449) days. RESULTS Compared with women taking DCCA, those who had recently begun taking estrogen were significantly more likely to undergo cholecystectomy (age-adjusted risk ratio [aRR] 1.9, 95% confidence interval [CI] 1.6-2.2) and appendectomy (aRR 1.8, 95% CI 1.1-3.0). No significant difference in either outcome measure was found between the levothyroxine users and the DCCA users. INTERPRETATION This study identifies an increased risk of cholecystectomy and appendectomy among postmenopausal women who have recently begun estrogen replacement therapy.
Collapse
|
19
|
Abstract
Evidence-based guidelines recommend few routine investigations for healthy adults at the periodic health examination (PHE). However, small studies indicate that laboratory tests are commonly ordered at the PHE. This study examined PHE laboratory testing that is not recommended by recognized guidelines ('discretionary'). Using administrative data from the universal health care system in Ontario, Canada, we studied 792,844 adults having a PHE in 1996 and the 3,727 physicians who administered them. We measured the number of discretionary laboratory tests per PHE along with the patient and physician factors potentially influencing laboratory testing. A multilevel, multivariate model was used to examine the association between the number of discretionary laboratory tests at the PHE with patient and physician characteristics. A mean of 7.1 discretionary tests (SD 7.1) was ordered per PHE. Renal, haematological, glucose and lipid tests each were conducted in more than a third of PHEs. Testing varied extensively between physicians and was more common in healthy patients. With the exception of age, patient factors had little effect on discretionary testing. However, each physician factor we examined was independently associated with the number of discretionary tests. Physician specialty, practice volume and previous testing patterns had the strongest influence on discretionary testing. Discretionary investigations are common at the PHE. Testing varies extensively between physicians and seems to be driven more by physician than by patient factors. Interventions to modify discretionary test utilization at the PHE should consider these physician factors.
Collapse
|
20
|
|
21
|
Abstract
BACKGROUND The validity of a review depends on its methodologic quality. OBJECTIVE To determine the methodologic quality of recently published review articles. DESIGN Critical appraisal. SETTING All reviews of clinical topics published in six general medical journals in 1996. MEASUREMENTS Explicit criteria that have been published and validated were used. RESULTS Of 158 review articles, only 2 satisfied all 10 methodologic criteria (median number of criteria satisfied, 1). Less than a quarter of the articles described how evidence was identified, evaluated, or integrated; 34% addressed a focused clinical question; and 39% identified gaps in existing knowledge. Of the 111 reviews that made treatment recommendations, 48% provided an estimate of the magnitude of potential benefits (and 34%, the potential adverse effects) of the treatment options, 45% cited randomized clinical trials to support their recommendations, and only 6% made any reference to costs. CONCLUSIONS The methodologic quality of clinical review articles is highly variable, and many of these articles do not specify systematic methods.
Collapse
|
22
|
Surveying physicians to determine the minimal important difference: implications for sample-size calculation. J Clin Epidemiol 1999; 52:717-23. [PMID: 10465315 DOI: 10.1016/s0895-4356(99)00050-5] [Citation(s) in RCA: 58] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
Abstract
The minimal important difference (MID) is the smallest benefit of treatment that would result in clinicians recommending it to their patients. The MID is necessary to calculate sample size for randomized clinical trials, but its chosen value is often arbitrary. This study set out to determine the practicability of surveying physicians to elicit the MID for clinical trial sample-size calculation. Using a mail survey, we elicited the MID of different physician specialties (family medicine, internal medicine, vascular surgery) for using propranolol to slow abdominal aortic aneurysm (AAA) growth assuming that propranolol was efficacious in this condition. We used different outcome measures (growth rate or proportion of patients requiring surgery) and different methods of data presentation for the proportion of patients requiring surgery (absolute risk reduction or number needed to treat). The MID varied significantly by physician specialty, experience with AAA and propranolol, and the method used to elicit the MID. Consequently, sample-size calculations using these various MIDs varied from 116 to 3015. Future attempts to elicit the MID need to consider carefully who is surveyed, how data are presented, and how opinions are elicited.
Collapse
|
23
|
Abstract
The objective of this study was to determine what physicians perceive to be necessary for high-quality discharge summaries. One-on-one surveys of 100 hospital-based physicians-in-training and community family physicians were conducted. Participants indicated the amount that 56 items contributed to discharge summary quality on a 15-category ordinal scale. Results were transformed to a continuous scale, extending from -6.6 ("item makes summary useless") through 0 ("item has no effect on discharge summary quality") to 10 ("item is so essential that summary is useless without it"). Quality decreased significantly when summary length exceeded 2 pages and when the delay from patient discharge to summary delivery increased. Summary content that increased quality most included admission diagnosis (mean 8.2; 95% confidence interval [7.7, 8.6]), pertinent physical examination findings (6.6 [6.0, 7.2]) and laboratory results (6.8 [6.3, 7.4]), procedures (7.1 [6.7, 7.6]) and complications in hospital (7.1 [6.6, 7.5]), discharge diagnosis (8.8 [8.4, 9.1]), discharge medications (7.9 [7.4, 8.4]), active medical problems at discharge (7.8 [7.4, 8.2]), and follow up (6.6 [6.0, 7.1]). With minor exceptions, hospital and family physicians agreed on contributors to summary quality. For this sample of physicians, summaries were of high quality when they were short, delivered quickly, and contained pertinent data that concentrated upon discharge information.
Collapse
|
24
|
Economic evaluations of technologies to minimize perioperative transfusion: a systematic review of published studies. International Study of Peri-operative Transfusion (ISPOT) investigators. Transfus Med Rev 1999; 13:106-17. [PMID: 10218234 DOI: 10.1016/s0887-7963(99)80005-4] [Citation(s) in RCA: 40] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
|
25
|
Dictated versus database-generated discharge summaries: a randomized clinical trial. CMAJ 1999; 160:319-26. [PMID: 10065073 PMCID: PMC1230033] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/11/2023] Open
Abstract
BACKGROUND Hospital discharge summaries communicate information necessary for continuing patient care. They are most commonly generated by voice dictation and are often of poor quality. The objective of this study was to compare discharge summaries created by voice dictation with those generated from a clinical database. METHODS A randomized clinical trial was performed in which discharge summaries for patients discharged from a general internal medicine service at a tertiary care teaching hospital in Ottawa were created by voice dictation (151 patients) or from a database (142 patients). Patients had been admitted between September 1996 and June 1997. The trial was preceded by a baseline cohort study in which all summaries were created by dictation. For the database group, information on forms completed by housestaff was entered into a database and collated into a discharge summary. For the dictation group, housestaff dictated narrative letters. The proportion of patients for whom a summary was generated within 4 weeks of discharge was recorded. Physicians receiving the summary rated its quality, completeness, organization and timeliness on a 100-mm visual analogue scale. Housestaff preference was also determined. RESULTS Patients in the database group and the dictation group were similar. A summary was much more likely to be generated within 4 weeks of discharge for patients in the database group than for those in the dictation group (113 [79.6%] v. 86 [57.0%]; p < 0.001). Summary quality was similar (mean rating 72.7 [standard deviation (SD) 19.3] v. 74.9 [SD 16.6]), as were assessments of completeness (73.4 [SD 19.8] v. 78.2 [SD 14.9]), organization (77.4 [SD 16.3] v. 79.3 [SD 17.2]) and timeliness (70.3 [SD 21.9] v. 66.2 [SD 25.6]). Many information items of interest were more likely to be included in the database-generated summaries. The database system created summaries faster and was preferred by housestaff. Dictated summaries in the baseline and randomized studies were similar, which indicated that the control group was not substantially different from the baseline cohort. INTERPRETATION The database system significantly increased the likelihood that a discharge summary was created. Housestaff preferred the database system for summary generation. Physicians thought that the quality of summaries generated by the 2 methods was similar. The use of computer databases to create hospital discharge summaries is promising and merits further study and refinement.
Collapse
|
26
|
Derivation of a clinical decision rule for the discontinuation of in-hospital cardiac arrest resuscitations. ARCHIVES OF INTERNAL MEDICINE 1999; 159:129-34. [PMID: 9927094 DOI: 10.1001/archinte.159.2.129] [Citation(s) in RCA: 68] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Abstract
BACKGROUND Most patients undergoing in-hospital cardiac resuscitation will not survive to hospital discharge. OBJECTIVE To derive a decision rule permitting the discontinuation of futile resuscitation attempts by identifying patients with no chance of surviving to hospital discharge. PATIENTS AND METHODS Patient, arrest, and outcome data for 1077 adult patients undergoing in-hospital cardiac resuscitation was retrieved from 2 randomized clinical trials involving 5 teaching hospitals at 2 university centers. Recursive partitioning was used to identify a decision rule using variables significantly associated with death in hospital. RESULTS One hundred three patients (9.6%) survived to hospital discharge. Death in hospital was significantly more likely if patients were older than 75 years (P<.001), the arrest was unwitnessed (P = .003), the resuscitation lasted longer than 10 minutes (P<.001), and the initial cardiac rhythm was not ventricular tachycardia or fibrillation (P<.001). All patients died if there was no pulse 10 minutes after the start of cardiopulmonary resuscitation, the initial cardiac rhythm was not ventricular tachycardia or fibrillation, and the arrest was not witnessed. As a resuscitation rule, these parameters identified all patients who survived to hospital discharge (sensitivity, 100%; 95% confidence interval, 97.1%-100%). Resuscitation could have been discontinued for 119 (12.1%) of 974 patients who did not survive, thereby avoiding 47 days of postresuscitative care. CONCLUSIONS A practical and highly sensitive decision rule has been derived that identifies patients with no chance of surviving in-hospital cardiac arrest. Prospective validation of the rule is necessary before it can be used clinically.
Collapse
|
27
|
Abstract
CONTEXT Previous studies have identified methods of decreasing laboratory utilization. However, most were hospital-based, relatively small, single-centered, or of limited duration. OBJECTIVE To determine the effect of 3 population-based interventions (physician guidelines, laboratory requisition form modification, and changes to funding policy) on laboratory utilization in Ontario. DESIGN Interventional time-series analysis in which data analysis was based on all claims made to the Ontario Health Insurance Program between July 1, 1991, and April 1997 for laboratory tests affected by the interventions. SETTING All clinical laboratories (not based in hospitals) in Ontario. INTERVENTIONS Physician guidelines, modification of laboratory requisition form, and changes in funding policy for the use of the erythrocyte sedimentation rate test (ESR), microscopic urinalysis, tests for renal function, iron stores, serum urea, and serum iron determinations, and tests for thyroid dysfunction (total thyroxine and thyroid-stimulating hormone [TSH]). MAIN OUTCOME MEASURES Change from 1991 to 1997 in utilization rates of ESR, microscopic urinalysis, serum urea and iron determinations, and tests for total thyroxine and TSH. RESULTS Age- and sex-standardized rates for laboratory tests unaffected by the interventions were stable during the study period. Utilization of ESR and urea determination decreased by 58% (P<.001) and 57% (P<.001), respectively, after they were removed from the requisition form and guidelines discouraging their use were disseminated. Rates for urinalyses without microscopy increased by 1700% (P<.001), while microscopic urinalysis decreased by 14% (P<.001), after a policy change eliminated microscopic urinalysis from routine urinalysis. Rates of iron determination declined by 80% (P<.001) and ferritin rates increased by 34% (P= .05) when policy changes eliminated iron testing when ordered with ferritin and guidelines advocating ferritin alone for investigating iron deficiency were disseminated. Utilization of total thyroxine testing declined by 96% (P = .02) when the provincial health plan stopped its funding. When TSH was removed from the laboratory requisition form, a 12% decline (P= .03) in its use was observed. Through April 1997, these interventions saved more than 625000 tests or $210400. CONCLUSIONS The combination of guideline dissemination, laboratory requisition form modification, and changes to funding policy was associated with significant reductions in laboratory utilization.
Collapse
|
28
|
Do advanced cardiac life support drugs increase resuscitation rates from in-hospital cardiac arrest? The OTAC Study Group. Ann Emerg Med 1998; 32:544-53. [PMID: 9795316 DOI: 10.1016/s0196-0644(98)70031-9] [Citation(s) in RCA: 118] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
STUDY OBJECTIVE The benefit of Advanced Cardiac Life Support (ACLS) medications during cardiac resuscitation is uncertain. The objective of this study was to determine whether the use of these medications increased resuscitation from in-hospital cardiac arrest. METHODS A prospective cohort of patients undergoing cardiac arrest in 1 of 5 academic hospitals was studied. Patient and arrest factors related to resuscitation outcome were recorded. We determined the association of the administration of ACLS drugs (epinephrine, atropine, bicarbonate, calcium, lidocaine, and bretylium) with survival at 1 hour after resuscitation. RESULTS Seven hundred seventy-three patients underwent cardiac resuscitation, with 269 (34. 8%) surviving for 1 hour. Use of epinephrine, atropine, bicarbonate, calcium, and lidocaine was associated with a decreased chance of successful resuscitation (P <.001 for all except lidocaine, P <.01). While controlling for significant patient factors (age, gender, and previous cardiac or respiratory disease) and arrest factors (initial cardiac rhythm, and cause of arrest), multivariate logistic regression demonstrated a significant association between unsuccessful resuscitation and the use of epinephrine (odds ratio . 08 [95% confidence interval .04-.14]), atropine (.24 [.17-.35]), bicarbonate (.31 [.21-.44]), calcium (.32 [.18-.55]), and lidocaine (.48 [.33-.71]). Drug effects did not improve when patients were grouped by their initial cardiac rhythm. Cox proportional hazards models that controlled for significant confounders demonstrated that survivors were significantly less likely to receive epinephrine (P <. 001) or atropine (P <.001) throughout the arrest. CONCLUSION We found no association between standard ACLS medications and improved resuscitation from in-hospital cardiac arrest. Randomized clinical trials are needed to determine whether other therapies can improve resuscitation from cardiac arrest when compared with the presently used ACLS drugs.
Collapse
|
29
|
Abstract
OBJECTIVE Laboratory utilization has steadily increased, and some studies suggest inappropriate utilization. Therefore, we wished to assess studies that measure inappropriate laboratory use in light of methodological criteria. DESIGN Systematic review of published studies. DATA SOURCES MEDLINE, HEALTHSTAR, and EMBASE databases were searched from 1966 to September 1997 using a broad and inclusive strategy with no language restriction. In addition, the references of all retrieved studies and 3 textbooks on diagnostic testing were hand-searched. STUDY SELECTION All studies that provided and applied criteria for inappropriate laboratory use. DATA EXTRACTION Studies were categorized based on whether the criteria were implicit (objective criteria for inappropriate utilization not provided or very broad) or explicit. Guidelines for evaluation were applied to each study by a single reviewer. DATA SYNTHESIS Forty-four eligible studies were identified. Eleven studies used implicit criteria for inappropriate laboratory utilization and contained small numbers of patients or physicians. Most did not adequately assess the reliability of the implicit criteria. Thirty-three studies used explicit criteria based on the appropriateness of test choice, frequency, and timing, as well as the probability of a positive result. There were large variations in the estimates of inappropriate laboratory use (4.5%-95%). Evidence supporting the explicit criteria was frequently weak by the standards suggested for therapeutic maneuvers, but was nonetheless compelling based on principles of physiology, pharmacology, and probability. CONCLUSIONS Many studies identify inappropriate laboratory use based on implicit or explicit criteria that do not meet methodological standards suggested for audits of therapeutic maneuvers. Researchers should develop alternative evidentiary standards for measuring inappropriateness of laboratory test use.
Collapse
|
30
|
Standardized or narrative discharge summaries. Which do family physicians prefer? CANADIAN FAMILY PHYSICIAN MEDECIN DE FAMILLE CANADIEN 1998; 44:62-9. [PMID: 9481464 PMCID: PMC2277571] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
OBJECTIVES To determine whether family physicians prefer discharge summaries in narrative or standardized format and to determine factors affecting this preference. DESIGN Mailed survey. SETTING Internal medicine ward at a teaching hospital. PARTICIPANTS Random sample of 180 family physicians practising in the Ottawa-Carleton area. Of the original sample, 20 were not family physicians and were excluded. Of the 160 physicians remaining, 126 responded for a response rate of 78.8%. INTERVENTION For a stratified random sample of patients, medical records and narrative discharge summaries were abstracted using a data acquisition form to capture essential information. Information on completed forms was transformed into standardized summaries. Physicians were sent both narrative and standardized summaries. MAIN OUTCOME MEASURE Physicians' format preference as indicated on an ordinal 7-point scale. RESULTS The standardized format was preferred with a score of 4.28 versus 3.84 for the narrative (P < .05). Responses indicated the standardized format provided information most relevant to ongoing care, with a mean score of 4.82 (95% confidence interval [CI] 4.48 to 5.15), and easier access to summary information (5.60, CI 5.30 to 5.89). The narrative summary better described patients' admission (3.54, CI 3.18 to 3.90). Preference for standardized summaries correlated with lengthier narrative summary (P < .05), shorter length of stay (P < .05), and physicians' dissatisfaction with previous summaries (P < .001). Standardized discharge summaries were significantly shorter (302 versus 619 words, P = .004) than narrative summaries. CONCLUSIONS Physicians preferred a standardized format for discharge summaries. Format preference is influenced by physician, patient, and discharge summary characteristics.
Collapse
|
31
|
Abstract
In this article, we review guidelines which may be used to evaluate studies documenting prognosis. We describe a clinical problem involving the prognosis of a patient in an intensive care unit. An approach to the literature search is then outlined. The results of the literature search are described and criteria for the appraisal of articles describing prognosis and prognostic factors are discussed using one article as an example.
Collapse
|
32
|
Quality assessment of a discharge summary system. CMAJ 1995; 152:1437-42. [PMID: 7728692 PMCID: PMC1337907] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/26/2023] Open
Abstract
OBJECTIVE To assess the completeness of hospital discharge summaries and the efficiency of the discharge summary system in two urban teaching hospitals. DESIGN Descriptive study, with follow-up telephone survey. SETTING General internal medicine services at two urban tertiary care hospitals affiliated with the University of Ottawa. PATIENTS A total of 135 patient charts, representing 10% of the patients discharged from the services between Aug. 1 and Dec. 31, 1993. Three charts were unavailable for review, and 26 were excluded because of patient death, early patient discharge (within 48 hours after admission) or lack of discharge summary; this left 106 summaries for analysis of completeness and 114 (including the charts without a summary) for analysis of efficiency. OUTCOME MEASURES Completeness: proportion of summaries in which the following information was reported: admission diagnosis, drug allergies, physical examination, significant laboratory tests and results, discharge diagnosis, discharge medications and medical follow-up. Efficiency: time taken to generate the discharge summary and whether the patient's family physician received it. RESULTS Of the 106 charts with a discharge summary, information was available from the dictation system database for all but one (99.1% complete). Information was missing on the admission diagnosis in 34.0% (36/106) of the summaries, the discharge diagnosis in 25.5% (27/106) and the discharge medications in 22.8% (23/101). Of the 268 significant laboratory tests and results noted in the charts 115 (42.9%) were not reported in the discharge summary. Of the 94 discharge summaries in charts with the patient's family physician listed on the facesheet, 38 (40.4%) were not received by the family physician. CONCLUSIONS Considerable deficiencies in the completeness of the discharge summaries and the efficiency of the discharge summary system were found in the participating hospitals. Replication of this study in other settings is indicated, and strategies to improve the process should be pursued.
Collapse
|
33
|
False-positive coding for acute myocardial infarction on hospital discharge records: chart audit results from a tertiary centre. Can J Cardiol 1990; 6:383-6. [PMID: 2276072] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/31/2022] Open
Abstract
Hospital medical records staff enter diagnostic codes on charts using the International Classification of Diseases (Clinically Modified), Ninth Revision (ICD-9-CM). In a downtown Toronto tertiary hospital, 209 consecutive charts coded for acute myocardial infarction as the primary diagnosis in 1987-88 were reviewed. Criteria for documentation of acute myocardial infarction included symptomatic, electrocardiographic and enzymatic elements. Forty-three (21%) false-positives, ie, charts coded acute myocardial infarction where criteria were not fulfilled, were found (95% confidence interval 15 to 26%). Physician diagnosis of acute myocardial infarction appeared on the face sheet of 30 of the false-positive cases. Common reasons for false-positive face sheet entries and chart coding were acute myocardial infarction within the previous eight weeks with transfer or readmission for coronary angiography and other procedures; and presumed acute myocardial infarction on admission subsequently unproven or disproved. The false-positive proportion was similar to a Canadian study drawing on charts from hospitals of various sizes in 1977, lower than in recent reports from various American tertiary teaching hospitals (P less than 0.0001), and higher than in five Boston area community hospitals (P = 0.0005) where procedure-related transfers or readmissions of previous acute myocardial infarction patients were less likely. This audit lends credence to arguments that changes are needed in ICD-9-CM codes for acute myocardial infarction and in the assignation of reasons for hospitalization.
Collapse
|