1
|
Early Post-Operative Events after Urethroplasty in Obese Patients: A Multi-Institutional Retrospective Series. Urology 2024:S0090-4295(24)00355-8. [PMID: 38754790 DOI: 10.1016/j.urology.2024.05.003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/27/2024] [Revised: 04/29/2024] [Accepted: 05/03/2024] [Indexed: 05/18/2024]
Abstract
OBJECTIVE To compare early urethroplasty outcomes in non-obese, obese and morbidly obese patients undergoing urethroplasty for urethral stricture disease. The impact of obesity on outcomes is poorly understood but will be increasingly important as obesity continues to rise. METHODS Patients underwent urethroplasty at one of five institutions between January 2016 and December 2020. Obese (BMI 30-39.9, n=72) and morbidly obese (BMI > 40, n=49) patients were compared to normal weight (BMI <25, n = 29) and overweight (BMI 25-29.9, n=51) patients. Demographics, comorbidities, and stricture characteristics were collected. Outcomes including complications, recurrence, and secondary interventions were compared using univariate and multivariate analysis. RESULTS 201 patients (Mean BMI 34.1, Range 18.4-65.2) with mean age 52.2 years (SD=17.2) were analyzed. Median follow-up time was 3.71 months. Obese patients were younger (p=0.008), had more anterior (p<0.001), iatrogenic and LS-associated strictures (p=0.036). 60-day complication rate was 26.3% with no differences between cohorts (p= 0.788). 9.5% of patients had extravasation at catheter removal, 18.9% reported stricture recurrence, and 7.4% required additional interventions. Obese patients had greater estimated blood loss (p=0.001) and length of stay (p=0.001). On multivariate analysis, smoking associated with contrast leak (OR 7.176, 95% CI 1.13-45.5) but not recurrence or need for intervention (p=0.155, 0.927). CONCLUSIONS Obese patients in our cohort had more anterior, iatrogenic, and LS-related strictures. However, obesity is not associated with complications, contrast leak, secondary interventions, or recurrence. Obese had higher blood loss and length of stay. Urethroplasty is safe and effective in obese patients.
Collapse
|
2
|
Radiation therapy for retroperitoneal sarcoma: practice patterns in North America. Radiat Oncol 2024; 19:38. [PMID: 38491404 PMCID: PMC10943830 DOI: 10.1186/s13014-024-02407-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2023] [Accepted: 01/16/2024] [Indexed: 03/18/2024] Open
Abstract
BACKGROUND The addition of radiation therapy (RT) to surgery in retroperitoneal sarcoma (RPS) remains controversial. We examined practice patterns in the use of RT for patients with RPS over time in a large, national cohort. METHODS Patients in the National Cancer Database (2004-2017) who underwent resection of RPS were included. Trends over time for proportions were calculated using contingency tables with Cochran-Armitage Trend test. RESULTS Of 7,485 patients who underwent resection, 1,821 (24.3%) received RT (adjuvant: 59.9%, neoadjuvant: 40.1%). The use of RT decreased annually by < 1% (p = 0.0178). There was an average annual increase of neoadjuvant RT by 13% compared to an average annual decrease of adjuvant RT by 6% (p < 0.0001). Treatment at high-volume centers (OR 14.795, p < 0.0001) and tumor > 10 cm (OR 2.009, p = 0.001) were associated with neoadjuvant RT. In contrast liposarcomas (OR 0.574, p = 0.001) were associated with adjuvant RT. There was no statistically significant difference in overall survival between patients treated with surgery alone versus surgery and RT (p = 0.07). CONCLUSION In the United States, the use of RT for RPS has decreased over time, with a shift towards neoadjuvant RT. However, a large percentage of patients are still receiving adjuvant RT and this mostly occurs at low-volume hospitals.
Collapse
|
3
|
Impact of vision impairment on discharge destination for patients with hip fracture. J Clin Orthop Trauma 2024; 50:102377. [PMID: 38495681 PMCID: PMC10937224 DOI: 10.1016/j.jcot.2024.102377] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/14/2023] [Accepted: 02/19/2024] [Indexed: 03/19/2024] Open
Abstract
Introduction Vision impairment (VI) due to low vision or blindness is a major sensory health problem affecting quality of life and contributing to increased risk of falls and hip fractures (HF). Up to 60% of patients with hip fracture have VI, and VI increases further susceptibility to falls due to mobility challenges after HF. We sought to determine if VI affects discharge destination for patients with HF. Materials and methods Cross-sectional analysis of 2015 Inpatient Medicare claims was performed and VI, blindness/low vision), HF and HF surgery were identified using ICD-9, and ICD-10 codes. Patients who sustained a HF with a diagnosis of VI were categorized as HF + VI. The outcome measure was discharge destination of home, skilled nursing facility (SNF), long-term care facility (LTCF) or other. Results During the one-year ascertainment of inpatient claims, there were 10,336 total HF patients, 66.82% female, 91.21% non-Hispanic white with mean (standard deviation) age 82.3 (8.2) years. There was an age-related increase in diagnosis of VI with 1.49% (29/1941) of patients aged 65-74, 1.76% (63/3574) of patients aged 75-84, and 2.07% (100/4821) of patients aged 85 and older. The prevalence of VI increased with age, representing 1.5% (29/1941) of adults aged 65-74, 1.8% (63/3574) of adults aged 75-84, and 2.1% (100/4821) of adults aged 85 and older. The age-related increase in VI was not significant (P = 0.235). Patients with HF were most commonly discharged to a SNF (64.46%), followed by 'Other' (25.70%), home (7.15%), and LTCF (2.67%). VI was not associated with discharge destination. Male gender, Black race, systemic complications, and late postoperative discharge significantly predicted discharge to LTCF with odds ratios (95%CI) 1.42 (1.07-1.89), 1.90 (1.13-3.18), 2.27 (1.66-3.10), and 1.73 (1.25-2.39) respectively. Conclusions The co-morbid presence of VI was not associated with altered discharge destinations to home, skilled nursing facility, LTCF or other setting.
Collapse
|
4
|
The effect of frailty and age on outcomes in elective paraesophageal hernia repair. Surg Endosc 2023; 37:9514-9522. [PMID: 37704792 DOI: 10.1007/s00464-023-10363-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2023] [Accepted: 07/30/2023] [Indexed: 09/15/2023]
Abstract
INTRODUCTION Paraesophageal hernia repair (PEHR) is a safe and effective operation. Previous studies have described risk factors for poor peri-operative outcomes such as emergent operations or advanced patient age, and pre-operative frailty is a known risk factor in other major surgery. The goal of this retrospective cohort study was to determine if markers of frailty were predictive of poor peri-operative outcomes in elective paraesophageal hernia repair. METHODS Patients who underwent elective PEHR between 1/2011 and 6/2022 at a single university-based institution were identified. Patient demographics, modified frailty index (mFI), and post-operative outcomes were recorded. A composite peri-operative morbidity outcome indicating the incidence of any of the following: prolonged length of stay (≥ 3 days), increased discharge level of care, and 30-day complications or readmissions was utilized for statistical analysis. Descriptive statistics and logistic regression were used to analyze the data. RESULTS Of 547 patients who underwent elective PEHR, the mean age was 66.0 ± 12.3, and 77.1% (n = 422) were female. Median length of stay was 1 [IQR 1, 2]. ASA was 3-4 in 65.8% (n = 360) of patients. The composite outcome occurred in 32.4% (n = 177) of patients. On multivariate analysis, increasing age (OR 1.021, p = 0.02), high frailty (OR 2.02, p < 0.01), ASA 3-4 (OR 1.544, p = 0.05), and redo-PEHR (OR 1.72, p = 0.02) were each independently associated with the incidence of the composite outcome. On a regression of age for the composite outcome, a cutoff point of increased risk is identified at age 72 years old (OR 2.25, p < 0.01). CONCLUSION High frailty and age over 72 years old each independently confer double the odds of a composite morbidity outcome that includes prolonged post-operative stay, peri-operative complications, the need for a higher level of care after elective paraesophageal hernia repair, and 30-day readmission. This provides additional information to counsel patients pre-operatively, as well as a potential opportunity for targeted pre-habilitation.
Collapse
|
5
|
Thoracic retransplantation: Does time to retransplantation matter? J Thorac Cardiovasc Surg 2023; 166:1529-1541.e4. [PMID: 36049964 DOI: 10.1016/j.jtcvs.2022.05.003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/28/2022] [Revised: 04/18/2022] [Accepted: 05/03/2022] [Indexed: 11/26/2022]
Abstract
OBJECTIVE For some individuals, chronic allograft failure is best treated with retransplantation. We sought to determine if time to retransplantation impacts short- and long-term outcomes for heart or lung retransplant recipients with a time to retransplantation more than 1 year. METHODS The United Network for Organ Sharing/Organ Procurement and Transplantation Network STAR file was queried for all adult, first-time heart (June 1, 2006, to September 30, 2020) and lung (May 1, 2005, to September 30, 2020) retransplantations with a time to retransplantation of at least 1 year. Patients were grouped according to the tertile of time to retransplantation (tertile 1: 1-7.7 years, tertile 2: 7.7-14.7 years, tertile 3: 14.7+ years; lung: tertile 1: 1-2.8 years, tertile 2: 2.8-5.6 years, tertile 3: 5.6+ years). The primary outcome was survival after retransplantation. Comparative statistics identified differences in groups, and Kaplan-Meier methods and a Cox proportional hazard model were used for survival analysis. RESULTS After selection, 908 heart and 871 lung retransplants were identified. Among heart retransplant recipients, tertile 1 was associated with male sex, smoking history, higher listing status, and increased mechanical support pretransplant. Tertile 3 had the highest rate of concomitant kidney transplant; however, the incidence of morbidity and in-hospital mortality was similar among the groups. Unadjusted and adjusted analyses revealed no survival difference among all groups. Regarding lung retransplant recipients, tertile 1 was associated with increased lung allocation score, pretransplant hospitalization, and mechanical support. Unadjusted and adjusted survival analyses revealed decreased survival in tertile 1. CONCLUSIONS Time to retransplant does not appear to affect heart recipients with a time to retransplantation of more than 1 year; however, shorter time to retransplantation for prior lung recipients is associated with decreased survival. Potential lung retransplant candidates with a time to retransplantation of less than 2.8 years should be carefully evaluated before retransplantation.
Collapse
|
6
|
A systematic review and meta-analysis of pancreatectomy rates following neoadjuvant therapy in patients with pancreatic ductal adenocarcinoma. EUROPEAN JOURNAL OF SURGICAL ONCOLOGY 2023. [DOI: 10.1016/j.ejso.2022.11.188] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/23/2023]
|
7
|
Donor and Recipient Hepatitis C Status Does Not Affect Rejection in Thoracic Transplantation. Ann Thorac Surg 2023; 115:221-230. [PMID: 35940315 DOI: 10.1016/j.athoracsur.2022.05.072] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/11/2021] [Revised: 05/12/2022] [Accepted: 05/28/2022] [Indexed: 12/31/2022]
Abstract
BACKGROUND Donors with hepatitis C virus (HCV) have expanded the donor pool for heart and lung transplantation, but concerns have arisen about rejection. We examined the incidence of rejection after heart and lung transplantation in recipients of HCV-positive donors as well as HCV-positive recipients. METHODS Adults undergoing heart and lung transplantation from March 31, 2015 to December 31, 2019 were identified in the United Network for Organ Sharing/Organ Transplantation and Procurement Network Standard Transplant Analysis and Research file. Patients were stratified as donor-recipient HCV negative, donor positive, and recipient positive. Comparative statistics and a multilevel logistic regression model were used. RESULTS Meeting the criteria were 10 624 heart transplant recipients. Donor-positive recipients were significantly associated with older age, blood group O, and shorter waitlist time. No significant differences existed with regards to treatment for rejection in the first year (negative, 19.5%; donor positive, 22.3%; recipient positive, 19.5%; P = .45) or other outcomes. On regression analysis HCV status was not associated with treated rejection; however center variability was significantly associated with treated rejection (median odds ratio, 2.18). Similarly, 9917 lung transplant recipients were identified. Donor-positive recipients were more commonly White and had obstructive disease and lower lung allocation scores. Both unadjusted (negative, 22.1%; donor positive, 23.0%; recipient positive, 18.6%; P = .43) and adjusted analyses failed to demonstrate a significant association between HCV status and treatment for rejection, whereas center variability remained significantly associated with treatment for rejection (median odds ratio, 2.41). CONCLUSIONS Use of HCV donors has expanded the donor pool for heart and lung transplantation. HCV donor status was not associated with treatment for rejection in the first year, but center variability played a role in the incidence and treatment of rejection.
Collapse
|
8
|
Surgical resection rates after neoadjuvant therapy for localized pancreatic ductal adenocarcinoma: meta-analysis. Br J Surg 2022; 110:34-42. [PMID: 36346716 DOI: 10.1093/bjs/znac354] [Citation(s) in RCA: 21] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2022] [Revised: 09/22/2022] [Accepted: 09/30/2022] [Indexed: 11/09/2022]
Abstract
BACKGROUND Neoadjuvant therapy is increasingly being used before surgery for localized pancreatic cancer. Given the importance of completing multimodal therapy, the aim of this study was to characterize surgical resection rates after neoadjuvant therapy as well as the reasons for, and long-term prognostic impact of, not undergoing resection. METHODS A systematic review and meta-analysis of prospective trials and high-quality retrospective studies since 2010 was performed to calculate pooled resection rates using a generalized random-effects model for potentially resectable, borderline resectable, and locally advanced pancreatic cancer. Median survival times were calculated using random-effects models for patients who did and did not undergo resection. RESULTS In 125 studies that met the inclusion criteria, neoadjuvant therapy consisted of chemotherapy (36.8 per cent), chemoradiation (15.2 per cent), or chemotherapy and radiation (48.0 per cent). Among 11 713 patients, the pooled resection rates were 77.4 (95 per cent c.i. 71.3 to 82.5), 60.6 (54.8 to 66.1), and 22.2 (16.7 to 29.0) per cent for potentially resectable, borderline resectable, and locally advanced pancreatic cancer respectively. The most common reasons for not undergoing resection were distant progression for resectable and borderline resectable cancers, and local unresectability for locally advanced disease. Among 42 studies with survival data available, achieving surgical resection after neoadjuvant therapy was associated with improved survival for patients with potentially resectable (median 38.5 versus 13.3 months), borderline resectable (32.3 versus 13.9 months), and locally advanced (30.0 versus 14.6 months) pancreatic cancer (P < 0.001 for all). CONCLUSION Although rates of surgical resection after neoadjuvant therapy vary based on anatomical stage, surgery is associated with improved survival for all patients with localized pancreatic cancer. These pooled resection and survival rates may inform patient-provider decision-making and serve as important benchmarks for future prospective trials.
Collapse
|
9
|
"Early results after initiation of a rib fixation programme: A propensity score matched analysis". Injury 2022; 53:137-144. [PMID: 34565619 DOI: 10.1016/j.injury.2021.09.009] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/04/2021] [Revised: 09/06/2021] [Accepted: 09/10/2021] [Indexed: 02/02/2023]
Abstract
INTRODUCTION Chest wall injuries are very common in blunt trauma and development of treatment protocols can significantly improve outcomes. Surgical stabilisation of rib fractures (SSRF) is an adjunct for the most severe chest injuries and can be used as a part of a comprehensive approach to chest injuries care. We hypothesized that implementation of a SSRF programme program would result in improved short-term outcomes. MATERIALS AND METHODS The characteristics of the initial group of SSRF patients (Early-SSRF) were used to identify matching factors. Patients prior to SSRF protocol underwent a propensity score match, followed by screening for operative indications and contraindications. After exclusions, a non-operative (Non-Op) cohort was defined (n=36) resulting in an approximately 1:1 match. An overall operative cohort, inclusive of Early-SSRF and all subsequent operative patients, was defined (All- SSRF). A before-and-after analysis using chi-squared, Students T-tests, and Mann-Whitney U-tests were used to assess significance at the level of 0.05. RESULTS Early-SSRF (n=22) and All-SSRF (n=45) were compared to Non-Op (n=36). The selection process resulted in well matched groups, and equally well-balanced operative indications between the groups. The Early-SSRF group demonstrated shortened duration of mechanical ventilation and a decreased frequency of being discharged a long-term acute care hospital. The All-SSRF group again demonstrated markedly shorter duration of mechanical ventilation compared to Non-Op (median 6 days vs 16 days, p < 0.01), more decrease discharge to a long-term acute care hospital (9% vs. 36%, p=0.01), and reduced risk for tracheostomy (8.9% vs. 33.3% respectively, p<0.01) CONCLUSION: The introduction of an operative rib fixation to a comprehensive chest wall injury protocol can produce improvements in clinical outcomes that decrease time on the ventilator and tracheostomy rates, and result in more patients being discharged to home. Creation and implementation of a chest wall injury protocol to include SSRF requires a multidisciplinary approach and thoughtful patient selection.
Collapse
|
10
|
The Effect of Video Education on Skin-to-Skin Contact at the Time of Delivery: A Randomized Controlled Trial. AJP Rep 2022; 12:e10-e16. [PMID: 35141030 PMCID: PMC8816630 DOI: 10.1055/s-0041-1741540] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/28/2020] [Accepted: 10/08/2021] [Indexed: 11/18/2022] Open
Abstract
Objective The objective of this study was to measure the impact of video education at the time of admission for delivery on intent and participation in skin-to-skin contact (SSC) immediately after birth. Methods This study was a randomized controlled trial of educational intervention in women ( N = 240) of 18 years or older admitted in anticipation of normal spontaneous term delivery. Alternate patients were randomized into video ( N = 120) and no video ( N = 120) groups. Both groups received a survey about SSC. The video group watched an educational DVD and completed a postsurvey about SSC. Results During the preintervention survey, 89.2% of those in the video group compared with 83.3% of those in the no video group indicated that they planned to use SSC ( p = 0.396). After the video, 98.3% planned to do SSC after delivery ( p < 0.001). However, only 59.8% started SSC within 5 minutes of delivery in the video group and only 49.4% started SSC within 5 minutes of delivery in the no video group ( p = 0.17). Conclusion Video education alters the intention and trends toward participation in SSC within 5 minutes of delivery. Despite the plans for SSC, however, there was no significant difference in rates between the two groups. These findings support that obstacles, other than prenatal education, may affect early SSC. Key Points Significant obstacles impact skin-to-skin rate.Video education alters skin-to-skin intent.Video education can improve skin-to-skin rate.Education can happen at the time of delivery.Video education can impact mothers and infants.
Collapse
|
11
|
Hitting the Vasopressor Ceiling: Finding Norepinephrine Associated Mortality in the Critically Ill. J Surg Res 2021; 265:139-146. [PMID: 33940236 DOI: 10.1016/j.jss.2021.03.042] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2020] [Revised: 03/07/2021] [Accepted: 03/20/2021] [Indexed: 01/27/2023]
Abstract
BACKGROUND There is no consensus on what dose of norepinephrine corresponds with futility. The purpose of this study was to investigate the maximum infusion and cumulative doses of norepinephrine associated with survival for patients in medical and surgical intensive care units (MICU and SICU). MATERIALS AND METHODS A retrospective review was conducted of 661 critically ill patients admitted to a large academic medical center who received norepinephrine. Univariate, multivariate, and area under the curve analyses with optimal cut offs for maximum infusion rate and cumulative dosage were determined by Youden Index. RESULTS The population was 54.9% male, 75.8% white, and 58.7 ± 16.1 y old with 384 (69.8%) admitted to the MICU and 166 (30.2%) admitted to the SICU, including 38 trauma patients. Inflection points in mortality were seen at 18 mcg/min and 17.6 mg. The inflection point was higher in MICU patients at 21 mcg/min and lower in SICU patients at 11 mcg/min. MICU patients also had a higher maximum cumulative dosage of 30.7 mg, compared to 2.7 mg in SICU patients. In trauma patients, norepinephrine infusions up to 5 mcg/min were associated with a 41.7% mortality rate. CONCLUSION A maximum rate of 18 mcg/min and cumulative dose of 17.6 mg were the inflection points for mortality risk in ICU patients, with SICU patients tolerating lower doses. In trauma patients, even low doses of norepinephrine were associated with higher mortality. These data suggest that MICU, SICU, and trauma patients differ in need for, response to, and outcome from escalating norepinephrine doses.
Collapse
|
12
|
Intrathoracic Pressure Regulator Performance in the Setting of Hemorrhage and Acute Lung Injury. Mil Med 2021; 185:e1083-e1090. [PMID: 32350538 DOI: 10.1093/milmed/usz485] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/02/2023] Open
Abstract
INTRODUCTION Intrathoracic pressure regulation (ITPR) can be utilized to enhance venous return and cardiac preload by inducing negative end expiratory pressure in mechanically ventilated patients. Previous preclinical studies have shown increased mean arterial pressure (MAP) and decreased intracranial pressure (ICP) with use of an ITPR device. The aim of this study was to evaluate the hemodynamic and respiratory effects of ITPR in a porcine polytrauma model of hemorrhagic shock and acute lung injury (ALI). METHODS Swine were anesthetized and underwent a combination of sham, hemorrhage, and/or lung injury. The experimental groups included: no injury with and without ITPR (ITPR, Sham), hemorrhage with and without ITPR (ITPR/Hem, Hem), and hemorrhage and ALI with and without ITPR (ITPR/Hem/ALI, Hem/ALI). The ITPR device was initiated at a setting of -3 cmH2O and incrementally decreased by 3 cmH2O after 30 minutes on each setting, with 15 minutes allowed for recovery between settings, to a nadir of -12 cmH2O. Histopathological analysis of the lungs was scored by blinded, independent reviewers. Of note, all animals were chemically paralyzed for the experiments to suppress gasping at ITPR pressures below -6 cmH2O. RESULTS Adequate shock was induced in the hemorrhage model, with the MAP being decreased in the Hem and ITPR/Hem group compared with Sham and ITPR/Sham, respectively, at all time points (Hem 54.2 ± 6.5 mmHg vs. 88.0 ± 13.9 mmHg, p < 0.01, -12 cmH2O; ITPR/Hem 59.5 ± 14.4 mmHg vs. 86.7 ± 12.1 mmHg, p < 0.01, -12 cmH2O). In addition, the PaO2/FIO2 ratio was appropriately decreased in Hem/ALI compared with Sham and Hem groups (231.6 ± 152.5 vs. 502.0 ± 24.6 (Sham) p < 0.05 vs. 463.6 ± 10.2, (Hem) p < 0.01, -12 cmH2O). Heart rate was consistently higher in the ITPR/Hem/ALI group compared with the Hem/ALI group (255 ± 26 bpm vs. 150.6 ± 62.3 bpm, -12 cmH2O) and higher in the ITPR/Hem group compared with Hem. Respiratory rate (adjusted to maintain pH) was also higher in the ITPR/Hem/ALI group compared with Hem/ALI at -9 and - 12 cmH2O (32.8 ± 3.0 breaths per minute (bpm) vs. 26.8 ± 3.6 bpm, -12 cmH2O) and higher in the ITPR/Hem group compared with Hem at -6, -9, and - 12 cmH2O. Lung compliance and end expiratory lung volume (EELV) were both consistently decreased in all three ITPR groups compared with their controls. Histopathologic severity of lung injury was worse in the ITPR and ALI groups compared with their respective injured controls or Sham. CONCLUSION In this swine polytrauma model, we demonstrated successful establishment of hemorrhage and combined hemorrhage/ALI models. While ITPR did not demonstrate a benefit for MAP or ICP, our data demonstrate that the ITPR device induced tachycardia with associated increase in cardiac output, as well as tachypnea with decreased lung compliance, EELV, PaO2/FIO2 ratio, and worse histopathologic lung injury. Therefore, implementation of the ITPR device in the setting of polytrauma may compromise pulmonary function without significant hemodynamic improvement.
Collapse
|
13
|
Rib Season: Temporal Variation in Chest Wall Injuries. J Surg Res 2020; 260:129-133. [PMID: 33338889 DOI: 10.1016/j.jss.2020.11.074] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2020] [Revised: 11/04/2020] [Accepted: 11/15/2020] [Indexed: 10/22/2022]
Abstract
INTRODUCTION Trauma to the chest wall is one of the most common injuries suffered. Knowing whether there are regular and reproducible changes in frequency or severity of certain injury types may help resource allocation and improve prevention efforts or outcomes; however, no prior studies have evaluated seasonal variation in chest wall injuries (CWIs). We aimed to determine if CWIs vary annually in a consistent distinct temporal variation. METHODS Using an established traumatic blunt CWI database at a single urban level 1 trauma center, patients with a moderate-to-severe (chest wall Abbreviated Injury Score (AIS) ≥2) CWI were reviewed. A subpopulation of predominant chest wall injury (pCWI) was defined as those with a chest wall AIS ≥3 and no other anatomic region having a higher AIS. Demographics, injury patterns, mechanisms of injury, and AIS were collected in addition to date of injury over a 4-y period. Data were analyzed using descriptive statistics as well as Poisson time-series regression for periodicity. Seasonal comparison of populations was performed using Student's t-tests and Analysis of Variance (ANOVA) with significance assessed at a level of P < 0.05. RESULTS Over a 4-y period nearly 16,000 patients presented with injury, of which 3042 patients were found to have a blunt CWI. Total CWI patients per year from 2014 to 2017 ranged from 571 to 947. Over this period, August had the highest incidence for patients with any CWI, moderate-to-severe injuries, and pCWI. February had the lowest overall injury incidence as well as lowest moderate-to-severe injury incidence. January had the lowest pCWI incidence. Yearly changes followed a quadratic sinusoid model that predicted a peak between incidence, between June and October, and the low season. A low season was found to be December-April. Comparing low to high seasons of injured patient monthly means revealed significant differences: total injuries (69.94 versus 85.56, P = 0.04), moderate to severe (62.25 versus 78.19, P = 0.06), and pCWI (25.25 versus 34.44, P = 0.01). Analysis of injuries by mechanism revealed a concomitant increase in motorcycle collisions during this period. CONCLUSIONS There appears to be a significant seasonal variation in the overall incidence of CWI as well as severe pCWI, with a high-volume injury season in summer months (June-October) and low-volume season in winter (December-April). Motorcycle accidents were the major blunt injury mechanism that changed with this seasonality. These findings may help guide resource utilization and injury prevention.
Collapse
|
14
|
Does chest wall Organ Injury Scale (OIS) or Abbreviated Injury Scale (AIS) predict outcomes? An analysis of 16,000 consecutive rib fractures. Surgery 2020; 168:198-204. [DOI: 10.1016/j.surg.2020.04.032] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2020] [Revised: 03/20/2020] [Accepted: 04/15/2020] [Indexed: 01/30/2023]
|
15
|
Mortality From Burns Sustained on Home Oxygen Therapy Exceeds Predicted Mortality. J Burn Care Res 2020; 41:976-980. [DOI: 10.1093/jbcr/iraa097] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
Abstract
Abstract
The Boston Criteria and the Abbreviated Burn Severity Index are two widely accepted models for predicting mortality in burn patients. We aimed to elucidate whether these models are able to predict the risk of mortality in patients who sustain burns while smoking on home oxygen given their clinical fragility. We conducted a retrospective chart review of 48 patients admitted to our burn center from November 2013 to September 2017 who sustained a burn while smoking on home oxygen. Yearlong mortality was the primary outcome of the investigation; secondary outcomes included discharge to facility, length of stay, and need for tracheostomy. We calculated the expected mortality rate for each patient based on Boston Criteria and Abbreviated Burn Severity Index and compared the mortality rate observed in our cohort. Patients in our cohort suffered a 54% mortality rate within a year of injury, compared to a 23.5% mortality predicted by Boston Criteria, which was found to be statistically significant by chi-square analysis (P < .05). Abbreviated Burn Severity Index predicted mortality was 19.7%. While the absolute value of the difference in mortality was greater, this was not significant on chi-square analysis due to sample size. Our secondary outcomes revealed 42% discharge to facility, the average length of stay of 6.2 days, and 6.25% required tracheostomy. Patients whose burns are attributable to smoking on home oxygen may have an increased risk of mortality than prognostication models would suggest. This bears significant clinical impact, particularly regarding family and provider decision making in pursuing aggressive management.
Collapse
|
16
|
Neoadjuvant Therapy for Resectable and Borderline Resectable Pancreatic Cancer: A Meta-Analysis of Randomized Controlled Trials. J Clin Med 2020; 9:jcm9041129. [PMID: 32326559 PMCID: PMC7231310 DOI: 10.3390/jcm9041129] [Citation(s) in RCA: 63] [Impact Index Per Article: 15.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2020] [Revised: 04/07/2020] [Accepted: 04/13/2020] [Indexed: 12/15/2022] Open
Abstract
The efficacy of neoadjuvant therapy (NT) versus surgery first (SF) for pancreatic ductal adenocarcinoma (PDAC) remains controversial. A random-effects meta-analysis of only prospective randomized controlled trials (RCTs) comparing NT versus SF for potentially resectable (PR) or borderline resectable (BR) PDAC was performed. Among six RCTs including 850 patients, 411 (48.3%) received NT and 439 (51.6%) SF. In all included trials, NT was gemcitabine-based: four using chemoradiation and two chemotherapy alone. Based on an intention-to-treat analysis, NT resulted in improved overall survival (OS) compared to SF (HR 0.73, 95% CI 0.61–0.86). This effect was independent of anatomic classification (PR: hazard ratio (HR) 0.73, 95% CI 0.59–0.91; BR: HR 0.51 95% CI 0.28–0.93) or NT type (chemoradiation: HR 0.77, 95% CI 0.61–0.98; chemotherapy alone: HR 0.68, 95% CI 0.54–0.87). Overall resection rate was similar (risk ratio (RR) 0.93, 95% CI 0.82–1.04, I2 = 39.0%) but NT increased the likelihood of a margin-negative (R0) resection (RR 1.51, 95% CI 1.18–1.93, I2 = 0%) and having negative lymph nodes (RR 2.07, 95% CI 1.47–2.91, I2 = 12.3%). In this meta-analysis of prospective RCTs, NT significantly improved OS in an intention-to-treat fashion, compared with SF for localized PDAC. Randomized controlled trials using contemporary multi-agent chemotherapy will be needed to confirm these findings and to define the optimal NT regimen.
Collapse
|
17
|
A Retrospective Multisite Case-Control Series of Concomitant Use of Daptomycin and Statins and the Effect on Creatine Phosphokinase. Open Forum Infect Dis 2019; 6:ofz444. [PMID: 31723571 PMCID: PMC6837837 DOI: 10.1093/ofid/ofz444] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2019] [Accepted: 10/28/2019] [Indexed: 02/02/2023] Open
Abstract
Objective Daptomycin has been associated with increased creatine phosphokinase (CPK) due to muscle injury leading to myalgias and muscle weakness. Statins have been proven to cause the same effects and it is recommended to discontinue the use of statins while on daptomycin. Evidence regarding this drug interaction is mixed. This study evaluated the risk of CPK elevation in concomitant use of daptomycin and statins compared to daptomycin alone. Method This is a multisite retrospective case-control study of patients who received daptomycin therapy with monitoring of CPK. Rates of CPK elevations were compared in patients receiving daptomycin with a statin versus daptomycin alone. To estimate the association between CPK elevation and daptomycin therapy controlling for other risk factors, logistic regression was used to analyze data. Statistical significance was determined at ɑ of 0.05. Results A total of 3658 patients were included in the study, with 2787 on daptomycin therapy alone and 871 with concurrent statin use. The incidence of CPK elevation was 90 events (3.2%) in the daptomycin group and 26 events (3.0%) in the concurrent statin group. Patients who received daptomycin therapy in addition to statins had no statistically significant difference from patients on daptomycin alone (hazard ratio, 1.05; P = .85; 95% confidence interval, 0.61-1.84). After adjusting for potential risk factors, the hazards ratio remained almost the same. Conclusions Concomitant use of daptomycin and statin did not show an increase risk of CPK elevation. Clinicians may consider concomitant use of daptomycin and statin therapy with weekly CPK monitoring.
Collapse
|
18
|
IPC11. Contemporary Trend of Vascular Trauma at a Tertiary Level Referral Center. J Vasc Surg 2019. [DOI: 10.1016/j.jvs.2019.04.086] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
|
19
|
A Prospective, Randomized Trial Comparing Liposomal Bupivacaine vs Fascia Iliaca Compartment Block for Postoperative Pain Control in Total Hip Arthroplasty. J Arthroplasty 2017; 32:2181-2185. [PMID: 28318860 DOI: 10.1016/j.arth.2017.02.019] [Citation(s) in RCA: 29] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/15/2016] [Revised: 01/31/2017] [Accepted: 02/06/2017] [Indexed: 02/01/2023] Open
Abstract
BACKGROUND Increasing demand for total hip arthroplasty (THA) in a climate of increasing focus on clinical outcomes, patient satisfaction, and cost has created a need for better acute postoperative pain control for patients. An ideal pain control method would have few side effects, decreased opioid consumption, improved pain control, early ambulation, and decreased hospital length of stay (LOS). METHODS We performed a prospective randomized, controlled study involving 79 patients undergoing elective THA between June 2015 and February 2016. Forty patients received liposomal bupivacaine and 39 patients received a fascia iliaca compartment block (FICB). In addition, the medical records of 28 patients who underwent elective THA between May 2015 and December 2015 were retrospectively examined. The primary outcome was visual analog scale pain scores and the secondary outcomes were LOS and total opioid consumption. SPSS, version 22, was used to run 1-way analysis of variance with contrast and Mood's median test on the data. RESULTS There were statistically significant decreases in pain intensity (P = .019) and LOS (P = .041) in both the liposomal bupivacaine group and the FICB group compared with those in the retrospective control group. In addition, only the FICB group showed statistically significant decreased total opioid consumption compared with that in the retrospective group (P = .028). CONCLUSION Patients undergoing elective THA have decreased overall pain intensity and a shorter LOS with multimodal pain management regimen that includes either liposomal bupivacaine or FICB. Patients who received FICB required less overall total opioids than the control group.
Collapse
|
20
|
Family-centered rounds and medical student performance on the NBME pediatrics subject (shelf) examination: a retrospective cohort study. MEDICAL EDUCATION ONLINE 2016; 21:30919. [PMID: 27087016 PMCID: PMC4834362 DOI: 10.3402/meo.v21.30919] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/06/2016] [Accepted: 02/22/2016] [Indexed: 06/02/2023]
Abstract
OBJECTIVE To determine the association between family-centered rounds (FCR) and medical student knowledge acquisition as assessed by the National Board of Medical Examiners (NBME) pediatric subject (shelf) exam. METHODS A retrospective cohort study was conducted of third-year medical students who graduated from Virginia Commonwealth University School of Medicine between 2009 and 2014. This timeframe represented the transition from 'traditional' rounds to FCR on the pediatric inpatient unit. Data collected included demographics, United States Medical Licensing Examination (USMLE) Step 1 and 2 scores, and NBME subject examinations in pediatrics (PSE), medicine (MSE), and surgery (SSE). RESULTS Eight hundred and sixteen participants were included in the analysis. Student performance on the PSE could not be statistically differentiated from performance on the MSE for any year except 2011 (z-score=-0.17, p=0.02). Average scores on PSE for years 2009, 2010, 2013, and 2014 were significantly higher than for SSE, but not significantly different for all other years. The PSE was highly correlated with USMLE Step 1 and Step 2 examinations (correlation range 0.56-0.77) for all years. CONCLUSIONS Our results showed no difference in PSE performance during a time in which our institution transitioned to FCR. These findings should be reassuring for students, attending physicians, and medical educators.
Collapse
|
21
|
Partner violence victimization and unintended pregnancy in Latina and Asian American women: Analysis using structural equation modeling. Women Health 2016; 57:430-445. [DOI: 10.1080/03630242.2016.1170094] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
22
|
Antioxidant diet and sex interact to regulate NOS isoform expression and glomerular mesangium proliferation in Zucker diabetic rat kidney. Acta Histochem 2016; 118:183-93. [PMID: 26797190 DOI: 10.1016/j.acthis.2015.12.011] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2015] [Revised: 12/30/2015] [Accepted: 12/31/2015] [Indexed: 12/21/2022]
Abstract
Oxidative stress contributes substantially to the pathophysiology of diabetic nephropathy (DN). Consumption of an antioxidant-fortified (AO) diet from an early age prevents or delays later development of DN in the Zucker rat female with type 2 diabetes. We hypothesize this is due to effects on mesangial matrix and renal nitric oxide synthase (NOS) distribution and to sex-specific differences in NOS responses in the diabetic kidney. Total glomerular tuft area (GTA) and PAS-positive tuft area (PTA), endothelial (e), neuronal (n) and inducible (i) NOS were quantified in males and females on AO or regular (REG) diet at 6 and 20 weeks of age. eNOS was observed in glomeruli and tubules. nNOS predominantly localized to tubular epithelium in both cortex and medulla. iNOS was expressed in proximal and distal tubules and collecting ducts. Sex, diabetes duration and AO diet affected the distribution of the three isoforms. GTA and PTA increased with duration of hyperglycemia and showed a negative correlation with renal levels of all NOS isoforms. AO diet in both genders was associated with less PAS-positive staining and less mesangial expansion than the REG diet, an early increase in cortical iNOS in males, and sex-specific changes in cortical eNOS at 20 weeks. These effects of AO diet may contribute to sex-specific preservation of renal function in females.
Collapse
|
23
|
Caregiving, gender, and nutritional status in Nyanza Province, Kenya: Grandmothers gain, grandfathers lose. Am J Hum Biol 2011; 23:498-508. [DOI: 10.1002/ajhb.21172] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2010] [Revised: 01/10/2011] [Accepted: 02/12/2011] [Indexed: 11/06/2022] Open
|
24
|
Comanagement of elderly patients with type 2 diabetes: better adherence to ADA guidelines? ACTA ACUST UNITED AC 2010. [DOI: 10.1016/j.osfp.2010.04.001] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/19/2023]
|
25
|
A randomized clinical trial of a coping improvement group intervention for HIV-infected older adults. J Behav Med 2010; 34:102-11. [PMID: 20857188 DOI: 10.1007/s10865-010-9292-6] [Citation(s) in RCA: 35] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2010] [Accepted: 08/25/2010] [Indexed: 11/30/2022]
Abstract
This research tested if a 12-session coping improvement group intervention (n = 104) reduced depressive symptoms in HIV-infected older adults compared to an interpersonal support group intervention (n = 105) and an individual therapy upon request (ITUR) control condition (n = 86). Participants were 295 HIV-infected men and women 50-plus years of age living in New York City, Cincinnati, OH, and Columbus, OH. Using A-CASI assessment methodology, participants provided data on their depressive symptoms using the Geriatric Depression Screening Scale (GDS) at pre-intervention, post-intervention, and 4- and 8-month follow-up. Whether conducted with all participants (N = 295) or only a subset of participants diagnosed with mild, moderate, or severe depressive symptoms (N = 171), mixed models analyses of repeated measures found that both coping improvement and interpersonal support group intervention participants reported fewer depressive symptoms than ITUR controls at post-intervention, 4-month follow-up, and 8-month follow-up. The effect sizes of the differences between the two active interventions and the control group were greater when outcome analyses were limited to those participants with mild, moderate, or severe depressive symptoms. At no assessment period did coping improvement and interpersonal support group intervention participants differ in depressive symptoms.
Collapse
|
26
|
Pharmacotherapy and the risk for community-acquired pneumonia. BMC Geriatr 2010; 10:45. [PMID: 20604960 PMCID: PMC2909244 DOI: 10.1186/1471-2318-10-45] [Citation(s) in RCA: 58] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2009] [Accepted: 07/06/2010] [Indexed: 12/14/2022] Open
Abstract
Background Some forms of pharmacotherapy are shown to increase the risk of community-acquired pneumonia (CAP). The purpose of this study is to investigate whether pharmacotherapy with proton pump inhibitors (PPI), inhaled corticosteroids, and atypical antipsychotics was associated with the increased risk for CAP in hospitalized older adults with the adjustment of known risk factors (such as smoking status and serum albumin levels). Methods A retrospective case-control study of adults aged 65 years or older at a rural community hospital during 2004 and 2006 was conducted. Cases (N = 194) were those with radiographic evidence of pneumonia on admission. The controls were patients without the discharge diagnosis of pneumonia or acute exacerbation of chronic obstructive pulmonary disease (COPD) (N = 952). Patients with gastric tube feeding, ventilator support, requiring hemodialysis, metastatic diseases or active lung cancers were excluded. Results Multiple logistic regression analysis revealed that the current use of inhaled corticosteroids (adjusted odds ratio [AOR] = 2.89, 95% confidence interval [CI] = 1.56-5.35) and atypical antipsychotics (AOR = 2.26, 95% CI = 1.23-4.15) was an independent risk factor for CAP after adjusting for confounders, including age, serum albumin levels, sex, smoking status, a history of congestive heart failure, coronary artery disease, and COPD, the current use of PPI, β2 agonist and anticholinergic bronchodilators, antibiotic(s), iron supplement, narcotics, and non-steroidal anti-inflammatory drugs. The crude OR and the AOR of PPI use for CAP was 1.41 [95% CI = 1.03 - 1.93] and 1.18 [95% CI = 0.80 - 1.74] after adjusting for the above confounders, respectively. Lower serum albumin levels independently increased the risk of CAP 1.89- fold by decreasing a gram per deciliter (AOR = 2.89, 95% CI = 2.01 - 4.16). Conclusion Our study reaffirmed that the use of inhaled corticosteroids and atypical antipsychotics was both associated with an increased risk for CAP in hospitalized older adults of a rural community. No association was found between current PPI use and the risk for CAP in this patient population of our study.
Collapse
|
27
|
Risk factors associated with stool retention assessed by abdominal radiography for constipation. J Am Med Dir Assoc 2010; 11:572-8. [PMID: 20889093 DOI: 10.1016/j.jamda.2009.11.015] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2009] [Revised: 11/24/2009] [Accepted: 11/24/2009] [Indexed: 10/19/2022]
Abstract
OBJECTIVES To assess the reliability of applying a radiographic scoring system in estimating the severity of stool retention (SR) in hospitalized older adults with constipation, and to identify risk factors associated with clinical constipation and SR scores. DESIGN Retrospective, case series study. SETTING Southeast Ohio community hospital. PARTICIPANTS Adults 65 years or older with constipation or fecal impaction and abdominal radiographs available (N=122). Bowel obstruction was excluded. MEASUREMENT Radiographs were independently scored by four readers twice, "5" being the most severe, for each quadrant of an abdominal film; possible total score was 0 to 20. Clinical constipation was defined as an average SR score of 13 or higher. Intra-class correlation was used to measure inter-rater agreement. RESULTS The overall inter-rater agreement on abdominal radiograph readings was 0.91, 95% confidence interval (CI)=0.88-0.93. Clinical constipation was associated with the use of statins and antimuscarinics by univariate logistic regression analysis. After adjusting for age, sex, residency, smoking history, oral laxatives, and self-reported constipation, the use of statins remained significantly associated with clinical constipation (OR=3.86, 95% CI=1.08-13.77, P=.036). Univariate linear regression analysis revealed that higher SR scores were associated with community residency, self-reported constipation, and the use of statins and antimuscarinics. After adjusting for the above confounders by multiple linear regression analyses, the use of antimuscarinics was independently associated with higher SR score (β=1.769, 95% CI=0.008-3.531, P=.049). CONCLUSION Abdominal radiography was reliable in assessing the severity of SR in older adults with constipation. The use of statins and antimuscarinics was associated with clinical constipation and greater SR.
Collapse
|
28
|
Abstract
As the HIV/AIDS pandemic progresses in Africa, elders are increasingly responsible for the care of orphans. Several reports suggest that elderly Africans do not have the resources to provide care and are at risk of poor health, but few studies have systematically measured health of caregivers. The Kenyan Grandparents Study is a longitudinal study designed to compare elder Luo caregivers to noncaregiving peers. Several measures of health were collected, including body mass index (BMI), blood pressure, glucose, and hemoglobin. In addition, self-perceived health and mental health were measured using the MOS Short-Form 36 (SF-36). It was hypothesized that caregivers would have poorer health than noncaregivers and that the difference in health would widen over the three waves of the study. Caregiving did not affect physical health but did act to decrease mental health and perceived health over time.
Collapse
|
29
|
Prevalence and correlates of high body mass index in rural Appalachian children aged 6-11 years. Rural Remote Health 2009. [DOI: 10.22605/rrh1234] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022] Open
|
30
|
Prevalence and correlates of high body mass index in rural Appalachian children aged 6-11 years. Rural Remote Health 2009; 9:1234. [PMID: 19848443] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/28/2023] Open
Abstract
INTRODUCTION In rural regions of the United States of America, estimates of pediatric obesity often exceed national averages. This problem may be particularly pronounced in Appalachian regions, where significant health and economic disparities abound. This study presents the findings of a body mass index (BMI) screening program for 6-11 year old children living in a rural Appalachian community. County-wide estimates of high BMI (>or=85th percentile) were obtained to understand the health status and needs of our pediatric community and to compare obesity prevalence rates with national averages. An additional aim was to identify subpopulations of children who may warrant clinical intervention due to demographic and behavioral risks factors of high BMI. METHODS A school-based BMI screening was conducted of 6-11 year old children in southeastern Ohio. Investigators collected 3 sets of height and weight measurements from approximately 2000 elementary school students between 2006 and 2007. Caregivers for a subset of this population also completed a health behaviors questionnaire. RESULTS Thirty-eight percent of children had high BMI, with 17% at risk for overweight and 20.9% overweight. Boys were 23% more likely than girls to be overweight (chi(2)(1) = 95% CI = 1.08, 1.40) and 11% more likely to become overweight with each year of age (OR = 1.11, 95% CI = 1.07, 1.15). Overweight children were more likely to view television, eat meals at school, and live with a caregiver who smokes. CONCLUSIONS Consistent with expectations, prevalence of high BMI in this sample of rural Appalachian children exceeds national averages. Prevalence of overweight varied by age and sex; boys are particularly vulnerable to developing obesity, especially as they age. Preliminary survey data suggest that eating breakfast at home and at school and increased hours of television viewing may be associated with higher BMI, especially in younger boys.
Collapse
|
31
|
|
32
|
Interexpert agreement on diagnosis of bacteriuria and urinary tract infection in hospitalized older adults. THE JOURNAL OF THE AMERICAN OSTEOPATHIC ASSOCIATION 2009; 109:220-226. [PMID: 19369509] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
CONTEXT Although bacteriuria with acute coexisting illness is common in hospitalized older adults, distinguishing it from urinary tract infection (UTI) can be challenging. OBJECTIVES To examine the rate of agreement between two geriatricians in distinguishing UTI from asymptomatic bacteriuria (ASB). To analyze the incidence of associated acute comorbidities and determine if an association exists between clinical manifestations and bacteriuria status on acute hospital admission. METHODS Two physicians conducted a retrospective analysis of 296 inpatient records, including 142 records from age- and condition-matched nonbacteriuria control subjects. Using consensus criteria to diagnose UTI vs ASB, these independent experts evaluated inpatient records, including admission and discharge diagnoses as well as urinalysis results. A kappa statistic was used to determine reviewer agreement. Risk assessment was measured by odds ratio with a 95% confidence interval. RESULTS Expert agreement for the diagnosis of UTI and ASB was 98% and 44%, respectively. Agreement was reached at a level greater than chance (z=6.74, P<.001, kappa=0.49). In the 30 cases where interexpert agreement was not reached, half of the subjects had acute pulmonary disease. Symptom crossover for this comorbid condition is the likely cause for lack of diagnostic agreement. Among other conditions observed, delirium was most common in UTI subjects. CONCLUSION Limited interexpert agreement seemed to result from difficulty in diagnosing patients who had no local symptoms but acute comorbid conditions with potential symptom crossover. Among the conditions observed in our sample population, delirium was most closely associated with UTI.
Collapse
|
33
|
Professional satisfaction among new osteopathic family physicians: a survey-based investigation of residency-trained graduates. THE JOURNAL OF THE AMERICAN OSTEOPATHIC ASSOCIATION 2009; 109:92-96. [PMID: 19269940] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
CONTEXT Progressively more osteopathic graduates are seeking training opportunities in programs accredited by the Accreditation Council on Graduate Medical Education (ACGME). OBJECTIVE To determine if family medicine residency training program choice (ie, allopathic [ACGME], osteopathic, or dually accredited) has an impact on professional satisfaction levels among recent osteopathic medical school graduates. METHODS The authors designed a survey instrument to gather data on professional satisfaction levels. Osteopathic family physicians who completed residency training from 1999-2003 were asked to participate in the study. RESULTS The survey was sent to 2284 individuals with an adjusted response rate of 37%. One hundred and one (15.8%) of the osteopathic family physicians who responded reported completing residency training programs approved by the American Osteopathic Association (AOA); 335 (52.3%), ACGME-accredited programs; 198 (30.9%), dually accredited programs. One hundred forty-three surveyed osteopathic physicians (22.3%) were less than happy with their career choice. In addition, 219 (34.2%) reported that they were "thinking of changing...specialty," and 30 (4.7%) reported that they were not "currently practicing family medicine." Individuals trained in ACGME programs reported slightly higher levels of professional satisfaction than individuals trained in AOA-approved or dually accredited programs--though these differences were deemed trivial (ie, low effect size, 0.01; P>.05). CONCLUSION The authors found no statistically significant differences in professional satisfaction levels among osteopathic family physicians who were recent medical school graduates regardless of residency training program choice.
Collapse
|
34
|
Impaired functioning and quality of life in severe migraine: the role of catastrophizing and associated symptoms. Cephalalgia 2007; 27:1156-65. [PMID: 17784854 PMCID: PMC2128721 DOI: 10.1111/j.1468-2982.2007.01420.x] [Citation(s) in RCA: 113] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
Migraine characteristics are associated with impaired functioning and quality of life (Fn/QoL), but the impact of other factors on Fn/QoL in headache patients is largely unexplored. We examined catastrophizing, comorbid anxiety/depression and migraine characteristics as related to Fn/QoL, and explored the consistency of these relationships across five Fn/QoL measures. We evaluated 232 frequent migraine sufferers for comorbid psychiatric diagnosis, and they completed anxiety, depression and catastrophizing measures, recorded migraine characteristics in a diary and completed five Fn/QoL measures (four self-report questionnaires, one diary disability measure). Backward regression revealed catastrophizing and severity of associated symptoms (photophobia, phonophobia, nausea) independently predicted Fn/QoL across all five measures (beta weights 0.16-0.50, all P < 0.01). This is the first demonstration that a psychological response to migraines (catastrophizing) is associated with impaired Fn/QoL independent of migraine characteristics and other demographic and psychological variables. Severity of associated symptoms also emerged as an important contributor to Fn/QoL.
Collapse
|