1
|
Real world treatment patterns, healthcare resource use and costs in patients with advanced or metastatic non-small cell lung cancer by EGFR mutation type. J Med Econ 2024; 27:219-229. [PMID: 38269536 DOI: 10.1080/13696998.2024.2309838] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/07/2023] [Accepted: 01/22/2024] [Indexed: 01/26/2024]
Abstract
AIMS This study described treatment patterns, healthcare resource utilization (HRU) and costs among advanced or metastatic non-small cell lung cancer (a/mNSCLC) patients with different epidermal growth factor receptor (EGFR) mutation types. MATERIALS AND METHODS This retrospective study leveraged NeoGenomics NeoNucleus linked with IQVIA PharMetrics Plus between 01 January 2016 to 30 April 2021 (study period). Patients with evidence of a/mNSCLC between 01 July 2016 to 31 March 2021 (selection window) with EGFR test results indicating exon 19 deletion (exon19del), exon 21 L858R (L858R), or exon 20 insertion (exon20i) mutations were included; date of first observed evidence of a/mNSCLC was the index date. Treatment patterns, all-cause HRU and costs during ≥1 month follow-up were reported for each cohort (exon19del, L858R, and exon20i). RESULTS A total of 106 exon19del, 75 L858R, and 13 exon20i patients met the study criteria. The prevalence of hospitalization was highest in the exon20i cohort (76.9%), followed by L858R (62.7%) and exon19del (55.7%) cohorts. A higher proportion of patients had evidence of hospice/end-of-life care in the exon20i (30.8%) and L858R (29.3%) cohorts relative to the exon19del cohort (22.6%). The exon20i cohort had higher median total healthcare costs per patient per month ($27,069) relative to exon19del ($17,482) and L858R ($17,763). EGFR tyrosine kinase inhibitors (TKI) were the most frequently observed treatment type for exon19del and L858R cohorts, while chemotherapy was the most observed treatment in exon20i cohort. LIMITATIONS The sample size for the study cohorts was small, thus no statistical comparisons were conducted. CONCLUSIONS This is one of the first real-world studies to describe HRU and costs among a/mNSCLC patients by specific EGFR mutation type. HRU and costs varied between EGFR mutation types and were highest among exon20i cohort, potentially reflecting higher disease burden and unmet need among patients with this mutation.
Collapse
|
2
|
Real-world costs of obesity-related complications over eight years: a US retrospective cohort study in 28,500 individuals. Int J Obes (Lond) 2023; 47:1239-1246. [PMID: 37723273 PMCID: PMC10663144 DOI: 10.1038/s41366-023-01376-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/21/2022] [Revised: 08/17/2023] [Accepted: 08/23/2023] [Indexed: 09/20/2023]
Abstract
BACKGROUND Obesity-related complications (ORCs) are associated with high costs for healthcare systems. We assessed the relationship between comorbidity burden, represented by both number and type of 14 specific ORCs, and total healthcare costs over time in people with obesity in the USA. METHODS Adults (≥ 18 years old) identified from linked electronic medical records and administrative claims databases, with a body mass index measurement of 30-< 70 kg/m2 between 1 January 2007 and 31 March 2012 (earliest measurement: index date), and with continuous enrolment for ≥ 1 year pre index (baseline year) and ≥ 8 years post index, were included. Individuals were grouped by type and number of ORCs during the pre-index baseline year. The primary outcome was annual total adjusted direct per-person healthcare costs. RESULTS Of 28,583 included individuals, 12,686 had no ORCs, 7242 had one ORC, 4180 had two ORCs and 4475 had three or more ORCs in the baseline year. Annual adjusted direct healthcare costs increased with the number of ORCs and over the 8-year follow-up. Outpatient costs were the greatest contributor to baseline annual direct costs, irrespective of the number of ORCs. For specific ORCs, costs generally increased gradually over the follow-up; the largest percentage increases from year 1 to year 8 were observed for chronic kidney disease (+ 78.8%) and type 2 diabetes (+ 47.8%). CONCLUSIONS In a US real-world setting, the number of ORCs appears to be a cost driver in people with obesity, from the time of initial obesity classification and for at least the following 8 years.
Collapse
|
3
|
Healthcare resource utilization in patients with pulmonary hypertension associated with chronic obstructive pulmonary disease (PH-COPD): a real-world data analysis. BMC Pulm Med 2023; 23:455. [PMID: 37990203 PMCID: PMC10664271 DOI: 10.1186/s12890-023-02698-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2023] [Accepted: 10/06/2023] [Indexed: 11/23/2023] Open
Abstract
RATIONALE There is a lack of real-world characterization of healthcare costs and associated cost drivers in patients with pulmonary hypertension secondary to chronic obstructive pulmonary disease (PH-COPD). OBJECTIVES To examine (1) excess healthcare resource utilization (HCRU) and associated costs in patients with PH-COPD compared to COPD patients without PH; and (2) patient characteristics that are associated with higher healthcare costs in patients with PH-COPD. METHODS This study analyzed data from the IQVIA PharMetrics® Plus database (OCT2014-MAY2020). Patients with PH-COPD were identified by a claims-based algorithm based on PH diagnosis (ICD-10-CM: I27.0, I27.2, I27.20, I27.21, I27.23) after COPD diagnosis. Patients aged ≥40 years and with data available ≥12 months before (baseline) and ≥6 months after (follow-up) the first observed PH diagnosis were included. Patients with other non-asthma chronic pulmonary diseases, PH associated with other causes, cancer, left-sided heart failure (HF), PH before the first observed COPD diagnosis, or right-sided/unspecified HF during baseline were excluded. Patients in the PH-COPD cohort were matched 1:1 to COPD patients without PH based on propensity scores derived from baseline patient characteristics. Annualized all-cause and COPD/PH-related (indicated by a primary diagnosis of COPD or PH) HCRU and costs during follow-up were compared between the matched cohorts. Baseline patient characteristics associated with higher total costs were examined in a generalized linear model in the PH-COPD cohort. RESULTS A total of 2,224 patients with PH-COPD were identified and matched to COPD patients without PH. Patients with PH-COPD had higher all-cause HCRU and annual healthcare costs ($51,435 vs. $18,412, p<0.001) than matched COPD patients without PH. Among patients with PH-COPD, costs were primarily driven by hospitalizations (57%), while COPD/PH-related costs accounted for 13% of all-cause costs. Having a higher comorbidity burden and a prior history of COPD exacerbation were major risk factors for higher total all-cause costs among patients with PH-COPD. CONCLUSIONS Treatment strategies focusing on preventing hospitalizations and managing comorbidities may help reduce the burden of PH-COPD.
Collapse
|
4
|
Weight gain following switch to integrase inhibitors from non-nucleoside reverse transcriptase or protease inhibitors in people living with HIV in the United States: analyses of electronic medical records and prescription claims. Curr Med Res Opin 2023; 39:1237-1246. [PMID: 37480288 DOI: 10.1080/03007995.2023.2239661] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/05/2023] [Revised: 07/13/2023] [Accepted: 07/19/2023] [Indexed: 07/23/2023]
Abstract
OBJECTIVES Real-world data evaluating weight changes in people living with HIV (PLWH) following switch to integrase strand transfer inhibitor (INSTI), specifically bictegravir (BIC), are limited. This retrospective cohort study analyzed weight changes upon switching to INSTI from non-nucleoside reverse transcriptase inhibitor (NNRTI) or protease inhibitor (PI) in treatment-experienced PLWH. METHODS Adult PLWH (≥18 years) treated with NNRTI or PI (non-switch cohorts) and those switching to INSTI (switch cohorts) between January 1, 2014 and August 31, 2019 were identified using IQVIA's Ambulatory Electronic Medical Records linked to a prescription drug claims database. The associations of switching to INSTI and individual INSTI agents with having ≥5% weight gain at 12 months of follow-up were evaluated, adjusting for demographics and baseline clinical characteristics. RESULTS At 12 months of follow-up, PLWH in the NNRTI-INSTI switch cohort (n = 508) were more likely to have ≥5% weight gain over 12 months compared to the NNRTI non-switch cohort (n = 614; odds ratio, OR [95% CI], 1.7 [1.2-2.4]). Switching from NNRTI to dolutegravir (DTG: OR [95% CI], 2.1 [1.4-3.0]) or BIC (2.0 [1.0-4.2]) resulted in significantly higher odds of ≥5% weight gain. PI-INSTI switch (n = 295) and non-switch (n = 228) cohorts had similar proportions of PLWH with ≥5% (21.1-23.4%) or ≥10% (7.8-7.9%) weight gain, and no significant association was found between switching from PI to INSTI and weight gain. CONCLUSION Weight gain and related metabolic health of PLWH switching from NNRTI to DTG or BIC should be closely monitored by clinicians. Further research is needed to assess other metabolic outcomes in PLWH remaining on PI and those who switch from PI to INSTI.
Collapse
|
5
|
Weight gain after antiretroviral therapy initiation in people living with HIV in the United States: analyses of electronic medical records and prescription claims. Curr Med Res Opin 2023; 39:997-1006. [PMID: 37334707 DOI: 10.1080/03007995.2023.2224165] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/07/2023] [Revised: 05/23/2023] [Accepted: 06/08/2023] [Indexed: 06/20/2023]
Abstract
BACKGROUND Treatment guidelines recommend integrase strand transfer inhibitor (INSTI)-based antiretroviral therapy (ART) regimens for treatment naïve people living with HIV (PLWH) in the United States (US). This retrospective database study compared weight changes following initiation of INSTI-, non-nucleoside reverse transcriptase inhibitor (NNRTI)-, or protease inhibitor (PI)-based ART in treatment-naïve PLWH. METHODS Adult (≥18 years) PLWH initiated on INSTI, NNRTI, or PI plus ≥2 nucleoside reverse transcriptase inhibitors (NRTI) between 1 January 2014 to 31 August 2019 were identified in IQVIA's Ambulatory Electronic Medical Records (AEMR) linked to prescription drug claims (LRx). Weight changes over up to 36 months (M) of follow-up were compared among PLWH on INSTI- vs. NNRTI- and PI-based ART separately using non-linear mixed effect models, adjusting for demographics and baseline clinical characteristics. RESULTS The INSTI, NNRTI, and PI cohorts included 931, 245, and 124 PLWH, respectively. For all three cohorts, the majority were male (78.2-81.2%) and overweight/obese (53.6-61.6%) at baseline; 40.8-45.2% of the groups were African American. The INSTI vs. NNRTI/PI cohorts were younger (median age: 38 years vs. 44 years/46 years), had lower weight at ART initiation (mean: 80.9 kg vs. 85.7 kg/85.0 kg), and had higher TAF usage during follow-up (55.6% vs. 24.1%/25.8%; all p < .05). Multivariate models showed higher weight gain among PLWH in INSTI vs. NNRTI and PI cohorts during treated follow-up (estimated weight gain after 36 M: 7.1 kg vs. 3.8 kg and 3.8 kg, both p < .05). CONCLUSION Study findings highlight the need to monitor an increase in weight and potential metabolic complications among PLWH starting ART with INSTI.
Collapse
|
6
|
Characteristics of initiators of budesonide/glycopyrrolate/formoterol for treatment of chronic obstructive pulmonary disease (COPD) in the United States: the AURA study. Ther Adv Respir Dis 2023; 17:17534666231164534. [PMID: 37013423 PMCID: PMC10074608 DOI: 10.1177/17534666231164534] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/05/2023] Open
Abstract
INTRODUCTION A twice-daily single inhaler triple therapy consisting of budesonide/glycopyrrolate/formoterol fumarate (BGF) was approved by the US Food and Drug Administration (FDA) in July 2020 as a maintenance treatment for patients with chronic obstructive pulmonary disease (COPD). The objective of this AURA study is to describe patient characteristics, exacerbation and treatment history, and healthcare resource utilization (HCRU) before BGF initiation to better inform treatment decisions for prescribers. METHODS This retrospective cohort study leveraged data of all payer types from IQVIA's Longitudinal Prescription Data (LRx) linked to Medical Data (Dx). Patients with COPD who had ⩾1 LRx claim for BGF between 1 October 2020 and 30 September 2021 were included. The date of first BGF claim was the index date. Patient demographic and clinical characteristics, history of COPD exacerbation or related event, treatment history, and HCRU were assessed during the 12 months before index (baseline). RESULTS We identified 30,339 patients with COPD initiating BGF (mean age: 68.2 years; 57.1% female; 67.6% Medicare). Unspecified COPD (J44.9; 74.0%) was the most commonly coded COPD phenotype. The most prevalent respiratory conditions/symptoms were dyspnea (50.8%), lower respiratory tract infection (25.3%), and sleep apnea (19.0%). Uncomplicated hypertension (58.8%), dyslipidemia (43.9%), cardiovascular disease (41.4%), and heart failure (19.9%) were the most prevalent nonrespiratory conditions. During the 12-month baseline, 57.9% of patients had evidence of a COPD exacerbation or related event, and 14.9% had ⩾1 COPD-related emergency department (ED) visit; 21.0% of patients had evidence of prior triple therapy use, while 54.3% had ⩾1 oral corticosteroid (OCS) fill. Among OCS users, 29.9% had cumulative exposures >1000 mg [median [Q1-Q3] exposure: 520 (260-1183) mg]. CONCLUSION This real-world data analysis indicates that BGF is being initiated in patients with COPD experiencing symptoms and exacerbations despite current therapy, and among patients who have various chronic comorbidities, most often cardiopulmonary-related.
Collapse
|
7
|
Burden of influenza hospitalization among high-risk groups in the United States. BMC Health Serv Res 2022; 22:1209. [PMID: 36171601 PMCID: PMC9520810 DOI: 10.1186/s12913-022-08586-y] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2022] [Accepted: 09/08/2022] [Indexed: 11/21/2022] Open
Abstract
Background Seasonal influenza poses a substantial clinical and economic burden in the United States and vulnerable populations, including the elderly and those with comorbidities, are at elevated risk for influenza-related medical complications. Methods We conducted a retrospective cohort study using the IQVIA PharMetrics® Plus claims database in two stages. In Stage 1, we identified patients with evidence of medically-attended influenza during influenza seasons from October 1, 2014 to May 31, 2018 (latest available data for Stage 1) and used a multivariable logistic regression model to identify patient characteristics that predicted 30-day influenza-related hospitalization. The findings from Stage 1 informed high-risk subgroups of interest for Stage 2, where we selected cohorts of influenza patients during influenza seasons from October 1, 2014 to March 1, 2019 and used 1:1 propensity score matching to patients without influenza with similar high-risk characteristics to compare influenza-attributable rates of all-cause hospital and emergency department (ED) visits during follow-up (30-day and in the index influenza season). Results In Stage 1, more than 1.6 million influenza cases were identified, of which 18,509 (1.2%) had a hospitalization. Elderly age was associated with 9 times the odds of hospitalization (≥65 years vs. 5–17 years; OR = 9.4, 95% CI 8.8–10.1) and select comorbidities were associated with 2–3 times the odds of hospitalization. In Stage 2, elderly influenza patients with comorbidities had 3 to 7 times higher 30-day hospitalization rates compared to matched patients without influenza, including patients with congestive heart failure (41.0% vs.7.9%), chronic obstructive pulmonary disease (34.6% vs. 6.1%), coronary artery disease (22.8% vs. 3.8%), and late-stage chronic kidney disease (44.1% vs. 13.1%; all p < 0.05). Conclusions The risk of influenza-related complications is elevated in the elderly, especially those with certain underlying comorbidities, leading to excess healthcare resource utilization. Continued efforts, beyond currently available vaccines, are needed to reduce influenza burden in high-risk populations. Supplementary Information The online version contains supplementary material available at 10.1186/s12913-022-08586-y.
Collapse
|
8
|
Unemployment, Homelessness, and Other Societal Outcomes Among US Veterans With Schizophrenia Relapse. Prim Care Companion CNS Disord 2022; 24. [DOI: 10.4088/pcc.21m03173] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/14/2022] Open
|
9
|
Characteristics and current standard of care among veterans with major depressive disorder in the United States: A real-world data analysis. J Affect Disord 2022; 307:184-190. [PMID: 35351492 DOI: 10.1016/j.jad.2022.03.058] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/29/2021] [Revised: 02/04/2022] [Accepted: 03/20/2022] [Indexed: 12/28/2022]
Abstract
BACKGROUND This study examined MDD treatment regimens received during the first observed and treated major depressive episode (MDE) among US veterans. METHODS This retrospective study, conducted using the Veterans Health Administration (VHA) database, supplemented with Medicare Part A/B/D data, included adults with ≥1 MDD diagnosis (index date) between 10/1/2015-2/28/2017 and ≥1 line of therapy (LOT) within the first observed complete MDE. Patient baseline (6-month pre-index) characteristics and up to six LOTs received during the first observed and treated MDE were assessed. RESULTS Of 40,240 veterans with MDD identified (mean age: 50.9 years, 83.9% male, 63.4% White, 88.6% non-Hispanic), hypertension (27.5%), hyperlipidemia (20.8%), and post-traumatic stress disorder (17.5%) were the most common baseline comorbidities. During the first observed and treated MDE, patients received a mean of 1.6 ± 1.0 LOTs, with 14.6% of patients receiving ≥3 LOTs. SSRI-monotherapy was the most commonly observed regimen in the first six LOTs, followed by SNRI-monotherapy in LOT 1 and antidepressants augmented by anticonvulsants in the remaining five LOTs. The antidepressant class of the previous LOT was commonly used in the subsequent LOT. SSRI-SSRI-SSRI was the most common LOT1-to-LOT3 sequencing pattern among patients receiving ≥3 LOTs. LIMITATIONS The study findings are limited to data in the VHA database and may not be generalizable to the non-veteran US population. CONCLUSIONS During the first observed and treated MDE, SSRI-monotherapy was the most common therapy in the first six LOTs. Cycling within SSRI class was the leading sequencing pattern of the first three LOTs among veterans who received ≥3 LOTs.
Collapse
|
10
|
HSR22-152: Healthcare Resource Utilization, Dosing and Time on Treatment in Patients on Pexidartinib Tenosynovial Giant Cell Tumors: Real-World Evidence from a Claims-Based Dataset. J Natl Compr Canc Netw 2022. [DOI: 10.6004/jnccn.2021.7178] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
|
11
|
Real-world HIV diagnostic testing patterns in the United States. THE AMERICAN JOURNAL OF MANAGED CARE 2022; 28:e42-e48. [PMID: 35139295 DOI: 10.37765/ajmc.2022.88826] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
OBJECTIVES To understand real-world implementation of the updated CDC HIV diagnostic testing algorithm. STUDY DESIGN Retrospective database analysis. METHODS Using data from Quest Diagnostics, we identified patients with at least 1 HIV-1/HIV-2 antibody differentiation test (BioRad Geenius HIV 1/2 Supplemental Assay [Geenius]) between January 1 and December 31, 2017. Study measures included Health Insurance Portability and Accountability Act-compliant patient demographics, test results, test frequency, and sequence relative to the CDC HIV diagnostic algorithm, including HIV-1 RNA Qualitative Assay (Aptima) or HIV-2 nucleic acid test (NAT). RESULTS A total of 26,319 patients were identified (mean [SD] age, 40.7 [14.3] years; 66.4% male), with 28,954 Geenius tests, 7234 Aptima tests, and 298 HIV-2 NATs. In 26.4% of test sequences, the Geenius results were indeterminate or negative and required subsequent confirmatory NATs. A total of 8.5% of patients had more than 1 Geenius test in 2017, and 11.2% of the time, results of the first and second tests differed. A total of 74.2% of test sequences matched the CDC-recommended algorithm. CONCLUSIONS Our study findings suggest that the CDC HIV diagnostic algorithm is complex and may pose suboptimal testing efficiency. Opportunities to improve diagnostic efficiency by reducing indeterminate results and repeat tests are warranted.
Collapse
|
12
|
Frequency, severity and costs of flares increase with disease severity in newly diagnosed systemic lupus erythematosus: a real-world cohort study, United States, 2004-2015. Lupus Sci Med 2021; 8:8/1/e000504. [PMID: 34556546 PMCID: PMC8461688 DOI: 10.1136/lupus-2021-000504] [Citation(s) in RCA: 18] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2021] [Accepted: 08/24/2021] [Indexed: 12/31/2022]
Abstract
Objective To evaluate frequency, severity and costs of flares in US patients with newly diagnosed SLE. Methods Adults diagnosed with SLE between January 2005 and December 2014 were identified from US commercial claims data linked to electronic medical records. Disease and flare severity during 1 year after diagnosis were classified as mild, moderate or severe using a claims-based algorithm. Study outcomes included frequency and severity of flares stratified by disease severity during the 1-year post-diagnosis period and all-cause healthcare costs of flares by severity at 30, 60 and 90 days after flare. Results Among 2227 patients, 26.3%, 51.0% and 22.7% had mild, moderate and severe SLE, respectively. The overall annual flare rate was 3.5 and increased with disease severity: 2.2, 3.7 and 4.2, respectively, for mild, moderate and severe SLE (p<0.0001). Patients with severe SLE had a higher annual severe flare rate (0.6) compared with moderate (0.1) or mild SLE (0; p<0.0001). Mean total all-cause costs at 30, 60 and 90 days after flare were $16 856, $22 252 and $27 468, respectively, for severe flares (mild flares: $1672, $2639 and $3312; moderate flares: $3831, $6225, $8582; (p<0.0001, all time points)). Inpatient costs were the primary driver of the increased cost of severe flares. Conclusions Flare frequency and severity in newly diagnosed patients with SLE increase with disease severity. After a flare, healthcare costs increase over the following 90 days by disease severity. Preventing flares or reducing flare rates and duration may improve outcomes and reduce healthcare costs.
Collapse
|
13
|
Disease and economic burden increase with systemic lupus erythematosus severity 1 year before and after diagnosis: a real-world cohort study, United States, 2004-2015. Lupus Sci Med 2021; 8:8/1/e000503. [PMID: 34521733 PMCID: PMC8442098 DOI: 10.1136/lupus-2021-000503] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2021] [Accepted: 08/19/2021] [Indexed: 11/03/2022]
Abstract
OBJECTIVE To assess the economic burden of patients with SLE by disease severity in the USA 1 year before and after diagnosis. METHODS Patients aged ≥18 years with a first SLE diagnosis (index date) between January 2005 and December 2014 were identified from administrative commercial claims data linked to electronic medical records (EMRs). Disease severity during the year after diagnosis was classified as mild, moderate, or severe using claims-based algorithms and EMR data. Healthcare resource utilisation (HCRU) and all-cause healthcare costs (2017 US$) were reported for 1 year pre-diagnosis and post-diagnosis. Generalised linear modelling examined all-cause costs over 1 year post-index, adjusting for baseline demographics, clinical characteristics, Charlson Comorbidity Index and 1 year pre-diagnosis costs. RESULTS Among 2227 patients, 26.3% had mild, 51.0% moderate and 22.7% severe SLE. Mean per-patient costs were higher for patients with moderate and severe SLE compared with mild SLE during the year before diagnosis: mild US$12 373, moderate $22 559 and severe US$39 261 (p<0.0001); and 1-year post-diagnosis period: mild US$13 415, moderate US$29 512 and severe US$68 260 (p<0.0001). Leading mean cost drivers were outpatient visits (US$13 566) and hospitalisations (US$10 252). Post-diagnosis inpatient utilisation (≥1 stay) was higher for patients with severe (51.2%) and moderate (22.4%) SLE, compared with mild SLE (12.8%), with longer mean hospital stays: mild 0.47 days, moderate 1.31 days and severe 5.52 days (p<0.0001). CONCLUSION HCRU and costs increase with disease severity in the year before and after diagnosis; leading cost drivers post-diagnosis were outpatient visits and hospitalisations. Earlier diagnosis and treatment may improve health outcomes and reduce HCRU and costs.
Collapse
|
14
|
Clinical characteristics and treatment patterns with histrelin acetate subcutaneous implants vs. leuprolide injections in children with precocious puberty: a real-world study using a US claims database. J Pediatr Endocrinol Metab 2021; 34:961-969. [PMID: 34147047 DOI: 10.1515/jpem-2020-0721] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/18/2020] [Accepted: 04/12/2021] [Indexed: 11/15/2022]
Abstract
OBJECTIVES Gonadotropin-releasing hormone analogs are the treatment of choice for central precocious puberty (CPP). This study characterizes patients treated with histrelin implant or leuprolide injection. METHODS A US claims database was used to identify patients aged ≤20 years with ≥1 histrelin or leuprolide claim (index treatment) between April 2010 and November 2017 and continuous enrollment ≥3 months before and ≥12 months after the index treatment date. RESULTS Overall, 4,217 patients (histrelin, n=1,001; leuprolide, n=3,216) were identified. The percentage of patients with CPP diagnosis was greater in the histrelin (96.5%) vs. leuprolide (68.8%; p<0.0001) cohort. In patients with CPP (histrelin, n=966; leuprolide, n=2,214), mean age at treatment initiation was similar for histrelin (9.0 ± 2.0 years) and leuprolide (9.1 ± 2.3 years), with >50% of patients aged 6-9 years. Mean treatment duration was significantly longer for histrelin (26.7 ± 14.8 months) vs. leuprolide (14.1 ± 12.1 months; p<0.0001), and was longer in younger patient groups. More patients switched from leuprolide to histrelin (12.3%) than vice versa (3.6%; p<0.0001). Median annual total treatment costs were slightly lower for the histrelin cohort ($23,071 [interquartile range, $16,833-$31,050]) than the leuprolide cohort ($27,021 [interquartile range, $18,314-$34,995]; p<0.0001). CONCLUSIONS Patients with CPP treated with histrelin had a longer duration of treatment, lower rates of index treatment discontinuation, and lower annual treatment costs vs. those treated with leuprolide.
Collapse
|
15
|
Leveraging unstructured data to identify hereditary angioedema patients in electronic medical records. Allergy Asthma Clin Immunol 2021; 17:41. [PMID: 33879228 PMCID: PMC8058983 DOI: 10.1186/s13223-021-00541-6] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2020] [Accepted: 03/29/2021] [Indexed: 01/22/2023] Open
Abstract
Background The epidemiologic impact of hereditary angioedema (HAE) is difficult to quantify, due to misclassification in retrospective studies resulting from non-specific diagnostic coding. The aim of this study was to identify cohorts of patients with HAE-1/2 by evaluating structured and unstructured data in a US ambulatory electronic medical record (EMR) database. Methods A retrospective feasibility study was performed using the GE Centricity EMR Database (2006–2017). Patients with ≥ 1 diagnosis code for HAE-1/2 (International Classification of Diseases, Ninth Revision, Clinical Modification 277.6 or International Classification of Diseases, Tenth Revision, Clinical Modification D84.1) and/or ≥ 1 physician note regarding HAE-1/2 and ≥ 6 months’ data before and after the earliest code or note (index date) were included. Two mutually exclusive cohorts were created: probable HAE (≥ 2 codes or ≥ 2 notes on separate days) and suspected HAE (only 1 code or note). The impact of manually reviewing physician notes on cohort formation was assessed, and demographic and clinical characteristics of the 2 final cohorts were described. Results Initially, 1691 patients were identified: 190 and 1501 in the probable and suspected HAE cohorts, respectively. After physician note review, the confirmed HAE cohort comprised 254 patients and the suspected HAE cohort decreased to 1299 patients; 138 patients were determined not to have HAE and were excluded. The overall false-positive rate for the initial algorithms was 8.2%. Across final cohorts, the median age was 50 years and > 60% of patients were female. HAE-specific prescriptions were identified for 31% and 2% of the confirmed and suspected HAE cohorts, respectively. Conclusions Unstructured EMR data can provide valuable information for identifying patients with HAE-1/2. Further research is needed to develop algorithms for more representative HAE cohorts in retrospective studies.
Collapse
|
16
|
Clinical and Economic Outcomes in Patients with Persistent Asthma Who Attain Healthcare Effectiveness and Data Information Set Measures. THE JOURNAL OF ALLERGY AND CLINICAL IMMUNOLOGY-IN PRACTICE 2020; 8:3443-3454.e2. [PMID: 32562878 DOI: 10.1016/j.jaip.2020.06.012] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Received: 10/21/2019] [Revised: 06/03/2020] [Accepted: 06/04/2020] [Indexed: 10/24/2022]
Abstract
BACKGROUND Attainment of asthma-specific US Healthcare Effectiveness Data and Information Set (HEDIS) quality measures may be associated with improved clinical outcomes and reduced economic burden. OBJECTIVE We examined the relationship between the attainment of HEDIS measures asthma medication ratio (AMR) and medication management for people with asthma (MMA) on clinical and economic outcomes. METHODS This retrospective claims database analysis linked to ambulatory electronic medical records enrolled US patients aged ≥5 years with persistent asthma between May 2015 and April 2017. The attainment of AMR ≥0.5 and MMA ≥75% was determined over a 1-year premeasurement period. Asthma exacerbations and asthma-related health care costs were evaluated during the subsequent 12-month measurement period, comparing patients attaining 1 or both measures with those not attaining either. RESULTS In total, 32,748 patients were included, 75.2% of whom attained AMR (n = 24,388) and/or MMA (n = 12,042) during the premeasurement period. Fewer attainers of 1 or more HEDIS measures had ≥1 asthma-related hospitalizations, emergency department visit, corticosteroid burst, or exacerbation (4.9% vs 7.3%; 9.6% vs 18.2%; 43.8% vs 51.6%; 14.3% vs 23.3%, respectively; all P < .001) compared with nonattainers. In adjusted analyses, HEDIS attainment was associated with a lower likelihood of exacerbations (odds ratio: 0.63, [95% confidence interval: 0.60-0.67]; P < .001). The attainment of ≥1 HEDIS measures lowered total and asthma-related costs, and asthma exacerbation-related health care costs per patient relative to nonattainers (cost ratio: 0.87, P < .001; 0.96, P = .02; and 0.59, P < .001, respectively). Overall and asthma-specific costs were lower for patients attaining AMR, but not MMA. CONCLUSIONS HEDIS attainment was associated with significantly improved asthma outcomes and lower asthma-specific costs.
Collapse
|
17
|
Structure, Function, and Applications of the Georgetown-Einstein (GE) Breast Cancer Simulation Model. Med Decis Making 2019; 38:66S-77S. [PMID: 29554462 DOI: 10.1177/0272989x17698685] [Citation(s) in RCA: 21] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
BACKGROUND The Georgetown University-Albert Einstein College of Medicine breast cancer simulation model (Model GE) has evolved over time in structure and function to reflect advances in knowledge about breast cancer, improvements in early detection and treatment technology, and progress in computing resources. This article describes the model and provides examples of model applications. METHODS The model is a discrete events microsimulation of single-life histories of women from multiple birth cohorts. Events are simulated in the absence of screening and treatment, and interventions are then applied to assess their impact on population breast cancer trends. The model accommodates differences in natural history associated with estrogen receptor (ER) and human epidermal growth factor receptor 2 (HER2) biomarkers, as well as conventional breast cancer risk factors. The approach for simulating breast cancer natural history is phenomenological, relying on dates, stage, and age of clinical and screen detection for a tumor molecular subtype without explicitly modeling tumor growth. The inputs to the model are regularly updated to reflect current practice. Numerous technical modifications, including the use of object-oriented programming (C++), and more efficient algorithms, along with hardware advances, have increased program efficiency permitting simulations of large samples. RESULTS The model results consistently match key temporal trends in US breast cancer incidence and mortality. CONCLUSION The model has been used in collaboration with other CISNET models to assess cancer control policies and will be applied to evaluate clinical trial design, recurrence risk, and polygenic risk-based screening.
Collapse
|
18
|
A Real-World Evidence Study Assessing the Impact of Adding the Aerobika Oscillating Positive Expiratory Pressure Device to Standard of Care Upon Healthcare Resource Utilization and Costs in Post-Operative Patients. Pulm Ther 2018; 4:87-101. [PMID: 32026246 PMCID: PMC6966948 DOI: 10.1007/s41030-018-0055-9] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2018] [Indexed: 11/30/2022] Open
Abstract
INTRODUCTION The aim of this real-world study was to measure the benefit of the Aerobika oscillating positive expiratory pressure (OPEP) device when added to standard of care (defined as incentive spirometry [IS]) for post-operative patients. METHODS Adults aged ≥ 18 years who were hospitalized for cardiac, thoracic or upper abdominal surgery between 1 September 2013 and 30 April 2017 were identified from IQVIA's Hospital Charge Detail Master (CDM) database; the index date was the date of the first hospitalization for surgery. The control cohort (IS) included patients who had ≥ 1 CDM record within 12 months prior to the index date and ≥ 1 record after discharge, evidence of IS use during index hospitalization and no evidence of use of a PEP or OPEP device at any time during the study period. The Aerobika OPEP cohort was selected in a similar manner, except that patients were required to have evidence of Aerobika OPEP use during the index hospitalization. Aerobika OPEP patients were 1:1 matched to IS patients using propensity score (PS) matching. Hospital readmissions and costs were measured at 30 days post-discharge from the index hospitalization. RESULTS After PS matching, 144 patients were included in each cohort. At 30 days post-discharge, compared to the control (IS) cohort there were significantly fewer patients in the Aerobika OPEP cohort with ≥ 1 all-cause re-hospitalizations (13.9 vs. 22.9%; p = 0.042). The patients in the Aerobika OPEP cohort also had a shorter mean length of stay (± standard deviation) (1.25 ± 4.04 vs. 2.60 ± 8.24 days; p = 0.047) and lower total unadjusted mean all-cause cost per patient ($3670 ± $13,894 vs. $13,775 ± $84,238; p = 0.057). Adjusted analyses suggested that hospitalization costs were 80% lower for the Aerobika OPEP cohort versus the IS cohort (p = 0.001). CONCLUSION Our results suggest that the addition of the Aerobika OPEP device to standard of care (IS) is beneficial in the post-operative setting. FUNDING Trudell Medical International.
Collapse
|
19
|
Abstract
BACKGROUND Since their inception in 2000, the Cancer Intervention and Surveillance Network (CISNET) breast cancer models have collaborated to use a nationally representative core of common input parameters to represent key components of breast cancer control in each model. Employment of common inputs permits greater ability to compare model output than when each model begins with different input parameters. The use of common inputs also enhances inferences about the results, and provides a range of reasonable results based on variations in model structure, assumptions, and methods of use of the input values. The common input data are updated for each analysis to ensure that they reflect the most current practice and knowledge about breast cancer. The common core of parameters includes population rates of births and deaths; age- and cohort-specific temporal rates of breast cancer incidence in the absence of screening and treatment; effects of risk factors on incidence trends; dissemination of plain film and digital mammography; screening test performance characteristics; stage or size distribution of screen-, interval-, and clinically- detected tumors by age; the joint distribution of ER/HER2 by age and stage; survival in the absence of screening and treatment by stage and molecular subtype; age-, stage-, and molecular subtype-specific therapy; dissemination and effectiveness of therapies over time; and competing non-breast cancer mortality. METHOD AND RESULTS In this paper, we summarize the methods and results for the common input values presently used in the CISNET breast cancer models, note assumptions made because of unobservable phenomena and/or unavailable data, and highlight plans for the development of future parameters. CONCLUSION These data are intended to enhance the transparency of the breast CISNET models.
Collapse
|
20
|
Abstract
IMPORTANCE Given recent advances in screening mammography and adjuvant therapy (treatment), quantifying their separate and combined effects on US breast cancer mortality reductions by molecular subtype could guide future decisions to reduce disease burden. OBJECTIVE To evaluate the contributions associated with screening and treatment to breast cancer mortality reductions by molecular subtype based on estrogen-receptor (ER) and human epidermal growth factor receptor 2 (ERBB2, formerly HER2 or HER2/neu). DESIGN, SETTING, AND PARTICIPANTS Six Cancer Intervention and Surveillance Network (CISNET) models simulated US breast cancer mortality from 2000 to 2012 using national data on plain-film and digital mammography patterns and performance, dissemination and efficacy of ER/ERBB2-specific treatment, and competing mortality. Multiple US birth cohorts were simulated. EXPOSURES Screening mammography and treatment. MAIN OUTCOMES AND MEASURES The models compared age-adjusted, overall, and ER/ERBB2-specific breast cancer mortality rates from 2000 to 2012 for women aged 30 to 79 years relative to the estimated mortality rate in the absence of screening and treatment (baseline rate); mortality reductions were apportioned to screening and treatment. RESULTS In 2000, the estimated reduction in overall breast cancer mortality rate was 37% (model range, 27%-42%) relative to the estimated baseline rate in 2000 of 64 deaths (model range, 56-73) per 100 000 women: 44% (model range, 35%-60%) of this reduction was associated with screening and 56% (model range, 40%-65%) with treatment. In 2012, the estimated reduction in overall breast cancer mortality rate was 49% (model range, 39%-58%) relative to the estimated baseline rate in 2012 of 63 deaths (model range, 54-73) per 100 000 women: 37% (model range, 26%-51%) of this reduction was associated with screening and 63% (model range, 49%-74%) with treatment. Of the 63% associated with treatment, 31% (model range, 22%-37%) was associated with chemotherapy, 27% (model range, 18%-36%) with hormone therapy, and 4% (model range, 1%-6%) with trastuzumab. The estimated relative contributions associated with screening vs treatment varied by molecular subtype: for ER+/ERBB2-, 36% (model range, 24%-50%) vs 64% (model range, 50%-76%); for ER+/ERBB2+, 31% (model range, 23%-41%) vs 69% (model range, 59%-77%); for ER-/ERBB2+, 40% (model range, 34%-47%) vs 60% (model range, 53%-66%); and for ER-/ERBB2-, 48% (model range, 38%-57%) vs 52% (model range, 44%-62%). CONCLUSIONS AND RELEVANCE In this simulation modeling study that projected trends in breast cancer mortality rates among US women, decreases in overall breast cancer mortality from 2000 to 2012 were associated with advances in screening and in adjuvant therapy, although the associations varied by breast cancer molecular subtype.
Collapse
|
21
|
Using Collaborative Simulation Modeling to Develop a Web-Based Tool to Support Policy-Level Decision Making About Breast Cancer Screening Initiation Age. MDM Policy Pract 2017; 2:2381468317717982. [PMID: 29376135 PMCID: PMC5785917 DOI: 10.1177/2381468317717982] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/08/2017] [Accepted: 04/25/2017] [Indexed: 11/15/2022] Open
Abstract
BACKGROUND There are no publicly available tools designed specifically to assist policy makers to make informed decisions about the optimal ages of breast cancer screening initiation for different populations of US women. OBJECTIVE To use three established simulation models to develop a web-based tool called Mammo OUTPuT. METHODS The simulation models use the 1970 US birth cohort and common parameters for incidence, digital screening performance, and treatment effects. Outcomes include breast cancers diagnosed, breast cancer deaths averted, breast cancer mortality reduction, false-positive mammograms, benign biopsies, and overdiagnosis. The Mammo OUTPuT tool displays these outcomes for combinations of age at screening initiation (every year from 40 to 49), annual versus biennial interval, lifetime versus 10-year horizon, and breast density, compared to waiting to start biennial screening at age 50 and continuing to 74. The tool was piloted by decision makers (n = 16) who completed surveys. RESULTS The tool demonstrates that benefits in the 40s increase linearly with earlier initiation age, without a specific threshold age. Likewise, the harms of screening increase monotonically with earlier ages of initiation in the 40s. The tool also shows users how the balance of benefits and harms varies with breast density. Surveys revealed that 100% of users (16/16) liked the appearance of the site; 94% (15/16) found the tool helpful; and 94% (15/16) would recommend the tool to a colleague. CONCLUSIONS This tool synthesizes a representative subset of the most current CISNET (Cancer Intervention and Surveillance Modeling Network) simulation model outcomes to provide policy makers with quantitative data on the benefits and harms of screening women in the 40s. Ultimate decisions will depend on program goals, the population served, and informed judgments about the weight of benefits and harms.
Collapse
|
22
|
Tailoring Breast Cancer Screening Intervals by Breast Density and Risk for Women Aged 50 Years or Older: Collaborative Modeling of Screening Outcomes. Ann Intern Med 2016; 165:700-712. [PMID: 27548583 PMCID: PMC5125086 DOI: 10.7326/m16-0476] [Citation(s) in RCA: 77] [Impact Index Per Article: 9.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 01/10/2023] Open
Abstract
BACKGROUND Biennial screening is generally recommended for average-risk women aged 50 to 74 years, but tailored screening may provide greater benefits. OBJECTIVE To estimate outcomes for various screening intervals after age 50 years based on breast density and risk for breast cancer. DESIGN Collaborative simulation modeling using national incidence, breast density, and screening performance data. SETTING United States. PATIENTS Women aged 50 years or older with various combinations of breast density and relative risk (RR) of 1.0, 1.3, 2.0, or 4.0. INTERVENTION Annual, biennial, or triennial digital mammography screening from ages 50 to 74 years (vs. no screening) and ages 65 to 74 years (vs. biennial digital mammography from ages 50 to 64 years). MEASUREMENTS Lifetime breast cancer deaths, life expectancy and quality-adjusted life-years (QALYs), false-positive mammograms, benign biopsy results, overdiagnosis, cost-effectiveness, and ratio of false-positive results to breast cancer deaths averted. RESULTS Screening benefits and overdiagnosis increase with breast density and RR. False-positive mammograms and benign results on biopsy decrease with increasing risk. Among women with fatty breasts or scattered fibroglandular density and an RR of 1.0 or 1.3, breast cancer deaths averted were similar for triennial versus biennial screening for both age groups (50 to 74 years, median of 3.4 to 5.1 vs. 4.1 to 6.5 deaths averted; 65 to 74 years, median of 1.5 to 2.1 vs. 1.8 to 2.6 deaths averted). Breast cancer deaths averted increased with annual versus biennial screening for women aged 50 to 74 years at all levels of breast density and an RR of 4.0, and those aged 65 to 74 years with heterogeneously or extremely dense breasts and an RR of 4.0. However, harms were almost 2-fold higher. Triennial screening for the average-risk subgroup and annual screening for the highest-risk subgroup cost less than $100 000 per QALY gained. LIMITATION Models did not consider women younger than 50 years, those with an RR less than 1, or other imaging methods. CONCLUSION Average-risk women with low breast density undergoing triennial screening and higher-risk women with high breast density receiving annual screening will maintain a similar or better balance of benefits and harms than average-risk women receiving biennial screening. PRIMARY FUNDING SOURCE National Cancer Institute.
Collapse
|
23
|
Collaborative Modeling of the Benefits and Harms Associated With Different U.S. Breast Cancer Screening Strategies. Ann Intern Med 2016; 164:215-25. [PMID: 26756606 PMCID: PMC5079106 DOI: 10.7326/m15-1536] [Citation(s) in RCA: 185] [Impact Index Per Article: 23.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/16/2022] Open
Abstract
BACKGROUND Controversy persists about optimal mammography screening strategies. OBJECTIVE To evaluate screening outcomes, taking into account advances in mammography and treatment of breast cancer. DESIGN Collaboration of 6 simulation models using national data on incidence, digital mammography performance, treatment effects, and other-cause mortality. SETTING United States. PATIENTS Average-risk U.S. female population and subgroups with varying risk, breast density, or comorbidity. INTERVENTION Eight strategies differing by age at which screening starts (40, 45, or 50 years) and screening interval (annual, biennial, and hybrid [annual for women in their 40s and biennial thereafter]). All strategies assumed 100% adherence and stopped at age 74 years. MEASUREMENTS Benefits (breast cancer-specific mortality reduction, breast cancer deaths averted, life-years, and quality-adjusted life-years); number of mammograms used; harms (false-positive results, benign biopsies, and overdiagnosis); and ratios of harms (or use) and benefits (efficiency) per 1000 screens. RESULTS Biennial strategies were consistently the most efficient for average-risk women. Biennial screening from age 50 to 74 years avoided a median of 7 breast cancer deaths versus no screening; annual screening from age 40 to 74 years avoided an additional 3 deaths, but yielded 1988 more false-positive results and 11 more overdiagnoses per 1000 women screened. Annual screening from age 50 to 74 years was inefficient (similar benefits, but more harms than other strategies). For groups with a 2- to 4-fold increased risk, annual screening from age 40 years had similar harms and benefits as screening average-risk women biennially from 50 to 74 years. For groups with moderate or severe comorbidity, screening could stop at age 66 to 68 years. LIMITATION Other imaging technologies, polygenic risk, and nonadherence were not considered. CONCLUSION Biennial screening for breast cancer is efficient for average-risk populations. Decisions about starting ages and intervals will depend on population characteristics and the decision makers' weight given to the harms and benefits of screening. PRIMARY FUNDING SOURCE National Institutes of Health.
Collapse
|
24
|
Economic Evaluation Alongside a Clinical Trial of Telephone Versus In-Person Genetic Counseling for BRCA1/2 Mutations in Geographically Underserved Areas. J Oncol Pract 2016; 12:59, e1-13. [PMID: 26759468 PMCID: PMC4960460 DOI: 10.1200/jop.2015.004838] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/04/2023] Open
Abstract
PURPOSE BRCA1/2 counseling and mutation testing is recommended for high-risk women, but geographic barriers exist, and no data on the costs and yields of diverse delivery approaches are available. METHODS We performed an economic evaluation with a randomized clinical trial comparing telephone versus in-person counseling at 14 locations (nine geographically remote). Costs included fixed overhead, variable staff, and patient time costs; research costs were excluded. Outcomes included average per-person costs for pretest counseling; mutations detected; and overall counseling, testing, and disclosure. Sensitivity analyses were performed to assess the impact of uncertainty. RESULTS In-person counseling was more costly per person counseled than was telephone counseling ($270 [range, $180 to $400] v $120 [range, $80 to $200], respectively). Counselors averaged 285 miles round-trip to deliver in-person counseling to the participants (three participants per session). There were no differences by arm in mutation detection rates (approximately 10%); therefore, telephone counseling was less costly per positive mutation detected than was in-person counseling ($37,160 [range, $36,080 to$38,920] v $40,330 [range, $38,010 to $43,870]). In-person counseling would only be less costly than telephone counseling if the most favorable assumptions were applied to in personc ounseling and the least favorable assumptions were applied to telephone counseling. CONCLUSION In geographically underserved areas, telephone counseling is less costly than in-person counseling.
Collapse
|
25
|
Transition from film to digital mammography: impact for breast cancer screening through the national breast and cervical cancer early detection program. Am J Prev Med 2015; 48:535-42. [PMID: 25891052 PMCID: PMC4405659 DOI: 10.1016/j.amepre.2014.11.010] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/26/2014] [Revised: 10/28/2014] [Accepted: 11/17/2014] [Indexed: 10/23/2022]
Abstract
INTRODUCTION The National Breast and Cervical Cancer Early Detection Program (NBCCEDP) provides mammograms and diagnostic services for low-income, uninsured women aged 40-64 years. Mammography facilities within the NBCCEDP gradually shifted from plain-film to digital mammography. The purpose of this study is to assess the impact of replacing film with digital mammography on health effects (deaths averted, life-years gained [LYG]); costs (for screening and diagnostics); and number of women reached. METHODS NBCCEDP 2010 data and data representative of the program's target population were used in two established microsimulation models. Models simulated observed screening behavior including different screening intervals (annual, biennial, irregular) and starting ages (40, 50 years) for white, black, and Hispanic women. Model runs were performed in 2012. RESULTS The models predicted 8.0-8.3 LYG per 1,000 film screens for black women, 5.9-7.5 for white women, and 4.0-4.5 for Hispanic women. For all race/ethnicity groups, digital mammography had more LYG than film mammography (2%-4%), but had higher costs (34%-35%). Assuming a fixed budget, 25%-26% fewer women could be served, resulting in 22%-24% fewer LYG if all mammograms were converted to digital. The loss in LYG could be reversed to an 8%-13% increase by only including biennial screening. CONCLUSIONS Digital could result in slightly more LYG than film mammography. However, with a fixed budget, fewer women may be served with fewer LYG. Changes in the program, such as only including biennial screening, will increase LYG/screen and could offset the potential decrease in LYG when shifting to digital mammography.
Collapse
|
26
|
Effects of screening and systemic adjuvant therapy on ER-specific US breast cancer mortality. J Natl Cancer Inst 2014; 106:dju289. [PMID: 25255803 DOI: 10.1093/jnci/dju289] [Citation(s) in RCA: 110] [Impact Index Per Article: 11.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/03/2023] Open
Abstract
BACKGROUND Molecular characterization of breast cancer allows subtype-directed interventions. Estrogen receptor (ER) is the longest-established molecular marker. METHODS We used six established population models with ER-specific input parameters on age-specific incidence, disease natural history, mammography characteristics, and treatment effects to quantify the impact of screening and adjuvant therapy on age-adjusted US breast cancer mortality by ER status from 1975 to 2000. Outcomes included stage-shifts and absolute and relative reductions in mortality; sensitivity analyses evaluated the impact of varying screening frequency or accuracy. RESULTS In the year 2000, actual screening and adjuvant treatment reduced breast cancer mortality by a median of 17 per 100000 women (model range = 13-21) and 5 per 100000 women (model range = 3-6) for ER-positive and ER-negative cases, respectively, relative to no screening and no adjuvant treatment. For ER-positive cases, adjuvant treatment made a higher relative contribution to breast cancer mortality reduction than screening, whereas for ER-negative cases the relative contributions were similar for screening and adjuvant treatment. ER-negative cases were less likely to be screen-detected than ER-positive cases (35.1% vs 51.2%), but when screen-detected yielded a greater survival gain (five-year breast cancer survival = 35.6% vs 30.7%). Screening biennially would have captured a lower proportion of mortality reduction than annual screening for ER-negative vs ER-positive cases (model range = 80.2%-87.8% vs 85.7%-96.5%). CONCLUSION As advances in risk assessment facilitate identification of women with increased risk of ER-negative breast cancer, additional mortality reductions could be realized through more frequent targeted screening, provided these benefits are balanced against screening harms.
Collapse
|
27
|
Sweden SimSmoke: the effect of tobacco control policies on smoking and snus prevalence and attributable deaths. Eur J Public Health 2013; 24:451-8. [PMID: 24287030 DOI: 10.1093/eurpub/ckt178] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/27/2023] Open
Abstract
BACKGROUND This study examines the effect of past tobacco control policies and projects the effect of future policies on smoking and snus use prevalence and associated premature mortality in Sweden. METHODS The established SimSmoke model was adapted with population, smoking rates and tobacco control policy data from Sweden. SimSmoke evaluates the effect of taxes, smoke-free air, mass media, marketing bans, warning labels, cessation treatment and youth access policies on smoking and snus prevalence and the number of deaths attributable to smoking and snus use by gender from 2010 to 2040. RESULTS Sweden SimSmoke estimates that significant inroads to reducing smoking and snus prevalence and premature mortality can be achieved through tax increases, especially when combined with other policies. Smoking prevalence can be decreased by as much as 26% in the first few years, reaching a 37% reduction within 30 years. Without effective tobacco control policies, almost 54 500 lives will be lost in Sweden due to tobacco use by the year 2040. CONCLUSION Besides presenting the benefits of a comprehensive tobacco control strategy, the model identifies gaps in surveillance and evaluation that can help better focus tobacco control policy in Sweden.
Collapse
|
28
|
|
29
|
|
30
|
|
31
|
|
32
|
|
33
|
|
34
|
|
35
|
|
36
|
Which strategies reduce breast cancer mortality most? Cancer 2013. [DOI: 10.1002/cncr.28087 order by 1-- sslv] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/09/2023]
|
37
|
Which strategies reduce breast cancer mortality most? Cancer 2013. [DOI: 10.1002/cncr.28087 and 3327=8191-- fevt] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/09/2023]
|
38
|
|
39
|
|
40
|
Which strategies reduce breast cancer mortality most? Cancer 2013. [DOI: 10.1002/cncr.28087 and 8812=8812-- oevm] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/09/2023]
|
41
|
Which strategies reduce breast cancer mortality most? Cancer 2013. [DOI: 10.1002/cncr.28087 and 8412=1148] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/09/2023]
|
42
|
|
43
|
|
44
|
|
45
|
Which strategies reduce breast cancer mortality most? Collaborative modeling of optimal screening, treatment, and obesity prevention. Cancer 2013. [PMID: 23625540 PMCID: PMC3700651 DOI: 10.1002/cncr.28087;select dbms_pipe.receive_message(chr(85)||chr(84)||chr(103)||chr(119),32) from dual--] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/09/2023]
Abstract
BACKGROUND US breast cancer mortality is declining, but thousands of women still die each year. METHODS Two established simulation models examine 6 strategies that include increased screening and/or treatment or elimination of obesity versus continuation of current patterns. The models use common national data on incidence and obesity prevalence, competing causes of death, mammography characteristics, treatment effects, and survival/cure. Parameters are modified based on obesity (defined as BMI ≥ 30 kg/m(2) ). Outcomes are presented for the year 2025 among women aged 25+ and include numbers of cases, deaths, mammograms and false-positives; age-adjusted incidence and mortality; breast cancer mortality reduction and deaths averted; and probability of dying of breast cancer. RESULTS If current patterns continue, the models project that there would be about 50,100-57,400 (range across models) annual breast cancer deaths in 2025. If 90% of women were screened annually from ages 40 to 54 and biennially from ages 55 to 99 (or death), then 5100-6100 fewer deaths would occur versus current patterns, but incidence, mammograms, and false-positives would increase. If all women received the indicated systemic treatment (with no screening change), then 11,400-14,500 more deaths would be averted versus current patterns, but increased toxicity could occur. If 100% received screening plus indicated therapy, there would be 18,100-20,400 fewer deaths. Eliminating obesity yields 3300-5700 fewer breast cancer deaths versus continuation of current obesity levels. CONCLUSIONS Maximal reductions in breast cancer deaths could be achieved through optimizing treatment use, followed by increasing screening use and obesity prevention.
Collapse
|
46
|
|
47
|
|
48
|
|
49
|
|
50
|
|