76
|
Mohile SG, Hurria A, Cohen HJ, Rowland JH, Leach CR, Arora NK, Canin B, Muss HB, Magnuson A, Flannery M, Lowenstein L, Allore HG, Mustian KM, Demark-Wahnefried W, Extermann M, Ferrell B, Inouye SK, Studenski SA, Dale W. Improving the quality of survivorship for older adults with cancer. Cancer 2016; 122:2459-568. [PMID: 27172129 PMCID: PMC4974133 DOI: 10.1002/cncr.30053] [Citation(s) in RCA: 85] [Impact Index Per Article: 10.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2016] [Revised: 03/23/2016] [Accepted: 03/24/2016] [Indexed: 12/31/2022]
Abstract
In May 2015, the Cancer and Aging Research Group, in collaboration with the National Cancer Institute and the National Institute on Aging through a U13 grant, convened a conference to identify research priorities to help design and implement intervention studies to improve the quality of life and survivorship of older, frailer adults with cancer. Conference attendees included researchers with multidisciplinary expertise and advocates. It was concluded that future intervention trials for older adults with cancer should: 1) rigorously test interventions to prevent the decline of or improve health status, especially interventions focused on optimizing physical performance, nutritional status, and cognition while undergoing cancer treatment; 2) use standardized care plans based on geriatric assessment findings to guide targeted interventions; and 3) incorporate the principles of geriatrics into survivorship care plans. Also highlighted was the need to integrate the expertise of interdisciplinary team members into geriatric oncology research, improve funding mechanisms to support geriatric oncology research, and disseminate high-impact results to the research and clinical community. In conjunction with the 2 prior U13 meetings, this conference provided the framework for future research to improve the evidence base for the clinical care of older adults with cancer. Cancer 2016;122:2459-68. © 2016 American Cancer Society.
Collapse
|
77
|
MacNeil Vroomen J, Hoeben J, Peeters CF, Bosmans J, De Rooij S, Allore HG, Monin J, Hout HP. P1‐442: Is There a Difference in Health‐Related Quality of Life in Caregivers Based on The Living Situation of The Person with Dementia Over One Year? Alzheimers Dement 2016. [DOI: 10.1016/j.jalz.2016.06.1194] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
78
|
Malawista A, Wang X, Trentalange M, Allore HG, Montgomery RR. Coordinated expression of tyro3, axl, and mer receptors in macrophage ontogeny. ACTA ACUST UNITED AC 2016; 3. [PMID: 27695708 PMCID: PMC5040214 DOI: 10.14800/macrophage.1261] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/21/2022]
Abstract
The TAM receptors (Tyro3, Axl, and Mer) are a family of homologous receptor-tyrosine kinases that inhibit Toll-like receptor signaling to regulate downstream pathways and restore homeostasis. TAM triple mutant mice (Tyro3−/−, Axl−/−, Mer−/−) have elevated levels of pro-inflammatory cytokines and are prone to developing lymphoproliferative disorders and autoimmunity. Understanding differential expression of TAM receptors among human subjects is critical to harnessing this pathway for therapeutic interventions. We have quantified changes in TAM expression during the ontogeny of human macrophages using paired samples of monocytes and macrophages to take advantage of characteristic expression within an individual. No significant differences in levels of Tyro3 were found between monocytes and macrophages (flow cytometry: p=0.652, immunoblot: p=0.231, qPCR: p=0.389). Protein levels of Axl were reduced (flow cytometry: p=0.049, immunoblot: p<0.001) when monocytes matured to macrophages. No significant differences in the levels of Axl mRNA transcripts were found (qPCR: p=0.082), however, Tyro3 and Axl were proportionate. The most striking difference was upregulation of expression of Mer with both protein and mRNA being significantly increased when monocytes developed into macrophages (flow cytometry: p<0.001, immunoblot: p<0.001, qPCR: p=0.004). A fuller characterization of TAM receptor expression in macrophage ontogeny informs our understanding of their function and potential therapeutic interventions.
Collapse
|
79
|
Botoseneanu A, Allore HG, Mendes de Leon CF, Gahbauer EA, Gill TM. Sex Differences in Concomitant Trajectories of Self-Reported Disability and Measured Physical Capacity in Older Adults. J Gerontol A Biol Sci Med Sci 2016; 71:1056-62. [PMID: 27071781 DOI: 10.1093/gerona/glw038] [Citation(s) in RCA: 30] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2015] [Accepted: 02/15/2016] [Indexed: 11/14/2022] Open
Abstract
BACKGROUND Despite documented age-related declines in self-reported functional status and measured physical capacity, it is unclear whether these functional indicators follow similar trajectories over time or whether the patterns of change differ by sex. METHODS We used longitudinal data from 687 initially nondisabled adults, aged 70 or older, from the Precipitating Events Project, who were evaluated every 18 months for nearly 14 years. Self-reported disability was assessed with a 12-item disability scale. Physical capacity was measured using grip strength and a modified version of Short Physical Performance Battery. Hierarchical linear models estimated the intra-individual trajectory of each functional indicator and differences in trajectories' intercept and slope by sex. RESULTS Self-reported disability, grip strength, and Short Physical Performance Battery score declined over 13.5 years following nonlinear trajectories. Women experienced faster accumulation of self-reported disability, but slower declines in measured physical capacity, compared with men. Trajectory intercepts revealed that women had significantly weaker grip strength and reported higher levels of disability compared with men, with no differences in starting Short Physical Performance Battery scores. These findings were robust to adjustments for differences in sociodemographic characteristics, length-of-survival, health risk factors, and chronic-disease status. CONCLUSIONS Despite the female disadvantage in self-reported disability, older women preserve measured physical capacity better than men over time. Self-reported and measured indicators should be viewed as complementary rather than interchangeable assessments of functional status for both clinical and research purposes, especially for sex-specific comparisons.
Collapse
|
80
|
Larocque SC, Kerstetter JE, Cauley JA, Insogna KL, Ensrud K, Lui LY, Allore HG. Dietary Protein and Vitamin D Intake and Risk of Falls: A Secondary Analysis of Postmenopausal Women from the Study of Osteoporotic Fractures. J Nutr Gerontol Geriatr 2016; 34:305-18. [PMID: 26267443 DOI: 10.1080/21551197.2015.1054574] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
More than 90% of hip fractures in older Americans result from a fall. Inadequate intake of dietary protein and vitamin D are common in older adults, and diets in low these could contribute to loss of muscle mass and strength or coordination, in turn increasing the risk of falling. The objective of the study was to evaluate the relationship between protein and vitamin D intake with the occurrence of falls in older women in the Study of Osteoporotic Fracture, a prospective cohort of more than 4000 postmenopausal women participating from January 1997 to September 1998. Incident falls were ascertained for one year. Protein and vitamin D intake was assessed by a food frequency questionnaire; associations with a reported fall were estimated with logistic regression, adjusted for fall-related covariates and energy. Protein and vitamin D were modeled separately because of high correlation (rho = 0.55, P < 0.001). A total of 1429 women reported a fall within one year. In separate, unadjusted models dietary protein (per 1 g/kg increase) and vitamin D (per 100 International Unit (IU) increase) significantly increased the odds ratio (OR) of falling (OR 1.35 95% CI 1.15-1.59, OR 1.11 95% CI 1.03-1.19, respectively). Once fall-related covariates were added to each model, dietary protein and vitamin D were noncontributory to falls. While we could find no direct association between vitamin D and protein intake and fall prevention, adequate intake of these two nutrients are critical for musculoskeletal health in older adults.
Collapse
|
81
|
Ng R, Allore HG, Monin JK, Levy BR. Retirement as Meaningful: Positive Retirement Stereotypes Associated with Longevity. THE JOURNAL OF SOCIAL ISSUES 2016; 72:69-85. [PMID: 27346893 PMCID: PMC4920366 DOI: 10.1111/josi.12156] [Citation(s) in RCA: 31] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/06/2023]
Abstract
Studies examining the association between retirement and health have produced mixed results. This may be due to previous studies treating retirement as merely a change in job status rather than a transition associated with stereotypes or societal beliefs (e.g., retirement is a time of mental decline or retirement is a time of growth). To examine whether these stereotypes are associated with health, we studied retirement stereotypes and survival over a 23-year period among 1,011 older adults. As predicted by stereotype embodiment theory, it was found that positive stereotypes about physical health during retirement showed a survival advantage of 4.5 years (hazard ratio = 0.88, p = .022) and positive stereotypes about mental health during retirement tended to show a survival advantage of 2.5 years (hazard ratio = 0.87, p = .034). Models adjusted for relevant covariates such as age, gender, race, employment status, functional health, and self-rated health. These results suggest that retirement preparation could benefit from considering retirement stereotypes.
Collapse
|
82
|
Buurman BM, Parlevliet JL, Allore HG, Blok W, van Deelen BAJ, Moll van Charante EP, de Haan RJ, de Rooij SE. Comprehensive Geriatric Assessment and Transitional Care in Acutely Hospitalized Patients: The Transitional Care Bridge Randomized Clinical Trial. JAMA Intern Med 2016; 176:302-9. [PMID: 26882111 DOI: 10.1001/jamainternmed.2015.8042] [Citation(s) in RCA: 53] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Abstract
IMPORTANCE Older adults acutely hospitalized are at risk of disability. Trials on comprehensive geriatric assessment (CGA) and transitional care present inconsistent results. OBJECTIVE To test whether an intervention of systematic CGA, followed by the transitional care bridge program, improved activities of daily living (ADLs) compared with systematic CGA alone. DESIGN, SETTING, AND PARTICIPANTS This study was a double-blind, multicenter, randomized clinical trial conducted at 3 hospitals with affiliated home care organizations in the Netherlands between September 1, 2010, and March 1, 2014. In total, 1070 consecutive patients were eligible, 674 (63.0%) of whom enrolled. They were 65 years or older, acutely hospitalized to a medical ward for at least 48 hours with an Identification of Seniors at Risk-Hospitalized Patients score of 2 or higher, and randomized using permuted blocks stratified by study site and Mini-Mental State Examination score (<24 vs ≥24). The dates of the analysis were June 1, 2014, to November 15, 2014. INTERVENTIONS The transitional care bridge program intervention was started during hospitalization by a visit from a community care registered nurse (CCRN) and continued after discharge with home visits at 2 days and at 2, 6, 12, and 24 weeks. The CCRNs applied the CGA care and treatment plan. MAIN OUTCOMES AND MEASURES The main outcome was the Katz Index of ADL at 6 months compared with 2 weeks before admission. Secondary outcomes were mortality, cognitive functioning, time to hospital readmission, and the time to discharge from a nursing home. RESULTS The study cohort comprised 674 participants. Their mean age was 80 years, 42.1% (n = 284) were male, and 39.2% (n = 264) were cognitively impaired at admission. Intent-to-treat analysis found no differences in the mean Katz Index of ADL at 6 months between the intervention arm (mean, 2.0; 95% CI, 1.8-2.2) and the CGA-only arm (mean, 1.9; 95% CI, 1.7-2.2). For secondary outcomes, there were 85 deaths (25.2%) in the intervention arm and 104 deaths (30.9%) in the CGA-only arm, resulting in a lower risk on the time to death within 6 months after hospital admission (hazard ratio, 0.75; 95% CI, 0.56-0.99; P = .045; number needed to treat to prevent 1 death, 16). No other secondary outcome was significant. CONCLUSIONS AND RELEVANCE A systematic CGA, followed by the transitional care bridge program, showed no effect on ADL functioning in acutely hospitalized older patients. TRIAL REGISTRATION Netherlands Trial Registry: NTR2384.
Collapse
|
83
|
Murphy TE, Allore HG, Han L, Peduzzi PN, Gill TM, Xu X, Lin H. A longitudinal, observational study with many repeated measures demonstrated improved precision of individual survival curves using Bayesian joint modeling of disability and survival. Exp Aging Res 2016; 41:221-39. [PMID: 25978444 DOI: 10.1080/0361073x.2015.1021640] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
UNLABELLED BACKGROUND/STUDY CONTEXT: It has not been previously demonstrated whether Bayesian joint modeling (BJM) of disability and survival can, under certain conditions, improve precision of individual survival curves. METHODS A longitudinal, observational study wherein 754 initially nondisabled community-dwelling adults in greater New Haven, Connecticut, were observed on a monthly basis for over 10 years. RESULTS In this study, BJM exploited many monthly observations to demonstrate, relative to a separate survival model with adjustment, improved precision of individual survival curves, permitting detection of significant differences between survival curves of two similar individuals. The gain in precision was lost when using only those observations from intervals of 6, 9, or 12 months. CONCLUSION When there are many repeated measures, BJM of longitudinal functional disability and interval-censored survival can potentially increase the precision of individual survival curves relative to those from a separate survival model. This may facilitate the identification of significant differences between individual survival curves, a useful result usually precluded by the large variability inherent to individual-level estimates from stand-alone survival models.
Collapse
|
84
|
Esserman D, Allore HG, Travison TG. The Method of Randomization for Cluster-Randomized Trials: Challenges of Including Patients with Multiple Chronic Conditions. ACTA ACUST UNITED AC 2016; 5:2-7. [PMID: 27478520 PMCID: PMC4963011 DOI: 10.6000/1929-6029.2016.05.01.1] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022]
Abstract
Cluster-randomized clinical trials (CRT) are trials in which the unit of randomization is not a participant but a group (e.g. healthcare systems or community centers). They are suitable when the intervention applies naturally to the cluster (e.g. healthcare policy); when lack of independence among participants may occur (e.g. nursing home hygiene); or when it is most ethical to apply an intervention to all within a group (e.g. school-level immunization). Because participants in the same cluster receive the same intervention, CRT may approximate clinical practice, and may produce generalizable findings. However, when not properly designed or interpreted, CRT may induce biased results. CRT designs have features that add complexity to statistical estimation and inference. Chief among these is the cluster-level correlation in response measurements induced by the randomization. A critical consideration is the experimental unit of inference; often it is desirable to consider intervention effects at the level of the individual rather than the cluster. Finally, given that the number of clusters available may be limited, simple forms of randomization may not achieve balance between intervention and control arms at either the cluster- or participant-level. In non-clustered clinical trials, balance of key factors may be easier to achieve because the sample can be homogenous by exclusion of participants with multiple chronic conditions (MCC). CRTs, which are often pragmatic, may eschew such restrictions. Failure to account for imbalance may induce bias and reducing validity. This article focuses on the complexities of randomization in the design of CRTs, such as the inclusion of patients with MCC, and imbalances in covariate factors across clusters.
Collapse
|
85
|
Allore HG, Zhan Y, Cohen AB, Tinetti ME, Trentalange M, McAvay G. Methodology to Estimate the Longitudinal Average Attributable Fraction of Guideline-recommended Medications for Death in Older Adults With Multiple Chronic Conditions. J Gerontol A Biol Sci Med Sci 2016; 71:1113-6. [PMID: 26748093 DOI: 10.1093/gerona/glv223] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2015] [Accepted: 11/30/2015] [Indexed: 11/12/2022] Open
Abstract
BACKGROUND Persons with multiple chronic conditions receive multiple guideline-recommended medications to improve outcomes such as mortality. Our objective was to estimate the longitudinal average attributable fraction for 3-year survival of medications for cardiovascular conditions in persons with multiple chronic conditions and to determine whether heterogeneity occurred by age. METHODS Medicare Current Beneficiary Survey participants (N = 8,578) with two or more chronic conditions, enrolled from 2005 to 2009 with follow-up through 2011, were analyzed. We calculated the longitudinal extension of the average attributable fraction for oral medications (beta blockers, renin-angiotensin system blockers, and thiazide diuretics) indicated for cardiovascular conditions (atrial fibrillation, coronary artery disease, heart failure, and hypertension), on survival adjusted for 18 participant characteristics. Models stratified by age (≤80 and >80 years) were analyzed to determine heterogeneity of both cardiovascular conditions and medications. RESULTS Heart failure had the greatest average attributable fraction (39%) for mortality. The fractional contributions of beta blockers, renin-angiotensin system blockers, and thiazides to improve survival were 10.4%, 9.3%, and 7.2% respectively. In age-stratified models, of these medications thiazides had a significant contribution to survival only for those aged 80 years or younger. The effects of the remaining medications were similar in both age strata. CONCLUSIONS Most cardiovascular medications were attributed independently to survival. The two cardiovascular conditions contributing independently to death were heart failure and atrial fibrillation. The medication effects were similar by age except for thiazides that had a significant contribution to survival in persons younger than 80 years.
Collapse
|
86
|
Han L, Pisani MA, Araujo KLB, Allore HG. Use of Self-Matching to Control for Stable Patient Characteristics While Addressing Time-Varying Confounding on Treatment Effect: A Case Study of Older Intensive Care Patients. ACTA ACUST UNITED AC 2016; 5:8-16. [PMID: 27123153 PMCID: PMC4844076 DOI: 10.6000/1929-6029.2016.05.01.2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
Exposure-crossover design offers a non-experimental option to control for stable baseline confounding through self-matching while examining causal effect of an exposure on an acute outcome. This study extends this approach to longitudinal data with repeated measures of exposure and outcome using data from a cohort of 340 older medical patients in an intensive care unit (ICU). The analytic sample included 92 patients who received ≥1 dose of haloperidol, an antipsychotic medication often used for patients with delirium. Exposure-crossover design was implemented by sampling the 3-day time segments prior (Induction) and posterior (Subsequent) to each treatment episode of receiving haloperidol. In the full cohort, there was a trend of increasing delirium severity scores (Mean±SD: 4.4±1.7) over the course of the ICU stay. After exposure-crossover sampling, the delirium severity score decreased from the Induction (4.9) to the Subsequent (4.1) intervals, with the treatment episode falling in-between (4.5). Based on a GEE Poisson model accounting for self-matching and within-subject correlation, the unadjusted mean delirium severity scores was −0.55 (95% CI: −1.10, −0.01) points lower for the Subsequent than the Induction intervals. The association diminished by 32% (−0.38, 95%CI: −0.99, 0.24) after adjusting only for ICU confounding, while being slightly increased by 7% (−0.60, 95%CI: −1.15, −0.04) when adjusting only for baseline characteristics. These results suggest that longitudinal exposure-crossover design is feasible and capable of partially removing stable baseline confounding through self-matching. Loss of power due to eliminating treatment-irrelevant person-time and uncertainty around allocating person-time to comparison intervals remain methodological challenges.
Collapse
|
87
|
Buurman BM, Han L, Murphy TE, Gahbauer EA, Leo-Summers L, Allore HG, Gill TM. Trajectories of Disability Among Older Persons Before and After a Hospitalization Leading to a Skilled Nursing Facility Admission. J Am Med Dir Assoc 2015; 17:225-31. [PMID: 26620073 DOI: 10.1016/j.jamda.2015.10.010] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2015] [Revised: 09/02/2015] [Accepted: 10/12/2015] [Indexed: 12/27/2022]
Abstract
OBJECTIVES To identify distinct sets of disability trajectories in the year before and after a Medicare qualifying skilled nursing facility (Q-SNF) admission, evaluate the associations between the pre-and post-Q-SNF disability trajectories, and determine short-term outcomes (readmission, mortality). DESIGN, SETTING, AND PARTICIPANTS Prospective cohort study including 754 community-dwelling older persons, 70+ years, and initially nondisabled in their basic activities of daily living. The analytic sample included 394 persons, with a first hospitalization followed by a Q-SNF admission between 1998 and 2012. MAIN OUTCOMES AND MEASURES Disability in the year before and after a Q-SNF admission using 13 basic, instrumental, and mobility activities. Secondary outcomes included 30-day readmission and 12-month mortality. RESULTS The mean (SD) age of the sample was 84.9 (5.5) years. We identified 3 disability trajectories in the year before a Q-SNF admission: minimal disability (37.3% of participants), mild disability (44.6%), and moderate disability (18.2%). In the year after a Q-SNF admission, all participants started with moderate to severe disability scores. Three disability trajectories were identified: substantial improvement (26.0% of participants), minimal improvement (36.5%), and no improvement (37.5%). Among participants with minimal disability pre-Q-SNF, 52% demonstrated substantial improvement; the other 48% demonstrated minimal improvement (32%) or no improvement (16%) and remained moderately to severely disabled in the year post-Q-SNF. Among participants with mild disability pre-Q-SNF, 5% showed substantial improvement, whereas 95% showed little to no improvement. Of participants with moderate disability pre-Q-SNF, 15% remained moderately disabled showing little improvement, whereas 85% showed no improvement. Participants who transitioned from minimal disability pre-Q-SNF to no improvement post-Q-SNF had the highest rates of 30-day readmission and 12-month mortality (rate/100 person-days 1.3 [95% CI 0.6-2.8] and 0.3 [95% CI 0.15-0.45], respectively). CONCLUSIONS Among older persons, distinct disability trajectories were observed in the year before and after a Q-SNF admission. The likelihood of improvement in disability was greatly constrained by the pre-Q-SNF disability trajectory. Most older persons remained moderately to severely disabled in the year following a Q-SNF admission.
Collapse
|
88
|
Han L, Gill TM, Jones BL, Allore HG. Cognitive Aging Trajectories and Burdens of Disability, Hospitalization and Nursing Home Admission Among Community-living Older Persons. J Gerontol A Biol Sci Med Sci 2015; 71:766-71. [PMID: 26511011 DOI: 10.1093/gerona/glv159] [Citation(s) in RCA: 43] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2015] [Accepted: 08/17/2015] [Indexed: 11/13/2022] Open
Abstract
BACKGROUND The course of cognitive aging has demonstrated substantial heterogeneity. This study attempted to identify distinctive cognitive trajectories and examine their relationship with burdens of disability, hospitalization, and nursing home admission. METHODS Seven hundred and fifty-four community-living persons aged 70 years or older in the Yale Precipitating Events Project were assessed with the Mini-Mental State Examination every 18 months for up to 108 months. A group-based trajectory model was used to determine cognitive aging trajectories while adjusting for age, sex, and education. Cumulative burden of disabilities, hospitalizations, and nursing home admissions over 141 months associated with the cognitive trajectories were evaluated using a generalized estimating equation Poisson model. RESULTS Five distinct cognitive trajectories were identified, with about a third of participants starting with high baseline cognitive function and demonstrating No decline during the follow-up period. The remaining participants diverged with Minimal (prevalence 41%), Moderate (16%), Progressive (8%), and Rapid (3%) cognitive decline. Participants with No decline incurred the lowest incidence rates (per 1,000 person-months) of disability in activities of daily living (ADL; 75, 95% confidence intervals: 60-95) and instrumental ADL (492, 453-535), hospitalization (29, 26-33) and nursing home admission (18, 12-27), whereas participants on the Rapid trajectory experienced the greatest burden of ADL disability (612, 595-758) and those on the Progressive trajectory had the highest nursing home admission (363, 292-451). CONCLUSIONS Community-living older persons follow distinct cognitive aging trajectories and experience increasing burdens of disability, hospitalization, and nursing home placement as they age, with greater burdens for those on a declining cognitive trajectory.
Collapse
|
89
|
Tinetti ME, McAvay G, Trentalange M, Cohen AB, Allore HG. Association between guideline recommended drugs and death in older adults with multiple chronic conditions: population based cohort study. BMJ 2015; 351:h4984. [PMID: 26432468 PMCID: PMC4591503 DOI: 10.1136/bmj.h4984] [Citation(s) in RCA: 44] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 01/12/2023]
Abstract
OBJECTIVE To estimate the association between guideline recommended drugs and death in older adults with multiple chronic conditions. DESIGN Population based cohort study. SETTING Medicare Current Beneficiary Survey cohort, a nationally representative sample of Americans aged 65 years or more. PARTICIPANTS 8578 older adults with two or more study chronic conditions (atrial fibrillation, coronary artery disease, chronic kidney disease, depression, diabetes, heart failure, hyperlipidemia, hypertension, and thromboembolic disease), followed through 2011. EXPOSURES Drugs included β blockers, calcium channel blockers, clopidogrel, metformin, renin-angiotensin system (RAS) blockers; selective serotonin reuptake inhibitors (SSRIs) and serotonin norepinephrine reuptake inhibitors (SNRIs); statins; thiazides; and warfarin. MAIN OUTCOME MEASURE Adjusted hazard ratios for death among participants with a condition and taking a guideline recommended drug relative to participants with the condition not taking the drug and among participants with the most common combinations of four conditions. RESULTS Over 50% of participants with each condition received the recommended drugs regardless of coexisting conditions; 1287/8578 (15%) participants died during the three years of follow-up. Among cardiovascular drugs, β blockers, calcium channel blockers, RAS blockers, and statins were associated with reduced mortality for indicated conditions. For example, the adjusted hazard ratio for β blockers was 0.59 (95% confidence interval 0.48 to 0.72) for people with atrial fibrillation and 0.68 (0.57 to 0.81) for those with heart failure. The adjusted hazard ratios for cardiovascular drugs were similar to those with common combinations of four coexisting conditions, with trends toward variable effects for β blockers. None of clopidogrel, metformin, or SSRIs/SNRIs was associated with reduced mortality. Warfarin was associated with a reduced risk of death among those with atrial fibrillation (adjusted hazard ratio 0.69, 95% confidence interval 0.56 to 0.85) and thromboembolic disease (0.44, 0.30 to 0.62). Attenuation in the association with reduced risk of death was found with warfarin in participants with some combinations of coexisting conditions. CONCLUSIONS Average effects on survival, particularly for cardiovascular study drugs, were comparable to those reported in randomized controlled trials but varied for some drugs according to coexisting conditions. Determining treatment effects in combinations of conditions may guide prescribing in people with multiple chronic conditions.
Collapse
|
90
|
Feder SL, Schulman-Green D, Geda M, Williams K, Dodson JA, Nanna MG, Allore HG, Murphy TE, Tinetti ME, Gill TM, Chaudhry SI. Physicians' perceptions of the Thrombolysis in Myocardial Infarction (TIMI) risk score in older adults with acute myocardial infarction. Heart Lung 2015; 44:376-81. [PMID: 26164651 PMCID: PMC4567390 DOI: 10.1016/j.hrtlng.2015.05.005] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2015] [Revised: 05/14/2015] [Accepted: 05/15/2015] [Indexed: 11/24/2022]
Abstract
OBJECTIVES To evaluate physician-perceived strengths and limitations of the Thrombolysis in Myocardial Infarction (TIMI) risk scores for use in older adults with acute myocardial infarction (AMI). BACKGROUND The TIMI risk scores are risk stratification models developed to estimate mortality risk for patients hospitalized for AMI. However, these models were developed and validated in cohorts underrepresenting older adults (≥75 years). METHODS Qualitative study using semi-structured telephone interviews and the constant comparative method for analysis. RESULTS Twenty-two physicians completed interviews ranging 10-30 min (mean = 18 min). Median sample age was 37 years, with a median of 11.5 years of clinical experience. TIMI strengths included familiarity, ease of use, and validation. Limitations included a lack of risk factors relevant to older adults and model scope and influence. CONCLUSIONS Physicians report that the TIMI models, while widely used in clinical practice, have limitations when applied to older adults. New risk models are needed to guide AMI treatment in this population.
Collapse
|
91
|
Gill TM, Allore HG, Gahbauer EA, Han L. Establishing a Hierarchy for the Two Components of Restricted Activity. J Gerontol A Biol Sci Med Sci 2015; 70:892-8. [PMID: 25391532 PMCID: PMC4481688 DOI: 10.1093/gerona/glu203] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2014] [Accepted: 10/01/2014] [Indexed: 11/13/2022] Open
Abstract
BACKGROUND Increasing evidence suggests that illnesses and injuries leading to restricted activity have adverse functional consequences, but whether the two components of restricted activity have comparable effects is unknown. We evaluated whether an illness/injury leading to bed rest represents a more potent exposure than one leading to cutting down on one's usual activities without bed rest. METHODS We prospectively evaluated 754 community-living persons, 70+ years. Telephone interviews were completed monthly for >15 years to assess disability in four basic, five instrumental, and four mobility activities and to ascertain exposure to illnesses/injuries leading to cut down activities and bed rest, respectively. For each of the three functional domains, transitions between no disability, mild disability, and severe disability were evaluated each month. RESULTS For each domain, cut down activities and bed rest were significantly associated with at least one transition. The associations were consistently stronger, however, for bed rest than for cut down activities. Bed rest was a particularly potent exposure for transitions from no disability to severe disability, with hazard ratios as high as 8.94 (95% CI, 5.69-14.1) for the mobility activities, and for all transitions from severe disability (representing recovery), with hazard ratios as low as 0.25 (0.12-0.54) for the transition to no disability for the basic activities. CONCLUSIONS In the setting of an illness/injury, bed rest was more strongly associated with a set of clinically meaningful transitions in functional status than cut down activities. Prompt medical attention may be warranted when an older person takes to bed because of an illness/injury.
Collapse
|
92
|
Feder SL, Schulman-Green D, Dodson JA, Geda M, Williams K, Nanna MG, Allore HG, Murphy TE, Tinetti ME, Gill TM, Chaudhry SI. Risk Stratification in Older Patients With Acute Myocardial Infarction: Physicians' Perspectives. J Aging Health 2015; 28:387-402. [PMID: 26100619 DOI: 10.1177/0898264315591005] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
OBJECTIVE Risk stratification models support clinical decision making in acute myocardial infarction (AMI) care. Existing models were developed using data from younger populations, potentially limiting accuracy and relevance in older adults. We describe physician-perceived risk factors, views of existing models, and preferences for future model development in older adults. METHOD Qualitative study using semi-structured telephone interviews and the constant comparative method. RESULTS Twenty-two physicians from 14 institutions completed the interviews. Median age was 37, and median years of clinical experience was 11.5. Perceived predictors included cardiovascular, comorbid, functional, and social risk factors. Physicians viewed models as easy to use, yet neither inclusive of risk factors nor predictive of non-mortality outcomes germane to clinical decision making in older adults. Ideal models included multidimensional risk domains and operational requirements. DISCUSSION Physicians reported limitations of available risk models when applied to older adults with AMI. New models are needed to guide AMI treatment in this population.
Collapse
|
93
|
Gill TM, Gahbauer EA, Han L, Allore HG. The role of intervening hospital admissions on trajectories of disability in the last year of life: prospective cohort study of older people. BMJ 2015; 350:h2361. [PMID: 25995357 PMCID: PMC4443433 DOI: 10.1136/bmj.h2361] [Citation(s) in RCA: 72] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
OBJECTIVE To evaluate the role of intervening hospital admissions on trajectories of disability in the last year of life. DESIGN Prospective cohort study. SETTING Greater New Haven, Connecticut, United States, from March 1998 to June 2013. PARTICIPANTS 552 decedents from a cohort of 754 community living people, aged 70 years or older, who were initially non-disabled in four essential activities of daily living: bathing, dressing, walking, and transferring. MAIN OUTCOME MEASURE Occurrence of admissions to hospital and severity of disability (range 0-4), ascertained during monthly interviews for more than 15 years. RESULTS In the last year of life, six distinct trajectories of disability were identified, from least disabled to most disabled: 95 participants (17.2%) had no disability, 61 (11.1%) had catastrophic disability, 53 (9.6%) had accelerated disability, 61 (11.1%) had progressively mild disability, 127 (23.0%) had progressively severe disability, and 155 (28.1%) had persistently severe disability. 392 (71.0%) participants had at least one hospital admission and 248 (44.9%) had multiple hospital admissions. For each trajectory the course of disability closely tracked the monthly prevalence of hospital admission. In a set of multivariable models that included several potential confounders, hospital admission in a given month had a strong independent effect on the severity of disability, in both relative and absolute terms. The largest absolute effect was observed for catastrophic disability, with a mean increase in disability score of 1.9 (95% confidence interval 1.5 to 2.4) in the setting of a hospital admission, corresponding to a rate ratio (or relative effect) of 2.0 (95% confidence interval 1.5 to 2.7). CONCLUSIONS In the last year of life, acute hospital admissions play an important role in the disabling process. Knowledge about the course of disability before these intervening events may facilitate clinical decision making at the end of life. For older patients admitted to hospital with progressive or persistent levels of severe disability, representing more than half of the decedents, clinicians might consider a palliative care approach to facilitate discussions about advance care planning and to better deal with personal care needs.
Collapse
|
94
|
Allore HG, Zhan Y, Tinetti M, Trentalange M, McAvay G. Longitudinal average attributable fraction as a method for studying time-varying conditions and treatments on recurrent self-rated health: the case of medications in older adults with multiple chronic conditions. Ann Epidemiol 2015; 25:681-686.e4. [PMID: 26033374 DOI: 10.1016/j.annepidem.2015.03.022] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2014] [Revised: 02/06/2015] [Accepted: 03/04/2015] [Indexed: 12/21/2022]
Abstract
PURPOSE The objective is to modify the longitudinal extension of the average attributable fraction (LE-AAF) for recurrent outcomes with time-varying exposures and control for covariates. METHODS We included Medicare Current Beneficiary Survey participants with two or more chronic conditions enrolled from 2005 to 2009 with follow-up through 2011. Nine time-varying medications indicated for nine time-varying common chronic conditions and 14 of 18 forward-selected participant characteristics were used as control variables in the generalized estimating equations step of the LE-AAF to estimate associations with the recurrent universal health outcome self-rated health (SRH). Modifications of the LE-AAF were made to accommodate these indicated medication-condition interactions and covariates. Variability was empirically estimated by bias-corrected and accelerated bootstrapping. RESULTS In the adjusted LE-AAF, thiazide, warfarin, and clopidogrel had significant contributions of 1.2%, 0.4%, 0.2%, respectively, to low (poor or fair) SRH; whereas there were no significant contributions of the other medications to SRH. Hyperlipidemia significantly contributed 4.6% to high SRH. All the other conditions except atrial fibrillation contributed significantly to low SRH. CONCLUSIONS Our modifications to the LE-AAF method apply to a recurrent binary outcome with time-varying factors accounting for covariates.
Collapse
|
95
|
Allore HG, McAvay G, Tinetti M. Health Outcome Effects of Common Medications in Elders With Multiple Conditions. J Patient Cent Res Rev 2015. [DOI: 10.17294/2330-0698.1118] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022] Open
|
96
|
Hartaigh BÓ, Allore HG, Trentalange M, McAvay G, Pilz S, Dodson JA, Gill TM. Elevations in time-varying resting heart rate predict subsequent all-cause mortality in older adults. Eur J Prev Cardiol 2015; 22:527-34. [PMID: 24445263 PMCID: PMC4156557 DOI: 10.1177/2047487313519932] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 01/26/2023]
Abstract
BACKGROUND An increased resting heart rate (RHR) has long been associated with unhealthy life. Nevertheless, it remains uncertain whether time-varying measurements of RHR are predictive of mortality in older persons. DESIGN The purpose of this study was to assess the relationship between repeated measurements of RHR and risk of death from all causes among older adults. METHODS We evaluated repeat measurements of resting heart rate among 5691 men and women (aged 65 years or older) enrolled in the Cardiovascular Health Study. RHR was measured annually for six consecutive years by validated electrocardiogram. All-cause mortality was confirmed by a study-wide Mortality Review Committee using reviews of obituaries, death certificates and hospital records, interviews with attending physicians, and next-of-kin. RESULTS Of the study cohort, 974 (17.1%) participants died. Each 10 beat/min increment in RHR increased the risk of death by 33% (adjusted hazard ratio, 95% confidence interval (CI) = 1.33, 1.26-1.40). Similar results were observed (adjusted hazard ratio, 95% CI = 2.21, 1.88-2.59) when comparing the upper-most quartile of RHR (mean = 81 beats/min) with the lowest (mean = 53 beats/min). Compared with participants whose RHR was consistently ≤65 beats/min during the study period, the risk of death increased monotonically for each 10 beat/min (consistent) increment in RHR, with adjusted hazard ratios (95% CI) ranging from 1.30 (1.23-1.37) for 75 beats/min to 4.78 (3.49-6.52) for 125 beats/min. CONCLUSIONS Elevations in the RHR over the course of six years are associated with an increased risk of all-cause mortality among older adults.
Collapse
|
97
|
Gill TM, Allore HG, Gahbauer EA, Murphy TE. The role of intervening illnesses and injuries in prolonging the disabling process. J Am Geriatr Soc 2015; 63:447-52. [PMID: 25735396 DOI: 10.1111/jgs.13319] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/01/2022]
Abstract
OBJECTIVES To evaluate the relationship between intervening illnesses and injuries leading to hospitalization and restricted activity, respectively, and prolongation of disability in four essential activities of daily living in newly disabled older persons. DESIGN Prospective cohort study. SETTING Greater New Haven, Connecticut. PARTICIPANTS Community-living persons aged 70 and older who had at least one episode of disability from March 1998 to June 2013 (N=632). MEASUREMENTS Disability and exposure to intervening illesses and injuries leading to hospitalization and restricted activity, respectively, were assessed every month. Prolongation of disability was operationalized in two complementary ways: as a dichotomous outcome, based on the persistence of any disability, and as a count of the number of disabled activities. RESULTS During a median follow-up of 114 months, the 632 participants experienced 2,764 disability episodes. The mean exposure rates for hospitalization and restricted activity were 80.7 (95% confidence interval (CI)=73.7-88.4) and 173.6 (95% CI=162.5-185.5), respectively, per 1,000 person-months. After adjustment for multiple disability risk factors, the likelihood of disability prolongation was 2.5 times as great (odds ratio (OR) 2.54, 95% CI=2.05-3.15) for hospitalization and 1.2 times as great (1.21, 95% CI=1.06-1.40) for restricted activity as for no hospitalization or restricted activity, and the mean number of disabilities was 35% (risk ratio (RR)=1.35, 95% CI=1.30-1.39) greater in the setting of hospitalization and 7% (1.07, 95% CI=1.05-1.09) greater in the setting of restricted activity. CONCLUSION Intervening illnesses and injuries leading to hospitalization and restricted activity, respectively, are strongly associated with prolongation of disability in newly disabled older adults. Efforts to prevent and more-aggressively manage these intervening events have the potential to break the cycle of disability in older persons.
Collapse
|
98
|
Fodeh SJ, Trentalange M, Allore HG, Gill TM, Brandt CA, Murphy TE. Baseline cluster membership demonstrates positive associations with first occurrence of multiple gerontologic outcomes over 10 years. Exp Aging Res 2015; 41:177-92. [PMID: 25724015 DOI: 10.1080/0361073x.2015.1001655] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
UNLABELLED BACKGROUND/STUDY CONTEXT: The potential of cluster analysis (CA) as a baseline predictor of multivariate gerontologic outcomes over a long period of time has not been previously demonstrated. METHODS Restricting candidate variables to a small group of established predictors of deleterious gerontologic outcomes, various CA methods were applied to baseline values from 754 nondisabled, community-living persons, aged 70 years or older. The best cluster solution yielded at baseline was subsequently used as a fixed explanatory variable in time-to-event models of the first occurrence of the following outcomes: any disability in four activities of daily living, any disability in four mobility measures, and death. Each outcome was recorded through a maximum of 129 months or death. Associations between baseline ordinal cluster level and first occurrence of all three outcomes were modeled over a 10-year period with proportional hazards regression and compared with the associations yielded by the analogous latent class analysis (LCA) solution. RESULTS The final cluster-defining variables were continuous measures of cognitive status and depressive symptoms, and dichotomous indicators of slow gait and exhaustion. The best solution yielded by baseline values of these variables was obtained with a K-means algorithm and cosine similarity and consisted of three clusters representing increasing levels of impairment. After adjustment for age, sex, ethnic group, and number of chronic conditions, baseline ordinal cluster level demonstrated significantly positive associations with all three outcomes over a 10-year period that were equivalent to those from the corresponding LCA solution. CONCLUSION These findings suggest that baseline clusters based on previously established explanatory variables have potential to predict multivariate gerontologic outcomes over a long period of time.
Collapse
|
99
|
Ng R, Allore HG, Trentalange M, Monin JK, Levy BR. Increasing negativity of age stereotypes across 200 years: evidence from a database of 400 million words. PLoS One 2015; 10:e0117086. [PMID: 25675438 PMCID: PMC4326131 DOI: 10.1371/journal.pone.0117086] [Citation(s) in RCA: 104] [Impact Index Per Article: 11.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/26/2014] [Accepted: 12/17/2014] [Indexed: 11/18/2022] Open
Abstract
Scholars argue about whether age stereotypes (beliefs about old people) are becoming more negative or positive over time. No previous study has systematically tested the trend of age stereotypes over more than 20 years, due to lack of suitable data. Our aim was to fill this gap by investigating whether age stereotypes have changed over the last two centuries and, if so, what may be associated with this change. We hypothesized that age stereotypes have increased in negativity due, in part, to the increasing medicalization of aging. This study applied computational linguistics to the recently compiled Corpus of Historical American English (COHA), a database of 400 million words that includes a range of printed sources from 1810 to 2009. After generating a comprehensive list of synonyms for the term elderly for these years from two historical thesauri, we identified 100 collocates (words that co-occurred most frequently with these synonyms) for each of the 20 decades. Inclusion criteria for the collocates were: (1) appeared within four words of the elderly synonym, (2) referred to an old person, and (3) had a stronger association with the elderly synonym than other words appearing in the database for that decade. This yielded 13,100 collocates that were rated for negativity and medicalization. We found that age stereotypes have become more negative in a linear way over 200 years. In 1880, age stereotypes switched from being positive to being negative. In addition, support was found for two potential explanations. Medicalization of aging and the growing proportion of the population over the age of 65 were both significantly associated with the increase in negative age stereotypes. The upward trajectory of age-stereotype negativity makes a case for remedial action on a societal level.
Collapse
|
100
|
Stabenau HF, Morrison LJ, Gahbauer EA, Leo-Summers L, Allore HG, Gill TM. Functional trajectories in the year before hospice. Ann Fam Med 2015; 13:33-40. [PMID: 25583890 PMCID: PMC4291263 DOI: 10.1370/afm.1720] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/28/2022] Open
Abstract
PURPOSE We undertook a study to identify distinct functional trajectories in the year before hospice, to determine how patients with these trajectories differ according to demographic characteristics and hospice diagnosis, and to evaluate the association between these trajectories and subsequent outcomes. METHODS From an ongoing cohort study of 754 community-living persons aged 70 years or older, we evaluated data on 213 persons who were subsequently enrolled in hospice from March 1998 to December 2011. Disability in 13 basic, instrumental, and mobility activities was assessed during monthly telephone interviews through June 2012. RESULTS In the year before hospice, we identified 5 clinically distinct functional trajectories, representing worsening cumulative burden of disability: late decline (10.8%), accelerated (10.8%), moderate (21.1%), progressively severe (24.9%), and persistently severe (32.4%). Participants with a cancer diagnosis (34.7%) had the most favorable functional trajectories (ie, lowest burden of disability), whereas those with neurodegenerative disease (21.1%) had the worst. Median survival in hospice was only 14 days and did not differ significantly by functional trajectory. Compared with participants in the persistently severe trajectory, those in the moderate trajectory had the highest likelihood of surviving and being independent in at least 1 activity in the month after hospice admission (adjusted odds ratio = 5.5; 95% CI, 1.9-35.9). CONCLUSIONS The course of disability in the year before hospice differs greatly among older persons but is particularly poor among those with neurodegenerative disease. Late admission to hospice (as shown by the short survival), coupled with high levels of severe disability before hospice, highlight potential unmet palliative care needs for many older persons at the end of life.
Collapse
|