26
|
Locke JE, Reed RD, Massie AB, MacLennan PA, Sawinski D, Kumar V, Snyder JJ, Carter AJ, Shelton BA, Mustian MN, Lewis CE, Segev DL. Obesity and long-term mortality risk among living kidney donors. Surgery 2019; 166:205-208. [PMID: 31072668 DOI: 10.1016/j.surg.2019.03.016] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2019] [Revised: 03/18/2019] [Accepted: 03/19/2019] [Indexed: 12/12/2022]
Abstract
BACKGROUND Body mass index of living kidney donors has increased substantially. Determining candidacy for live kidney donation among obese individuals is challenging because many donation-related risks among this subgroup remain unquantified, including even basic postdonation mortality. METHODS We used data from the Scientific Registry of Transplant Recipients linked to data from the Centers for Medicare and Medicaid Services to study long-term mortality risk associated with being obese at the time of kidney donation among 119,769 live kidney donors (1987-2013). Donors were followed for a maximum of 20 years (interquartile range 6.0-16.0). Cox proportional hazards estimated the risk of postdonation mortality by obesity status at donation. Multiple imputation accounted for missing obesity data. RESULTS Obese (body mass index ≥ 30) living kidney donors were more likely male, African American, and had higher blood pressure. The estimated risk of mortality 20 years after donation was 304.3/10,000 for obese and 208.9/10,000 for nonobese living kidney donors. Adjusting for age, sex, race/ethnicity, blood pressure, baseline estimated glomerular filtration rate, relationship to recipient, smoking, and year of donation, obese living kidney donors had a 30% increased risk of long-term mortality compared with their nonobese counterparts (adjusted hazard ratio: 1.32, 95% CI: 1.09-1.60, P = .006). The impact of obesity on mortality risk did not differ significantly by sex, race or ethnicity, biologic relationship, baseline estimated glomerular filtration rate, or among donors who did and did not develop postdonation kidney failure. CONCLUSION These findings may help to inform selection criteria and discussions with obese persons considering living kidney donation.
Collapse
|
27
|
Reed RD, Sawinski D, Shelton BA, MacLennan PA, Hanaway M, Kumar V, Long D, Gaston RS, Kilgore ML, Julian BA, Lewis CE, Locke JE. Population Health, Ethnicity, and Rate of Living Donor Kidney Transplantation. Transplantation 2018; 102:2080-2087. [PMID: 29787519 PMCID: PMC6249044 DOI: 10.1097/tp.0000000000002286] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
BACKGROUND Living donor kidney transplantation has declined in the United States since 2004, but the relationship between population characteristics and rate of living donation is unknown. The goal of our study was to use data on general population health and socioeconomic status to investigate the association with living donation. METHODS This cross-sectional, ecological study used population health and socioeconomic status data from the CDC Behavioral Risk Factor Surveillance System to investigate the association with living donation. Transplant centers performing 10 or greater kidney transplants reported to the Scientific Registry of Transplant Recipients in 2015 were included. Center rate of living donation was defined as the proportion of all kidney transplants performed at a center that were from living donors. RESULTS In a linear mixed-effects model, a composite index of health and socioeconomic status factors was negatively associated with living donation, with a rate of living donation that was on average 7.3 percentage points lower among centers in areas with more comorbid disease and poorer socioeconomic status (95% confidence interval, -12.2 to -2.3, P = 0.004). Transplant centers in areas with higher prevalence of minorities had a rate of living donation that was 7.1 percentage points lower than centers with fewer minorities (95% confidence interval, -11.8 to -2.3, P = 0.004). CONCLUSIONS Center-level variation in living donation was associated with population characteristics and minority prevalence. Further examination of these factors in the context of patient and center-level barriers to living donation is warranted.
Collapse
|
28
|
Shelton BA, Reed RD, MacLennan PA, McWilliams D, Mustian MN, Sawinski D, Kumar V, Ong S, Locke JE. Increasing Obesity Prevalence in the United States End-Stage Renal Disease Population. JOURNAL OF HEALTH SCIENCE & EDUCATION 2018; 2:151. [PMID: 37538870 PMCID: PMC10398833] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Indexed: 08/05/2023]
Abstract
Background Among ESRD patients, obesity may improve dialysis-survival but decreases likelihood of transplantation, and as such, obesity prevalence may directly affect growth of the dialysis population. Objective The objective of this study was to assess BMI trends in the ESRD population as compared to the general population. Materials and Methods Incident adult ESRD patients were identified from the United States Renal Data System from 01/01/1995-12/31/2010 (n=1,458,350). Data from the Behavioral Risk Factor Surveillance System (n=4,303,471) represented the US population. Trends in BMI, obesity classes I (BMI of 30-34.9), II (BMI of 35-39.9), and III (BMI ≥ 40), were examined by year of dialysis initiation. Trends in BMI slope were compared between the ESRD and US populations using linear regression. Results Mean BMI of ESRD patients in 1995 was 25.2 as compared to 29.4 in 2010, a 16.7% increase, while the US population's mean BMI increased from 25.3 to 27.2, a 7.5% increase. BMI increase among the ESRD population was significantly more rapid than among the US population (β: 0.16, 95% CI: 0.14-0.18, p<0.001). Conclusions and Recommendations Mean BMI among the ESRD population is increasing more rapidly than the US population. Given decreased access to kidney transplantation among ESRD patients with obesity, future research should be directed at controlling healthcare expenditures by identifying strategies to address the obesity epidemic among the US ESRD population.
Collapse
|
29
|
Shelton BA, Sawinski D, Linas BP, Reese PP, Mustian M, Hungerpiller M, Reed RD, MacLennan PA, Locke JE. Population level outcomes and cost-effectiveness of hepatitis C treatment pre- vs postkidney transplantation. Am J Transplant 2018; 18:2483-2495. [PMID: 30058218 PMCID: PMC6206868 DOI: 10.1111/ajt.15040] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/15/2018] [Revised: 07/03/2018] [Accepted: 07/23/2018] [Indexed: 01/25/2023]
Abstract
Direct-acting antivirals approved for use in patients with end-stage renal disease (ESRD) now exist. HCV-positive (HCV+) ESRD patients have the opportunity to decrease the waiting times for transplantation by accepting HCV-infected kidneys. The optimal timing for HCV treatment (pre- vs posttransplant) among kidney transplant candidates is unknown. Monte Carlo microsimulation of 100 000 candidates was used to examine the cost-effectiveness of HCV treatment pretransplant vs posttransplant by liver fibrosis stage and waiting time over a lifetime time horizon using 2 regimens approved for ESRD patients. Treatment pretransplant yielded higher quality-adjusted life years (QALYs) compared with posttransplant treatment in all subgroups except those with Meta-analysis of Histological Data in Viral Hepatitis stage F0 (pretransplant: 5.7 QALYs vs posttransplant: 5.8 QALYs). However, treatment posttransplant was cost-saving due to decreased dialysis duration with the use of HCV-infected kidneys (pretransplant: $735 700 vs posttransplant: $682 400). Using a willingness-to-pay threshold of $100 000, treatment pretransplant was not cost-effective except for those with Meta-analysis of Histological Data in Viral Hepatitis stage F3 whose fibrosis progression was halted. If HCV+ candidates had access to HCV-infected donors and were transplanted ≥9 months sooner than HCV-negative candidates, treatment pretransplant was no longer cost-effective (incremental cost-effectiveness ratio [ICER]: $107 100). In conclusion, optimal timing of treatment depends on fibrosis stage and access to HCV+ kidneys but generally favors posttransplant HCV eradication.
Collapse
|
30
|
Mustian MN, MacLennan PA, Reed RD, Shelton BA, Kumar V, Locke JE. Fasting Serum Glucose Predictive of Approval for Living Kidney Donation among Obese Donor Candidates. J Am Coll Surg 2018. [DOI: 10.1016/j.jamcollsurg.2018.07.310] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
|
31
|
Martins PN, Mustian MN, MacLennan PA, Ortiz JA, Akoad M, Caicedo JC, Echeverri GJ, Gray SH, Lopez-Soler RI, Gunasekaran G, Kelly B, Mobley CM, Black SM, Esquivel C, Locke JE. Impact of the new kidney allocation system A2/A2B → B policy on access to transplantation among minority candidates. Am J Transplant 2018; 18:1947-1953. [PMID: 29509285 PMCID: PMC6105461 DOI: 10.1111/ajt.14719] [Citation(s) in RCA: 31] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/29/2017] [Revised: 02/20/2018] [Accepted: 02/25/2018] [Indexed: 01/25/2023]
Abstract
Blood group B candidates, many of whom represent ethnic minorities, have historically had diminished access to deceased donor kidney transplantation (DDKT). The new national kidney allocation system (KAS) preferentially allocates blood group A2/A2B deceased donor kidneys to B recipients to address this ethnic and blood group disparity. No study has yet examined the impact of KAS on A2 incompatible (A2i) DDKT for blood group B recipients overall or among minorities. A case-control study of adult blood group B DDKT recipients from 2013 to 2017 was performed, as reported to the Scientific Registry of Transplant Recipients. Cases were defined as recipients of A2/A2B kidneys, whereas controls were all remaining recipients of non-A2/A2B kidneys. A2i DDKT trends were compared from the pre-KAS (1/1/2013-12/3/2014) to the post-KAS period (12/4/2014-2/28/2017) using multivariable logistic regression. Post-KAS, there was a 4.9-fold increase in the likelihood of A2i DDKT, compared to the pre-KAS period (odds ratio [OR] 4.92, 95% confidence interval [CI] 3.67-6.60). However, compared to whites, there was no difference in the likelihood of A2i DDKT among minorities post-KAS. Although KAS resulted in increasing A2/A2B→B DDKT, the likelihood of A2i DDKT among minorities, relative to whites, was not improved. Further discussion regarding A2/A2B→B policy revisions aiming to improve DDKT access for minorities is warranted.
Collapse
|
32
|
Shelton BA, Sawinski D, Ray C, Reed RD, MacLennan PA, Blackburn J, Young CJ, Gray S, Yanik M, Massie A, Segev DL, Locke JE. Decreasing deceased donor transplant rates among children (≤6 years) under the new kidney allocation system. Am J Transplant 2018; 18:1690-1698. [PMID: 29333639 DOI: 10.1111/ajt.14663] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2017] [Revised: 12/19/2017] [Accepted: 12/21/2017] [Indexed: 01/25/2023]
Abstract
The Kidney Allocation System (KAS) was implemented in December 2014 with unknown impact on the pediatric waitlist. To understand the effect of KAS on pediatric registrants, deceased donor kidney transplant (DDKT) rate was assessed using interrupted time series analysis and time-to-event analysis. Two allocation eras were defined with an intermediary washout period: Era 1 (01/01/2013-09/01/2014), Era 2 (09/01/2014-03/01/2015), and Era 3(03/01/2015-03/01/2017). When using Cox proportional hazards, there was no significant association between allocation era and DDKT likelihood as compared to Era 1 (Era 3: aHR: 1.07, 95% CI: 0.97-1.18, P = .17). However, this was not consistent across all subgroups. Specifically, while highly sensitized pediatric registrants were consistently less likely to be transplanted than their less sensitized counterparts, this disparity was attenuated in Era 3 (Era 1 aHR: 0.04, 95%CI: 0.01-0.14, P < .001; Era 3 aHR: 0.33, 95% CI: 0.21-0.53, P < .001) whereas the youngest registrants aged 0-6 experienced a 21% decrease in DDKT likelihood in Era 3 as compared to Era 1 (aHR: 0.79, 95% CI: 0.64-0.98, P = .03). Thus, while overall DDKT likelihood remained stable with the introduction of KAS, registrants ≤ 6 years of age were disadvantaged, warranting further study to ensure equitable access to transplantation.
Collapse
|
33
|
Locke JE, Sawinski D, Reed RD, Shelton B, MacLennan PA, Kumar V, Mehta S, Mannon RB, Gaston R, Julian BA, Carr JJ, Terry JG, Kilgore M, Massie AB, Segev DL, Lewis CE. Apolipoprotein L1 and Chronic Kidney Disease Risk in Young Potential Living Kidney Donors. Ann Surg 2018; 267:1161-1168. [PMID: 28187045 PMCID: PMC5805656 DOI: 10.1097/sla.0000000000002174] [Citation(s) in RCA: 37] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
OBJECTIVE The aim of this study was to develop a novel chronic kidney disease (CKD) risk prediction tool for young potential living kidney donors. SUMMARY OF BACKGROUND DATA Living kidney donor selection practices have evolved from examining individual risk factors to a risk calculator incorporating multiple characteristics. Owing to limited long-term data and lack of genetic information, current risk tools lack precision among young potential living kidney donors, particularly African Americans (AAs). METHODS We identified a cohort of young adults (18-30 years) with no absolute contraindication to kidney donation from the longitudinal cohort study Coronary Artery Risk Development in Young Adults. Risk associations for CKD (estimated glomerular filtration rate <60 mL/min/1.73 m) were identified and assigned weighted points to calculate risk scores. RESULTS A total of 3438 healthy adults were identified [mean age 24.8 years; 48.3% AA; median follow-up 24.9 years (interquartile range: 24.5-25.2)]. For 18-year olds, 25-year projected CKD risk varied by ethnicity and sex even without baseline clinical and genetic abnormalities; risk was 0.30% for European American (EA) women, 0.52% for EA men, 0.52% for AA women, 0.90% for AA men. Among 18-year-old AAs with apolipoprotein L1 gene (APOL1) renal-risk variants without baseline abnormalities, 25-year risk significantly increased: 1.46% for women and 2.53% for men; among those with 2 APOL1 renal-risk variants and baseline abnormalities, 25-year risk was higher: 2.53% to 6.23% for women and 4.35% to 10.58% for men. CONCLUSIONS Young AAs were at highest risk for CKD, and APOL1 renal-risk variants drove some of this risk. Understanding the genetic profile of young AA potential living kidney donors in the context of baseline health characteristics may help to inform candidate selection and counseling.
Collapse
|
34
|
Mustian MN, Cannon RM, MacLennan PA, Reed RD, Shelton BA, McWilliams DM, Deierhoi MH, Locke JE. Landscape of ABO-Incompatible Live Donor Kidney Transplantation in the US. J Am Coll Surg 2018; 226:615-621. [PMID: 29309944 PMCID: PMC5869103 DOI: 10.1016/j.jamcollsurg.2017.12.026] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2017] [Accepted: 12/13/2017] [Indexed: 12/26/2022]
Abstract
BACKGROUND Widespread implementation of ABO-incompatible (ABOi) living donor kidney transplantation (LDKT) has been proposed as a means to partially ameliorate the national shortage of deceased donor kidneys. Acceptance of this practice has been encouraged by reports from experienced centers demonstrating acute rejection (AR) rates similar to those obtained with ABO-compatible (ABOc) LDKT. Acute rejection rate and graft survival after ABOi LDKT on a national level have yet to be fully determined. STUDY DESIGN We studied adult (>18 years) LDKT recipients, from 2000 to 2015, reported to the Scientific Registry of Transplant Recipients. Acute rejection rates in the first post-transplant year (modified Poisson regression) and graft survival (Cox proportional hazards) were assessed by ABO compatibility status (ABOi: 930; ABOc: 89,713). RESULTS Patients undergoing ABOi LDKT had an AR rate of 19.4% compared with 10.5% for ABOc recipients (p < 0.0001). After adjusting for recipient- and donor-related risk factors, patients undergoing ABOi LDKT were found to have a 1.76-fold greater risk for AR within 1 year of transplantation compared with ABOc LDKT recipients (adjusted relative risk [aRR] 1.76; 95% CI 1.54 to 2.01). Moreover, there was a 2.34-fold greater risk of death-censored graft loss at 1-year post-transplant among ABOi vs ABOc LDKT recipients (adjusted hazard ratio [aHR] 2.34; 95% CI 1.85 to 2.96). CONCLUSIONS Based on these findings, the low rates of AR and excellent short-term graft survival presented in single center series may not be sustainable on a national level. These findings highlight the potential utility for identification of centers of excellence and regionalization of ABOi LDKT.
Collapse
|
35
|
Shelton BA, Sawinski D, Mehta S, Reed RD, MacLennan PA, Locke JE. Kidney transplantation and waitlist mortality rates among candidates registered as willing to accept a hepatitis C infected kidney. Transpl Infect Dis 2018; 20:e12829. [DOI: 10.1111/tid.12829] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2017] [Revised: 09/12/2017] [Accepted: 11/19/2017] [Indexed: 12/13/2022]
|
36
|
Sawinski D, Shelton BA, Mehta S, Reed RD, MacLennan PA, Gustafson S, Segev DL, Locke JE. Impact of Protease Inhibitor-Based Anti-Retroviral Therapy on Outcomes for HIV+ Kidney Transplant Recipients. Am J Transplant 2017; 17:3114-3122. [PMID: 28696079 DOI: 10.1111/ajt.14419] [Citation(s) in RCA: 48] [Impact Index Per Article: 6.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2017] [Revised: 06/02/2017] [Accepted: 06/28/2017] [Indexed: 01/25/2023]
Abstract
Excellent outcomes have been demonstrated among select HIV-positive kidney transplant (KT) recipients with well-controlled infection, but to date, no national study has explored outcomes among HIV+ KT recipients by antiretroviral therapy (ART) regimen. Intercontinental Marketing Services (IMS) pharmacy fills (1/1/01-10/1/12) were linked with Scientific Registry of Transplant Recipients (SRTR) data. A total of 332 recipients with pre- and posttransplantation fills were characterized by ART at the time of transplantation as protease inhibitor (PI) or non-PI-based ART (88 PI vs. 244 non-PI). Cox proportional hazards models were adjusted for recipient and donor characteristics. Comparing recipients by ART regimen, there were no significant differences in age, race, or HCV status. Recipients on PI-based regimens were significantly more likely to have an Estimated Post Transplant Survival (EPTS) score of >20% (70.9% vs. 56.3%, p = 0.02) than those on non-PI regimens. On adjusted analyses, PI-based regimens were associated with a 1.8-fold increased risk of allograft loss (adjusted hazard ratio [aHR] 1.84, 95% confidence interval [CI] 1.22-2.77, p = 0.003), with the greatest risk observed in the first posttransplantation year (aHR 4.48, 95% CI 1.75-11.48, p = 0.002), and a 1.9-fold increased risk of death as compared to non-PI regimens (aHR 1.91, 95% CI 1.02-3.59, p = 0.05). These results suggest that whenever possible, recipients should be converted to a non-PI regimen prior to kidney transplantation.
Collapse
|
37
|
Crowson CN, Reed RD, Shelton BA, MacLennan PA, Locke JE. Lymphocyte-depleting induction therapy lowers the risk of acute rejection in African American pediatric kidney transplant recipients. Pediatr Transplant 2017; 21. [PMID: 27699934 DOI: 10.1111/petr.12823] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 09/06/2016] [Indexed: 11/29/2022]
Abstract
The use of lymphocyte-depleting induction immunosuppression has been associated with a reduction in risk of AR after KT among adult recipients, particularly among high-risk subgroups such as AAs. However, data on induction regimen and AR risk are lacking among pediatric KT recipients. We examined outcomes among 7884 first-time pediatric KT recipients using SRTR data (2000-2014). Characteristics were compared across race using Wilcoxon rank-sum tests for continuous and chi-square tests for categorical variables. Risk of AR was estimated using modified Poisson regression, stratified by recipient race, adjusting for recipient age, gender, BMI, primary diagnosis, number of HLA mismatches, maintenance immunosuppression, and donor type. Risk of AR within 1 year was lower in AA recipients receiving lymphocyte-depleting induction (ATG or alemtuzumab; RR, 0.66; 95% CI, 0.52-0.83 P < .001) compared to AA recipients receiving anti-IL-2 receptor antibody induction. This difference was not seen in non-AA recipients receiving lymphocyte-depleting induction (RR, 0.93; 95% CI, 0.81-1.06, P = .26) compared to IL-2 induction. These findings support a role for lymphocyte-depleting induction agents in AA pediatric patients undergoing KT and continued use of IL-2 inhibitor induction in non-AA pediatric KT recipients.
Collapse
|
38
|
Shelton BA, Mehta S, Sawinski D, Reed RD, MacLennan PA, Gustafson S, Segev DL, Locke JE. Increased Mortality and Graft Loss With Kidney Retransplantation Among Human Immunodeficiency Virus (HIV)-Infected Recipients. Am J Transplant 2017; 17:173-179. [PMID: 27305590 PMCID: PMC5159327 DOI: 10.1111/ajt.13922] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2016] [Revised: 05/13/2016] [Accepted: 06/08/2016] [Indexed: 01/25/2023]
Abstract
Excellent outcomes have been demonstrated in primary human immunodeficiency virus (HIV)-positive (HIV+) kidney transplant recipients, but a subset will lose their graft and seek retransplantation (re-KT). To date, no study has examined outcomes among HIV+ re-KT recipients. We studied risk for death and graft loss among 4149 (22 HIV+ vs. 4127 HIV-negative [HIV-]) adult re-KT recipients reported to the Scientific Registry of Transplant Recipients (SRTR) (2004-2013). Compared to HIV- re-KT recipients, HIV+ re-KT recipients were more commonly African American (63.6% vs. 26.7%, p < 0.001), infected with hepatitis C (31.8% vs. 5.0%, p < 0.001) and had longer median time on dialysis (4.8 years vs. 2.1 years, p = 0.02). There were no significant differences in length of time between the primary and re-KT events by HIV status (1.5 years vs. 1.4 years, p = 0.52). HIV+ re-KT recipients experienced a 3.11-fold increased risk of death (adjusted hazard ratio [aHR]: 3.11, 95% confidence interval [CI]: 1.82-5.34, p < 0.001) and a 1.96-fold increased risk of graft loss (aHR: 1.96, 95% CI: 1.14-3.36, p = 0.01) compared to HIV- re-KT recipients. Re-KT among HIV+ recipients was associated with increased risk for mortality and graft loss. Future research is needed to determine if a survival benefit is achieved with re-KT in this vulnerable population.
Collapse
|
39
|
Locke JE, Reed RD, Massie A, MacLennan PA, Sawinski D, Kumar V, Mehta S, Mannon RB, Gaston R, Lewis CE, Segev DL. Obesity increases the risk of end-stage renal disease among living kidney donors. Kidney Int 2016; 91:699-703. [PMID: 28041626 DOI: 10.1016/j.kint.2016.10.014] [Citation(s) in RCA: 111] [Impact Index Per Article: 13.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2016] [Revised: 09/23/2016] [Accepted: 10/06/2016] [Indexed: 12/31/2022]
Abstract
Determining candidacy for live kidney donation among obese individuals remains challenging. Among healthy non-donors, body mass index (BMI) above 30 is associated with a 16% increase in risk of end-stage renal disease (ESRD). However, the impact on the ESRD risk attributable to donation and living with only one kidney remains unknown. Here we studied the risk of ESRD associated with obesity at the time of donation among 119 769 live kidney donors in the United States. Maximum follow-up was 20 years. Obese (BMI above 30) live kidney donors were more likely male, African American, and had higher blood pressure. Estimated risk of ESRD 20 years after donation was 93.9 per 10 000 for obese; significantly greater than the 39.7 per 10 000 for non-obese live kidney donors. Adjusted for age, sex, ethnicity, blood pressure, baseline estimated glomerular filtration rate, and relationship to recipient, obese live kidney donors had a significant 86% increased risk of ESRD compared to their non-obese counterparts (adjusted hazard ratio 1.86; 95% confidence interval 1.05-3.30). For each unit increase in BMI above 27 kg/m2 there was an associated significant 7% increase in ESRD risk (1.07, 1.02-1.12). The impact of obesity on ESRD risk was similar for male and female donors, African American and Caucasian donors, and across the baseline estimated glomerular filtration rate spectrum. These findings may help to inform selection criteria and discussions with persons considering living kidney donation.
Collapse
|
40
|
Locke JE, Shelton B, Reed R, MacLennan PA, Mehta S, Sawinski D, Segev DL. Identification of Optimal Donor-Recipient Combinations Among Human Immunodeficiency Virus (HIV)-Positive Kidney Transplant Recipients. Am J Transplant 2016; 16:2377-83. [PMID: 27140837 PMCID: PMC4956609 DOI: 10.1111/ajt.13847] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2016] [Revised: 04/01/2016] [Accepted: 04/22/2016] [Indexed: 01/25/2023]
Abstract
For some patient subgroups, human immunodeficiency virus (HIV) infection has been associated with worse outcomes after kidney transplantation (KT); potentially modifiable factors may be responsible. The study goal was to identify factors that predict a higher risk of graft loss among HIV-positive KT recipients compared with a similar transplant among HIV-negative recipients. In this study, 82 762 deceased donor KT recipients (HIV positive: 526; HIV negative: 82 236) reported to the Scientific Registry of Transplant Recipients (SRTR) (2001-2013) were studied by interaction term analysis. Compared to HIV-negative recipients, the hepatitis C virus (HCV) amplified risk 2.72-fold among HIV-positive KT recipients (adjusted hazard ratio [aHR]: 2.72, 95% confidence interval [CI]: 1.75-4.22, p < 0.001). Forty-three percent of the excess risk was attributable to the interaction between HIV and HCV (attributable proportion of risk due to the interaction [AP]: 0.43, 95% CI: 0.23-0.63, p = 0.02). Among HIV-positive recipients with more than three HLA mismatches (MMs), risk was amplified 1.80-fold compared to HIV-negative (aHR: 1.80, 95% CI: 1.31-2.47, p < 0.001); 42% of the excess risk was attributable to the interaction between HIV and more than three HLA MMs (AP: 0.42, 95% CI: 0.24-0.60, p = 0.01). High-HIV-risk (HIV-positive/HCV-positive HLAwith more than three MMs) recipients had a 3.86-fold increased risk compared to low-HIV-risk (HIV-positive/HCV-negative HLA with three or fewer MMs)) recipients (aHR: 3.86, 95% CI: 2.37-6.30, p < 0.001). Avoidance of more than three HLA MMs in HIV-positive KT recipients, particularly among coinfected patients, may mitigate the increased risk of graft loss associated with HIV infection.
Collapse
|
41
|
DuBay DA, MacLennan PA, Reed RD, Shelton BA, Redden DT, Fouad M, Martin MY, Gray SH, White JA, Eckhoff DE, Locke JE. Insurance Type and Solid Organ Transplantation Outcomes: A Historical Perspective on How Medicaid Expansion Might Impact Transplantation Outcomes. J Am Coll Surg 2016; 223:611-620.e4. [PMID: 27457252 DOI: 10.1016/j.jamcollsurg.2016.07.004] [Citation(s) in RCA: 34] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2016] [Revised: 07/11/2016] [Accepted: 07/12/2016] [Indexed: 11/16/2022]
Abstract
BACKGROUND The number of Medicaid beneficiaries has increased under the Affordable Care Act, improving access to solid organ transplantation in this disadvantaged patient cohort. It is unclear what impact Medicaid expansion will have on transplantation outcomes. We performed a retrospective cohort analysis to measure the frequency and variation in Medicaid transplantation and post-transplantation survival in Medicaid patients. STUDY DESIGN Adult heart, lung, liver, and renal transplant recipients between 2002 and 2011 (n = 169,194) reported to the Scientific Registry of Transplant Recipients were identified. Transplant recipients were classified based on insurance status (private, Medicare or Medicaid). Outcomes measures included 5-year post-transplantation survival, summarized using Kaplan-Meier curves and compared with log-rank tests. Organ-specific Cox proportional hazards models were used to adjust for donor and recipient factors. RESULTS Medicaid patients comprised 8.6% of all organ transplant recipients. Fewer transplantations were performed than expected among Medicaid beneficiaries for all organs except liver (liver: observed to expected ratio = 1.21; 95% CI, 0.68-1.90; heart: observed to expected ratio = 0.89; 95% CI, 0.44-1.49; lung: observed to expected ratio = 0.57; 95% CI, 0.22-1.06; renal: observed to expected ratio = 0.32; 95% CI, 0.08-0.72). Medicaid transplant recipients were listed with more severe organ failure and experienced shorter transplant wait times. Post-transplantation survival was lower in Medicaid patients compared with private insurance for all organs. Post-transplantation survival in Medicaid patients was similar to Medicare patients for heart, liver, and renal but lower in lung. CONCLUSIONS Medicaid organ transplant beneficiaries had significantly lower survival compared with privately insured beneficiaries. The more severe organ failure among Medicaid beneficiaries at the time of listing, suggested a pattern of late referral, which might account for worse outcomes. Implementation of the Affordable Care Act gives the opportunity to develop the necessary infrastructure to ensure timely transplantation referrals and improve long-term outcomes in this vulnerable population.
Collapse
|
42
|
Brown CJ, Foley KT, Lowman JD, MacLennan PA, Razjouyan J, Najafi B, Locher J, Allman RM. Comparison of Posthospitalization Function and Community Mobility in Hospital Mobility Program and Usual Care Patients: A Randomized Clinical Trial. JAMA Intern Med 2016; 176:921-7. [PMID: 27243899 DOI: 10.1001/jamainternmed.2016.1870] [Citation(s) in RCA: 131] [Impact Index Per Article: 16.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Abstract
IMPORTANCE Low mobility is common during hospitalization and associated with loss or declines in ability to perform activities of daily living (ADL) and limitations in community mobility. OBJECTIVE To examine the effect of an in-hospital mobility program (MP) on posthospitalization function and community mobility. DESIGN, SETTING, AND PARTICIPANTS This single-blind randomized clinical trial used masked assessors to compare a MP with usual care (UC). Patients admitted to the medical wards of the Birmingham Veterans Affairs Medical Center from January 12, 2010, through June 29, 2011, were followed up throughout hospitalization with 1-month posthospitalization telephone follow-up. One hundred hospitalized patients 65 years or older were randomly assigned to the MP or UC groups. Patients were cognitively intact and able to walk 2 weeks before hospitalization. Data analysis was performed from November 21, 2012, to March 14, 2016. INTERVENTIONS Patients in the MP group were assisted with ambulation up to twice daily, and a behavioral strategy was used to encourage mobility. Patients in the UC group received twice-daily visits. MAIN OUTCOMES AND MEASURES Changes in self-reported ADL and community mobility were assessed using the Katz ADL scale and the University of Alabama at Birmingham Study of Aging Life-Space Assessment (LSA), respectively. The LSA measures community mobility based on the distance through which a person reports moving during the preceding 4 weeks. RESULTS Of 100 patients, 8 did not complete the study (6 in the MP group and 2 in the UC group). Patients (mean age, 73.9 years; 97 male [97.0%]; and 19 black [19.0%]) had a median length of stay of 3 days. No significant differences were found between groups at baseline. For all periods, groups were similar in ability to perform ADL; however, at 1-month after hospitalization, the LSA score was significantly higher in the MP (LSA score, 52.5) compared with the UC group (LSA score, 41.6) (P = .02). For the MP group, the 1-month posthospitalization LSA score was similar to the LSA score measured at admission. For the UC group, the LSA score decreased by approximately 10 points. CONCLUSIONS AND RELEVANCE A simple MP intervention had no effect on ADL function. However, the MP intervention enabled patients to maintain their prehospitalization community mobility, whereas those in the UC group experienced clinically significant declines. Lower life-space mobility is associated with increased risk of death, nursing home admission, and functional decline, suggesting that declines such as those observed in the UC group would be of great clinical importance. TRIAL REGISTRATION clinicaltrials.gov Identifier: NCT00715962.
Collapse
|
43
|
Donnelly JP, Locke JE, MacLennan PA, McGwin G, Mannon RB, Safford MM, Baddley JW, Muntner P, Wang HE. Inpatient Mortality Among Solid Organ Transplant Recipients Hospitalized for Sepsis and Severe Sepsis. Clin Infect Dis 2016; 63:186-94. [PMID: 27217215 DOI: 10.1093/cid/ciw295] [Citation(s) in RCA: 39] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2016] [Accepted: 04/23/2016] [Indexed: 12/26/2022] Open
Abstract
BACKGROUND Solid organ transplant (SOT) recipients are at elevated risk of sepsis. The impact of SOT on outcomes following sepsis is unclear. METHODS We performed a retrospective cohort study using data from University HealthSystem Consortium, a consortium of academic medical center affiliates. We examined the association between SOT and mortality among patients hospitalized with severe sepsis or explicitly coded sepsis in 2012-2014. We used International Classification of Diseases, Ninth Revision (ICD-9) codes to identify severe sepsis, explicitly coded sepsis, and SOT (kidney, liver, heart, lung, pancreas, or intestine transplants). We fit random-intercept logistic regression models to account for clustering by hospital. RESULTS There were 903 816 severe sepsis hospitalizations (39 618 [4.4%] with SOT) and 410 623 sepsis hospitalizations (14 526 [3.9%] with SOT) in 250 hospitals. SOT recipients were younger and more likely to be insured by Medicare than those without SOT. Among hospitalizations for severe sepsis and sepsis, in-hospital mortality was lower among those with vs those without SOT (5.5% vs 9.4% for severe sepsis; 8.7% vs 12.7% for sepsis). After adjustment, the odds ratio for mortality comparing SOT patients vs non-SOT was 0.83 (95% confidence interval [CI], .79-.87) for severe sepsis and 0.78 (95% CI, .73-.84) for sepsis. Compared to non-SOT patients, kidney, liver, and co-transplant (kidney-pancreas/kidney-liver) recipients demonstrated lower mortality. No association was present for heart transplant, and lung transplant was associated with higher mortality. CONCLUSIONS Among patients hospitalized for severe sepsis or sepsis, those with SOT had lower inpatient mortality than those without SOT. Identifying the specific strategies employed for populations with improved mortality could inform best practices for sepsis among SOT and non-SOT populations.
Collapse
|
44
|
Tapley JL, McGwin G, Ashraf AP, MacLennan PA, Callahan K, Searcey K, Witherspoon CD, Saaddine J, Owsley C. Feasibility and efficacy of diabetic retinopathy screening among youth with diabetes in a pediatric endocrinology clinic: a cross-sectional study. Diabetol Metab Syndr 2015; 7:56. [PMID: 26136849 PMCID: PMC4487844 DOI: 10.1186/s13098-015-0054-z] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/26/2015] [Accepted: 06/17/2015] [Indexed: 01/03/2023] Open
Abstract
BACKGROUND We examined the feasibility and efficacy of using a non-mydriatic camera to screen for diabetic retinopathy (DR) among youth with type 1 or type 2 diabetes seen in a pediatric endocrinology clinic serving Alabama, the state that has the highest diabetes rate in the United States. METHODS 236 youths with type 1 or type 2 diabetes were screened for DR using a non-mydriatic camera. Visual acuity was also assessed. A questionnaire asked parents about diabetes and eye care history. RESULTS Mean duration since diabetes diagnosis was 5.5 years. 66 % reported receiving an eye examination within the previous year. 97.5 % had images that were gradable. DR was detected in 3.8 % of participants. 9.1 % were visually impaired. CONCLUSIONS Use of a non-mydriatic fundus camera is feasible and efficacious for DR screening in youth with diabetes. DR screening at routine endocrinology visits may be beneficial in managing youth with diabetes and preventing irreversible vision loss, particularly for those in regions where diabetes rates are high.
Collapse
|
45
|
MacLennan PA, McGwin G, Searcey K, Owsley C. A survey of Alabama eye care providers in 2010-2011. BMC Ophthalmol 2014; 14:44. [PMID: 24708636 PMCID: PMC4233655 DOI: 10.1186/1471-2415-14-44] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2013] [Accepted: 03/26/2014] [Indexed: 11/18/2022] Open
Abstract
BACKGROUND State level information regarding eye care resources can provide policy makers with valuable information about availability of eye care services. The current study surveyed ophthalmologists, optometrists and vision rehabilitation providers practicing in Alabama. METHODS Three mutually exclusive provider groups were identified, i.e., all ophthalmologists, optometrists, and vision rehabilitation providers working in Alabama in 2010. Eligible providers were contacted in 2010 and 2011 and information was requested regarding provider demographics and training, practice type and service characteristics, and patient characteristics. Descriptive statistics (e.g., means, proportions) were used to characterize provider groups by their demographic and training characteristics, practice characteristics, services provided and patients or clients served. In addition, county level figures demonstrate the numbers and per capita ophthalmologists and optometrists. RESULTS Ophthalmologists were located in 24 of Alabama's 67 counties, optometrists in 56, and 10 counties had neither an ophthalmologist nor an optometrist. Overall, 1,033 vision care professionals were identified as eligible to participate in the survey: 217 ophthalmologists, 638 optometrists, and 178 visual rehabilitation providers. Of those, 111 (51.2%) ophthalmologists, 246 (38.6%) optometrists, and 81 (45.5%) rehabilitation providers participated. Most participating ophthalmologists, optometrists, and vision rehabilitation providers identified themselves as non-Hispanic White. Ophthalmologists and optometrists estimated that 27% and 22%, respectively, of their patients had diabetes but that the proportion that adhered to eye care guidelines was 61% among ophthalmology patients and 53% among optometry patients. CONCLUSIONS A large number of Alabama communities are isolated from eye care services. Increased future demand for eye care is anticipated nationally given the aging of the population and decreasing numbers of providers; however, Alabama also has a high and growing prevalence of diabetes which will result in greater numbers at risk for diabetic retinopathy, glaucoma, and cataracts.
Collapse
|
46
|
Griffin RL, Davis GG, Levitan EB, MacLennan PA, Redden DT, McGwin G. The effect of previous traumatic injury on homicide risk. J Forensic Sci 2014; 59:986-90. [PMID: 24673555 DOI: 10.1111/1556-4029.12416] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2013] [Revised: 04/24/2013] [Accepted: 05/04/2013] [Indexed: 11/30/2022]
Abstract
Research has reported that a strong risk factor for traumatic injury is having a previous injury (i.e., recidivism). To date, the only study examining the relationship between recidivism and homicide reported strong associations, but was limited by possible selection bias. The current matched case-control study utilized coroner's data from 2004 to 2008. Subjects were linked to trauma registry data to determine whether the person had a previous traumatic injury. Conditional logistic regression was used to estimate odds ratios (ORs) and 95% confidence intervals (95% CIs) for the association between homicide and recidivism. Homicide risk was increased for those having a previous traumatic injury (OR 1.81, 95% CI 1.09-2.99) or a previous intentional injury (OR 2.53, 95% CI 1.24-5.17). These results suggest an association between homicide and injury recidivism, and that trauma centers may be an effective setting for screening individuals for secondary prevention efforts of homicide through violence prevention programs.
Collapse
|
47
|
Blunck H, Owsley C, MacLennan PA, McGwin G. Driving with pets as a risk factor for motor vehicle collisions among older drivers. ACCIDENT; ANALYSIS AND PREVENTION 2013; 58:70-74. [PMID: 23708755 PMCID: PMC4492539 DOI: 10.1016/j.aap.2013.04.019] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/18/2012] [Revised: 04/05/2013] [Accepted: 04/18/2013] [Indexed: 06/02/2023]
Abstract
Increasing rates of distraction-related motor vehicle collisions (MVCs) continue to raise concerns regarding driving safety. This study sought to evaluate a novel driving-related distraction, driving with a pet, as a risk factor for MVCs among older, community dwelling adults. Two thousand licensed drivers aged 70 and older were identified, of whom 691 reported pet ownership. Comparing pet owners who did and did not drive with their pets, neither overall MVC rates (rate ratio [RR] 0.97, 95% confidence interval [CI] 0.75-1.26) nor at-fault MVC rates (RR 0.84, 95% CI 0.57-1.24) were elevated. However, those who reported always driving with a pet in the vehicle had an elevated MVC rate (RR 1.89, 95% CI 1.10-3.25), as compared to those who did not drive with a pet. The MVC rate was not increased for those reporting only sometimes or rarely driving with a pet in the vehicle. The current study demonstrates an increased risk of MVC involvement in those older drivers who always take a pet with them when they drive a vehicle. When confronted with an increased cognitive or physical workload while driving, elderly drivers in prior studies have exhibited slower cognitive performance and delayed response times in comparison to younger age groups. Further study of pet-related distracted driving behaviors among older drivers as well as younger populations with respect to driver safety and performance is warranted to appropriately inform the need for policy regulation on this issue.
Collapse
|
48
|
MacLennan PA, McGwin G, Searcey K, Owsley C. Medical record validation of self-reported eye diseases and eye care utilization among older adults. Curr Eye Res 2012; 38:1-8. [PMID: 23078191 DOI: 10.3109/02713683.2012.733054] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/25/2022]
Abstract
PURPOSE Vision impairment is an important public health concern. Accurate information regarding visual health and eye care utilization is essential to monitor trends and inform health policy interventions aimed at addressing at-need populations. National surveys provide annual prevalence estimates but rely on self-report. The validity of self-reported information regarding eye disease has not been adequately explored. METHODS This cross-sectional study compared self-report of eye care utilization and eye disease with information obtained from medical records. The study population was 2001 adults aged 70 years and older who completed the Behavioral Risk Factor Surveillance System's Visual Impairment and Access to Eye Care Module. Cohen's kappa (κ) was used to assess agreement. RESULTS Agreement between self-report and medical records was substantial for eye care utilization (κ = 0.64) and glaucoma (κ = 0.73), moderate for macular degeneration (κ = 0.40) and diabetic retinopathy (κ = 0.47) and slight for cataracts (κ = 0.18). Self-report tended to overestimate the number of subjects who visited an eye care provider in the previous year, and underestimated the prevalence in all but one (glaucoma) of the four eye diseases evaluated. CONCLUSIONS Though agreement was substantial for self-report of eye care utilization, results of the current study suggest that national estimates based on self-report overestimate eye care utilization.
Collapse
|
49
|
Blackburn JL, Levitan EB, MacLennan PA, Owsley C, McGwin G. Changes in eye protection behavior following an occupational eye injury. Workplace Health Saf 2012. [PMID: 22909223 DOI: 10.3928/21650799-20120816-52] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/09/2023]
Abstract
This study investigated whether workers modify eye protection behavior following an occupational eye injury. Workers treated for work-related eye injuries were questioned regarding the use of protective eyewear for the work-month prior to their eye injuries and again 6 to 12 months later. Workers reported an increase in the proportion of work-time they used eye protection (from a median of 20% to 100%; p < .0001). The effect appeared to be driven by whether eye protection was used at the time of the injury. Most respondents (66%) indicated they were more likely to use eye protection since their injuries. Workers not using eye protection at the time of injury were more likely to use eye protection in the future. A variety of employer and employee factors may influence this change. Although many workers' behaviors changed, health care providers should embrace the teachable moment when treating occupational eye injuries to encourage continued use or more appropriate forms of eye protection.
Collapse
|
50
|
Blackburn JL, Levitan EB, MacLennan PA, Owsley C, McGwin G. Changes in eye protection behavior following an occupational eye injury. Workplace Health Saf 2012; 60:393-400. [PMID: 22909223 DOI: 10.1177/216507991206000904] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2012] [Accepted: 06/04/2012] [Indexed: 01/09/2023]
Abstract
This study investigated whether workers modify eye protection behavior following an occupational eye injury. Workers treated for work-related eye injuries were questioned regarding the use of protective eyewear for the work-month prior to their eye injuries and again 6 to 12 months later. Workers reported an increase in the proportion of work-time they used eye protection (from a median of 20% to 100%; p < .0001). The effect appeared to be driven by whether eye protection was used at the time of the injury. Most respondents (66%) indicated they were more likely to use eye protection since their injuries. Workers not using eye protection at the time of injury were more likely to use eye protection in the future. A variety of employer and employee factors may influence this change. Although many workers' behaviors changed, health care providers should embrace the teachable moment when treating occupational eye injuries to encourage continued use or more appropriate forms of eye protection.
Collapse
|