1
|
High-Functioning Deceased Donor Kidney Transplant System Characteristics: The British Columbia Experience With an Opt-In System. Kidney Med 2024; 6:100812. [PMID: 38665993 PMCID: PMC11044131 DOI: 10.1016/j.xkme.2024.100812] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/28/2024] Open
Abstract
Rationale & Objective A high level of cooperation between organ procurement organizations and transplant programs may help maximize use of deceased donor kidneys. The practices that are essential for a high functioning organ donation and transplant system remain uncertain. We sought to report metrics of organ donation and transplant performance in British Columbia, Canada, and to assess the association of specific policies and practices that contribute to the system's performance. Study Design A retrospective observational study. Setting & Participants Referred deceased organ donors in British Columbia were used in the study from January 1, 2016, to December 31 2019. Exposures Provincial, organ procurement organization, and center level policies were implemented to improve donor referral and organ utilization. Outcomes Assessment of donor and kidney utilization along steps of the critical pathway for organ donation. Analytical Approach Deceased donors were classified according to the critical pathway for organ donation and key donation and transplant metrics were identified. Results There were 1,948 possible donors referred. Of 1,948, 754 (39%) were potential donors. Of 754 potential donors, 587 (78%) were consented donors. Of 587 consented donors, 480 (82%) were eligible kidney donors. Of 480 eligible kidney donors, 438 (91%) were actual kidney donors. And of 438 actual kidney donors, 432 (99%) were utilized kidney donors. One-year all-cause allograft survival was 95%. Practices implemented to improve the system's performance included hospital donor coordinators, early communication between the organ procurement organization and transplant nephrologists, dedicated organ recovery and implant surgeons, aged-based kidney allocation, and hospital admission of recipients before kidney recovery. Limitations Assignment of causality between individual policies and practices and organ donation and utilization is limited in this observational study. Conclusions In British Columbia, consent for donation, utilization of donated kidneys, and transplant survival are exceptionally high, suggesting the importance of an integrated deceased donor and kidney transplant service.
Collapse
|
2
|
Efficacy and Safety of Bleselumab in Preventing the Recurrence of Primary Focal Segmental Glomerulosclerosis in Kidney Transplant Recipients: A Phase 2a, Randomized, Multicenter Study. Transplantation 2024:00007890-990000000-00714. [PMID: 38564451 DOI: 10.1097/tp.0000000000004985] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/04/2024]
Abstract
BACKGROUND Focal segmental glomerulosclerosis (FSGS) is a common cause of end-stage kidney disease and frequently recurs after kidney transplantation. Recurrent FSGS (rFSGS) is associated with poor allograft and patient outcomes. Bleselumab, a fully human immunoglobulin G4 anti-CD40 antagonistic monoclonal antibody, disrupts CD40-related processes in FSGS, potentially preventing rFSGS. METHODS A phase 2a, randomized, multicenter, open-label study of adult recipients (aged ≥18 y) of a living or deceased donor kidney transplant with a history of biopsy-proven primary FSGS. The study assessed the efficacy of bleselumab combined with tacrolimus and corticosteroids as maintenance immunosuppression in the prevention of rFSGS >12 mo posttransplantation, versus standard of care (SOC) comprising tacrolimus, mycophenolate mofetil, and corticosteroids. All patients received basiliximab induction. The primary endpoint was rFSGS, defined as proteinuria (protein-creatinine ratio ≥3.0 g/g) with death, graft loss, or loss to follow-up imputed as rFSGS, through 3 mo posttransplant. RESULTS Sixty-three patients were followed for 12 mo posttransplantation. Relative decrease in rFSGS occurrence through 3 mo with bleselumab versus SOC was 40.7% (95% confidence interval, -89.8 to 26.8; P = 0.37; absolute decrease 12.7% [95% confidence interval, -34.5 to 9.0]). Central-blinded biopsy review found relative (absolute) decreases in rFSGS of 10.9% (3.9%), 17.0% (6.2%), and 20.5% (7.5%) at 3, 6, and 12 mo posttransplant, respectively; these differences were not statistically significant. Adverse events were similar for both treatments. No deaths occurred during the study. CONCLUSIONS In at-risk kidney transplant recipients, bleselumab numerically reduced proteinuria occurrence versus SOC, but no notable difference in occurrence of biopsy-proven rFSGS was observed.
Collapse
|
3
|
Erythrocytosis and thrombotic events in kidney transplant recipients prescribed a sodium glucose cotransport-2 inhibitor. Clin Transplant 2023; 37:e15013. [PMID: 37170711 DOI: 10.1111/ctr.15013] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2023] [Revised: 04/24/2023] [Accepted: 04/29/2023] [Indexed: 05/13/2023]
Abstract
INTRODUCTION The safety and efficacy of sodium glucose cotransport-2 inhibitors (SGLT2i) in kidney transplant recipients remains uncertain. Transplant recipients may be at risk of thrombosis because of post-transplant erythrocytosis and SGLT2i are associated with an increase in hematocrit. METHODS We determined SGLT2i use, the change in hematocrit and incidence of thrombotic events in kidney transplant recipients in 1700 prevalent patients in our center. RESULTS Among the 42 patients treated with SGLT2i, the mean pre-transplant hematocrit was 31%, and none of the patients had a hematocrit ≥50%. The mean percent change in hematocrit measured at an average of 53 days after initiation of an SGLT2i was 11% and four patients (10%) had a hematocrit ≥ 50%. The mean hematocrit measured 3 months after treatment was 42% and two patients (5%) had a hematocrit ≥50%. One patient had a cerebellar stroke 14 months post-SGLT2i initiation when the hemoglobin was 173 grams/liter, and the hematocrit was 52%. CONCLUSIONS All patients had a sustained increase in hematocrit 3 months after SGLT2i treatment. Hematocrit ≥50% occurred in 10%, and one patient had a thrombotic event that may or may not have been related to an increase in hematocrit. Clinicians may consider monitoring for erythrocytosis after starting and SGLT2i in kidney transplant recipients.
Collapse
|
4
|
Differences in medication adherence by sex and organ type among adolescent and young adult solid organ transplant recipients. Pediatr Transplant 2023; 27:e14446. [PMID: 36478059 DOI: 10.1111/petr.14446] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/22/2022] [Revised: 10/26/2022] [Accepted: 11/11/2022] [Indexed: 12/12/2022]
Abstract
BACKGROUND Identification of differences in medication adherence by sex or organ type may help in planning interventions to optimize outcomes. We compared immunosuppressive medication adherence between males and females, and between kidney, liver and heart transplant recipients. METHODS This multicenter study of prevalent kidney, liver and heart transplant recipients 14-25 years assessed adherence 3 times (0, 3, 6 months post-enrollment) with the BAASIS self-report tool. At each visit, participants were classified as adherent if they missed no doses in the prior 4 weeks and non-adherent otherwise. Adherence was also assessed using the coefficient of variation (CV) of tacrolimus trough levels; CV < 30% was classified as adherent. We used multivariable mixed effects logistic regression models adjusted for potential confounders to compare adherence by sex and by organ. RESULTS Across all visits, males (n = 150, median age 20.4 years, IQR 17.2-23.3) had lower odds of self-reported adherence than females (n = 120, median age 19.8 years, IQR 17.1-22.7) (OR 0.41, 95% CI 0.21-0.80) but higher odds of adherence by tacrolimus CV (OR 2.50, 95% CI 1.30-4.82). No significant differences in adherence (by self-report or tacrolimus CV) were noted between the 184 kidney, 58 liver, and 28 heart recipients. CONCLUSION Females show better self-reported adherence than males but greater variability in tacrolimus levels. Social desirability bias, more common in females than males, may contribute to better self-reported adherence among females. Higher tacrolimus variability among females may reflect biologic differences in tacrolimus metabolism between males and females rather than sex differences in adherence. There were no significant differences in adherence by organ type.
Collapse
|
5
|
Immunosuppressant Medication Use in Patients with Kidney Allograft Failure: A Prospective Multi-Center Canadian Cohort Study. J Am Soc Nephrol 2022; 33:1182-1192. [PMID: 35321940 PMCID: PMC9161795 DOI: 10.1681/asn.2021121642] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/27/2021] [Accepted: 01/03/2022] [Indexed: 01/01/2023] Open
Abstract
Background: Patients with kidney transplant failure have a high risk of hospitalization and death due to infection. The optimal use of immunosuppressants after transplant failure remains uncertain and clinical practice varies widely. Methods: This prospective cohort study enrolled patients within 21 days of starting dialysis after transplant failure in 16 Canadian centers. Immunosuppressant medication use, death, hospitalized infection, rejection of the failed allograft, and panel reactive anti-HLA antibodies (PRA) were determined at 1, 3, 6 , and 12 months and bi-annually until death, repeat transplantation, or loss to follow-up. Results: The 269 study patients were followed for a median of 558 days. There were 33 deaths, 143 patients hospitalized for infection, and 21 rejections. Most patients (65%) continued immunosuppressants, 20% continued prednisone only, while 15% discontinued all immunosuppressants. In multivariable models, patients who continued immunosuppressants had a lower risk of death (HR =0.40, 95% CI, 0.17-0.93) and were not at increased risk of hospitalized infection (HR 1.81; 95% CI 0.82 to 4.0) compared to patients who discontinued all immunosuppressants or continued prednisone only. The mean class I and class II PRA increased from 11% to 27% and 25% to 47%, respectively, but did not differ by immunosuppressant use. Continuation of immunosuppressants was not protective of rejection of the failed allograft (HR 0.81, 95% CI, 0.22-2.94). Conclusions: Prolonged use of immunosuppressants greater than one year after transplant failure was not associated with a higher risk of death or hospitalized infection but was insufficient to prevent higher anti-HLA antibodies or rejection of the failed allograft.
Collapse
|
6
|
Care processes and structures associated with higher medication adherence in adolescent and young adult transplant recipients. Pediatr Transplant 2021; 25:e14106. [PMID: 34339090 DOI: 10.1111/petr.14106] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/14/2021] [Revised: 07/19/2021] [Accepted: 07/24/2021] [Indexed: 11/29/2022]
Abstract
BACKGROUND We aimed to identify care processes and structures that were independently associated with higher medication adherence among young transplant recipients. METHODS We conducted a prospective, observational cohort study of 270 prevalent kidney, liver, and heart transplant recipients 14-25 years old. Patients were ≥3 months post-transplant, ≥2 months post-discharge, and followed in one of 14 pediatric or 14 adult transplant programs in Canada. Patients were enrolled between June 2015 and March 2018 and followed for 6 months. Adherence was assessed at baseline, 3, and 6 months using the BAASIS© self-report tool. Patients were classified as adherent if no doses were missed in the prior 4 weeks. Transplant program directors and nurses completed questionnaires regarding care organization and processes. RESULTS Of the 270 participants, 99 were followed in pediatric programs and 171 in adult programs. Median age was 20.3 years, and median time since transplant was 5 years. At baseline, 71.5% were adherent. Multivariable mixed effects logistic regression models with program as a random effect identified two program-level factors as independently associated with better adherence: minimum number of prescribed blood draws per year for those >3 years post-transplant (per 1 additional) (OR 1.12 [95% CI 1.00, 1.26]; p = .047), and average time nurses spend with patients in clinic (per 5 additional minutes) (OR 1.15 [1.03, 1.29]; p = .017). CONCLUSION Program-level factors including protocols with a greater frequency of routine blood testing and more nurse time with patients were associated with better medication adherence. This suggests that interventions at the program level may support better adherence.
Collapse
|
7
|
A Peculiar Case of Paraproteinemia and Elevated Creatinine. Am J Kidney Dis 2019; 71:A14-A15. [PMID: 29477177 DOI: 10.1053/j.ajkd.2017.09.027] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2017] [Accepted: 09/19/2017] [Indexed: 11/11/2022]
|
8
|
Posttransplant Lymphoproliferative Disorder in Adults Receiving Kidney Transplantation in British Columbia: A Retrospective Cohort Analysis. Can J Kidney Health Dis 2018; 5:2054358118760831. [PMID: 29636980 PMCID: PMC5888818 DOI: 10.1177/2054358118760831] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2016] [Accepted: 12/06/2017] [Indexed: 01/31/2023] Open
Abstract
Background: Posttransplant lymphoproliferative disorder (PTLD) is a major complication following kidney transplantation. Objective: We undertook this study to characterize PTLD in kidney transplant patients in British Columbia with regard to incidence, patient and graft survival, histological subtypes, treatment modalities, and management of immunosuppression. Design: Retrospective cohort analysis. Setting: British Columbia. Patients: All adult patients who underwent kidney transplantation in British Columbia between January 1, 1996, and December 31, 2012, were included. Patients less than 18 years of age at the time of first transplant and multiple organ transplant recipients were excluded from analysis. Measurements: Patients with lymphoproliferative disorders that occurred subsequent to kidney transplantation were considered to have developed PTLD. Methods: Cases of PTLD were identified by cross-referencing data abstracted from the provincial transplant agency’s clinical database with the provincial cancer agency’s lymphoma registry. Patients were followed up for the development of PTLD until December 31, 2012, and for outcomes of death and graft failure until December 31, 2014. Data collection was completed via an electronic chart review. Results: Of 2217 kidney transplant recipients, 37 (1.7%) developed PTLD. Nine cases were early-onset PTLD, occurring within 1 year of transplant; of these cases, 6 were known/presumed Epstein-Barr virus mismatch, compared with only 2 of 28 late-onset cases. Patient survival for early-onset PTLD was 100% at 2 years post diagnosis. Late-onset PTLD had survival rates of 71.4% and 67.9% at 1 and 2 years, respectively. PTLD was associated with significantly decreased patient survival (P = .031) and graft survival (uncensored for death, P = .017), with median graft survival of PTLD and non-PTLD patients being 9.5 and 16 years, respectively. Immunosuppressant therapy was reduced in the majority of patients; additional therapies included rituximab monotherapy, CHOP-R, radiation, and surgery. Limitations: Limitations to this study include its retrospective nature and the unknown adherence of patients to prescribed immunosuppressant regimens. In addition, cumulative doses of immunosuppression received and the degree of immunosuppression reduction for PTLD management were not effectively captured. Conclusions: The incidence of PTLD in British Columbia following kidney transplantation was low and consistent with rates reported in the literature. The incidence of late-onset PTLD and its association with reduced patient and graft survival warrant further analysis of patients’ long-term immunosuppression.
Collapse
|
9
|
Timing of Pregnancy After Kidney Transplantation and Risk of Allograft Failure. Am J Transplant 2016; 16:2360-7. [PMID: 26946063 DOI: 10.1111/ajt.13773] [Citation(s) in RCA: 49] [Impact Index Per Article: 6.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/05/2015] [Revised: 02/12/2016] [Accepted: 02/12/2016] [Indexed: 01/25/2023]
Abstract
The optimal timing of pregnancy after kidney transplantation remains uncertain. We determined the risk of allograft failure among women who became pregnant within the first 3 posttransplant years. Among 21 814 women aged 15-45 years who received a first kidney-only transplant between 1990 and 2010 captured in the United States Renal Data System, n = 729 pregnancies were identified using Medicare claims. The probability of allograft failure from any cause including death (ACGL) at 1, 3, and 5 years after pregnancy was 9.6%, 25.9%, and 36.6%. In multivariate analyses, pregnancy in the first posttransplant year was associated with an increased risk of ACGL (hazard ratio [HR]: 1.18; 95% confidence interval [CI] 1.00, 1.40) and death censored graft loss (DCGL) (HR:1.25; 95% CI 1.04, 1.50), while pregnancy in the second posttransplant year was associated with an increased risk of DCGL (HR: 1.26; 95% CI 1.06, 1.50). Pregnancy in the third posttransplant year was not associated with an increased risk of ACGL or DCGL. These findings demonstrate a higher incidence of allograft failure after pregnancy than previously reported and that the increased risk of allograft failure extends to pregnancies in the second posttransplant year.
Collapse
|
10
|
Women’s Experiences of Appearance Concern and Body Control across the Lifespan: Challenging accepted wisdom. J Health Psychol 2016; 9:397-410. [PMID: 15117539 DOI: 10.1177/1359105304042349] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
This study adopted a lifespan approach to women’s experiences of appearance concern and body control. Thirty-two women (aged 16 to 77) were interviewed about their exercise and food regulation. Results of the grounded theory analysis challenge social constructions of appearance concern as associated principally with the reproductive years, and of the body as malleable, and highlight the complexity of the relationship between appearance concern and body control. Despite frequent persistence of (or increase in) appearance concern beyond young adulthood, ‘healthier’ responses to appearance concern occurred due to changing priorities and increasing awareness. Findings highlight the utility of an inclusive and qualitative approach, and the absence of simple and sovereign factors determining an individual’s levels of appearance concern or body control.
Collapse
|
11
|
Urinary biomarkers of chronic allograft nephropathy. Proteomics Clin Appl 2016; 9:574-85. [PMID: 25951805 DOI: 10.1002/prca.201400200] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2014] [Revised: 04/23/2015] [Accepted: 05/05/2015] [Indexed: 12/21/2022]
Abstract
PURPOSE Chronic allograft nephropathy (CAN) is widely accepted as the leading cause of renal allograft loss after the first year post transplantation. This study aimed to identify urinary biomarkers that could predict CAN in transplant patients. EXPERIMENTAL DESIGN The study included 34 renal transplant patients with histologically proven CAN and 36 renal transplant patients with normal renal function. OrbiTrap MS was utilized to analysis a urinary fraction in order to identify other members of a previously identified biomarker tree . This novel biomarker pattern offers the potential to distinguish between transplant recipients with CAN and those with normal renal function. RESULTS The primary node of the biomarker pattern was reconfirmed as β2 microglobulin. Three other members of this biomarker pattern were identified: neutrophil gelatinase-associated lipocalin, clusterin, and kidney injury biomarker 1. Significantly higher urinary concentrations of these proteins were found in patients with CAN compared to those with normal kidney function. CONCLUSIONS AND CLINICAL RELEVANCE While further validation in a larger more-diverse patient population is required to determine if this biomarker pattern provides a potential means of diagnosing CAN by noninvasive methods in a clinical setting, this study clearly demonstrates the biomarkers' ability to stratify patients based on transplant function.
Collapse
|
12
|
Abstract
IMPORTANCE BK virus infection is a significant complication of modern immunosuppression used in kidney transplantation. Viral reactivation occurs first in the urine (BK viruria) and is associated with a high risk of transplant failure. There are currently no therapies to prevent or treat BK virus infection. Quinolone antibiotics have antiviral properties against BK virus but efficacy at preventing this infection has not been shown in prospective controlled studies. OBJECTIVE To determine if levofloxacin can prevent BK viruria in kidney transplant recipients. DESIGN, SETTING, AND PARTICIPANTS Double-blind, placebo-controlled randomized trial involving 154 patients who received a living or deceased donor kidney-only transplant in 7 Canadian transplant centers between December 2011 and June 2013. INTERVENTIONS Participants were randomly assigned to receive a 3-month course of levofloxacin (500 mg/d; n = 76) or placebo (n = 78) starting within 5 days after transplantation. MAIN OUTCOMES AND MEASURES The primary outcome was time to occurrence of BK viruria (detected using quantitative real-time polymerase chain reaction) within the first year after transplantation. Secondary outcomes included BK viremia, peak viral load, rejection, and patient and allograft survival. RESULTS The mean follow-up time was 46.5 weeks in the levofloxacin group and 46.3 weeks in the placebo group (27 patients had follow-up terminated before the end of the planned follow-up period or development of viruria because the trial was stopped early owing to lack of funding). BK viruria occurred in 22 patients (29%) in the levofloxacin group and in 26 patients (33.3%) in the placebo group (hazard ratio, 0.91; 95% CI, 0.51-1.63; P = .58). There was no significant difference between the 2 groups in regard to any of the secondary end points. There was an increased risk of resistant infection among isolates usually sensitive to quinolones in the levofloxacin group vs placebo (14/24 [58.3%] vs 15/45 [33.3%], respectively; risk ratio, 1.75; 95% CI, 1.01-2.98) as well as a nonsignificant increased risk of suspected tendinitis (6/76 [7.9%] vs 1/78 [1.3%]; risk ratio, 6.16; 95% CI, 0.76-49.95). CONCLUSIONS AND RELEVANCE Among kidney transplant recipients, a 3-month course of levofloxacin initiated early following transplantation did not prevent BK viruria. Levofloxacin was associated with an increased risk of adverse events such as bacterial resistance. These findings do not support the use of levofloxacin to prevent posttransplant BK virus infection. TRIAL REGISTRATION clinicaltrials.gov Identifier: NCT01353339.
Collapse
|
13
|
Differential association of body mass index with access to kidney transplantation in men and women. Clin J Am Soc Nephrol 2014; 9:951-9. [PMID: 24742478 DOI: 10.2215/cjn.08310813] [Citation(s) in RCA: 79] [Impact Index Per Article: 7.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/24/2023]
Abstract
BACKGROUND AND OBJECTIVES Obese patients encounter barriers to medical care not encountered by lean patients, and inequities in access to care among obese patients may vary by sex. This study aimed to determine the association of body mass index (BMI) with access to kidney transplantation in men and women. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS In this retrospective analysis of 702,456 incident ESRD patients aged 18-70 years (captured in the US Renal Data System between 1995 and 2007), multivariate time-to-event analyses were used to determine the association of BMI with likelihood of transplantation from any donor source, transplantation from a living donor, and transplantation from a deceased donor, as well the individual steps in obtaining a deceased donor transplant (activation to the waiting list, and transplantation after wait-listing). RESULTS Among women, a BMI ≥ 25.0 kg/m(2) was associated with a lower likelihood of transplantation from any donor source (hazard ratio [HR], 0.75; 95% confidence interval [95% CI], 0.73 to 0.77), transplantation from a living donor (HR, 0.75; 95% CI, 0.72 to 0.77), and transplantation from a deceased donor (HR, 0.74; 95% CI, 0.72 to 0.77). By contrast, among men, a BMI of 25.0-34.9 kg/m(2) was associated with a higher likelihood of the outcomes of transplantation from any donor source (HR, 1.08; 95% CI, 1.06 to 1.11), transplantation from a living donor (HR, 1.18; 95% CI, 1.13 to 1.22), and transplantation from a deceased donor (HR, 1.05; 95% CI, 1.02 to 1.07). Among men, the level beyond which BMI was associated with a lower likelihood of transplantation from any donor source or a living donor was ≥ 40.0 kg/m(2), and ≥ 35.0 kg/m(2) in the case of deceased donor transplantation. CONCLUSIONS The association of BMI with access to transplantation varies between men and women. The reasons for this difference should be further studied.
Collapse
|
14
|
The effect of race and income on living kidney donation in the United States. J Am Soc Nephrol 2013; 24:1872-9. [PMID: 23990679 DOI: 10.1681/asn.2013010049] [Citation(s) in RCA: 69] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022] Open
Abstract
Studies of racial disparities in access to living donor kidney transplantation focus mainly on patient factors, whereas donor factors remain largely unexamined. Here, data from the US Census Bureau were combined with data on all African-American and white living kidney donors in the United States who were registered in the United Network for Organ Sharing (UNOS) between 1998 and 2010 (N=57,896) to examine the associations between living kidney donation (LKD) and donor median household income and race. The relative incidence of LKD was determined in zip code quintiles ranked by median household income after adjustment for age, sex, ESRD rate, and geography. The incidence of LKD was greater in higher-income quintiles in both African-American and white populations. Notably, the total incidence of LKD was higher in the African-American population than in the white population (incidence rate ratio [IRR], 1.20; 95% confidence interval [95% CI], 1.17 to 1.24]), but ratios varied by income. The incidence of LKD was lower in the African-American population than in the white population in the lowest income quintile (IRR, 0.84; 95% CI, 0.78 to 0.90), but higher in the African-American population in the three highest income quintiles, with IRRs of 1.31 (95% CI, 1.22 to 1.41) in Q3, 1.50 (95% CI, 1.39 to 1.62) in Q4, and 1.87 (95% CI, 1.73 to 2.02) in Q5. Thus, these data suggest that racial disparities in access to living donor transplantation are likely due to socioeconomic factors rather than cultural differences in the acceptance of LKD.
Collapse
|
15
|
The survival benefit of kidney transplantation in obese patients. Am J Transplant 2013; 13:2083-90. [PMID: 23890325 DOI: 10.1111/ajt.12331] [Citation(s) in RCA: 134] [Impact Index Per Article: 12.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2012] [Revised: 05/03/2013] [Accepted: 05/07/2013] [Indexed: 01/25/2023]
Abstract
Obese patients have a decreased risk of death on dialysis but an increased risk of death after transplantation, and may derive a lower survival benefit from transplantation. Using data from the United States between 1995 and 2007 and multivariate non-proportional hazards analyses we determined the relative risk of death in transplant recipients grouped by body mass index (BMI) compared to wait-listed candidates with the same BMI (n = 208 498). One year after transplantation the survival benefit of transplantation varied by BMI: Standard criteria donor transplantation was associated with a 48% reduction in the risk of death in patients with BMI ≥ 40 kg/m(2) but a ≥ 66% reduction in patients with BMI < 40 kg/m2. Living donor transplantation was associated with ≥ 66% reduction in the risk of death in all BMI groups. In sub-group analyses, transplantation from any donor source was associated with a survival benefit in obese patients ≥ 50 years, and diabetic patients, but a survival benefit was not demonstrated in Black patients with BMI ≥ 40 kg/m(2). Although most obese patients selected for transplantation derive a survival benefit, the benefit is lower when BMI is ≥ 40 kg/m(2), and uncertain in Black patients with BMI ≥ 40 kg/m(2).
Collapse
|
16
|
Quinolone prophylaxis for the prevention of BK virus infection in kidney transplantation: study protocol for a randomized controlled trial. Trials 2013; 14:185. [PMID: 23800312 PMCID: PMC3691619 DOI: 10.1186/1745-6215-14-185] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2013] [Accepted: 06/06/2013] [Indexed: 12/01/2022] Open
Abstract
BACKGROUND BK virus infection has emerged as a major complication in kidney transplantation leading to a significant reduction in graft survival. There are currently no proven strategies to prevent or treat BK virus infection. Quinolone antibiotics, such as levofloxacin, have demonstrated activity against BK virus. We hypothesize that administration of a quinolone antibiotic, when given early post-transplantation, will prevent the establishment of BK viral replication in the urine and thus prevent systemic BK virus infection. METHODS/DESIGN The aim of this pilot trial is to assess the efficacy, safety and feasibility of a 3-month course of levofloxacin in the kidney transplant population. This is a multicenter, randomized, double-blind, placebo-controlled trial with two parallel arms conducted in 11 Canadian kidney transplant centers. A total of 154 patients with end-stage renal disease undergoing kidney transplantation will be randomized to receive a 3-month course of levofloxacin or placebo starting in the early post-transplant period. Levofloxacin will be administered at 500 mg po daily with dose adjustments based on kidney function. The primary outcome will be the time to occurrence of BK viruria within the first year post-transplantation. Secondary outcomes include BK viremia, measures of safety (adverse events, resistant infections,Clostridium difficile-associated diarrhea), measures of feasibility (proportion of transplanted patients recruited into the trial), proportion of patients adherent to the protocol, patient drop-out and loss to follow-up,and use of quinolone antibiotics outside of the trial protocol. DISCUSSION Results from this pilot study will provide vital information to design and conduct a large, multicenter trial to determine if quinolone therapy decreases clinically meaningful outcomes in kidney transplantation. If levofloxacin significantly reduces BK viruria and urine viral loads in kidney transplantation, it will provide important justification to progress to the larger trial. If the full trial shows that levofloxacin significantly reduces BK infection and improves outcomes, its use in kidney transplantation will be strongly endorsed given the lack of proven therapies for this condition. TRIAL REGISTRATION This trial was funded by the Canadian Institutes of Health Research (grant number:222493) and is registered at ClinicalTrials.gov (NCT01353339).
Collapse
|
17
|
Quantification of the early risk of death in elderly kidney transplant recipients. Am J Transplant 2013; 13:427-32. [PMID: 23167257 DOI: 10.1111/j.1600-6143.2012.04323.x] [Citation(s) in RCA: 76] [Impact Index Per Article: 6.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2012] [Revised: 09/04/2012] [Accepted: 09/26/2012] [Indexed: 01/25/2023]
Abstract
To inform decision making regarding transplantation in patients ≥ 65 years, we quantified the early posttransplant risk of death by determining the time to equal risk and equal survival between transplant recipients and wait-listed dialysis patients in the United States between 1995 and 2007 (total n = 25 468). Survival was determined using separate multivariate nonproportional hazards analyses in low-, intermediate- and high-risk cardiovascular risk patients. Compared to wait-listed patients with similar cardiovascular risk, standard criteria (SCD) and expanded criteria (ECD) recipients had a higher risk of death in the perioperative and early-posttransplant period. In contrast, low and intermediate risk living donor (LD) recipients had an immediate survival advantage compared to similar risk wait-listed patients. In all risk groups, transplantation was associated with a long-term survival advantage compared to dialysis, but there were marked differences in time to equal risk of death, and time to equal survival by donor type. For example, survival in high-risk recipients of an LD, SCD and ECD transplant became equal to that in similar risk wait-listed patients 130, 368 and 521 days after transplantation. Early posttransplant mortality risk is eliminated in low- and intermediate-risk patients, and markedly reduced in high-risk patients with LD transplantation.
Collapse
|
18
|
Income of living kidney donors and the income difference between living kidney donors and their recipients in the United States. Am J Transplant 2012; 12:3111-8. [PMID: 22882723 DOI: 10.1111/j.1600-6143.2012.04211.x] [Citation(s) in RCA: 31] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
Disincentives for living kidney donation are common but are poorly understood. We studied 54 483 living donor kidney transplants in the United States between 2000 and 2009, limiting to those with valid zip code data to allow determination of median household income by linkage to the 2000 U.S. Census. We then determined the income and income difference of donors and recipients. The median household income in donors and recipients was $46 334 ±$17 350 and $46 439 ±$17 743, respectively. Donation-related expenses consume ≥ 1 month's income in 76% of donors. The mean ± standard deviation income difference between recipients and donors in transplants involving a wealthier recipient was $22 760 ± 14 792 and in 90% of transplants the difference was <$40 000 dollars. The findings suggest that the capacity for donors to absorb the financial consequences of donation, or of recipients to reimburse allowable expenses, is limited. There were few transplants with a large difference in recipient and donor income, suggesting that the scope and value of any payment between donors and recipients is likely to be small. We conclude that most donors and recipients have similar modest incomes, suggesting that the costs of donation are a significant burden in the majority of living donor transplants.
Collapse
|
19
|
Kidney Disease: A Guide for Living, by Walter A. Hunt. Am J Transplant 2011. [DOI: 10.1111/j.1600-6143.2011.03752.x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
20
|
Abstract
Whether pancreas after kidney transplantation (PAK) compromises kidney allograft survival, and what pre-PAK glomerular filtration rate (GFR) should be used to select patients for PAK is unclear. We analyzed all (n = 2776) PAK recipients in the United States between 1989 and 2007 and compared their risk of kidney failure to a comparator group of n = 13 635 young adult diabetic kidney only transplant recipients during the same time after accounting for selection bias by the use of a propensity score for PAK in a multivariate time to event analysis. In a secondary analysis, we determined the association of pre-PAK GFR with subsequent kidney allograft survival. Despite an increased risk of death early after pancreas transplantation, PAK recipients had a decreased long-term risk of kidney allograft failure compared to diabetic kidney only transplant recipients HR = 0.89; 95% CI: [0.78-1.00]; p = 0.05. An association of pre-PAK GFR with kidney survival was not evident until 3 years after pancreas transplantation, and patients with a pre-PAK GFR of 30-39 mL/min still attained 10-year post-PAK kidney survival of 69%. We conclude that PAK is associated with improved kidney allograft survival, and pre-PAK GFR 30-39 mL/min should not preclude PAK. Expanded use of PAK is warranted.
Collapse
|
21
|
Identification of β2-microglobulin as a urinary biomarker for chronic allograft nephropathy using proteomic methods. Proteomics Clin Appl 2011; 5:422-31. [PMID: 21751411 DOI: 10.1002/prca.201000160] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2010] [Revised: 03/30/2011] [Accepted: 04/20/2011] [Indexed: 12/13/2022]
Abstract
PURPOSE Chronic allograft nephropathy (CAN) remains the leading cause of renal graft loss after the first year following renal transplantation. This study aimed to identify novel urinary proteomic profiles, which could distinguish and predict CAN in susceptible individuals. EXPERIMENTAL DESIGN The study included 34 renal transplant patients with histologically proven CAN and 36 patients with normal renal transplant function. High-throughput proteomic profiles were generated from urine samples with three different ProteinChip arrays by surface-enhanced laser-desorption/ionization time-of-flight mass spectrometry (SELDI-TOF-MS). Following SELDI, a biomarker pattern software analysis was performed which led to the identification of a novel biomarker pattern that could distinguish patients with CAN from those with normal renal function. RESULTS An 11.7 kDa protein identified as β2 microglobulin was the primary protein of this biomarker pattern, distinguishing CAN from control patients (receiver operator characteristic [ROC]=0.996). SELDI-TOF-MS comparison of purified β2 microglobulin protein and CAN urine demonstrated identical 11.7 kDa protein peaks. Significantly, higher concentrations of 2 microglobulin were found in the urine of patients with CAN compared with the urine of normal renal function transplant recipients (p<0.001). CONCLUSIONS AND CLINICAL RELEVANCE Although further validation in a larger more diverse patient population is required to determine if this β2 microglobulin protein biomarker will provide a potential means of diagnosing CAN by noninvasive methods in a clinical setting, this study clearly shows a capability to stratify control and disease patients.
Collapse
|
22
|
Excessive exercise: From quantitative categorisation to a qualitative continuum approach. EUROPEAN EATING DISORDERS REVIEW 2011; 19:237-48. [DOI: 10.1002/erv.970] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
23
|
Opportunities to deter transplant tourism exist before referral for transplantation and during the workup and management of transplant candidates. Kidney Int 2011; 79:1026-31. [DOI: 10.1038/ki.2010.540] [Citation(s) in RCA: 24] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/15/2022]
|
24
|
Therapeutic writing as an intervention for symptoms of bulimia nervosa: effects and mechanism of change. Int J Eat Disord 2010; 43:405-19. [PMID: 19544556 DOI: 10.1002/eat.20714] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
OBJECTIVE This study explored the effects on bulimic symptomatology of a writing task intended to reduce emotional avoidance. METHOD Eighty individuals reporting symptoms of bulimia completed, by e-mail, a therapeutic or control writing task. Participants completed questionnaires on bulimic symptoms, mood, and potential moderating and mediating factors, and were followed up after 4 and 8 weeks. Writing content was explored using a word count package and qualitative framework analysis. RESULTS Bulimic symptoms decreased in both groups, although in both groups the number of participants who improved was approximately equal to the number who did not improve. Symptom decreases were associated with increases in perceived mood regulation abilities, and decreases in negative beliefs about emotions. Participants preferred internet delivery to face to face discussion. DISCUSSION For individuals experiencing symptoms of bulimia, the effects of therapeutic writing did not differ significantly from effects of a control writing task.
Collapse
|
25
|
Favorable Graft Survival in Renal Transplant Recipients with Polycystic Kidney Disease. Ren Fail 2009. [DOI: 10.1081/jdi-56606] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022] Open
|
26
|
General practitioner attitudes towards referral of eating-disordered patients: a vignette study based on the theory of planned behaviour. MENTAL HEALTH IN FAMILY MEDICINE 2008; 5:213-218. [PMID: 22477872 PMCID: PMC2777584] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Accepted: 03/29/2009] [Indexed: 05/31/2023]
Abstract
Objective The study examined individual differences between general practitioners (GPs) to determine their impact on variations in intention to refer a hypothetical patient with disordered eating to specialist eating disorder services. The study also examined the impact of patient weight on intention to refer.Method GPs within three primary care trusts (PCTs) were posted a vignette depicting a patient with disordered eating, described as either normal weight or underweight. A questionnaire was developed from the theory of planned behaviour to assess the GPs' attitudes, perception of subjective norms, perceived behavioural control, and intention to refer the patient. Demographic details were also collected.Results Responses were received from 88 GPs (33%). Intention to refer the patient was significantly related to subjective norms and cognitive attitudes. Together these predictors explained 86% of the variance in the intention to refer. GP or practice characteristics did not have a significant effect on the GPs' intention to refer, and nor did the patient's weight.Conclusion Despite National Institute for Health and Clinical Excellence current guidance, patient weight did not influence GPs' decisions to refer. Much of the variance in actual referral behaviour may be explained by cognitive attitudes and subjective norms. Interventions to reduce this variation should be focused on informing GPs about actual norms, and best practice guidelines.
Collapse
|
27
|
Identification of Apolipoprotein AI as a serum biomarker of chronic kidney disease in liver transplant recipients, using proteomic techniques. Proteomics Clin Appl 2008; 2:1338-48. [DOI: 10.1002/prca.200780167] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2008] [Indexed: 11/06/2022]
|
28
|
Sirolimus is associated with new-onset diabetes in kidney transplant recipients. J Am Soc Nephrol 2008; 19:1411-8. [PMID: 18385422 DOI: 10.1681/asn.2007111202] [Citation(s) in RCA: 286] [Impact Index Per Article: 17.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/15/2023] Open
Abstract
New-onset diabetes (NOD) is associated with transplant failure. A few single-center studies have suggested that sirolimus is associated with NOD, but this is not well established. With the use of data from the United States Renal Data System, this study evaluated the association between sirolimus use at the time of transplantation and NOD among 20,124 adult recipients of a first kidney transplant without diabetes. Compared with patients treated with cyclosporine and either mycophenolate mofetil orazathioprine, sirolimus-treated patients were at increased risk for NOD, whether it was used in combination with cyclosporine (adjusted hazard ratio [HR] 1.61; 95% confidence interval [CI] 1.36 to 1.90),tacrolimus (adjusted HR 1.66; 95% CI 1.42 to 1.93), or an antimetabolite (mycophenolate mofetil orazathioprine; adjusted HR 1.36; 95% CI 1.09 to 1.69). Similar results were obtained in a subgroup analysis that included the 16,861 patients who did not have their immunosuppressive regimen changed throughout the first posttransplantation year. In conclusion, sirolimus is independently associated with NOD. Given the negative impact of NOD on posttransplantation outcomes, these findings should be confirmed in prospective studies or in meta-analyses of existing trials that involved sirolimus.
Collapse
|
29
|
Impact of acute rejection and new-onset diabetes on long-term transplant graft and patient survival. Clin J Am Soc Nephrol 2008; 3:814-21. [PMID: 18322046 DOI: 10.2215/cjn.04681107] [Citation(s) in RCA: 183] [Impact Index Per Article: 11.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022]
Abstract
BACKGROUND AND OBJECTIVES Development of new therapeutic strategies to improve long-term transplant outcomes requires improved understanding of the mechanisms by which these complications limit long-term transplant survival. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS The association of acute rejection and new-onset diabetes was determined in the first posttransplantation year with the outcomes of transplant failure from any cause, death-censored graft loss, and death with a functioning graft in 27,707 adult recipients of first kidney-only transplants, with graft survival of at least 1 yr, performed between 1995 and 2002 in the United States. RESULTS In multivariate analyses, patients who developed acute rejection or new-onset diabetes had a similar risk for transplant failure from any cause, but the mechanisms of transplant failure were different: Acute rejection was associated with death-censored graft loss but only weakly associated with death with a functioning graft. In contrast new-onset diabetes was not associated with death-censored graft loss but was associated with an increased risk for death with a functioning graft. CONCLUSIONS Acute rejection and new-onset diabetes have a similar impact on long-term transplant survival but lead to transplant failure through different mechanisms. The mechanisms by which new-onset diabetes leads to transplant failure should be prospectively studied. Targeted therapeutic strategies to minimize the impact of various early posttransplantation complications may lead to improved long-term outcomes.
Collapse
|
30
|
Qualitative study of depression management in primary care: GP and patient goals, and the value of listening. Br J Gen Pract 2007; 57:872-9. [PMID: 17976282 PMCID: PMC2169333 DOI: 10.3399/096016407782318026] [Citation(s) in RCA: 61] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/29/2007] [Revised: 05/17/2007] [Accepted: 08/14/2007] [Indexed: 10/31/2022] Open
Abstract
BACKGROUND Guidelines for depression management have been developed but little is known about GP and patient goals, which are likely to influence treatment offers, uptake, and adherence. AIM To identify issues of importance to GPs, patients, and patients' supporters regarding depression management. GP and patient goals for depression management became a focus of the study. DESIGN OF STUDY Grounded theory-based qualitative study. SETTING GPs were drawn from 28 practices. The majority of patients and supporters were recruited from 10 of these practices. METHOD Sixty-one patients (28 depressed, 18 previously depressed, 15 never depressed), 18 supporters, and 32 GPs were interviewed. RESULTS GPs described encouraging patients to view depression as separate from the self and 'normal' sadness. Patients and supporters often questioned such boundaries, rejecting the notion of a medical cure and emphasising self-management. The majority of participants who were considering depression-management strategies wanted to 'get out' of their depression. However, a quarter did not see this as immediately relevant or achievable. They focused on getting by from day to day, which had the potential to clash with GP priorities. GP frustration and uncertainty could occur when depression was resistant to cure. Participants identified the importance of GPs listening to patients, but often felt that this did not happen. CONCLUSION Physicians need greater awareness of the extent to which their goals for the management of depression are perceived as relevant or achievable by patients. Future research should explore methods of negotiating agreed strategies for management.
Collapse
|
31
|
Abstract
BACKGROUND Earlier diagnosis of disordered eating is linked to improved prognosis, but detection in primary care is poor. OBJECTIVES To assess the feasibility of screening for disordered eating within primary care, in terms of the proportion of patients accepting screening, yield of cases, action taken by staff and staff views on screening. METHODS Data were collected in open GP surgeries, midwife (MW) antenatal clinics and health visitor (HV) child health surveillance clinics in two GP practices, using face-to-face surveys and semi-structured interviews. Female patients aged 16-35 were asked to complete the SCOFF questionnaire, which was scored by researchers and taken by the patient into their consultation. If the result indicated possible disturbed eating, the health professional (HP) running the surgery/clinic was asked to complete a questionnaire and interview. One hundred and eleven women were screened and 11 HPs (GPs, MWs, HVs) were interviewed. RESULTS Forty-six percent of patients agreed to be screened. Of these, 16% produced a positive result. The staff survey suggested that HPs found screening acceptable. However, concerns arose in the interviews, principally over what action to take in response to positive results. Positive results were rarely recorded in medical notes, and treatment was rarely offered. CONCLUSION In order for a screening programme for eating disorders to be implemented in primary care, HP concerns about options for dealing with positive results would need to be addressed. Feasibility of screening would be enhanced by production of a protocol to be followed in the case of positive results.
Collapse
|
32
|
Access to kidney transplantation: the limitations of our current understanding. J Nephrol 2007; 20:501-506. [PMID: 17918132] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/25/2023]
Abstract
Since kidney transplantation (KTX) is the preferred means of treating kidney failure, ensuring that all patients who may benefit from KTX have equal access to this scarce resource is an important objective. Studies focusing on this issue will become increasingly important as the gap between the demand and supply of organs continues to increase, and changes to the United Network of Organ Sharing organ allocation policy are actively debated. However, it is clear that current methods used to study access to KTX have serious limitations. This review highlights the shortcomings of the methods currently used to assess access to KTX, and the limitations of registry data and national wait-list data as information sources to study patient access to KTX. The review also provides suggestions for research and analytical approaches that might be utilized to improve our future understanding of patient access to KTX. The information provided will aid the reader to critically assess issues related to patient access to KTX.
Collapse
|
33
|
Abstract
The role of transplant nephrectomy after transplant failure is uncertain. We report the use and consequences of transplant nephrectomy among 19 107 transplant failure patients between 1995 and 2003 in the United States. Among 3707 patients with early transplant failure (graft survival <12 m), nephrectomy was performed in 56%, and was associated with an increased risk of death (HR 1.13, 95% CI 1.01-1.26). In contrast, among 15,400 patients with late transplant failure (graft survival > or =12 m), nephrectomy was performed in 27%, and was associated with a decreased risk of death (HR 0.89, 95% CI 0.83-0.95). In early transplant failure patients, nephrectomy was associated with a lower risk of repeat transplant failure (HR 0.72, 95% CI 0.56-0.94), while among late transplant failure patients; nephrectomy was associated with a higher risk of repeat transplant failure (HR 1.20, 95% CI 1.02-1.41). Definitive conclusions are not possible from this observational study. The role of nephrectomy in the management of dialysis treated transplant failure patients, and the implications of nephrectomy for repeat transplantation should be further studied in prospective studies.
Collapse
|
34
|
Prevention of sepsis during the transition to dialysis may improve the survival of transplant failure patients. J Am Soc Nephrol 2007; 18:1331-7. [PMID: 17314323 DOI: 10.1681/asn.2006091017] [Citation(s) in RCA: 42] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/17/2022] Open
Abstract
Dialysis patients are at risk for sepsis, and the risk may be even higher among transplant failure patients because of previous or ongoing immunosuppression. The incidence and the consequences of sepsis as defined by International Classification of Diseases, Ninth Revision, Clinical Modification hospital discharge diagnoses codes were determined among 5117 patients who initiated dialysis after transplant failure between 1995 and 2004 in the United States. The overall sepsis rate was 11.8 per 100 patient years (95% confidence interval [CI] 11.5 to 12.1). Sepsis was highest in the first 6 mo after transplant failure (35.6 per 100 patient years [95% CI 29.4 to 43.0] between 0 to 3 mo after transplant failure; 19.7 per 100 patient years [95% CI 17.2 to 22.5] between 3 to 6 mo after transplant failure). In comparison, the sepsis rate among incident dialysis patients between 3 and 6 mo after dialysis initiation was 7.8 per 100 patient years (95% CI 7.3 to 8.3), whereas the sepsis rate among transplant recipients between 3 and 6 mo after transplantation was 5.4 per 100 patient years (95% CI 4.9 to 5.9). Patients who were > or =60 yr, obese patients, patients with diabetes, and patients with a history or peripheral vascular disease or congestive heart failure were at risk for sepsis. Transplant nephrectomy was not associated with septicemia. The role of continued immunosuppression and vascular access creation was not assessed and should be addressed in future studies. In a multivariate analysis, patients who were hospitalized for sepsis had an increased risk for death (hazard ratio 2.93; 95% CI 2.64 to 3.24; P < 0.001). Strategies to prevent sepsis during the transition from transplantation to dialysis may improve the survival of patients with allograft failure.
Collapse
|
35
|
Reduced graft function (with or without dialysis) vs immediate graft function--a comparison of long-term renal allograft survival. Nephrol Dial Transplant 2006; 21:2270-4. [PMID: 16720598 DOI: 10.1093/ndt/gfl103] [Citation(s) in RCA: 94] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
BACKGROUND Delayed graft function (DGF) is a common complication in cadaveric kidney transplants affecting graft outcome. However, the incidence of DGF differs widely between centres as its definition is very variable. The purpose of this study was to define a parameter for DGF and immediate graft function (IGF) and to compare the graft outcome between these groups at our centre. METHODS The renal allograft function of 972 first cadaveric transplants performed between 1990 and 2001 in the Republic of Ireland was examined. The DGF and IGF were defined by a creatinine reduction ratio (CRR) between time 0 of transplantation and day 7 post-transplantation of <70 and >70%, respectively. Recipients with reduced graft function (DGF) not requiring dialysis were defined as slow graft function (SGF) patients. The serum creatinine at 3 months, 6 months, 1, 2 and 5 years after transplantation was compared between these groups of recipients. The graft survival rates at 1, 3 and 5 years and the graft half-life for DGF, SGF and IGF recipients were also assessed. RESULTS Of the 972 renal transplant recipients, DGF was seen in 102 (10.5%) patients, SGF in 202 (20.8%) recipients and IGF in 668 (68.7%) patients. Serum creatinine levels were significantly different between the three groups at 3 and 6 months, 1, 2 and 5 years. Graft survival at 5 years for the DGF patients was 48.5%, 60.5% for SGF recipients and 75% for IGF patients with graft half-life of 4.9, 8.7 and 10.5 years, respectively. CONCLUSION This study has shown that the CRR at day 7 correlates with renal function up to 5 years post-transplantation and with long-term graft survival. We have also demonstrated that amongst patients with reduced graft function after transplantation, two groups with significantly different outcomes exist.
Collapse
|
36
|
The impact of donor spontaneous intracranial haemorrhage vs. other donors on long-term renal graft and patient survival. Clin Transplant 2006; 20:91-5. [PMID: 16556161 DOI: 10.1111/j.1399-0012.2005.00446.x] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
BACKGROUND Donor cause of death has a significant impact on transplant survival in heart transplants recipients. The objective of this study was to determine if long-term renal allograft and patient survival differed between grafts donated by donors who died of spontaneous intracranial haemorrhage (SIH) compared with those with other causes of death (OCOD). METHODS Between 1990 and 2001, 1526 renal transplants were performed (711 SIH donors and 815 OCOD donors) at our unit. Serum creatinine levels at 1 yr, graft half-life and annual graft failure rate were measured for both groups. Renal graft and patient survivals between the groups were compared. Relative risk for SIH donors and other confounding variables was measured using Cox proportional hazards models. RESULTS Graft half-life results were obtained for SIH (8 yr) and OCOD (10.13 yr) recipients. Graft and patient survival at 5 and 10 yr was 68.5% and 39.3% respectively for the SIH group vs. 76.8% and 51.9% respectively for the OCOD group (p < 0.001). However, SIH graft recipients were significantly older with more females. After adjustment for differences in baseline variables between the groups, donor cause of death did not have an independent effect on long-term graft or patient survival. CONCLUSION Spontaneous intracranial haemorrhage as a cause of donor death, failed to have a significant independent effect on long-term allograft and patient survival.
Collapse
|
37
|
Favorable Graft Survival in Renal Transplant Recipients with Polycystic Kidney Disease. Ren Fail 2005. [DOI: 10.1081/jdi-200056606] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022] Open
|
38
|
Favorable graft survival in renal transplant recipients with polycystic kidney disease. Ren Fail 2005; 27:309-14. [PMID: 15957548] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/03/2023] Open
Abstract
Graft survival in the autosomal dominant polycystic kidney disease (ADPKD) transplant population at our center was compared to other end stage renal disease (ESRD) transplant recipients (excluding diabetics). There were 1512 adult cadaveric renal transplants carried out at our center between 1989 and 2002. After exclusions, 1372 renal grafts were included in the study. Using Kaplan-Meier methods, patient and graft survival were determined and compared between the two groups. Mean age at transplant was significantly older for the ADPKD group of patients. The age adjusted graft survival at 5 years was 79% for ADPKD patients compared to 68% in the controls. Patient survival for ADPKD patients improved from 89% at 5 years to 95% when age adjusted. Using the Cox proportional hazards models to compare ADPKD with other causes of ESRD (including recipient age and other variables) in a multifactorial model, ADPKD was significant at the 5% level (p=0.036). This study demonstrates a graft and patient survival advantage in ADPKD patients when age-matched compared to other ESRD patients.
Collapse
|
39
|
|
40
|
Abstract
This article presents preliminary findings from the first participant to complete an experiment assessing the efficacy of the personal stereo in treating auditory hallucinations. O.C., a 50-year-old woman, took part in a controlled treatment trial in which 1-week baseline, personal stereo, and control treatment (nonfunctioning hearing aid) stages were alternated for 7 weeks. The Positive and Negative Syndrome Scale, Clinical Global Impression Scales, Beliefs About Voices Questionnaire, Rosenberg Self-Esteem Scale, and Topography of Voices Rating Scale were used. The personal stereo led to a decrease in the severity of O.C.'s auditory hallucinations. For example, she rated her voices as being fairly distressing during baseline and control treatment stages but neutral during personal stereo stages. A slight decrease in other psychopathology also occurred during personal stereo stages. Use of the personal stereo did not lead to a decrease in self-esteem, contradicting suggestions that counterstimulation treatments for auditory hallucinations may be disempowering.
Collapse
|
41
|
New territory. Interview by Derek Hand. Nurs Stand 2001; 15:18-9. [PMID: 12214398] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/26/2023]
|
42
|
|
43
|
|
44
|
Abstract
Screening of 717 5-year-olds and 757 7-year-olds, found 55 of the former and 77 of the latter possibly to be poorly coordinated. Further diagnostic testing with the McCarthy Motor Scales confirmed the problem in a total of 95 children, a prevalence of 6.4%. Neurological examination showed 43% of the 5-year-olds and 21% of the 7-year-olds to have choreiform movements. Of the total 95, proprioception was abnormal in 40%, but abnormal muscle tone present in only 4%. An increased prevalence of hearing loss and obesity and a history of developmental delays was found. Low birth weights, prematurity, post-maturity and perinatal problems were significantly associated with poor coordination. Socioeconomic status was not a significant factor. The difficulties of testing and measuring poor coordination and the need for more precise measures are discussed. Follow-up of at risk children at age 5 with tests of motor coordination is recommended.
Collapse
|
45
|
Poor co-ordination in 5 year olds: a screening test for use in schools. AUSTRALIAN PAEDIATRIC JOURNAL 1987; 23:157-61. [PMID: 2444202 DOI: 10.1111/j.1440-1754.1987.tb00235.x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/01/2023]
Abstract
A simple standardized screening test (South Australian Motor Co-ordination Screening Test, SAM Test) was developed to screen for poor co-ordination in 5 year olds. This SAM Test, which can be used by teachers, nurses and doctors, has explicit pass/fail criteria and has classified correctly 90% of children. The McCarthy Motor Scales, which are time consuming and limited to use by psychologists, were used to categorize 60 poorly co-ordinated and 60 normal children. The 120 children thus selected were tested on 19 items covering gross and fine motor skills. Statistical analysis to determine which items best discriminated between the two groups found the following five gross motor items to be most effective: one leg balancing, hopping, heel-toe walking on line, jumping over ribbon and dropping ball and catching.
Collapse
|
46
|
Pre-school developmental screening. AUSTRALIAN PAEDIATRIC JOURNAL 1983; 19:272-3. [PMID: 6201158] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 01/18/2023]
|
47
|
School entry health examination. Med J Aust 1981; 1:198-9. [PMID: 7231301 DOI: 10.5694/j.1326-5377.1981.tb135477.x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/24/2023]
|
48
|
Ill health and developmental delays in Adelaide four-year-old children. AUSTRALIAN PAEDIATRIC JOURNAL 1980; 16:248-54. [PMID: 6165347 DOI: 10.1111/j.1440-1754.1980.tb01308.x] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/18/2023]
|
49
|
A coronary risk-factor profile of 4 year olds. II. Inter-relationships, clustering, and tracking of blood pressure, serum lipoproteins, and skinfold thickness. AUSTRALIAN PAEDIATRIC JOURNAL 1978; 14:278-82. [PMID: 747548] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 12/24/2022]
|
50
|
Abstract
A retrospective study with adequate followup of 5 years of longer of 114 patients who had transurethral resection for cure or control of bladder tumors and in whom there were 1 or more recurrences is reported. There was recurrence of the tumor in a higher grade in 19 per cent of the cases and in a higher stage in 22 per cent. The interval between the treatment of the original tumor and a recurrence in a higher grade or state varied from 0.25 (3 months) to 27 years. When a low grade or low stage tumor recurs in a higher category in less than 3 years more aggressive treatment should be considered.
Collapse
|