1
|
Kasiske BL, Snyder JJ, Gilbertson D, Matas AJ. Diabetes mellitus after kidney transplantation in the United States. Am J Transplant 2003; 3:178-85. [PMID: 12603213 DOI: 10.1034/j.1600-6143.2003.00010.x] [Citation(s) in RCA: 944] [Impact Index Per Article: 42.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
New onset diabetes is a major complication after kidney transplantation. However, the incidence, risk factors and clinical relevance of post-transplant diabetes mellitus (PTDM) vary among reports from single-center observational studies and clinical trials. Using data from the United Renal Data System we identified 11 659 Medicare beneficiaries who received their first kidney transplant in 1996-2000. The cumulative incidence of PTDM was 9.1% (95% confidence interval = 8.6-9.7%), 16.0% (15.3-16.7%), and 24.0% (23.1-24.9%) at 3, 12, and 36 months post-transplant, respectively. Using Cox's proportional hazards analysis, risk factors for PTDM included age, African American race (relative risk = 1.68, range: 1.52-1.85, p < 0.0001), Hispanic ethnicity (1.35, range: 1.19-1.54, p < 0.0001), male donor (1.12, range: 1.03-1.21, p = 0.0090), increasing HLA mismatches, hepatitis C infection (1.33, range: 1.15-1.55, p < 0.0001), body mass index >or=30 kg/m2 (1.73, range: 1.57-1.90, p < 0.0001), and the use of tacrolimus as the initial maintenance immunosuppressive medication (1.53, range: 1.29-1.81, p < 0.0001). Factors that reduced the risk for PTDM included the use of mycophenolate mofetil, azathioprine, younger recipient age, glomerulonephritis as a cause of kidney failure, and a college education. As a time-dependent covariate in Cox analyses that also included multiple other risk factors, PTDM was associated with increased graft failure (1.63, 1.46-1.84, p < 0.0001), death-censored graft failure (1.46, 1.25-1.70, p < 0.0001), and mortality (1.87, 1.60-2.18, p < 0.0001). We conclude that high incidences of PTDM are associated with the type of initial maintenance immunosuppression, race, ethnicity, obesity and hepatitis C infection. It is a strong, independent predictor of graft failure and mortality. Efforts should be made to minimize the risk of this important complication.
Collapse
|
|
22 |
944 |
2
|
Ibrahim HN, Foley R, Tan L, Rogers T, Bailey RF, Guo H, Gross CR, Matas AJ. Long-term consequences of kidney donation. N Engl J Med 2009; 360:459-69. [PMID: 19179315 PMCID: PMC3559132 DOI: 10.1056/nejmoa0804883] [Citation(s) in RCA: 742] [Impact Index Per Article: 46.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
Abstract
BACKGROUND The long-term renal consequences of kidney donation by a living donor are attracting increased appropriate interest. The overall evidence suggests that living kidney donors have survival similar to that of nondonors and that their risk of end-stage renal disease (ESRD) is not increased. Previous studies have included relatively small numbers of donors and a brief follow-up period. METHODS We ascertained the vital status and lifetime risk of ESRD in 3698 kidney donors who donated kidneys during the period from 1963 through 2007; from 2003 through 2007, we also measured the glomerular filtration rate (GFR) and urinary albumin excretion and assessed the prevalence of hypertension, general health status, and quality of life in 255 donors. RESULTS The survival of kidney donors was similar to that of controls who were matched for age, sex, and race or ethnic group. ESRD developed in 11 donors, a rate of 180 cases per million persons per year, as compared with a rate of 268 per million per year in the general population. At a mean (+/-SD) of 12.2+/-9.2 years after donation, 85.5% of the subgroup of 255 donors had a GFR of 60 ml per minute per 1.73 m(2) of body-surface area or higher, 32.1% had hypertension, and 12.7% had albuminuria. Older age and higher body-mass index, but not a longer time since donation, were associated with both a GFR that was lower than 60 ml per minute per 1.73 m(2) and hypertension. A longer time since donation, however, was independently associated with albuminuria. Most donors had quality-of-life scores that were better than population norms, and the prevalence of coexisting conditions was similar to that among controls from the National Health and Nutrition Examination Survey (NHANES) who were matched for age, sex, race or ethnic group, and body-mass index. CONCLUSIONS Survival and the risk of ESRD in carefully screened kidney donors appear to be similar to those in the general population. Most donors who were studied had a preserved GFR, normal albumin excretion, and an excellent quality of life.
Collapse
|
Research Support, N.I.H., Extramural |
16 |
742 |
3
|
Abstract
The perioperative and long-term risks for living kidney donors are of concern. We have studied donors at the University of Minnesota 20 years or more (mean 23.7) after donation by comparing renal function, blood pressure, and proteinuria in donors with siblings. In 57 donors (mean age 61 [SE 1]), mean serum creatinine is 1.1 (0.01) mg/dl, blood urea nitrogen 17 (0.5) mg/dl, creatinine clearance 82 (2) ml/min, and blood pressure 134 (2)/80 (1) mm Hg. 32% of the donors are taking antihypertensive drugs and 23% have proteinuria. The 65 siblings (mean age 58 [1.3]) do not significantly differ from the donors in any of these variables: 1.1 (0.03) mg/dl, 17 (1.2) mg/dl, 89 (3.3) ml/min, and 130 (3)/80 (1.5) mm Hg, respectively. 44% of the siblings are taking antihypertensives and 22% have proteinuria. To assess perioperative mortality, we surveyed all members of the American Society of Transplant Surgeons about donor mortality at their institutions. We documented 17 perioperative deaths in the USA and Canada after living donation, and estimate mortality to be 0.03%. We conclude that perioperative mortality in the USA and Canada after living-donor nephrectomy is low. In long-term follow-up of our living donors, we found no evidence of progressive renal deterioration or other serious disorders.
Collapse
|
|
33 |
464 |
4
|
Platt JL, Vercellotti GM, Dalmasso AP, Matas AJ, Bolman RM, Najarian JS, Bach FH. Transplantation of discordant xenografts: a review of progress. IMMUNOLOGY TODAY 1990; 11:450-6; discussion 456-7. [PMID: 2073317 DOI: 10.1016/0167-5699(90)90174-8] [Citation(s) in RCA: 456] [Impact Index Per Article: 13.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/30/2022]
Abstract
Hyperacute rejection, apparently initiated by natural antibodies and complement, has been viewed as an absolute barrier to the xenotransplantation of vascularized grafts between different species. Until recently, little was known about the molecular and physiological basis for this barrier nor was there evidence that the barrier might be more than transiently breached. In this paper Jeffrey Platt, Fritz Bach and colleagues describe a model of hyperacute rejection and propose that, if hyperacute rejection can be averted for a period after transplantation, prolonged xenograft survival will be possible.
Collapse
|
Review |
35 |
456 |
5
|
Sutherland DE, Gruessner RW, Dunn DL, Matas AJ, Humar A, Kandaswamy R, Mauer SM, Kennedy WR, Goetz FC, Robertson RP, Gruessner AC, Najarian JS. Lessons learned from more than 1,000 pancreas transplants at a single institution. Ann Surg 2001; 233:463-501. [PMID: 11303130 PMCID: PMC1421277 DOI: 10.1097/00000658-200104000-00003] [Citation(s) in RCA: 421] [Impact Index Per Article: 17.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/15/2022]
Abstract
OBJECTIVE To determine outcome in diabetic pancreas transplant recipients according to risk factors and the surgical techniques and immunosuppressive protocols that evolved during a 33-year period at a single institution. SUMMARY BACKGROUND DATA Insulin-dependent diabetes mellitus is associated with a high incidence of management problems and secondary complications. Clinical pancreas transplantation began at the University of Minnesota in 1966, initially with a high failure rate, but outcome improved in parallel with other organ transplants. The authors retrospectively analyzed the factors associated with the increased success rate of pancreas transplants. METHODS From December 16, 1966, to March 31, 2000, the authors performed 1,194 pancreas transplants (111 from living donors; 191 retransplants): 498 simultaneous pancreas-kidney (SPK) and 1 simultaneous pancreas-liver transplant; 404 pancreas after kidney (PAK) transplants; and 291 pancreas transplants alone (PTA). The analyses were divided into five eras: era 0, 1966 to 1973 (n = 14), historical; era 1, 1978 to 1986 (n = 148), transition to cyclosporine for immunosuppression, multiple duct management techniques, and only solitary (PAK and PTA) transplants; era 2, 1986 to 1994 (n = 461), all categories (SPK, PAK, and PTA), predominantly bladder drainage for graft duct management, and primarily triple therapy (cyclosporine, azathioprine, and prednisone) for maintenance immunosuppression; era 3, 1994 to 1998 (n = 286), tacrolimus and mycophenolate mofetil used; and era 4, 1998 to 2000 (n = 275), use of daclizumab for induction immunosuppression, primarily enteric drainage for SPK transplants, pretransplant immunosuppression in candidates awaiting PTA. RESULTS Patient and primary cadaver pancreas graft functional (insulin-independence) survival rates at 1 year by category and era were as follows: SPK, era 2 (n = 214) versus eras 3 and 4 combined (n = 212), 85% and 64% versus 92% and 79%, respectively; PAK, era 1 (n = 36) versus 2 (n = 61) versus 3 (n = 84) versus 4 (n = 92), 86% and 17%, 98% and 59%, 98% and 76%, and 98% and 81%, respectively; in PTA, era 1 (n = 36) versus 2 (n = 72) versus 3 (n = 30) versus 4 (n = 40), 77% and 31%, 99% and 50%, 90% and 67%, and 100% and 88%, respectively. In eras 3 and 4 combined for primary cadaver SPK transplants, pancreas graft survival rates were significantly higher with bladder drainage (n = 136) than enteric drainage (n = 70), 82% versus 74% at 1 year (P =.03). Increasing recipient age had an adverse effect on outcome only in SPK recipients. Vascular disease was common (in eras 3 and 4, 27% of SPK recipients had a pretransplant myocardial infarction and 40% had a coronary artery bypass); those with no vascular disease had significantly higher patient and graft survival rates in the SPK and PAK categories. Living donor segmental pancreas transplants were associated with higher technically successful graft survival rates in each era, predominately solitary (PAK and PTA) in eras 1 and 2 and SPK in eras 3 and 4. Diabetic secondary complications were ameliorated in some recipients, and quality of life studies showed significant gains after the transplant in all recipient categories. CONCLUSIONS Patient and graft survival rates have significantly improved over time as surgical techniques and immunosuppressive protocols have evolved. Eventually, islet transplants will replace pancreas transplants for suitable candidates, but currently pancreas transplants can be applied and should be an option at all stages of diabetes. Early transplants are preferable for labile diabetes, but even patients with advanced complications can benefit.
Collapse
|
Review |
24 |
421 |
6
|
Troppmann C, Gillingham KJ, Benedetti E, Almond PS, Gruessner RW, Najarian JS, Matas AJ. Delayed graft function, acute rejection, and outcome after cadaver renal transplantation. The multivariate analysis. Transplantation 1995; 59:962-8. [PMID: 7709456 DOI: 10.1097/00007890-199504150-00007] [Citation(s) in RCA: 390] [Impact Index Per Article: 13.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/26/2023]
Abstract
The impact of delayed graft function on outcome after cadaver renal transplantation has been controversial, but most authors fail to control their analyses for the presence or absence of rejection. We studied 457 adult recipients of primary cadaver allografts at a single institution during the cyclosporine era. All patients received sequential immunosuppression. The incidence of delayed graft function (defined as dialysis being required during the first week after transplant) was 23%. There was a significant association between delayed graft function and cold ischemia time > 24 hr (P = 0.0001) and between delayed graft function and the occurrence of at least one biopsy-proven rejection episode (P = 0.004). Actuarial graft survival was not significantly different when comparing delayed graft function versus no delayed graft function for patients without rejection (P = 0.02). However, it was significantly worse for patients with both delayed graft function and rejection versus those with delayed graft function but no rejection (P = 0.005), as well as for grafts preserved > 24 hr versus < or = 24 hr (P = 0.007). By multivariate analysis, delayed graft function per se was not a significant risk factor for decreased graft survival for patients without rejection (P = 0.42). In contrast, rejection significantly decreased graft survival for grafts with immediate function (relative risk = 2.3, P = 0.0002), particularly in combination with delayed graft function (relative risk = 4.2, P < 0.0001). While cold ischemia time > 24 hr was also a significant risk factor (relative risk = 1.9, P = 0.02), other variables (preservation mode, 0 HLA Ag mismatch, age at transplantation, gender, diabetic status, and panel-reactive antibody at transplantation) had no impact on graft survival. Patient survival was significantly affected by the combination of delayed graft function and rejection (relative risk = 3.1, P < 0.0001), age at transplantation > 50 years (relative risk = 2.6, P < 0.0001), and diabetes (relative risk = 1.8, P = 0.006). Further studies are necessary to elucidate the mechanisms linking delayed graft function and rejection, which, in combination, lead to poor allograft outcome.
Collapse
|
|
30 |
390 |
7
|
Bloom RD, Bromberg JS, Poggio ED, Bunnapradist S, Langone AJ, Sood P, Matas AJ, Mehta S, Mannon RB, Sharfuddin A, Fischbach B, Narayanan M, Jordan SC, Cohen D, Weir MR, Hiller D, Prasad P, Woodward RN, Grskovic M, Sninsky JJ, Yee JP, Brennan DC. Cell-Free DNA and Active Rejection in Kidney Allografts. J Am Soc Nephrol 2017; 28:2221-2232. [PMID: 28280140 DOI: 10.1681/asn.2016091034] [Citation(s) in RCA: 380] [Impact Index Per Article: 47.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2016] [Accepted: 01/06/2017] [Indexed: 12/28/2022] Open
Abstract
Histologic analysis of the allograft biopsy specimen is the standard method used to differentiate rejection from other injury in kidney transplants. Donor-derived cell-free DNA (dd-cfDNA) is a noninvasive test of allograft injury that may enable more frequent, quantitative, and safer assessment of allograft rejection and injury status. To investigate this possibility, we prospectively collected blood specimens at scheduled intervals and at the time of clinically indicated biopsies. In 102 kidney recipients, we measured plasma levels of dd-cfDNA and correlated the levels with allograft rejection status ascertained by histology in 107 biopsy specimens. The dd-cfDNA level discriminated between biopsy specimens showing any rejection (T cell-mediated rejection or antibody-mediated rejection [ABMR]) and controls (no rejection histologically), P<0.001 (receiver operating characteristic area under the curve [AUC], 0.74; 95% confidence interval [95% CI], 0.61 to 0.86). Positive and negative predictive values for active rejection at a cutoff of 1.0% dd-cfDNA were 61% and 84%, respectively. The AUC for discriminating ABMR from samples without ABMR was 0.87 (95% CI, 0.75 to 0.97). Positive and negative predictive values for ABMR at a cutoff of 1.0% dd-cfDNA were 44% and 96%, respectively. Median dd-cfDNA was 2.9% (ABMR), 1.2% (T cell-mediated types ≥IB), 0.2% (T cell-mediated type IA), and 0.3% in controls (P=0.05 for T cell-mediated rejection types ≥IB versus controls). Thus, dd-cfDNA may be used to assess allograft rejection and injury; dd-cfDNA levels <1% reflect the absence of active rejection (T cell-mediated type ≥IB or ABMR) and levels >1% indicate a probability of active rejection.
Collapse
|
Journal Article |
8 |
380 |
8
|
Platt JL, Fischel RJ, Matas AJ, Reif SA, Bolman RM, Bach FH. Immunopathology of hyperacute xenograft rejection in a swine-to-primate model. Transplantation 1991; 52:214-20. [PMID: 1871792 DOI: 10.1097/00007890-199108000-00006] [Citation(s) in RCA: 359] [Impact Index Per Article: 10.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/29/2022]
Abstract
Hyperacute rejection is the inevitable consequence of the transplantation of vascularized organs between phylogenetically distant species. The nature of the incompatibility and the pathogenetic mechanisms that lead to hyperacute xenograft rejection are incompletely understood. We investigated these issues by the immunopathological analysis of tissues from swine renal and cardiac xenografts placed in rhesus monkeys. Hyperacute rejection was associated with deposition of recipient IgM and classic but not alternative complement pathway components along endothelial surfaces, the formation of platelet and fibrin thrombi, and the infiltration of neutrophils. In animals from which natural antibody was temporarily depleted by organ perfusion, rejection was observed at 3 days to 5 days posttransplant. The immunopathology of rejection in these tissues revealed focal vascular changes similar to those observed in hyperacute rejection. A xenograft functioning for a prolonged period in a recipient temporarily depleted of circulating natural antibody contained recipient IgM along endothelial surfaces but no evidence for significant deposition of complement, formation of platelet and fibrin thrombi, or infiltration of neutrophils. These results suggest that rhesus IgM contributes significantly to the development of hyperacute rejection in the swine to Rhesus model and that the fixation of complement is a critical factor in the recruitment of the coagulation cascade and platelet aggregation--and possibly in the adherence and infiltration of PMN.
Collapse
|
|
34 |
359 |
9
|
Matas AJ, Smith JM, Skeans MA, Thompson B, Gustafson SK, Stewart DE, Cherikh WS, Wainright JL, Boyle G, Snyder JJ, Israni AK, Kasiske BL. OPTN/SRTR 2013 Annual Data Report: kidney. Am J Transplant 2015; 15 Suppl 2:1-34. [PMID: 25626344 DOI: 10.1111/ajt.13195] [Citation(s) in RCA: 346] [Impact Index Per Article: 34.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
A new kidney allocation system, expected to be implemented in late 2014, will characterize donors on a percent scale (0%-100%) using the kidney donor profile index (KDPI). The 20% of deceased donor kidneys with the greatest expected posttransplant longevity will be allocated first to the 20% of candidates with the best expected posttransplant survival; kidneys that are not accepted will then be offered to remaining 80% of candidates. Waiting time will start at the time of maintenance dialysis initiation (even if before listing) or at the time of listing with an estimated glomerular filtration rate of 20 mL/min/1.73 m(2) or less. Under the current system, the number of candidates on the waiting list continues to increase, as each year more candidates are added than are removed. Median waiting times for adults increased from 3 years in 2003 to more than 4.5 years in 2009. Donation rates have not increased. Short-term outcomes continue to improve; death-censored graft survival at 90 days posttransplant was 97% or higher for deceased donor transplants and over 99% for living donor transplants. In 2013, 883 pediatric candidates were added to the waiting list; 65.8% of pediatric candidates on the list in 2013 underwent deceased donor transplant. Five-year graft survival was highest for living donor recipients aged younger than 11 years (89%) and lowest for deceased donor recipients aged 11 to 17 years (68%).
Collapse
|
|
10 |
346 |
10
|
Matas AJ, Gillingham KJ, Payne WD, Najarian JS. The impact of an acute rejection episode on long-term renal allograft survival (t1/2). Transplantation 1994; 57:857-9. [PMID: 8154032 DOI: 10.1097/00007890-199403270-00015] [Citation(s) in RCA: 338] [Impact Index Per Article: 10.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/29/2023]
Abstract
An acute renal transplant rejection episode has been shown to be associated with decreased 1-year graft survival. The impact on long-term outcome is undefined. We studied the impact of an acute rejection episode on t1/2, the time it takes for 1/2 of the grafts functioning at 1 year to fail. Use of t1/2 avoids inclusion of early graft loss to acute rejection or complications of treatment. Since 1/1/86, a total of 653 patients have received a primary kidney transplant and had at least 1 year of function. Recipients were divided by the incidence and timing of rejection: no rejection; 1 rejection within the first year; > 1 rejection, the first episode in the first year; and > or = 1 rejection, the first episode after the first year. A single rejection episode in the first year reduced t1/2 (45 +/- 11 years in those with no rejection vs. 25 +/- 8 years in those with 1 in the first year). Multiple rejections (t1/2 = 5 +/- 11 years) and a first rejection after the first year (t1/2 = 3 +/- 1 years) have a significant effect (P < .05). Both living and cadaver donor recipients with rejection had shortened t1/2. For those with > 1 rejection, the first episode in the first year, and those with > or = 1 rejection, the first episode after the first year, chronic rejection was the predominant cause of graft loss; noncompliance also played a role. We conclude that a single rejection episode shortens t1/2. Those with > 1 rejection, the first episode within the first year, and those with > or = 1 rejection, the first episode after the first year, are at high risk for late graft loss.
Collapse
|
|
31 |
338 |
11
|
Kasiske BL, Snyder JJ, Matas AJ, Ellison MD, Gill JS, Kausz AT. Preemptive kidney transplantation: the advantage and the advantaged. J Am Soc Nephrol 2002; 13:1358-64. [PMID: 11961024 DOI: 10.1097/01.asn.0000013295.11876.c9] [Citation(s) in RCA: 334] [Impact Index Per Article: 14.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/04/2023] Open
Abstract
It remains unclear whether preemptive transplantation is beneficial, and if so, who benefits. A total of 38,836 first, kidney-only transplants between 1995 and 1998 were retrospectively studied. A surprising 39% of preemptive transplants were from cadaver donors, and the proportions of cadaver donor transplants that were preemptive changed little, from 7.3% in 1995 to 7.7% in 1998. Preemptive transplants using cadaver donors were more likely among recipients aged 0 to 17 yr versus 18 to 29 yr (odds ratio [OR], 2.48; 95% confidence interval [CI], 1.94 to 3.17), white versus black (OR, 2.33; 95% CI, 2.03 to 2.68), able to work versus unable to work (OR, 1.42; 95% CI, 1.26 to 1.61), covered by private insurance versus Medicare (OR, 4.77; 95% CI, 4.26 to 5.32), or recipients with a college degree versus no college degree (OR, 1.34; 95% CI, 1.17 to 1.54). Preemptive transplants were less likely for Hispanics versus non-Hispanics (OR, 0.57; 95% CI, 0.50 to 0.67), patients with type 2 versus type 1 diabetes (OR, 0.76; 95% CI, 0.61 to 0.96), and for 2 to 5 HLA mismatches compared with 0 HLA mismatches (OR range, 0.77 to 0.82). In adjusted Cox proportional hazards analysis, the relative risk of graft failure for preemptive transplantation was 0.75 (0.67 to 0.84) among 25,758 cadaver donor transplants and 0.73 (0.64 to 0.83) among 13,078 living donor transplants, compared with patients who received a transplant after already being on dialysis. Preemptive transplantation was associated with a reduced risk of death: 0.84 (0.72 to 0.99) for cadaver donor transplants and 0.69 (0.56 to 0.85) for living donor transplants. Thus, preemptive transplantation, which is associated with improved patient and graft survival, is less common among racial minorities, those who have less education, and those who must rely on Medicare for primary payment. Alterations in the payment system, emphasis on early referral, and changes in cadaver kidney allocation could increase the number of patients who benefit from preemptive transplantation.
Collapse
|
|
23 |
334 |
12
|
McKay DB, Josephson MA, Armenti VT, August P, Coscia LA, Davis CL, Davison JM, Easterling T, Friedman JE, Hou S, Karlix J, Lake KD, Lindheimer M, Matas AJ, Moritz MJ, Riely CA, Ross LF, Scott JR, Wagoner LE, Wrenshall L, Adams PL, Bumgardner GL, Fine RN, Goral S, Krams SM, Martinez OM, Tolkoff-Rubin N, Pavlakis M, Scantlebury V. Reproduction and transplantation: report on the AST Consensus Conference on Reproductive Issues and Transplantation. Am J Transplant 2005; 5:1592-9. [PMID: 15943616 DOI: 10.1111/j.1600-6143.2005.00969.x] [Citation(s) in RCA: 295] [Impact Index Per Article: 14.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
It has been almost 50 years since the first child was born to a female transplant recipient. Since that time pregnancy has become common after transplantation, but physicians have been left to rely on case reports, small series and data from voluntary registries to guide the care of their patients. Many uncertainties exist including the risks that pregnancy presents to the graft, the patient herself, and the long-term risks to the fetus. It is also unclear how to best modify immunosuppressive agents or treat rejection during pregnancy, especially in light of newer agents available where pregnancy safety has not been established. To begin to address uncertainties and define clinical practice guidelines for the transplant physician and obstetrical caregivers, a consensus conference was held in Bethesda, Md. The conferees summarized both what is known and important gaps in our knowledge. They also identified key areas of agreement, and posed a number of critical questions, the resolution of which is necessary in order to establish evidence-based guidelines. The manuscript summarizes the deliberations and conclusions of the conference as well as specific recommendations based on current knowledge in the field.
Collapse
|
Consensus Development Conference |
20 |
295 |
13
|
Basadonna GP, Matas AJ, Gillingham KJ, Payne WD, Dunn DL, Sutherland DE, Gores PF, Gruessner RW, Najarian JS. Early versus late acute renal allograft rejection: impact on chronic rejection. Transplantation 1993; 55:993-5. [PMID: 8497913 DOI: 10.1097/00007890-199305000-00007] [Citation(s) in RCA: 282] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/31/2023]
Abstract
We studied the effect of acute renal allograft rejection and its timing on the development of chronic rejection and subsequent graft loss. Between January 1, 1987 and April 30, 1991, 424 patients at the University of Minnesota received a primary kidney transplant (minimum follow-up, 1 year). Patients were subdivided by donor source, presence or absence of acute rejection, and the timing of acute rejection onset (early, < or = 60 days vs. late, > 60 days post-transplant). For living donor (LD) transplant recipients (n = 219), the incidence of chronic rejection is 0.8% in those who had no acute rejection (n = 130), 20% in those with acute rejection < or = 60 days (n = 59) (P < 0.001 vs. no acute rejection), and 43% in those with acute rejection > 60 days (n = 30) (P < 0.001 vs. no acute rejection, P = 0.04 vs. early acute rejection). For cadaver (CAD) transplant recipients (n = 205), the incidence of chronic rejection is 0% in those who had no acute rejection (n = 109), 36% in those with acute rejection < or = 60 days (n = 69) (P < 0.001 vs. no acute rejection), and 63% in those with acute rejection > 60 days (n = 27) (P < 0.001 vs. no acute rejection, P = 0.03 vs. early acute rejection). For both LD and CAD recipients, no grafts have been lost to chronic rejection among those who did not first have at least 1 acute rejection episode. In contrast, 23 patients with acute rejection have had graft loss to chronic rejection. For both LD and CAD recipients, those with > 1 acute rejection episode had significantly more chronic rejection than those with only 1 rejection (P < 0.05). There was no significant difference in the incidence of chronic rejection based on whether the first acute rejection episode was steroid resistant or steroid responsive. We conclude that acute rejection is strongly related to the development of biopsy-proven chronic rejection and subsequent graft loss. Patients undergoing their first acute rejection episode > 60 days (vs. < or = 60 days) have an increased incidence of chronic rejection.
Collapse
|
|
32 |
282 |
14
|
Matas AJ, Smith JM, Skeans MA, Thompson B, Gustafson SK, Schnitzler MA, Stewart DE, Cherikh WS, Wainright JL, Snyder JJ, Israni AK, Kasiske BL. OPTN/SRTR 2012 Annual Data Report: kidney. Am J Transplant 2014; 14 Suppl 1:11-44. [PMID: 24373166 DOI: 10.1111/ajt.12579] [Citation(s) in RCA: 278] [Impact Index Per Article: 25.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
For most end-stage renal disease patients, successful kidney transplant provides substantially longer survival and better quality of life than dialysis, and preemptive transplant is associated with better outcomes than transplants occurring after dialysis initiation. However, kidney transplant numbers in the us have not changed for a decade. Since 2004, the total number of candidates on the waiting list has increased annually. Median time to transplant for wait-listed adult patients increased from 2.7 years in 1998 to 4.2 years in 2008. The discard rate of deceased donor kidneys has also increased, and the annual number of living donor transplants has decreased. The number of pediatric transplants peaked at 899 in 2005, and has remained steady at approximately 750 over the past 3 years; 40.9% of pediatric candidates undergo transplant within 1 year of wait-listing. Graft survival continues to improve for both adult and pediatric recipients. Kidney transplant is one of the most cost-effective surgical interventions; however, average reimbursement for recipients with primary Medicare coverage from transplant through 1 year posttransplant was comparable to the 1-year cost of care for a dialysis patient. Rates of rehospitalization are high in the first year posttransplant; annual costs after the first year are lower.
Collapse
|
|
11 |
278 |
15
|
Reich DJ, Mulligan DC, Abt PL, Pruett TL, Abecassis MMI, D'Alessandro A, Pomfret EA, Freeman RB, Markmann JF, Hanto DW, Matas AJ, Roberts JP, Merion RM, Klintmalm GBG. ASTS recommended practice guidelines for controlled donation after cardiac death organ procurement and transplantation. Am J Transplant 2009; 9:2004-11. [PMID: 19624569 DOI: 10.1111/j.1600-6143.2009.02739.x] [Citation(s) in RCA: 265] [Impact Index Per Article: 16.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
The American Society of Transplant Surgeons (ASTS) champions efforts to increase organ donation. Controlled donation after cardiac death (DCD) offers the family and the patient with a hopeless prognosis the option to donate when brain death criteria will not be met. Although DCD is increasing, this endeavor is still in the midst of development. DCD protocols, recovery techniques and organ acceptance criteria vary among organ procurement organizations and transplant centers. Growing enthusiasm for DCD has been tempered by the decreased yield of transplantable organs and less favorable posttransplant outcomes compared with donation after brain death. Logistics and ethics relevant to DCD engender discussion and debate among lay and medical communities. Regulatory oversight of the mandate to increase DCD and a recent lawsuit involving professional behavior during an attempted DCD have fueled scrutiny of this activity. Within this setting, the ASTS Council sought best-practice guidelines for controlled DCD organ donation and transplantation. The proposed guidelines are evidence based when possible. They cover many aspects of DCD kidney, liver and pancreas transplantation, including donor characteristics, consent, withdrawal of ventilatory support, operative technique, ischemia times, machine perfusion, recipient considerations and biliary issues. DCD organ transplantation involves unique challenges that these recommendations seek to address.
Collapse
|
Practice Guideline |
16 |
265 |
16
|
Leventhal JR, Dalmasso AP, Cromwell JW, Platt JL, Manivel CJ, Bolman RM, Matas AJ. Prolongation of cardiac xenograft survival by depletion of complement. Transplantation 1993; 55:857-65; discussion 865-6. [PMID: 8475561 DOI: 10.1097/00007890-199304000-00033] [Citation(s) in RCA: 257] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/31/2023]
Abstract
Complement (C) activation is thought to be critical for the hyperacute rejection of xenografts. We investigated the role of C in the rejection of discordant cardiac xenografts by studying outcome in recipients depleted of C, using a highly purified form of cobra venom factor (CVF) in both a small (guinea pig [GP]-to-rat) and large (pig-to-baboon) animal model. A single dose of 30 or 60 units CVF given i.v. to rats completely abrogated hemolytic C activity for up to 72 hr. The lack of hemolytic C activity correlated with nearly undetectable serum levels of C3. Doses of 30 U/kg daily or 60 U/kg every other day over a 7-day period sustained C depletion without morbidity or mortality. Rats receiving GP cardiac xenografts during CVF therapy had significantly prolonged xenograft survival (88 +/- 10 hr in CVF-treated rats vs. 18.6 +/- 7.2 min in control rats, P < 0.001). Rats that rejected GP xenografts at 4 days posttransplant had higher levels of anti-GP antibodies than control rats, without hemolytic C activity at rejection. This rise in xenoreactive Ig reflected an increase in circulating IgG and IgM against GP antigens recognized before transplantation. Histologic analysis of GP cardiac xenografts taken from CVF-treated rats revealed leukocyte and monocyte margination along blood vessels, beginning at 12 hr posttransplant. Progressive cell infiltration, interstitial hemorrhage, and necrosis were observed over the next 72 hr. Rejected GP xenografts showed diffuse deposition of IgM and fibrin within blood vessels but no evidence of C3 deposition. A nonspecific pattern of IgG deposition was noted. CVF was tested in baboons. Complete C depletion was achieved with a dose of 60 U/kg, and was not associated with any morbidity or mortality. Xenotransplantation of a pig heart was performed in one baboon receiving CVF, 60 U/kg/day, for 2 consecutive days. Xenograft survival was prolonged to 68 hr, compared with 90 +/- 30 min in control baboons. Lack of hemolytic activity was noted during engraftment and at rejection. Histology showed evidence of vascular rejection. Immunopathology showed diffuse deposition of IgM, fibrin, and C4, and absence of C3 or membrane attack complex. We conclude that highly purified CVF can achieve marked C depletion with minimal morbidity and no associated fatalities. CVF alone can significantly prolong discordant cardiac xenograft survival. In the GP-to-rat model, the improvement in graft survival achieved with CVF was better than with conventional immunosuppression or isolated acute antibody depletion.(ABSTRACT TRUNCATED AT 400 WORDS)
Collapse
|
|
32 |
257 |
17
|
Johnson EM, Anderson JK, Jacobs C, Suh G, Humar A, Suhr BD, Kerr SR, Matas AJ. Long-term follow-up of living kidney donors: quality of life after donation. Transplantation 1999; 67:717-21. [PMID: 10096528 DOI: 10.1097/00007890-199903150-00013] [Citation(s) in RCA: 241] [Impact Index Per Article: 9.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/10/2023]
Abstract
The University of Minnesota has been a strong advocate of living donor kidney transplants. The benefits for living donor recipients have been well documented. The relative low risk of physical complications during donation has also been well documented. Less well understood is the psychosocial risk to donors. Most published reports have indicated an improved sense of well-being and a boost in self-esteem for living kidney donors. However, there have been some reports of depression and disrupted family relationships after donation, even suicide after a recipient's death. To determine the quality of life of our donors, we sent a questionnaire to 979 who had donated a kidney between August 1, 1984, and December 31, 1996. Of the 60% who responded, the vast majority had an excellent quality of life. As a group, they scored higher than the national norm on the SF-36, a standardized quality of life health questionnaire. However, 4% were dissatisfied and regretted the decision to donate. Further, 4% found the experience extremely stressful and 8% very stressful. We used multivariate analysis to identify risk factors for this poor psychosocial outcome and found that relatives other than first degree (odds ratio=3.5, P=0.06) and donors whose recipient died within 1 year of transplant (odds ratio=3.3, P=0.014) were more likely to say they would not donate again if it were possible. Further, donors who had perioperative complications (odds ratio=3.5, P=0.007) and female donors (odds ratio=1.8, P=0.1) were more likely to find the overall experience more stressful. Overall, the results of this study are overwhelmingly positive and have encouraged us to continue living donor kidney transplants.
Collapse
|
|
26 |
241 |
18
|
Matas AJ, Sutherland DE, Steffes MW, Mauer SM, Sowe A, Simmons RL, Najarian JS. Hepatocellular transplantation for metabolic deficiencies: decrease of plasms bilirubin in Gunn rats. Science 1976; 192:892-4. [PMID: 818706 DOI: 10.1126/science.818706] [Citation(s) in RCA: 236] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022]
Abstract
A sustained decrease of plasma bilirubin concentrations occurred in homozygous recessive Gunn rats lacking the enzyme uridine diphosphate glucuronyltransferase following infusion into the portal vein of hepatocytes from heterozygous nonjaundiced Gunn rats possessing the enzyme. Transplantation of cells capable of continuous enzyme production could be an effective mode of therapy for congenital enzyme deficiency diseases.
Collapse
|
|
49 |
236 |
19
|
Matas AJ, Bartlett ST, Leichtman AB, Delmonico FL. Morbidity and Mortality After Living Kidney Donation, 1999-2001: Survey of United States Transplant Centers. Am J Transplant 2003. [DOI: 10.1046/j.1038-5282.2001.00400.x-i1] [Citation(s) in RCA: 235] [Impact Index Per Article: 10.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/10/2023]
|
|
22 |
235 |
20
|
Matas AJ, Simmons RL, Kjellstrand CM, Buselmeier TJ, Najarian JS. Increased incidence of malignancy during chronic renal failure. Lancet 1975; 1:883-6. [PMID: 47534 DOI: 10.1016/s0140-6736(75)91684-0] [Citation(s) in RCA: 222] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
The incidence of cancer in 646 dialysis/transplant patients before uraemia developed, during the period of progressive uraemia, and post-transplantation was compared. 10 tumours (3 breast, 2 kidney, 1 leukaemia, 1 lung, 1 insulinoma, 1 thyroid, 1 cervix in situ) developed in 9 patients during the period of progressive uraemia, a significant increase over the expected number in the age-matched general population. 6 of these patients have received transplants and have no evidence of recurrent disease 6 months to 4 years post-transplantation. 11 de-novo tumours have developed in 530 transplant recipients (4 cervix in situ, 2 skin, 2 reticulum-cell sarcoma, 1 lip, 1 dysgerminoma, 1 colon)--a significant increase over the age-matched general population. The cancers in the uraemic patients are relatively common types of mesenchymal tumours while the cancers in the transplant recipients are epithelial and lymphoproliferative. This difference may reflect the presence of the graft in the transplant patient or may be due to different patterns of immunosuppression in these two populations.
Collapse
|
|
50 |
222 |
21
|
Johnson EM, Remucal MJ, Gillingham KJ, Dahms RA, Najarian JS, Matas AJ. Complications and risks of living donor nephrectomy. Transplantation 1997; 64:1124-8. [PMID: 9355827 DOI: 10.1097/00007890-199710270-00007] [Citation(s) in RCA: 198] [Impact Index Per Article: 7.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Abstract
BACKGROUND Short- and long-term patient and graft survival rates are better for living donor (vs. cadaver) kidney transplant recipients. However, donor nephrectomy is associated with at least some morbidity and mortality. We have previously estimated the mortality of living donor nephrectomy to be 0.03%. In our present study, to determine associated perioperative morbidity, we reviewed donor nephrectomies performed at our institution from January 1, 1985, to December 31, 1995. METHODS The records of 871 donors were complete and available for review. Of these donors, 380 (44%) were male and 491 (56%) were female. The mean age at the time of donation was 38 years (range: 17-74 years), and mean postoperative stay was 4.9 days (range: 2-14 days). RESULTS We noted two (0.2%) major complications: femoral nerve compression with resulting weakness, and a retained sponge that required reexploration. We noted 86 minor complications in 69 (8%) donors: 22 (2.4%) suspected wound infections (only 1 wound was opened), 13 (1.5%) pneumothoraces (6 required intervention, 7 resolved spontaneously), 11 (1.3%) unexplained fevers, 8 (0.9%) instances of operative blood loss > or = 750 ml (not associated with other complications), 8 (0.9%) pneumonias (all of which resolved quickly with antibiotics alone), 5 (0.6%) wound hematomas or seromas (none were opened), 4 (0.5%) phlebitic intravenous sites, 3 (0.3%) urinary tract infections, 3 (0.3%) readmissions (2 for pain control and 1 for mild confusion that resolved with discontinuation of narcotics), 3 (0.3%) cases of atelectasis, 2 (0.2%) corneal abrasions, 1 (0.1%) subacute epididymitis, 1 (0.1%) Clostridium difficile colitis, 1 (0.1%) urethral trauma from catheter placement, and 1 (0.1%) enterotomy. At our institution, no donor died or required ventilation or intensive care. We noted no myocardial infarctions, deep wound infections, or reexplorations for bleeding. Analysis, by logistic regression, identified these significant risk factors for perioperative complications: male gender (vs. female, P<0.001), pleural entry (vs. no pleural entry, P<0.004), and weight > or = 100 kg (vs. < 100 kg, P<0.02). Similar analysis identified these significant risk factors for postoperative stay > 5 days: operative duration > or = 4 hr (vs. < 4 hr, P<0.001) and age > or = 50 years (vs. < 50 years, P<0.001). CONCLUSIONS Living donor nephrectomy can be done with little major morbidity. The risks of nephrectomy must be balanced against the better outcome for recipients of living donor transplants.
Collapse
|
|
28 |
198 |
22
|
Wiebe C, Rush DN, Nevins TE, Birk PE, Blydt-Hansen T, Gibson IW, Goldberg A, Ho J, Karpinski M, Pochinco D, Sharma A, Storsley L, Matas AJ, Nickerson PW. Class II Eplet Mismatch Modulates Tacrolimus Trough Levels Required to Prevent Donor-Specific Antibody Development. J Am Soc Nephrol 2017; 28:3353-3362. [PMID: 28729289 DOI: 10.1681/asn.2017030287] [Citation(s) in RCA: 189] [Impact Index Per Article: 23.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2017] [Accepted: 05/15/2017] [Indexed: 11/03/2022] Open
Abstract
Despite more than two decades of use, the optimal maintenance dose of tacrolimus for kidney transplant recipients is unknown. We hypothesized that HLA class II de novo donor-specific antibody (dnDSA) development correlates with tacrolimus trough levels and the recipient's individualized alloimmune risk determined by HLA-DR/DQ epitope mismatch. A cohort of 596 renal transplant recipients with 50,011 serial tacrolimus trough levels had HLA-DR/DQ eplet mismatch determined using HLAMatchmaker software. We analyzed the frequency of tacrolimus trough levels below a series of thresholds <6 ng/ml and the mean tacrolimus levels before dnDSA development in the context of HLA-DR/DQ eplet mismatch. HLA-DR/DQ eplet mismatch was a significant multivariate predictor of dnDSA development. Recipients treated with a cyclosporin regimen had a 2.7-fold higher incidence of dnDSA development than recipients on a tacrolimus regimen. Recipients treated with tacrolimus who developed HLA-DR/DQ dnDSA had a higher proportion of tacrolimus trough levels <5 ng/ml, which continued to be significant after adjustment for HLA-DR/DQ eplet mismatch. Mean tacrolimus trough levels in the 6 months before dnDSA development were significantly lower than the levels >6 months before dnDSA development in the same patients. Recipients with a high-risk HLA eplet mismatch score were less likely to tolerate low tacrolimus levels without developing dnDSA. We conclude that HLA-DR/DQ eplet mismatch and tacrolimus trough levels are independent predictors of dnDSA development. Recipients with high HLA alloimmune risk should not target tacrolimus levels <5 ng/ml unless essential, and monitoring for dnDSA may be advisable in this setting.
Collapse
|
Journal Article |
8 |
189 |
23
|
Mannon RB, Matas AJ, Grande J, Leduc R, Connett J, Kasiske B, Cecka JM, Gaston RS, Cosio F, Gourishankar S, Halloran P, Hunsicker L, Rush D. Inflammation in areas of tubular atrophy in kidney allograft biopsies: a potent predictor of allograft failure. Am J Transplant 2010; 10:2066-73. [PMID: 20883541 PMCID: PMC2951299 DOI: 10.1111/j.1600-6143.2010.03240.x] [Citation(s) in RCA: 174] [Impact Index Per Article: 11.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
The Banff scoring schema provides a common ground to analyze kidney transplant biopsies. Interstitial inflammation (i) and tubulitis (t) in areas of viable tissue are features in scoring acute rejection, but are excluded in areas of tubular atrophy (TA). We studied inflammation and tubulitis in a cohort of kidney transplant recipients undergoing allograft biopsy for new-onset late graft dysfunction (N = 337). We found inflammation ('iatr') and tubulitis ('tatr') in regions of fibrosis and atrophy to be strongly correlated with each other (p < 0.0001). Moreover, iatr was strongly associated with death-censored graft failure when compared to recipients whose biopsies had no inflammation, even after adjusting for the presence of interstitial fibrosis (Hazard Ratio = 2.31, [1.10-4.83]; p = 0.0262) or TA (hazard ratio = 2.42, [1.16-5.08]; p = 0.191), serum creatinine at the time of biopsy, time to biopsy and i score. Further, these results did not qualitatively change after additional adjustments for C4d staining or donor specific antibody. Stepwise regression identified the most significant markers of graft failure which include iatr score. We propose that a more global assessment of inflammation in kidney allograft biopsies to include inflammation in atrophic areas may provide better prognostic information. Phenotypic characterization of these inflammatory cells and appropriate treatment may ameliorate late allograft failure.
Collapse
|
research-article |
15 |
174 |
24
|
Matas AJ, Smith JM, Skeans MA, Lamb KE, Gustafson SK, Samana CJ, Stewart DE, Snyder JJ, Israni AK, Kasiske BL. OPTN/SRTR 2011 Annual Data Report: kidney. Am J Transplant 2013; 13 Suppl 1:11-46. [PMID: 23237695 DOI: 10.1111/ajt.12019] [Citation(s) in RCA: 173] [Impact Index Per Article: 14.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
A shortage of kidneys for transplant remains a major problem for patients with end-stage renal disease. The number of candidates on the waiting list continues to increase each year, while organ donation numbers remain flat. Thus, transplant rates for adult wait-listed candidates continue to decrease. However, pretransplant mortality rates also show a decreasing trend. Many kidneys recovered for transplant are discarded, and discard rates are increasing. Living donation rates have been essentially unchanged for the past decade, despite introduction of desensitization, non-directed donations, and kidney paired donation programs. For both living and deceased donor recipients, early posttransplant results have shown ongoing improvement, driven by decreases in rates of graft failure and return to dialysis. Immunosuppressive drug use has changed little, except for the Food and Drug Administration approval of belatacept in 2011, the first approval of a maintenance immunosuppressive drug in more than a decade. Pediatric kidney transplant candidates receive priority under the Share 35 policy. The number of pediatric transplants peaked in 2005, and decreased to a low of 760 in 2011. Graft survival and short-term renal function continue to improve for pediatric recipients. Postransplant lymphoproliferative disorder is an important concern, occurring in about one-third of pediatric recipients.
Collapse
|
|
12 |
173 |
25
|
Humar A, Ramcharan T, Denny R, Gillingham KJ, Payne WD, Matas AJ. Are wound complications after a kidney transplant more common with modern immunosuppression? Transplantation 2001; 72:1920-3. [PMID: 11773889 DOI: 10.1097/00007890-200112270-00009] [Citation(s) in RCA: 171] [Impact Index Per Article: 7.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
BACKGROUND The most common surgical complication after a kidney transplant is likely related to the wound. The purpose of this analysis was to determine the incidence of, and risk factors for, wound complications (e.g., infections, hernias) in kidney recipients and to assess whether newer immunosuppressive drugs increase the risk for such complications. METHODS Between January 1, 1984 and September 30, 1998, we performed 2013 adult kidney transplants. Of these 2013 recipients, 97 (4.8%) developed either a superficial or a deep wound infection. Additionally, 73 (3.6%) recipients developed either a fascial dehiscence or a hernia of the wound. We used univariate and multivariate techniques to determine significant risk factors and outcomes. RESULTS Mean time to development of a superficial infection (defined as located above the fascia) was 11.9 days posttransplant; to development of a deep infection (defined as located below the fascia), 39.2 days; and to development of a hernia or fascial dehiscence, 12.8 months. By multivariate analysis, the most significant risk factor for a superficial or deep wound infection was obesity (defined as body mass index>30 kg/m2) (RR=4.4, P=0.0001). Other significant risk factors were a urine leak posttransplant, any reoperation through the transplant incision, diabetes, and the use of mycophenolate mofetil (MMF) (vs. azathioprine) for maintenance immunosuppression (RR=2.43, P=0.0001). Significant risk factors for a hernia or fascial dehiscence were any reoperation through the transplant incision, increased recipient age, obesity, and the use of MMF (vs. azathioprine) for maintenance immunosuppression (RR=3.54, P=0.0004). Use of antibody induction and treatment for acute rejection were not significant risk factors for either infections or hernias. Death-censored graft survival was lower in recipients who developed a wound infection (vs. those who did not); it was not lower in recipients who developed an incisional hernia or facial dehiscence (vs. those who did not). CONCLUSIONS Despite immunosuppression including chronic steroids, the incidence of wound infections, incisional hernias, and fascial dehiscence is low in kidney recipients. As with other types of surgery, the main risk factors for postoperative complications are obesity, reoperation, and increased age. However, in kidney recipients, use of MMF (vs. azathioprine) is an additional risk factor -one that potentially could be altered, especially in high-risk recipients.
Collapse
|
|
24 |
171 |