101
|
|
102
|
Abstract
Status 1 is the listing category reserved for patients awaiting liver transplantation who are at risk of imminent death. This high allocation priority was intended to benefit patients with acute liver failure and children with severe chronic liver failure. However, the status 1 criteria were not well defined. The aims of this study, which used the Organ Procurement and Transplantation Network/Scientific Registry of Transplant Recipients database for patients wait-listed between February 27, 2002, and September 30, 2003, were to determine the indication and numbers of children and adults at status 1 (including regional variations); examine death rates on the waiting list for children at vs. not at status 1; and examine time to death, transplant, or removal from the waiting list for both pediatric and adult status 1 candidates. During the study period, 40.3% of children and 6.1% of adults were transplanted at status 1. The indication was acute liver failure in 52.1% of adults and 31% of children. Among status 1 transplants, Regional Review Board exceptions were granted for 16.7% of children and 10.1% of adults. Death rates for children listed at status 1 by exception per patient-year at risk were substantially lower (0.51) than those of children with acute liver failure (4.06) or with chronic liver disease and Pediatric End-Stage Liver Disease score > or =25 (4.63). The percentage of adults who died while on the waiting list within 90 days of listing was more than twice that of children, whereas the percentages transplanted were similar. Patients listed and transplanted at status 1 were a heterogeneous population with an overrepresentation of children with varying degrees of chronic liver disease and other exceptions, and an associated wide variation in waiting list mortality. Recent changes in status 1 criteria provide stricter definitions, particularly for children, including the removal of the "by exception" category, with the intent that all candidates listed at status 1 share a similar mortality risk.
Collapse
|
103
|
Abstract
The number of liver transplants performed yearly has slowly and steadily increased over the last 10 years, reaching 6441 procedures in 2005. The number of living donor liver transplants performed rose steadily from 1996 to 2001, when it peaked at 519; since 2003 there have been approximately 320 such procedures performed each year. The continual increase in the size of the waiting list for a liver transplant, which peaked in 2001 at 14 897 patients, was interrupted in 2002 by the implementation of the allocation system based on the model for end-stage liver disease and pediatric end-stage liver disease (MELD/PELD). Activity in all areas of intestinal transplantation continues to increase. One-year patient and graft survival following intestine-alone transplantation now seem to be superior to outcomes following liver-intestine transplantation. Other topics covered here include the recent 'Share 15' component of the MELD allocation system; liver transplantation following donation after cardiac death; simultaneous liver-kidney transplantation and waiting list and post-transplant outcomes for both liver and intestine transplantation, broken out by a variety of clinical and demographic factors.
Collapse
|
104
|
|
105
|
Use of a pediatric end-stage liver disease score for deceased donor allocation: the United States experience. Indian J Pediatr 2007; 74:387-92. [PMID: 17476086 DOI: 10.1007/s12098-007-0066-2] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
Abstract
The Pediatric end-stage liver disease (PELD) score was developed as a measure of the severity of chronic liver disease that would predict mortality or children awaiting liver transplant. From multivariate analyses a model was derived that included five objective factors which together comprise the PELD score. The factors are growth failure, age less than 1 year, international normalized ratio (INR), serum albumin and total bilirubin.
Collapse
|
106
|
Abstract
OBJECTIVE This study examines donation after cardiac death (DCD) practices and outcomes in liver transplantation. SUMMARY BACKGROUND DATA Livers procured from DCD donors have recently been used to increase the number of deceased donors and bridge the gap between limited organ supply and the pool of waiting list candidates. Comprehensive evaluation of this practice and its outcomes has not been previously reported. METHODS A national cohort of all DCD and donation after brain-death (DBD) liver transplants between January 1, 2000 and December 31, 2004 was identified in the Scientific Registry of Transplant Recipients. Time to graft failure (including death) was modeled by Cox regression, adjusted for relevant donor and recipient characteristics. RESULTS DCD livers were used for 472 (2%) of 24,070 transplants. Annual DCD liver activity increased from 39 in 2000 to 176 in 2004. The adjusted relative risk of DCD graft failure was 85% higher than for DBD grafts (relative risk, 1.85; 95% confidence interval, 1.51-2.26; P < 0.001), corresponding to 3-month, 1-year, and 3-year graft survival rates of 83.0%, 70.1%, and 60.5%, respectively (vs. 89.2%, 83.0%, and 75.0% for DBD recipients). There was no significant association between transplant program DCD liver transplant volume and graft outcome. CONCLUSIONS The annual number of DCD livers used for transplant has increased rapidly. However, DCD livers are associated with a significantly increased risk of graft failure unrelated to modifiable donor or recipient factors. Appropriate recipients for DCD livers have not been fully characterized and recipient informed consent should be obtained before use of these organs.
Collapse
|
107
|
Abstract
The ability of the model for end-stage liver disease (MELD) score to accurately predict death among liver transplant candidates allows for evaluation of geographic differences in transplant access for patients with similar death risk. Adjusted models of time to transplant and death for adult liver transplant candidates listed between 2002 and 2003 were developed to test for differences in MELD score among Organ Procurement and Transplantation Network (OPTN) regions and Donation Service Areas (DSA). The average MELD and relative risk (RR) of death varied somewhat by region (from 0.82 to 1.28), with only two regions having significant differences in RRs. Greater variability existed in adjusted transplant rates by region; 7 of 11 regions differed significantly from the national average. Simulation results indicate that an allocation system providing regional priority to candidates at MELD scores > or = 15 would increase the median MELD score at transplant and reduce the total number of deaths across DSA quintiles. Simulation results also indicate that increasing priority to higher MELD candidates would reduce the percentage variation among DSAs of transplants to patients with MELD scores > or = 15. The variation decrease was due to increasing the MELD score at time of transplantation in the DSAs with the lowest MELD scores at transplant.
Collapse
|
108
|
Educational web-based intervention for high school students to increase knowledge and promote positive attitudes toward organ donation. HEALTH EDUCATION & BEHAVIOR 2006; 33:773-86. [PMID: 16923836 DOI: 10.1177/1090198106288596] [Citation(s) in RCA: 37] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
A sample of 490 high school students from 81 schools in Michigan participated in an experiment in which they were randomly assigned to either a control or an experimental Web site. The experimental Web site provided exposure to educational material about the process of organ donation and organ transplantation. The control Web site provided educational material on methods to avoid the common cold. The pre-and posttests of knowledge of issues related to organ donation and of attitude toward donation demonstrated statistically significant increases for the experimental group compared with the control group. A structural equation path model suggested that these increases in knowledge and prodonation attitude mediated the effects of the experiment on contacting the Michigan donor registry. The increase in knowledge and in prodonation attitude increased the likelihood of contacting the registry. The potential for this and similar other Web interventions to enhance students' health education is discussed.
Collapse
|
109
|
Abstract
The optimal use of kidneys from small pediatric deceased donors remains undetermined. Using data from the Scientific Registry of Transplant Recipients, 2886 small (< 21 kg) pediatric donors between 1993 and 2002 were identified. Donor factors predictive of kidney recovery and transplantation (1343 en bloc; 1600 single) were identified by logistic regression. Multivariable Cox regression was used to assess the risk of graft loss. The rate of kidney recovery from small pediatric donors was significantly higher with increasing age, weight and height. The odds of transplant of recovered small donor kidneys were significantly higher with increasing age, weight, height and en bloc recovery (adjusted odds ratio = 65.8 vs. single; p < 0.0001), and significantly lower with increasing creatinine. Compared to en bloc, solitary transplants had a 78% higher risk of graft loss (p < 0.0001). En bloc transplants had a similar graft survival to ideal donors (p = 0.45) while solitary transplants had an increased risk of graft loss (p < 0.0001). En bloc recovery of kidneys from small pediatric donors may result in the highest probability of transplantation. Although limited by the retrospective nature of the study, kidneys transplanted en bloc had a similar graft survival to ideal donors but may not maximize the number of successfully transplanted recipients.
Collapse
|
110
|
|
111
|
|
112
|
Abstract
Transplant physicians and candidates have become increasingly aware that donor characteristics significantly impact liver transplantation outcomes. Although the qualitative effect of individual donor variables are understood, the quantitative risk associated with combinations of characteristics are unclear. Using national data from 1998 to 2002, we developed a quantitative donor risk index. Cox regression models identified seven donor characteristics that independently predicted significantly increased risk of graft failure. Donor age over 40 years (and particularly over 60 years), donation after cardiac death (DCD), and split/partial grafts were strongly associated with graft failure, while African-American race, less height, cerebrovascular accident and 'other' causes of brain death were more modestly but still significantly associated with graft failure. Grafts with an increased donor risk index have been preferentially transplanted into older candidates (>50 years of age) with moderate disease severity (nonstatus 1 with lower model for end-stage liver disease (MELD) scores) and without hepatitis C. Quantitative assessment of the risk of donor liver graft failure using a donor risk index is useful to inform the process of organ acceptance.
Collapse
|
113
|
|
114
|
Abstract
The shortage of deceased donor kidneys for transplantation continues to restrict the full application of this lifesaving procedure to all who might benefit. Increasing reliance on donors with characteristics previously thought to be unsuitable for use in transplantation has led to questions about graft outcomes for recipients of such organs. Careful definition of the expanded criteria donor (ECD) for kidney has facilitated modifications of national organ allocation policy that are designed to increase procurement, improve use, decrease cold ischemia time, and lead to improved outcome. The effects of these policy changes in the United States have been studied recently and are reviewed here. In addition, the impact of ECD kidney transplantation on mortality risk among candidates awaiting deceased donor renal transplantation is examined. Further studies of ECD organs and their recipients are needed to optimize the use of these scarce resources.
Collapse
|
115
|
Invasive fungal infections in low-risk liver transplant recipients: a multi-center prospective observational study. Am J Transplant 2006; 6:386-91. [PMID: 16426325 DOI: 10.1111/j.1600-6143.2005.01176.x] [Citation(s) in RCA: 80] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
Prevention of invasive fungal infections (IFIs) in orthotopic liver transplant (OLT) recipients utilizing postoperative systemic antifungal prophylaxis, typically with fluconazole, is justified among those at high risk for IFI. Use of postoperative antifungal prophylaxis for low-risk OLT recipients is widely practiced but not universally accepted nor supported by data. We conducted a prospective observational study among 200 OLT recipients who were at low risk for IFI and did not receive postoperative antifungal prophylaxis. Patients were considered low risk if they had </=1 of the following conditions: choledochojejunostomy anastomosis; retransplantation; intra-operative administration of >/=units of 40 blood products or return to the operating room for intra-abdominal bleeding; return to the operating room for anastomotic leak or vascular insufficiency; preoperative serum creatinine of >/=2 mg/dL; and perioperative Candida colonization. Patients were followed 100 d post-transplantation for evidence of IFI. Of 193 eligible patients, 7 (4%) developed an IFI. Three (2%) IFIs were due to Candida spp. and potentially preventable by standard fluconazole prophylaxis. Three patients developed invasive aspergillosis; one developed late onset disseminated cryptococcosis. Liver transplant recipients at low risk for IFI can be identified utilizing pre-determined criteria, and post-transplantation antifungal prophylaxis can be routinely withheld in these patients.
Collapse
|
116
|
Abstract
Access to timely, risk-adjusted measures of transplant center outcomes is crucial for program quality improvement. The cumulative summation technique (CUSUM) has been proposed as a sensitive tool to detect persistent, clinically relevant changes in transplant center performance over time. Scientific Registry of Transplant Recipients data for adult kidney and liver transplants (1/97 to 12/01) were examined using logistic regression models to predict risk of graft failure (kidney) and death (liver) at 1 year. Risk-adjusted CUSUM charts were constructed for each center and compared with results from the semi-annual method of the Organ Procurement and Transplantation Network (OPTN). Transplant centers (N = 258) performed 59 650 kidney transplants, with a 9.2% 1-year graft failure rate. The CUSUM method identified centers with a period of significantly improving (N = 92) or declining (N = 52) performance. Transplant centers (N = 114) performed 18 277 liver transplants, with a 13.9% 1-year mortality rate. The CUSUM method demonstrated improving performance at 48 centers and declining performance at 24 centers. The CUSUM technique also identified the majority of centers flagged by the current OPTN method (20/22 kidney and 8/11 liver). CUSUM monitoring may be a useful technique for quality improvement, allowing center directors to identify clinically important, risk-adjusted changes in transplant center outcome.
Collapse
|
117
|
Abstract
A national conference on organ donation after cardiac death (DCD) was convened to expand the practice of DCD in the continuum of quality end-of-life care. This national conference affirmed the ethical propriety of DCD as not violating the dead donor rule. Further, by new developments not previously reported, the conference resolved controversy regarding the period of circulatory cessation that determines death and allows administration of pre-recovery pharmacologic agents, it established conditions of DCD eligibility, it presented current data regarding the successful transplantation of organs from DCD, it proposed a new framework of data reporting regarding ischemic events, it made specific recommendations to agencies and organizations to remove barriers to DCD, it brought guidance regarding organ allocation and the process of informed consent and it set an action plan to address media issues. When a consensual decision is made to withdraw life support by the attending physician and patient or by the attending physician and a family member or surrogate (particularly in an intensive care unit), a routine opportunity for DCD should be available to honor the deceased donor's wishes in every donor service area (DSA) of the United States.
Collapse
|
118
|
|
119
|
Abstract
CONTEXT Transplantation using kidneys from deceased donors who meet the expanded criteria donor (ECD) definition (age > or =60 years or 50 to 59 years with at least 2 of the following: history of hypertension, serum creatinine level >1.5 mg/dL [132.6 micromol/L], and cerebrovascular cause of death) is associated with 70% higher risk of graft failure compared with non-ECD transplants. However, if ECD transplants offer improved overall patient survival, inferior graft outcome may represent an acceptable trade-off. OBJECTIVE To compare mortality after ECD kidney transplantation vs that in a combined standard-therapy group of non-ECD recipients and those still receiving dialysis. DESIGN, SETTING, AND PATIENTS Retrospective cohort study using data from a US national registry of mortality and graft outcomes among kidney transplant candidates and recipients. The cohort included 109,127 patients receiving dialysis and added to the kidney waiting list between January 1, 1995, and December 31, 2002, and followed up through July 31, 2004. MAIN OUTCOME MEASURE Long-term (3-year) relative risk of mortality for ECD kidney recipients vs those receiving standard therapy, estimated using time-dependent Cox regression models. RESULTS By end of follow-up, 7790 ECD kidney transplants were performed. Because of excess ECD recipient mortality in the perioperative period, cumulative survival did not equal that of standard-therapy patients until 3.5 years posttransplantation. Long-term relative mortality risk was 17% lower for ECD recipients (relative risk, 0.83; 95% confidence interval, 0.77-0.90; P<.001). Subgroups with significant ECD survival benefit included patients older than 40 years, both sexes, non-Hispanics, all races, unsensitized patients, and those with diabetes or hypertension. In organ procurement organizations (OPOs) with long median waiting times (>1350 days), ECD recipients had a 27% lower risk of death (relative risk, 0.73; 95% confidence interval, 0.64-0.83; P<.001). In areas with shorter waiting times, only recipients with diabetes demonstrated an ECD survival benefit. CONCLUSIONS ECD kidney transplants should be offered principally to candidates older than 40 years in OPOs with long waiting times. In OPOs with shorter waiting times, in which non-ECD kidney transplant availability is higher, candidates should be counseled that ECD survival benefit is observed only for patients with diabetes.
Collapse
|
120
|
Outcomes of 385 adult-to-adult living donor liver transplant recipients: a report from the A2ALL Consortium. Ann Surg 2005; 242:314-23, discussion 323-5. [PMID: 16135918 PMCID: PMC1357740 DOI: 10.1097/01.sla.0000179646.37145.ef] [Citation(s) in RCA: 286] [Impact Index Per Article: 15.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
OBJECTIVE The objective of this study was to characterize the patient population with respect to patient selection, assess surgical morbidity and graft failures, and analyze the contribution of perioperative clinical factors to recipient outcome in adult living donor liver transplantation (ALDLT). SUMMARY BACKGROUND DATA Previous reports have been center-specific or from large databases lacking detailed variables. The Adult-to-Adult Living Donor Liver Transplantation Cohort Study (A2ALL) represents the first detailed North American multicenter report of recipient risk and outcome aiming to characterize variables predictive of graft failure. METHODS Three hundred eighty-five ALDLT recipients transplanted at 9 centers were studied with analysis of over 35 donor, recipient, intraoperative, and postoperative variables. Cox regression models were used to examine the relationship of variables to the risk of graft failure. RESULTS Ninety-day and 1-year graft survival were 87% and 81%, respectively. Fifty-one (13.2%) grafts failed in the first 90 days. The most common causes of graft failure were vascular thrombosis, primary nonfunction, and sepsis. Biliary complications were common (30% early, 11% late). Older recipient age and length of cold ischemia were significant predictors of graft failure. Center experience greater than 20 ALDLT was associated with a significantly lower risk of graft failure. Recipient Model for End-stage Liver Disease score and graft size were not significant predictors. CONCLUSIONS This multicenter A2ALL experience provides evidence that ALDLT is a viable option for liver replacement. Older recipient age and prolonged cold ischemia time increase the risk of graft failure. Outcomes improve with increasing center experience.
Collapse
|
121
|
Extracorporeal support for organ donation after cardiac death effectively expands the donor pool. ACTA ACUST UNITED AC 2005; 58:1095-101; discussion 1101-2. [PMID: 15995454 DOI: 10.1097/01.ta.0000169949.82778.df] [Citation(s) in RCA: 181] [Impact Index Per Article: 9.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
Abstract
BACKGROUND We sought to evaluate the effect on short-term outcomes of normothermic, extracorporeal perfusion (ECMO) for donation of abdominal organs for transplantation after cardiac death (DCD). Study parameters included increase in number of donors and organs, types of organs procured, and viability of kidneys transplanted. METHODS We retrospectively reviewed medical record data for all patients enrolled in our ECMO-supported DCD donor protocol between 10/1/2000 to 2/01/2004. We also reviewed the records for all patients undergoing organ donation after brain-death (DBD) during the study period at our institution. Recipient data were obtained and analyzed for all kidneys procured from both groups. RESULTS Twenty patients were enrolled in our DCD protocol and underwent attempted organ donation. Fifteen patients completed the protocol; 3 maintained cardiac function throughout the prescribed 60 minutes after withdrawal of life support, and two patients' organs were deemed unsuitable for transplantation. Fourteen (70%) of the DCD donor patients originated on the trauma service and six (30%) were from other clinical services. The DCD program increased the potential donor pool by 33% (61 versus 81 patients) and the number of kidneys transplanted by 24% (100 versus 124). A total of 24 kidney, 5 liver, and 1 pancreas transplants were performed with these organs. Two of 24 (8.3%) DCD kidneys had delayed graft function. There were no perioperative rejection episodes or deaths. CONCLUSION The implementation of a DCD protocol using extracorporeal perfusion increased the potential organ donor pool at our institution by 33%. This was accomplished without short term adverse effect on organ function compared with kidneys transplanted from DBD donors.
Collapse
|
122
|
Kidney transplantation and wait-listing rates from the international Dialysis Outcomes and Practice Patterns Study (DOPPS). Kidney Int 2005; 68:330-7. [PMID: 15954924 DOI: 10.1111/j.1523-1755.2005.00412.x] [Citation(s) in RCA: 84] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
BACKGROUND The international Dialysis Outcomes and Practice Patterns Study (DOPPS I and II) allows description of variations in kidney transplantation and wait-listing from nationally representative samples of 18- to 65-year-old hemodialysis patients. The present study examines the health status and socioeconomic characteristics of United States patients, the role of for-profit versus not-for-profit status of dialysis facilities, and the likelihood of transplant wait-listing and transplantation rates. METHODS Analyses of transplantation rates were based on 5267 randomly selected DOPPS I patients in dialysis units in the United States, Europe, and Japan who received chronic hemodialysis therapy for at least 90 days in 2000. Left-truncated Cox regression was used to assess time to kidney transplantation. Logistic regression determined the odds of being transplant wait-listed for a cross-section of 1323 hemodialysis patients in the United States in 2000. Furthermore, kidney transplant wait-listing was determined in 12 countries from cross-sectional samples of DOPPS II hemodialysis patients in 2002 to 2003 (N= 4274). RESULTS Transplantation rates varied widely, from very low in Japan to 25-fold higher in the United States and 75-fold higher in Spain (both P values <0.0001). Factors associated with higher rates of transplantation included younger age, nonblack race, less comorbidity, fewer years on dialysis, higher income, and higher education levels. The likelihood of being wait-listed showed wide variation internationally and by United States region but not by for-profit dialysis unit status within the United States. CONCLUSION DOPPS I and II confirmed large variations in kidney transplantation rates by country, even after adjusting for differences in case mix. Facility size and, in the United States, profit status, were not associated with varying transplantation rates. International results consistently showed higher transplantation rates for younger, healthier, better-educated, and higher income patients.
Collapse
|
123
|
Impact of the Expanded Criteria Donor Allocation System on the Use of Expanded Criteria Donor Kidneys. Transplantation 2005; 79:1257-61. [PMID: 15880081 DOI: 10.1097/01.tp.0000161225.89368.81] [Citation(s) in RCA: 74] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
BACKGROUND The U.S. Organ Procurement and Transplantation Network recently implemented a policy allocating expanded criteria donor (ECD) kidneys by waiting time alone. ECD kidneys were defined as having a risk of graft failure > or = 1.7 times that of ideal donors. ECDs include any donor > or = 60 years old and donors 50 to 59 years old with at least two of the following: terminal creatinine >1.5 mg/dL, history of hypertension, or death by cerebrovascular accident. The impact of this policy on use of ECD kidneys is assessed. METHODS The authors compared use of ECD kidneys recovered in the 18 months immediately before and after policy implementation. Differences were tested using t test and chi2 analyses. RESULTS There was an 18.3% increase in ECD kidney recoveries and a 15.0% increase in ECD kidney transplants in the first 18 months after policy implementation. ECD kidneys made up 22.1% of all recovered kidneys and 16.8% of all transplants, compared with 18.8% (P<0.001) and 14.5% (P<0.001), respectively, in the prior period. The discard rate was unchanged. The median relative risk (RR) for graft failure for transplanted ECD kidneys was 2.07 versus 1.99 in the prepolicy period (P=not significant); the median RR for procured ECD kidneys was unchanged at 2.16. The percentage of transplanted ECD kidneys with cold ischemia times (CIT) <12 hr increased significantly; the corresponding percentage for CIT > or = 24 hr decreased significantly. CONCLUSIONS The recent increase in ECD kidney recoveries and transplants appears to be related to implementation of the ECD allocation system.
Collapse
|
124
|
Abstract
Using OPTN/SRTR data, this article reviews the state of thoracic organ transplantation in 2003 and the previous decade. Time spent on the heart waiting list has increased significantly over the last decade. The percentage of patients awaiting heart transplantation for >2 years increased from 23% in 1994 to 49% by 2003. However, there has been a general decline in heart waiting list death rates over the decade. In 2003, the lung transplant waiting list reached a record high of 3,836 registrants, up slightly from 2002 and more than threefold since 1994. One-year patient survival for those receiving lungs in 2002 was 82%, a statistically significant improvement from 2001 (78%). The number of patients awaiting a heart-lung transplant, declining since 1998, reached 189 in 2003. Adjusted patient survival for heart-lung recipients is consistently worse than the corresponding rate for isolated lung recipients, primarily due to worse outcomes for heart-lung recipients with congenital heart disease. A new lung allocation system, approved in June 2004, derives from the survival benefit of transplantation with consideration of urgency based on waiting list survival, instead of being based solely on waiting time. A goal of the policy is to minimize deaths on the waiting list.
Collapse
|
125
|
Abstract
Retransplantation for liver allograft failure associated with hepatitis C virus (HCV) has been increasing due to nearly universal posttransplant HCV recurrence and has been demonstrated to be associated with poor outcomes. We report on the risk factors for death after retransplantation among liver recipients with HCV. A retrospective cohort of liver transplant recipients who underwent retransplantation between January 1997 and December 2002 was identified in the Scientific Registry of Transplant Recipients database. Cox regression was used to assess the relative effect of HCV diagnosis on mortality risk after retransplantation and was adjusted for multiple covariates. Of 1,718 liver retransplantations during the study period, 464 (27%) were associated with a diagnosis of HCV infection. Based on Cox regression, retransplant recipients with HCV had a 30% higher covariate-adjusted mortality risk than those without HCV diagnosis (hazard ratio [HR], 1.30; 95% confidence interval [CI], 1.10-1.54; P = 0.002). Other covariates associated with significant relative risk of death after retransplantation included older recipient age, presence in an intensive care unit (ICU), serum creatinine, and donor age. Additional regression analysis revealed that the increase in mortality risk associated with HCV was concentrated between 3 and 24 months postretransplantation, among patients age 18 to 39 at retransplant, and in patients retransplanted during the years 2000 to 2002. In conclusion, HCV liver recipients account for a considerable proportion of all retransplantations performed. Surprisingly, younger age predicted a higher mortality for recipients with HCV undergoing liver retransplantation. This may reflect a willingness to retransplant younger patients with an increased severity of illness or a more virulent HCV infection in this population. Although HCV was predictive of an increased risk of death, consideration of other characteristics of HCV patients, including donor and recipient age and need for preoperative ICU care may identify those at significantly higher risk.
Collapse
|
126
|
|
127
|
Abstract
The demand for donated organs greatly exceeds supply and many candidates die awaiting transplantation. Policies for allocating deceased donor organs may address equity of access and medical efficacy, but typically must be implemented with incomplete information. Simulation-based analysis can inform the policy process by predicting the likely effects of alternative policies on a wide variety of outcomes of interest. This paper describes a family of simulations developed by the US Scientific Registry of Transplant Recipients and initial experience in the application of one member of this family, the Liver Simulated Allocation Model (LSAM).
Collapse
|
128
|
|
129
|
Abstract
Demand for liver transplantation continues to exceed donor organ supply. Comparing recipient survival to that of comparable candidates without a transplant can improve understanding of transplant survival benefit. Waiting list and post-transplant mortality was studied among a cohort of 12 996 adult patients placed on the waiting list between 2001 and 2003. Time-dependent Cox regression models were fitted to determine relative mortality rates for candidates and recipients. Overall, deceased donor transplant recipients had a 79% lower mortality risk than candidates (HR = 0.21; p < 0.001). At Model for End-stage Liver Disease (MELD) 18-20, mortality risk was 38% lower (p < 0.01) among recipients compared to candidates. Survival benefit increased with increasing MELD score; at the maximum score of 40, recipient mortality risk was 96% lower than that for candidates (p < 0.001). In contrast, at lower MELD scores, recipient mortality risk during the first post-transplant year was much higher than for candidates (HR = 3.64 at MELD 6-11, HR = 2.35 at MELD 12-14; both p < 0.001). Liver transplant survival benefit at 1 year is concentrated among patients at higher risk of pre-transplant death. Futile transplants among severely ill patients are not identified under current practice. With 1 year post-transplant follow-up, patients at lower risk of pre-transplant death do not have a demonstrable survival benefit from liver transplant.
Collapse
|
130
|
Development and current status of ECD kidney transplantation. CLINICAL TRANSPLANTS 2005:37-55. [PMID: 17424724] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 05/14/2023]
Abstract
The worsening shortage of donor kidneys for transplant and the aging of both the donor and candidate populations have contributed to the increasing importance of ECD kidney transplantation. While ECD transplants have an increased risk of graft failure, for most candidates patient survival is still improved over remaining on dialysis. Because of this risk, however, ECD kidneys have a high likelihood of discard; significant geographic variation in discard and transplant rates impedes maximum utilization of these kidneys. The ECD allocation system was implemented to help facilitate expeditious placement of ECD kidneys to pre-consented candidates by a simplified allocation algorithm. Under this system, recovery and transplantation of ECD kidneys have increased at rates not seen with non-ECD kidneys and not predicted by preexisting trends. More disappointing has been the lack of effect on the percentage of discards and DGF, despite significant reductions in CIT. The disadvantage in graft survival for ECD kidneys extends equally across the spectrum of recipient characteristics, such that no one group of candidates has a proportionately smaller increase in risk. However, benefit analyses comparing the risk of accepting an ECD kidney versus waiting for a non-ECD kidney demonstrate a significant ECD benefit for older and diabetic candidates in regions with prolonged waiting times. The potential value of an ECD kidney to an individual candidate hinges upon the ability to receive it substantially earlier than a non-ECD kidney. Thus, future allocation efforts may focus on ensuring that is the case. In allocation driven by net benefit, ECD kidneys may become an alternative for those who might not otherwise receive a kidney transplant.
Collapse
|
131
|
|
132
|
Predicted lifetimes for adult and pediatric split liver versus adult whole liver transplant recipients. Am J Transplant 2004; 4:1792-7. [PMID: 15476478 DOI: 10.1111/j.1600-6143.2004.00594.x] [Citation(s) in RCA: 77] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
Split liver transplantation allows 2 recipients to receive transplants from one organ. Comparisons of predicted lifetimes for two alternatives (split liver for an adult and pediatric recipient vs. whole liver for an adult recipient) can help guide the use of donor livers. We analyzed mortality risk for 48,888 waitlisted candidates, 907 split and 21,913 whole deceased donor liver transplant recipients between January 1, 1995 and February 26, 2002. Cox regression models for pediatric and adult patients assessed average relative wait list and post-transplant death risks, for split liver recipients. Life years gained compared with remaining on the waiting list over a 2-year period were calculated. Seventy-six splits (152 recipients) and 24 re-transplants resulted from every 100 livers (13.1% [adult] and 18.0% [pediatric] 2-year re-transplant rates, respectively). Whole livers used for 93 adults also utilized 100 livers (re-transplant rate 7.0%). Eleven extra life years and 59 incremental recipients accrued from each 100 livers used for split compared with whole organ transplants. Split liver transplantation could provide enough organs to satisfy the entire current demand for pediatric donor livers in the United States, provide more aggregate years of life than whole organ transplants and result in larger numbers of recipients.
Collapse
|
133
|
Abstract
1. Liver transplantation is currently offered as a therapeutic option for patients with a wide range of end-stage liver diseases. 2. Conventional wisdom suggests that patients who receive a liver transplant have a greater expected lifetime when compared to comparable candidates on the waiting list. 3. The model for end-stage liver disease (MELD) scoring system is an excellent predictor of mortality on the waiting list and also predicts mortality after liver transplantation. 4. The combination of waiting list mortality risk and posttransplant mortality risk assessed by MELD and other factors can be used to estimate whether candidates are likely to derive a survival benefit from a liver transplant.
Collapse
|
134
|
Summary report of a national conference: Evolving concepts in liver allocation in the MELD and PELD era. December 8, 2003, Washington, DC, USA. Liver Transpl 2004; 10:A6-22. [PMID: 15382225 DOI: 10.1002/lt.20247] [Citation(s) in RCA: 127] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/13/2022]
Abstract
A national conference was held to review and assess data gathered since implementation of MELD and PELD and determine future directions. The objectives of the conference were to review the current system of liver allocation with a critical analysis of its strengths and weaknesses. Conference participants used an evidence-based approach to consider whether predicted outcome after transplantation should influence allocation, to discuss the concept of minimal listing score, to revisit current and potential expansion of exception criteria, and to determine whether specific scores should be used for automatic removal of patients on the waiting list. After review of data from the first 18 months since implementation, association and society leaders, and surgeons and hepatologists with wide regional representation were invited to participate in small group discussions focusing on each of the main objectives. At the completion of the meeting, there was agreement that MELD has had a successful initial implementation, meeting the goal of providing a system of allocation that emphasizes the urgency of the candidate while diminishing the reliance on waiting time, and that it has proven to be a powerful tool for auditing the liver allocation system. It was also agreed that the data regarding the accuracy of PELD as a predictor of pretransplant mortality were less conclusive and that PELD should be considered in isolation. Recommendations for the transplant community, based on the analysis of the MELD data, were discussed and are presented in the summary document.
Collapse
|
135
|
Abstract
1. The PELD score accurately predicts the 3 month probability of waiting list death for children with chronic liver disease. 2. Comparing pre and post PELD and MELD implementation, the percent of children receiving deceased donor livers increased and the percent of children dying on the list decreased after PELD/MELD implementation. 3. Excluding children transplanted at status 1, the largest percentage of children are transplanted at a PELD score < 10. 4. Before MELD/PELD 48% of all children receiving deceased donor organs were transplanted at status 1, compared to 41% in the PELD/MELD era. Wide regional variation occurs.
Collapse
|
136
|
Abstract
Outcomes for certain surgical procedures have been linked with volume: hospitals performing a high number of procedures demonstrate better outcomes than do low-volume centers. This study examines the effect of volume on hepatic and renal transplant outcomes. Data from the Scientific Registry of Transplant Recipients were analyzed for transplants performed from 1996-2000. Transplant centers were assigned to volume quartiles (kidney) or terciles (liver). Logistic regression models, adjusted for clinical characteristics and transplant center clustering, demonstrate the effect of transplant center volume quantile on 1-year post-transplant patient mortality (liver) and graft loss (kidney). The unadjusted rate of renal graft loss within 1 year was significantly lower at high volume centers (8.6%) compared with very low (9.6%), low (9.9%) and medium (9.7%) volume centers (p = 0.0014). After adjustment, kidney transplant at very low [adjusted odds ratio (AOR) 1.22; p = 0.043) and low volume (AOR 1.22 p = 0.041) centers was associated with a higher incidence of graft loss when compared with high volume centers. Unadjusted 1-year mortality rates for liver transplant were significantly different at high (15.9%) vs. low (16.9%) or medium (14.7%) volume centers. After adjustment, low volume centers were associated with a significantly higher risk of death (AOR 1.30; p = 0.0036). There is considerable variability in the range of failure between quantiles after kidney and liver transplant. Transplant outcomes are better at high volume centers; however, there is no clear minimal threshold volume.
Collapse
|
137
|
|
138
|
Abstract
On February 27, 2002, the liver allocation system changed from a status-based algorithm to one using a continuous MELD/PELD severity score to prioritize patients on the waiting list. Using data from the Scientific Registry of Transplant Recipients, we examine and discuss several aspects of the new allocation, including the development and evolution of MELD and PELD, the relationship between the two scoring systems, and the resulting effect on access to transplantation and waiting list mortality. Additional considerations, such as regional differences in MELD/PELD at transplantation and the predictive effects of rapidly changing MELD/PELD, are also addressed. Death or removal from the waiting list for being too sick for a transplant has decreased in the MELD/PELD era for both children and adults. Children younger than 2 years, however, still have a considerably higher rate of death on the waiting list than adults. A limited definition of ECD livers suggests that they are used more frequently for patients with lower MELD scores.
Collapse
|
139
|
|
140
|
Abstract
We sought to determine which type of donor graft provides children and young adults with the best outcomes following liver transplantation. Using the US Scientific Registry of Transplant Recipients database, we identified 6467 recipients of first liver transplants during 1989-2000 aged < 30 years. We used Cox models to examine adjusted patient and graft outcomes by age (< 2, 2-10, 11-16, 17-29) and donor graft type (deceased donor full size (DD-F), split (DD-S), living donor (LD)]. For patients aged < 2, LD grafts had a significantly lower risk of graft failure than DD-S (RR = 0.49, p < 0.0001) and DD-F (RR = 0.70, p = 0.02) and lower mortality risk than DD-S (RR = 0.71, p = 0.08) during the first year post-transplant. In contrast, older children exhibited a higher risk of graft loss and a trend toward higher mortality associated with LD transplants. In young adults, DD-S transplants were associated with poor outcomes. Three-year follow up yielded similar graft survival results but no significant differences in mortality risk by graft type within age group. For recipients aged < 2, LD transplants provide superior graft survival than DD-F or DD-S and trend toward better patient survival than DD-S. Living donor is the preferred donor source in the most common pediatric age group (< 2 years) undergoing liver transplantation.
Collapse
|
141
|
|
142
|
|
143
|
Immunosuppression and the risk of post-transplant malignancy among cadaveric first kidney transplant recipients. Am J Transplant 2004; 4:87-93. [PMID: 14678038 DOI: 10.1046/j.1600-6135.2003.00274.x] [Citation(s) in RCA: 232] [Impact Index Per Article: 11.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
The success of renal transplantation may be counterbalanced by serious adverse medical events. The effect of immunosuppression on the incidence of de novo neoplasms among kidney recipients should be monitored continuously. Using data from the Scientific Registry of Transplant Recipients, we studied the association of induction therapy by immunosuppression with antilymphocyte antibodies, with the development of de novo neoplasms. The study population included more than 41 000 recipients who received a cadaveric first kidney transplant after December 31, 1995, and were followed through February 28, 2002. Using Cox regression models, we estimated time to development of two types of malignancy: de novo solid tumors and post-transplant lymphoproliferative disorder (PTLD). We made adjustments for several patient demographic factors and comorbidities. Induction therapy was significantly associated with a higher relative risk (RR) of PTLD (RR = 1.78, p < 0.001), but not with a greater likelihood of de novo tumors (RR = 1.07, p = 0.42). Treatment with maintenance tacrolimus vs. cyclosporine showed a significantly different RR of developing de novo tumors for recipients with induction than for those not receiving induction (p = 0.024). These new estimates of the magnitude of malignancy risk associated with induction therapy may be useful for clinical practice.
Collapse
|
144
|
Review of transplantation in HIV patients during the HAART era. CLINICAL TRANSPLANTS 2004:63-82. [PMID: 16704139] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Subscribe] [Scholar Register] [Indexed: 05/09/2023]
Abstract
Human immunodeficiency virus (HIV) infection was once thought to be a relative or even absolute contraindication to transplantation. With the recent advent of highly active antiretroviral therapy (HAART), those infected with HIV are now living longer and dying from illnesses other than acquired immunodeficiency syndrome (AIDS). Although studies prior to the HAART era suggested poor outcomes might occur with transplantation in those infected with HIV, more recent studies have demonstrated results comparable to those of recipients without HIV infection. A number of issues persist regarding ethics, patient selection, post-operative management, and drug interactions between antiretroviral and immunosuppression agents. In this review, kidney, liver, and heart transplantation in the HIV-positive population were analyzed using data from the Organ Procurement and Transplantation Network/ Scientific Registry of Transplant Recipients.
Collapse
|
145
|
Abstract
Transplant candidates are permitted to register on multiple waiting lists. We examined multiple-listing practices and outcomes, using data on 81 481 kidney and 26 260 liver candidates registered between 7/1/95 and 6/30/00. Regression models identified factors associated with multiple-listing and its effect on relative rates of transplantation, waiting list mortality, kidney graft failure, and liver transplant mortality. Overall, 5.8% (kidney) and 3.3% (liver) of candidates multiple-listed. Non-white race, older age, non-private insurance, and lower educational level were associated with significantly lower odds of multiple-listing. While multiple-listed, transplantation rates were significantly higher for nearly all kidney and liver candidate subgroups (relative rate [RR]= 1.42-2.29 and 1.82-7.41, respectively). Waiting list mortality rates were significantly lower while multiple-listed for 11 kidney subgroups (RR = 0.22-0.72) but significantly higher for 7 liver subgroups (RR = 1.44-5.93), suggesting multiple-listing by healthier kidney candidates and sicker liver candidates. Graft failure was 10% less likely among multiple-listed kidney recipients. Multiple- and single-listed liver recipients had similar post-transplant mortality rates. Although specific factors characterize those transplant candidates likely to multiple-list, transplant access is significantly enhanced for almost all multiple-listed kidney and liver candidates.
Collapse
|
146
|
Abstract
The shortage of liver donors and the increasing number of patients on the waiting list for liver transplantation have led to a widening of the definition of suitable liver donors. In this case report, we describe transplantation of a liver from a 20-year-old brain-dead donor with a past history of schistosomiasis. Careful evaluation for schistosomiasis-related hepatic complications using hepatic function tests, clinical assessment for manifestations of portal hypertension, as well as abdominal ultrasound, and liver biopsy were performed. At 7 months follow-up, the recipient is doing well with normal liver function. Liver transplantation from a donor with a history of schistosomiasis is acceptable in carefully screened cases.
Collapse
|
147
|
Trends in cadaveric organ donation in the United States: 1990-1999. CLINICAL TRANSPLANTS 2003:105-9. [PMID: 12971439] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Subscribe] [Scholar Register] [Indexed: 03/04/2023]
Abstract
Despite declining in-hospital deaths, deceased donor organ donation increased in all demographic segments of the US population from 1990 to 1999. Although the rate of growth in older donors was more dramatic, there was substantial growth in the most physiologically suitable donor age groups, which contribute most to the donor pool. Nationwide, the average annual growth in donation rate for in-hospital deaths ages 1-70 was 5.0%, with wide regional variation. While the national trend is patently inadequate to meet the need for organ transplantation, the wide range of programs aimed at improving organ donation at all levels of society point to an encouraging pattern that may have resulted, in part, from these initiatives.
Collapse
|
148
|
Abstract
BACKGROUND Transplantation of nonrenal organs is often complicated by chronic renal disease with multifactorial causes. We conducted a population-based cohort analysis to evaluate the incidence of chronic renal failure, risk factors for it, and the associated hazard of death in recipients of nonrenal transplants. METHODS Pretransplantation and post-transplantation clinical variables and data from a registry of patients with end-stage renal disease (ESRD) were linked in order to estimate the cumulative incidence of chronic renal failure (defined as a glomerular filtration rate of 29 ml per minute per 1.73 m2 of body-surface area or less or the development of ESRD) and the associated risk of death among 69,321 persons who received nonrenal transplants in the United States between 1990 and 2000. RESULTS During a median follow-up of 36 months, chronic renal failure developed in 11,426 patients (16.5 percent). Of these patients, 3297 (28.9 percent) required maintenance dialysis or renal transplantation. The five-year risk of chronic renal failure varied according to the type of organ transplanted - from 6.9 percent among recipients of heart-lung transplants to 21.3 percent among recipients of intestine transplants. Multivariate analysis indicated that an increased risk of chronic renal failure was associated with increasing age (relative risk per 10-year increment, 1.36; P<0.001), female sex (relative risk among male patients as compared with female patients, 0.74; P<0.001), pretransplantation hepatitis C infection (relative risk, 1.15; P<0.001), hypertension (relative risk, 1.18; P<0.001), diabetes mellitus (relative risk, 1.42; P<0.001), and postoperative acute renal failure (relative risk, 2.13; P<0.001). The occurrence of chronic renal failure significantly increased the risk of death (relative risk, 4.55; P<0.001). Treatment of ESRD with kidney transplantation was associated with a five-year risk of death that was significantly lower than that associated with dialysis (relative risk, 0.56; P=0.02). CONCLUSIONS The five-year risk of chronic renal failure after transplantation of a nonrenal organ ranges from 7 to 21 percent, depending on the type of organ transplanted. The occurrence of chronic renal failure among patients with a nonrenal transplant is associated with an increase by a factor of more than four in the risk of death.
Collapse
|
149
|
|
150
|
Prospective, randomized, multi-center trial of antibody induction therapy in simultaneous pancreas-kidney transplantation. Am J Transplant 2003; 3:855-64. [PMID: 12814477 DOI: 10.1034/j.1600-6143.2003.00160.x] [Citation(s) in RCA: 46] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
A randomized, multicenter, prospective study was conducted at 18 pancreas transplant centers in the United States to determine the role of induction therapy in simultaneous pancreas-kidney (SPK) transplantation. One hundred and 74 recipients were enrolled: 87 recipients each in the induction and noninduction treatment arms. Maintenance immunosuppression consisted of tacrolimus, mycophenolate mofetil, and corticosteroids. There were no statistically significant differences between treatment groups for patient, kidney, and pancreas graft survival at 1-year. The 1-year cumulative incidence of any treated biopsy-confirmed or presumptive rejection episodes (kidney or pancreas) in the induction and noninduction treatment arms was 24.6% and 31.2% (p = 0.28), respectively. The 1-year cumulative incidence of biopsy-confirmed, treated, acute kidney allograft rejection in the induction and noninduction treatment arms was 13.1% and 23.0% (p = 0.08), respectively. Biopsy-confirmed kidney allograft rejection occurred later post-transplant and appeared to be less severe among recipients that received induction therapy. The highest rate of Cytomegalovirus (CMV) viremia/syndrome was observed in the subgroup of recipients who received T-cell depleting antibody induction and received organs from CMV serologically positive donors. Decisions regarding the routine use of induction therapy in SPK transplantation must take into consideration its differential effects on risk of rejection and infection.
Collapse
|