1
|
Portable hypothermic oxygenated machine perfusion for organ preservation in liver transplantation: A randomized, open-label, clinical trial. Hepatology 2024; 79:1033-1047. [PMID: 38090880 PMCID: PMC11019979 DOI: 10.1097/hep.0000000000000715] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/14/2023] [Accepted: 11/01/2023] [Indexed: 02/27/2024]
Abstract
BACKGROUND AND AIMS In liver transplantation, cold preservation induces ischemia, resulting in significant reperfusion injury. Hypothermic oxygenated machine perfusion (HMP-O 2 ) has shown benefits compared to static cold storage (SCS) by limiting ischemia-reperfusion injury. This study reports outcomes using a novel portable HMP-O 2 device in the first US randomized control trial. APPROACH AND RESULTS The PILOT trial (NCT03484455) was a multicenter, randomized, open-label, noninferiority trial, with participants randomized to HMP-O 2 or SCS. HMP-O 2 livers were preserved using the Lifeport Liver Transporter and Vasosol perfusion solution. The primary outcome was early allograft dysfunction. Noninferiority margin was 7.5%. From April 3, 2019, to July 12, 2022, 179 patients were randomized to HMP-O 2 (n=90) or SCS (n=89). The per-protocol cohort included 63 HMP-O 2 and 73 SCS. Early allograft dysfunction occurred in 11.1% HMP-O 2 (N=7) and 16.4% SCS (N=12). The risk difference between HMP-O 2 and SCS was -5.33% (one-sided 95% upper confidence limit of 5.81%), establishing noninferiority. The risk of graft failure as predicted by Liver Graft Assessment Following Transplant score at seven days (L-GrAFT 7 ) was lower with HMP-O 2 [median (IQR) 3.4% (2.4-6.5) vs. 4.5% (2.9-9.4), p =0.024]. Primary nonfunction occurred in 2.2% of all SCS (n=3, p =0.10). Biliary strictures occurred in 16.4% SCS (n=12) and 6.3% (n=4) HMP-O 2 ( p =0.18). Nonanastomotic biliary strictures occurred only in SCS (n=4). CONCLUSIONS HMP-O 2 demonstrates safety and noninferior efficacy for liver graft preservation in comparison to SCS. Early allograft failure by L-GrAFT 7 was lower in HMP-O 2 , suggesting improved early clinical function. Recipients of HMP-O 2 livers also demonstrated a lower incidence of primary nonfunction and biliary strictures, although this difference did not reach significance.
Collapse
|
2
|
Impact of Portable Normothermic Blood-Based Machine Perfusion on Outcomes of Liver Transplant: The OCS Liver PROTECT Randomized Clinical Trial. JAMA Surg 2022; 157:189-198. [PMID: 34985503 PMCID: PMC8733869 DOI: 10.1001/jamasurg.2021.6781] [Citation(s) in RCA: 141] [Impact Index Per Article: 70.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/19/2022]
Abstract
Question Can oxygenated portable normothermic perfusion of deceased donor livers for transplant improve outcomes compared with the current standard of care using ischemic cold storage? Findings In this multicenter randomized clinical trial of 300 recipients of liver transplants with the donor liver preserved by either normothermic perfusion or conventional ischemic cold storage, normothermic machine perfusion resulted in decreased early liver graft injury and ischemic biliary complications and greater organ utilization. Meaning In this study, portable normothermic oxygenated machine perfusion of donor liver grafts resulted in improved outcomes after liver transplant and in more livers being transplanted. Importance Ischemic cold storage (ICS) of livers for transplant is associated with serious posttransplant complications and underuse of liver allografts. Objective To determine whether portable normothermic machine perfusion preservation of livers obtained from deceased donors using the Organ Care System (OCS) Liver ameliorates early allograft dysfunction (EAD) and ischemic biliary complications (IBCs). Design, Setting, and Participants This multicenter randomized clinical trial (International Randomized Trial to Evaluate the Effectiveness of the Portable Organ Care System Liver for Preserving and Assessing Donor Livers for Transplantation) was conducted between November 2016 and October 2019 at 20 US liver transplant programs. The trial compared outcomes for 300 recipients of livers preserved using either OCS (n = 153) or ICS (n = 147). Participants were actively listed for liver transplant on the United Network of Organ Sharing national waiting list. Interventions Transplants were performed for recipients randomly assigned to receive donor livers preserved by either conventional ICS or the OCS Liver initiated at the donor hospital. Main Outcomes and Measures The primary effectiveness end point was incidence of EAD. Secondary end points included OCS Liver ex vivo assessment capability of donor allografts, extent of reperfusion syndrome, incidence of IBC at 6 and 12 months, and overall recipient survival after transplant. The primary safety end point was the number of liver graft–related severe adverse events within 30 days after transplant. Results Of 293 patients in the per-protocol population, the primary analysis population for effectiveness, 151 were in the OCS Liver group (mean [SD] age, 57.1 [10.3] years; 102 [67%] men), and 142 were in the ICS group (mean SD age, 58.6 [10.0] years; 100 [68%] men). The primary effectiveness end point was met by a significant decrease in EAD (27 of 150 [18%] vs 44 of 141 [31%]; P = .01). The OCS Liver preserved livers had significant reduction in histopathologic evidence of ischemia-reperfusion injury after reperfusion (eg, less moderate to severe lobular inflammation: 9 of 150 [6%] for OCS Liver vs 18 of 141 [13%] for ICS; P = .004). The OCS Liver resulted in significantly higher use of livers from donors after cardiac death (28 of 55 [51%] for the OCS Liver vs 13 of 51 [26%] for ICS; P = .007). The OCS Liver was also associated with significant reduction in incidence of IBC 6 months (1.3% vs 8.5%; P = .02) and 12 months (2.6% vs 9.9%; P = .02) after transplant. Conclusions and Relevance This multicenter randomized clinical trial provides the first indication, to our knowledge, that normothermic machine perfusion preservation of deceased donor livers reduces both posttransplant EAD and IBC. Use of the OCS Liver also resulted in increased use of livers from donors after cardiac death. Together these findings indicate that OCS Liver preservation is associated with superior posttransplant outcomes and increased donor liver use. Trial Registration ClinicalTrials.gov Identifier: NCT02522871
Collapse
|
3
|
Robot-assisted kidney transplantation is a safe alternative approach for morbidly obese patients with end-stage renal disease. Int J Med Robot 2021; 17:e2293. [PMID: 34080270 DOI: 10.1002/rcs.2293] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2021] [Revised: 05/18/2021] [Accepted: 05/27/2021] [Indexed: 02/05/2023]
Abstract
BACKGROUND Many centres deny obese patients with a body mass index (BMI) >35 access to kidney transplantation due to increased intraoperative and postoperative complications. METHODS From August 2017 to December 2019, 73 consecutive cases of kidney transplantation in morbidly obese patients were enrolled at a single university at the initiation of a robotic transplant surgery program. Outcomes of patients who underwent robotic assisted kidney transplant (RAKT) were compared to frequency-matched patients undergoing open kidney transplant (OKT). RESULTS A total of 24 morbidly obese patients successfully underwent RAKT, and 49 obese patients received an OKT. The RAKT group developed fewer surgical site infections (SSI) than the OKT group. Graft function, creatinine, and glomerular filtration rate (GFR) were similar between groups 1 year after surgery. Graft and patient survival were 100% for both groups. CONCLUSIONS RAKT offers a safe alternative for morbidly obese patients, who may otherwise be denied access to OKT.
Collapse
|
4
|
Effects of Initial Hepatic Artery Followed by Portal Reperfusion Technique on Deceased Donor Liver Transplant Outcomes. EXP CLIN TRANSPLANT 2021; 19:671-675. [PMID: 33928876 DOI: 10.6002/ect.2020.0555] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
OBJECTIVES Although initial portal vein reperfusion of a liver allograft is nearly standardized, limited data suggest initial hepatic artery reperfusion may improve hemodynamics and posttransplant outcomes. MATERIALS AND METHODS We retrospectively reviewed orthotopic liver transplants performed between January 2013 and February 2018. Parameters of liver recipients with initial hepatic artery reperfusion were compared with those with initial portal vein reperfusion. RESULTS Of 204 recipients, 53 (26%) were initially perfused from the hepatic artery and 151 (74%) were initially perfused from the portal vein. Demographics between groups did not differ. There were no significant differences in the incidence of acute rejection between recipients with initial hepatic artery reperfusion versus portal vein reperfusion at 3 months and 1 year (1.9% vs 7.9% and 7.5% vs 10.6%; not significant), hepatic artery thrombosis (1.9% vs 4.0% and 1.9% vs 7.3%; not significant), biliary leakage (7.5% vs 4.0% and 9.4 vs 6.6; not significant), biliary strictures (7.5% vs 5.3% and 11.3% vs 7.9%; not significant), or portal or hepatic venous thrombosis/stenosis (5.7% vs 5.3% and 7.5% vs 7.9%; not significant). Furthermore, recipients with initial hepatic artery reperfusion and portal vein reperfusion were both hospitalized for a median of 8.5 days (interquartile range, 6.5-15.5 vs 7.0-14.0 days, respectively), and both groups were in the intensive care unit for a median of 3 days (interquartile range, 2-7 vs 2-4 days, respectively). Initial hepatic artery reperfusion was associated with significantly less intraoperative packet red blood cell transfusion (median, 11.9 U [interquartile range, 11.1-13.1 U] vs 15.5 U [interquartile range, 12.9-17.9 U]; P < .001). The 2 groups did not differ in terms of patient and graft survival. CONCLUSIONS Initial reperfusion of liver allografts with arterial, rather than portal, blood has benefits to hemodynamic stability, did not have deleterious effects on outcomes, and resulted in less intraoperative blood utilization.
Collapse
|
5
|
Pre-exposure prophylaxis among men who have sex with men in Côte d'Ivoire: a quantitative study of acceptability. AIDS Care 2020; 33:1228-1236. [PMID: 32603610 DOI: 10.1080/09540121.2020.1785997] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
Abstract
This cross sectional study was conducted in 2018 in Côte d'Ivoire to assess PrEP acceptability among men who have sex with men (MSM). Two hundred and one men were asked on their intention to use PrEP if made available. Logistic regression accounting for the sampling design was used to analyze associations between high PrEP acceptability and different independent variables including barriers and facilitators. Participants were mostly young (mean age = 25 years), educated (82% with secondary/postsecondary education) and single (95.5%). On average, 3.4 episodes of anal sex were reported monthly and 37.8% of men did not use a condom at last sex. Most MSM (72.6%) had heard of PrEP before enrollment. Overall, 35.3% reported that they would use PrEP very probably if made available. In multivariate analysis, factors associated with high PrEP acceptability were condom use at last sexual intercourse (Odds ratio (OR) = 2.51; 95%Confidence interval (95%CI) = 1.45-4.33); insertive sex as compared to versatile sex (OR = 2.56; 95%CI = 1.14-5.67); free PrEP delivery (OR = 2.45; 95%CI = 1.07-5.59), concerns about side effects (OR = 0.66; 95%CI = 0.48-0.90), and being preoccupied by the fact that post-PrEP antiretroviral therapy could be inefficient (OR = 0.25; 95%CI = 0.14-0.44). PrEP implementation among MSM in Côte d'Ivoire should be accompanied by awareness raising campaigns explaining its utility.
Collapse
|
6
|
Hepatic Artery Thrombosis and Takotsubo Syndrome After Liver Transplantation - Which Came First? AMERICAN JOURNAL OF CASE REPORTS 2020; 21:e920263. [PMID: 32287173 PMCID: PMC7176589 DOI: 10.12659/ajcr.920263] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
Abstract
BACKGROUND Takotsubo syndrome is a transient, reversible, stress-induced cardiomyopathy that affects only 1.4% of liver transplant patients and can cause complications, including cardiogenic shock, arrhythmia, and thromboembolism. Hepatic artery thrombosis is also rare, affecting just 2-4% of these patients, but can have disastrous consequences. Here, we describe a case of concurrent takotsubo syndrome and hepatic artery thrombosis in a postoperative liver transplant recipient. CASE REPORT The patient was a 66-year-old man who underwent living donor liver transplantation for non-alcoholic steatohepatitis. On postoperative day 3, he became lethargic and tachycardic to the 120 s. Work-up, including EKG, troponin I, BNP, and transthoracic echocardiogram, was characteristic for takotsubo syndrome. His LVEF of 15-20% was markedly reduced compared to his baseline of 50-55% from 6 months prior. Hepatic ultrasonography showed no hepatic arterial flow, prompting emergent return to the OR, where intraoperative evaluation revealed hepatic artery thrombosis. The graft was salvaged after hepatic artery thrombectomy and arterial anastomosis revision. We are unable to determine which event caused the other in this case, as both takotsubo syndrome and hepatic artery thrombosis manifested within the same time frame. CONCLUSIONS It is important to recognize takotsubo syndrome as a potential cause of cardiac dysfunction and hepatic artery thrombosis in liver transplant patients, and also be aware that hepatic artery thrombosis can precipitate takotsubo syndrome.
Collapse
|
7
|
Intraoperative glycemic control in patients undergoing Orthotopic liver transplant: a single center prospective randomized study. BMC Anesthesiol 2020; 20:3. [PMID: 31901245 PMCID: PMC6942664 DOI: 10.1186/s12871-019-0918-0] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2019] [Accepted: 12/23/2019] [Indexed: 01/04/2023] Open
Abstract
Background Perioperative hyperglycemia is associated with poor outcomes yet evidence to guide intraoperative goals and treatment modalities during non-cardiac surgery are lacking. End-stage liver disease is associated with altered glucose homeostasis; patients undergoing liver transplantation display huge fluctuations in blood glucose (BG) and represent a population of great interest. Here, we conduct a randomized trial to compare the effects of strict versus conventional glycemic control during orthotopic liver transplant (OLT). Methods Following approval by the Institutional Review Board of the University of Michigan Medical School and informed consent, 100 adult patients undergoing OLT were recruited. Patients were randomized to either strict (target BG 80–120 mg/dL) or conventional (target BG 180–200 mg/dL) BG control with block randomization for diabetic and nondiabetic patients. The primary outcomes measured were 1-year patient and graft survival assessed on an intention to treat basis. Graft survival is defined as death or needing re-transplant (www.unos.org). Three and 5-year patient and graft survival, infectious and biliary complications were measured as secondary outcomes. Data were examined using univariate methods and Kaplan-Meir survival analysis. A sensitivity analysis was performed to compare patients with a mean BG of ≤120 mg/dL and those > 120 mg/dL regardless of treatment group. Results There was no statistically significant difference in patient survival between conventional and strict control respectively;1 year, 88% vs 88% (p-0.99), 3 years, 86% vs 84% (p- 0.77), 5 years, 82% vs 78. % (p-0.36). Graft survival was not different between conventional and strict control groups at 1 year, 88% vs 84% (p-0.56), 3 years 82% vs 76% (p-0.46), 5 years 78% vs 70% (p-0.362). Conclusion There was no difference in patient or graft survival between intraoperative strict and conventional glycemic control during OLT. Trial registration Clinical trial number and registry: www.clinicaltrials.gov NCT00780026. This trial was retrospectively registered on 10/22/2008.
Collapse
|
8
|
Geographic variation in liver transplantation persists despite implementation of Share35. Hepatol Res 2018; 48:225-232. [PMID: 28603899 DOI: 10.1111/hepr.12922] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/31/2017] [Revised: 05/12/2017] [Accepted: 06/06/2017] [Indexed: 02/08/2023]
Abstract
AIM Geographic disparities persist in the USA despite locoregional organ sharing policies. The impact of national organ sharing policies on waiting-list mortality on a regional basis remains unknown. METHODS Data on all adult liver transplants between 1 February 2002 and 31 March 2015 were obtained from the United Network for Organ Sharing/Organ and Transplantation Network. Multivariable Cox proportional hazards models were constructed in a time-to-event analysis to estimate waiting-list mortality for the pre- and post-Share35 eras. RESULTS In the analyzed time period, 134 247 patients were listed for transplantation and 54 510 received organs (42.8%). Listing volume increased following the introduction of the Share35 organ sharing policy (15 976 candidates pre- vs. 18 375 post) without significant regional changes as did the number of transplants (7210 pre- vs. 8224 post). Waiting-list mortality improved from 12.2% to 8.1% (P < 0.001). Adjusted waiting-list mortality ratios remained geographically disparate. Region 10 and region 11 had lower hazard ratios (HR) but still had increased mortality (1.46, 95% confidence interval [CI] 1.34-1.60, P < 0.001; and HR 1.49, 95% CI 1.37-1.62, P < 0.001, respectively). Regions 3 and 6 had increased HR with persistently elevated waiting-list mortality (1.79, 95% CI 1.66-1.93, P < 0.001; and HR 1.29, 95% CI 1.16-1.45, P < 0.001, respectively). Model for End-state Liver Disease (MELD) exception continued to propagate a survival benefit (HR 0.65, 95% CI 0.63-0.68, P < 0.001). CONCLUSIONS Although overall waiting-list mortality has decreased, geographic disparities persist, but appear reduced despite broader sharing policies enacted by Share35. The advantage afforded by MELD exception, while still present, was diminished by Share35 as organs are being shifted to MELD >35 candidates. The disparities highlighted by our findings imply a need to review current allocation policies to best balance local, regional, and national transplant environments.
Collapse
|
9
|
PROviding Better ACcess To ORgans: A comprehensive overview of organ-access initiatives from the ASTS PROACTOR Task Force. Am J Transplant 2017; 17:2546-2558. [PMID: 28742951 DOI: 10.1111/ajt.14441] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2017] [Revised: 06/25/2017] [Accepted: 07/13/2017] [Indexed: 01/25/2023]
Abstract
The American Society of Transplant Surgeons (ASTS) PROviding better Access To Organs (PROACTOR) Task Force was created to inform ongoing ASTS organ access efforts. Task force members were charged with comprehensively cataloguing current organ access activities and organizing them according to stakeholder type. This white paper summarizes the task force findings and makes recommendations for future ASTS organ access initiatives.
Collapse
|
10
|
|
11
|
Advanced non-alcoholic steatohepatitis cirrhosis: A high-risk population for pre-liver transplant portal vein thrombosis. World J Hepatol 2017; 9:139-146. [PMID: 28217250 PMCID: PMC5295147 DOI: 10.4254/wjh.v9.i3.139] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/26/2016] [Revised: 10/25/2016] [Accepted: 12/14/2016] [Indexed: 02/06/2023] Open
Abstract
AIM To examine if liver transplant recipients with high-risk non-alcoholic steatohepatitis (NASH) are at increased risk for pre-transplant portal venous thrombosis.
METHODS Data on all liver transplants in the United States from February 2002 through September 2014 were analyzed. Recipients were sorted into three distinct groups: High-risk (age > 60, body mass index > 30 kg/m2, hypertension and diabetes), low-risk and non-NASH cirrhosis. Multivariable logistic regression models were constructed.
RESULTS Thirty-five thousand and seventy-two candidates underwent liver transplantation and of those organ recipients, 465 were transplanted for high-risk and 2775 for low-risk NASH. Two thousand six hundred and twenty-six (7.5%) recipients had pre-transplant portal vein thrombosis; 66 (14.2%) of the high-risk NASH group had portal vein thrombosis vs 328 (11.8%) of the low-risk NASH group. In general, all NASH recipients were less likely to be male or African American and more likely to be obese. In adjusted multivariable regression analyses, high-risk recipients had the greatest risk of pre-transplant portal vein thrombosis with OR = 2.11 (95%CI: 1.60-2.76, P < 0.001) when referenced to the non-NASH group.
CONCLUSION Liver transplant candidates with high-risk NASH are at the greatest risk for portal vein thrombosis development prior to transplantation. These candidates may benefit from interventions to decrease their likelihood of clot formation and resultant downstream hepatic decompensating events. Prospective study is needed.
Collapse
|
12
|
Liver transplant recipients with portal vein thrombosis receiving an organ from a high-risk donor are at an increased risk for graft loss due to hepatic artery thrombosis. Transpl Int 2016; 29:1286-1295. [PMID: 27714853 DOI: 10.1111/tri.12855] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2016] [Revised: 05/23/2016] [Accepted: 09/02/2016] [Indexed: 12/11/2022]
Abstract
We hypothesize that recipients with pretransplant portal vein thrombosis (PVT) receiving organs from high-risk donors (HRD) are at an increased risk of HAT. Data on all liver transplants in the United States from February 2002 to March 2015 were analyzed. Recipients were sorted into two groups: those with PVT and those without. HRDs were defined by donor risk index (DRI) >1.7. Multivariable logistic regression models were constructed to assess the independent risk factors for HAT with the resultant graft loss ≤90 days from transplantation. A total of 60 404 candidates underwent liver transplantation; of those recipients, 623 (1.0%) had HAT, of which 66.0% (n = 411) received organs from HRDs compared with 49.3% (n = 29 473) in recipients without HAT (P < 0.001); 2250 (3.7%) recipients had pretransplantation PVT and received organs from HRDs. On adjusted multivariable analysis, PVT with a HRD organ was the most significant independent risk factor (OR 3.56, 95% CI 2.52-5.02, P < 0.001) for the development of HAT. Candidates with pretransplant PVT who receive an organ from a HRD are at the highest risk for postoperative HAT independent of other measurable factors. Recipients with pretransplant PVT would benefit from careful donor selection and possibly anticoagulation perioperatively.
Collapse
|
13
|
Pre-transplant portal vein thrombosis is an independent risk factor for graft loss due to hepatic artery thrombosis in liver transplant recipients. HPB (Oxford) 2016; 18:279-86. [PMID: 27017168 PMCID: PMC4814623 DOI: 10.1016/j.hpb.2015.10.008] [Citation(s) in RCA: 37] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/05/2015] [Accepted: 10/12/2015] [Indexed: 12/12/2022]
Abstract
BACKGROUND Hepatic artery thrombosis is an uncommon but catastrophic complication following liver transplantation. We hypothesize that recipients with portal vein thrombosis are at increased risk. METHODS Data on all liver transplants in the U.S. during the MELD era through September 2014 were obtained from UNOS. Status one, multivisceral, living donor, re-transplants, pediatric recipients and donation after cardiac death were excluded. Logistic regression models were constructed for hepatic artery thrombosis with resultant graft loss within 90 days of transplantation. RESULTS 63,182 recipients underwent transplantation; 662 (1.1%) recipients had early hepatic artery thrombosis; of those, 91 (13.8%) had pre-transplant portal vein thrombosis, versus 7.5% with portal vein thrombosis but no hepatic artery thrombosis (p < 0.0001). Portal vein thrombosis was associated with an increased independent risk of hepatic artery thrombosis (OR 2.17, 95% CI 1.71-2.76, p < 0.001) as was donor risk index (OR 2.02, 95% CI 1.65-2.48, p < 0.001). Heparin use at cross clamp, INR, and male donors were all significantly associated with lower risk. DISCUSSION Pre-transplant portal vein thrombosis is associated with post-transplant hepatic artery thrombosis independent of other factors. Recipients with portal vein thrombosis might benefit from aggressive coagulation management and careful donor selection. More research is needed to determine causal mechanism.
Collapse
|
14
|
Increased risk of portal vein thrombosis in patients with cirrhosis due to nonalcoholic steatohepatitis. Liver Transpl 2015; 21:1016-21. [PMID: 25845711 PMCID: PMC6615024 DOI: 10.1002/lt.24134] [Citation(s) in RCA: 96] [Impact Index Per Article: 10.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/09/2014] [Revised: 03/02/2015] [Accepted: 03/25/2015] [Indexed: 12/12/2022]
Abstract
Portal vein thrombosis (PVT) is a common complication of cirrhosis sometimes implicated in hepatic decompensation. There are no consistent epidemiologic data to suggest an increased risk of thrombotic complications in nonalcoholic steatohepatitis (NASH); however, research suggests an increased risk of thrombosis. Our aim was to examine the independent association between NASH cirrhosis and PVT in patients who underwent liver transplantation (LT) in a cross-sectional study. Data on all LTs occurring in the United States between January 1, 2003 and December 31, 2012 were obtained from the United Network for Organ Sharing. Multivariable models were constructed to assess the statistical associations and risk factors for the development of PVT. A total of 33,368 patients underwent transplantation. Of these, 2096 (6.3%) had PVT. Of the patients with PVT, 12.0% had NASH. When we compared these patients to a composite of all other causes of cirrhosis, an increased prevalence of PVT was again found, with 10.1% having PVT at the time of transplantation versus 6.0% without NASH (P < 0.001). The strongest risk factor independently associated with a diagnosis of PVT in a multivariable analysis was NASH cirrhosis (odds ratio, 1.55; 95% confidence interval, 1.33-1.81; P < 0.001). NASH cirrhosis appears to predispose a patient to PVT independently of other risk factors. These epidemiological findings provide support for the idea that NASH is a prothrombotic state, and they should lead to more research in treatment and prevention in this population.
Collapse
|
15
|
Photodynamic therapy provides local control of cholangiocarcinoma in patients awaiting liver transplantation. Am J Transplant 2014; 14:466-71. [PMID: 24373228 DOI: 10.1111/ajt.12597] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2013] [Revised: 10/21/2013] [Accepted: 11/14/2013] [Indexed: 01/25/2023]
Abstract
Many transplant centers use endoscopically directed brachytherapy to provide locoregional control in patients with otherwise incurable cholangiocarcinoma (CCA) who are awaiting liver transplantation (LT). The use of endoscopic retrograde cholangiopancreatography (ERCP)-directed photodynamic therapy (PDT) as an alternative to brachytherapy for providing locoregional control in this patient population has not been studied. The aim of this study was to report on our initial experience using ERCP-directed PDT to provide local control in patients with unresectable CCA who were awaiting LT. Patients with unresectable CCA who underwent protocol-driven neoadjuvant chemoradiation and ERCP-directed PDT with the intent of undergoing LT were reviewed. Four patients with confirmed or suspected CCA met the inclusion criteria for protocol LT. All four patients (100%) successfully underwent ERCP-directed PDT. All patients had chemoradiation dose delays, and two patients had recurrent cholangitis despite PDT. None of these patients had progressive locoregional disease or distant metastasis following PDT. All four patients (100%) underwent LT. Intention-to-treat disease-free survival was 75% at mean follow-up of 28.1 months. In summary, ERCP-directed PDT is a reasonably well tolerated and safe procedure that may have benefit by maintaining locoregional tumor control in patients with CCA who are awaiting LT.
Collapse
|
16
|
A prospective, randomized trial of complete avoidance of steroids in liver transplantation with follow-up of over 7 years. HPB (Oxford) 2013; 15:286-93. [PMID: 23458449 PMCID: PMC3608983 DOI: 10.1111/j.1477-2574.2012.00576.x] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/15/2012] [Accepted: 08/21/2012] [Indexed: 12/12/2022]
Abstract
OBJECTIVES Steroids are a mainstay of treatment in orthotopic liver transplantation (OLT) and are associated with significant morbidity. This trial was conducted to assess the efficacy of steroids avoidance. METHODS Patients undergoing OLT between June 2002 and April 2005 were entered into a prospective, randomized trial of complete steroids avoidance and followed until November 2011. Recipients received either standard therapy (n = 50) or complete steroids avoidance (n = 50). Analyses were performed on an intention-to-treat basis. The mean follow-up of all recipients was 2095 ± 117 days. Sixteen (32%) recipients randomized to the steroids avoidance group ultimately received steroids for clinical indications. RESULTS Incidences of diabetes and hypertension prior to or after OLT were similar in both groups, as was the incidence of rejection. Patient and graft survival rates at 1, 3 and 5 years were lower in the steroids avoidance group than in the standard therapy group (patient survival: 1-year, 80% versus 86%; 3-year, 68% versus 76%; 5-year, 60% versus 72%; graft survival: 1-year, 76% versus 76%; 3-year, 64% versus 74%; 5-year, 56% versus 72%), but the differences were not statistically different. CONCLUSIONS Complete steroids avoidance provides liver transplant recipients with minimal benefit and appears to result in a concerning trend towards decreased graft and recipient survival. The present data support the use of at least a short course of steroids after liver transplantation.
Collapse
|
17
|
Incidence and risk factors of hepatocellular carcinoma recurrence after liver transplantation in the MELD era. Dig Dis Sci 2012; 57:806-12. [PMID: 21953139 PMCID: PMC3288660 DOI: 10.1007/s10620-011-1910-9] [Citation(s) in RCA: 38] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/07/2011] [Accepted: 09/02/2011] [Indexed: 02/06/2023]
Abstract
BACKGROUND AND AIMS Deceased donor liver transplantation (DDLT) rates for candidates with hepatocellular carcinoma (HCC) have significantly increased in the MELD era because of the extra priority given to these candidates. We examined the incidence and pre-DDLT radiological and donor factors associated with post-DDLT HCC recurrence in the MELD era. METHODS Outcomes of HCC candidates aged ≥18 years that underwent DDLT between 2/28/02 and 6/30/08 (n = 94) were reviewed. The primary outcome was biopsy-proven post-LT HCC recurrence at any site. Kaplan-Meier analysis was used to calculate the cumulative incidence and Cox regression was used to identify the predictors of post-LT HCC recurrence. RESULTS The median age of the 94 candidates who met the study criteria was 54 years, 64% had hepatitis C, median lab MELD was 13, and median pre-LT AFP was 47 ng/dl. Based upon pre-DDLT imaging, 94% candidates met the Milan criteria. The median waiting time to transplant was 47 days and 27% received pre-DDLT loco-regional therapy. Seventeen (18%) developed HCC recurrence after 2.1 median years with a cumulative incidence of 6.8, 12, and 19% at 1, 2, and 3 years post-DDLT. The pre-DDLT number of lesions (p = 0.015), largest lesion diameter (p = 0.008), and higher donor age (p = 0.002) were the significant predictors of HCC recurrence after adjusting for pre-LT loco-regional therapy and waiting time. Post-LT HCC recurrence (p < 0.0001) and higher donor age (p = 0.029) were associated with lower post-LT survival. CONCLUSIONS Post-LT HCC recurrence is higher in our MELD era cohort than the reported rate of 8% at 4 years in Mazzaferro et al.'s study. The risk of HCC recurrence was significantly associated with the number of lesions and size of the largest lesion at the time of DDLT as well as with older donor age. Risk stratification using a predictive model for post-LT HCC recurrence based on pre-LT imaging and donor factors may help guide candidate selection and tailoring of HCC surveillance strategies after LT.
Collapse
|
18
|
Abstract
It is challenging to discuss the use of high-risk organs with patients, in part because of the lack of information about how patients view this topic. This study was designed to determine how patients think about organ quality and to test formats for risk communication. Semistructured interviews of 10 patients on the waiting list revealed limited understanding about the spectrum of organ quality and a reluctance to consider anything but the best organs. A computerized quantitative survey was then conducted with an interactive graph to elicit the risk of graft failure that patients would accept. Fifty-eight percent of the 95 wait-listed patients who completed the survey would accept only organs with a risk of graft failure of 25% or less at 3 years, whereas 18% would accept only organs with the lowest risk possible (19% at 3 years). Risk tolerance was increased when the organ quality was presented relative to average organs rather than the best organs and when feedback was provided about the implications for organ availability. More than three-quarters of the patients reported that they wanted an equal or dominant role in organ acceptance decisions. Men tended to prefer lower risk organs (mean acceptable risk = 29%) in comparison with women (mean acceptable risk = 35%, P = 0.04), but risk tolerance was not associated with other demographic or clinical characteristics (eg, the severity of liver disease). In summary, patients want to be involved in decisions about organ quality. Patients' risk tolerance varies widely, and their acceptance of high-risk organs can be facilitated if we present the risks of graft failure with respect to average organs and provide feedback about the implications for organ availability.
Collapse
|
19
|
Value of delayed hypointensity and delayed enhancing rim in magnetic resonance imaging diagnosis of small hepatocellular carcinoma in the cirrhotic liver. J Magn Reson Imaging 2010; 32:360-6. [PMID: 20677263 DOI: 10.1002/jmri.22271] [Citation(s) in RCA: 94] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/19/2022] Open
Abstract
PURPOSE To determine the diagnostic utility of delayed hypointensity and delayed enhancing rim on magnetic resonance imaging (MRI) as indicators of hepatocellular carcinoma (HCC) in arterially enhancing nodules < or =5 cm in the cirrhotic liver and determine the features that best predict HCC. MATERIALS AND METHODS Gadolinium-enhanced MRI studies performed from January 2001 to December 2004 in patients with cirrhosis were evaluated for arterially enhancing nodules measuring < or =5 cm. Verification was via explant correlation, biopsy, or imaging follow-up. Sensitivity and specificity of diagnostic features of HCC were calculated. Features predictive of HCC were determined using the Generalized Estimating Equation approach. RESULTS In all, 116 arterially enhancing nodules were identified in 80 patients (<2 cm: n = 79, 2-5 cm n = 37). Sensitivity and specificity of delayed hypointensity for HCC measuring < or =5 cm, 2-5 cm, and <2 cm were 0.54 (40 of 74) and 0.86 (36 of 42); 0.72 (23 of 32) and 0.80 (4 of 5); and 0.41 (17 of 42) and 0.87 (32 of 37). For the delayed enhancing rim sensitivity and specificity were 0.64 (47 of 74) and 0.86 (36 of 42); 0.75 (24 of 32) and 1.0 (5 of 5); and 0.55 (23 of 42) and 0.83 (31 of 37), respectively. Lesion size (> or =2 cm) and delayed enhancing rim, as main features and their interaction, were the most significant predictors of HCC. CONCLUSION Delayed hypointensity and enhancing rim improve the specificity of diagnosis of HCC of all sizes but are seen less frequently in small (<2 cm) HCC. Nodule size (> or =2 cm) and delayed enhancing rim are the strongest predictors of HCC.
Collapse
|
20
|
Long-term efficacy of nucleoside monotherapy in preventing HBV infection in HBsAg-negative recipients of anti-HBc-positive donor livers. Hepatol Int 2010; 4:707-15. [PMID: 21286341 DOI: 10.1007/s12072-010-9188-0] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/14/2010] [Accepted: 07/09/2010] [Indexed: 01/10/2023]
Abstract
BACKGROUND AND AIM Transmission of hepatitis B virus (HBV) infection occurs in up to 87.5% of HBsAg-negative recipients of anti-HBc-positive donor livers in the absence of HBV prophylaxis. There is no standardized prophylactic regimen to prevent HBV infection in this setting. The aim of this study was to determine the long-term efficacy of nucleoside analogue to prevent HBV infection in this setting. METHODS A retrospective study of HBsAg-negative patients receiving liver transplantation (LT) from anti-HBc-positive donors during a 10-year period. RESULTS Twenty patients were studied, mean age was 50.2 ± 8.3 years, 40% were men, and 90% were Caucasian. The median MELD score at the time of LT was 18 (12-40). None of the patients received hepatitis B immune globulin. Eighteen patients received nucleoside analogue monotherapy: 10 received lamivudine and 8 received entecavir. None of these 18 patients developed HBV infection after a median follow up of 32 (1-75) months. One patient received a second course of hepatitis B vaccine 50 months after LT with anti-HBs titer above 1,000 mIU/mL. Lamivudine was discontinued and the patient remained HBsAg negative 18 months after withdrawal of lamivudine. Two patients who were anti-HBs positive before LT were not started on HBV prophylaxis after LT; both developed HBV infection after LT. CONCLUSIONS Nucleoside monotherapy is sufficient in preventing HBV infection in HBsAg-negative recipients of anti-HBc-positive donor livers. HBV prophylaxis is necessary in anti-HBs-positive recipients of anti-HBc-positive donor livers.
Collapse
|
21
|
Abstract
Changes in organ allocation policy in 2002 reduced the number of adult patients on the liver transplant waiting list, changed the characteristics of transplant recipients and increased the number of patients receiving simultaneous liver-kidney transplantation (SLK). The number of liver transplants peaked in 2006 and declined marginally in 2007 and 2008. During this period, there was an increase in donor age, the Donor Risk Index, the number of candidates receiving MELD exception scores and the number of recipients with hepatocellular carcinoma. In contrast, there was a decrease in retransplantation rates, and the number of patients receiving grafts from either a living donor or from donation after cardiac death. The proportion of patients with severe obesity, diabetes and renal insufficiency increased during this period. Despite increases in donor and recipient risk factors, there was a trend towards better 1-year graft and patient survival between 1998 and 2007. Of major concern, however, were considerable regional variations in waiting time and posttransplant survival. The current status of liver transplantation in the United States between 1999 and 2008 was analyzed using SRTR data. In addition to a general summary, we have included a more detailed analysis of liver transplantation for hepatitis C, retransplantation and SLK transplantation.
Collapse
|
22
|
Abstract
The effects of occlusive portal vein thrombosis (PVT) on the survival of patients with cirrhosis are unknown. This was a retrospective cohort study at a single center. The main exposure variable was the presence of occlusive PVT. The primary outcome measure was time-dependent mortality. A total of 3295 patients were analyzed, and 148 (4.5%) had PVT. Variables independently predictive of mortality from the time of liver transplant evaluation included age [hazard ratio (HR), 1.02; 95% confidence interval (CI), 1.01-1.03], Model for End-Stage Liver Disease (MELD) score (HR, 1.10; 95% CI, 1.08-1.11), hepatitis C (HR, 1.44; 95% CI, 1.24-1.68), and PVT (HR, 2.61; 95% CI, 1.97-3.51). Variables independently associated with the risk of mortality from the time of liver transplant listing included age (HR, 1.02; 95% CI, 1.01-1.03), transplantation (HR, 0.65; 95% CI, 0.50-0.81), MELD (HR, 1.08; 95% CI, 1.06-1.10), hepatitis C (HR, 1.50; 95% CI, 1.18-1.90), and PVT (1.99; 95% CI, 1.25-3.16). The presence of occlusive PVT at the time of liver transplantation was associated with an increased risk of death at 30 days (odds ratio, 7.39; 95% CI, 2.39-22.83). In conclusion, patients with cirrhosis complicated by PVT have an increased risk of death.
Collapse
|
23
|
An intention-to-treat analysis of liver transplantation for hepatocellular carcinoma using organ procurement transplant network data. Liver Transpl 2009; 15:859-68. [PMID: 19642139 DOI: 10.1002/lt.21778] [Citation(s) in RCA: 119] [Impact Index Per Article: 7.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
Single-center studies have shown acceptable long-term outcomes following orthotopic liver transplantation (OLT) for hepatocellular carcinoma (HCC) when tumors are within the Milan criteria. However, the overall survival and waiting list removal rates have not been described at a national level with pooled registry data. To evaluate this, a retrospective cohort of patients listed for OLT with a diagnosis of HCC between January 1998 and March 2006 was identified from Organ Procurement Transplant Network data. Analysis was performed from the time of listing. Adjusted Cox models were used to assess the relative effect of potential confounders on removal from the waiting list as well as survival from the time of wait listing. A total of 4482 patients with HCC were placed on the liver waiting list during the study period. Of these, 65% underwent transplantation, and 18% were removed from the list because of tumor progression or death. The overall 1- and 5-year intent-to-treat survival for all patients listed was 81% and 51%, respectively. The 1- and 5-year survival was 89% and 61% for those listed with tumors meeting the Milan criteria versus 70% and 32% for those exceeding the Milan criteria (P < 0.0001). On multivariate analysis, advanced liver failure manifested by Child-Pugh class B or C increased the risk of death, while age < 55 years, meeting the Milan criteria, and obtaining a liver transplant were associated with better survival. The current criteria for liver transplantation of candidates with HCC lead to acceptable 5-year survival while limiting the dropout rate. Liver Transpl 15:859-868, 2009. (c) 2009 AASLD.
Collapse
|
24
|
Should heart, lung, and liver transplant recipients receive immunosuppression induction for kidney transplantation? Clin Transplant 2009; 24:67-72. [PMID: 19222505 DOI: 10.1111/j.1399-0012.2009.00973.x] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
As the outcomes of heart, liver, and lung transplantation continue to improve, more patients will present for subsequent renal transplantation. It remains unclear whether these patients benefit from induction immunosuppression. We retrospectively reviewed induction on solid organ graft recipients who underwent renal transplant at our center from January 1, 1995 to March 30, 2007. Induction and the non-induction groups were compared by univariate and Kaplan-Meier analyses. There were 21 patients in each group, with mean follow-up of 4.5-6.0 years. Forty-seven percent of patients receiving induction had a severe post-operative infection, compared with 28.6% in the non-induction group (p = NS). The one yr rejection rate in the induction group was 9.5% compared with 14.3% for non-induction (p = NS). One-yr graft survival was 81.0% and 95.2% in the induction and non-induction group (p = NS). In summary, there is a trend toward lower patient and graft survival among patients undergoing induction. These trends could relate to selection bias in the decision to prescribe induction immunosuppression, but further study is needed to better define the risks and benefits of antibody-induction regimens in this population.
Collapse
|
25
|
Impact of the model for end-stage liver disease allocation policy on the use of high-risk organs for liver transplantation. Gastroenterology 2008; 135:1568-74. [PMID: 19009713 DOI: 10.1053/j.gastro.2008.08.003] [Citation(s) in RCA: 51] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
BACKGROUND & AIMS Although priority for liver transplantation is determined by the model for end-stage liver disease (MELD) score, the quality of organs used is subject to physician discretion. We aimed to determine whether implementation of MELD affected the quality of organs transplanted, the type of patients who receive the higher-risk organs, and the impact of these changes on their posttransplant survival. METHODS Data were analyzed from the United Network for Organ Sharing of adults who underwent deceased-donor liver transplantation between January 1, 2007, and August 1, 2007 (n = 47,985). Dependent variables included the donor risk index (a continuous variable that measures the risk of graft failure associated with a particular organ) and patient survival after transplantation. RESULTS The overall organ quality of transplanted livers has worsened since MELD implementation, with an increase in the donor risk index equivalent to a 4% increased risk of graft failure after adjusting for temporal trends (P < .001). This was accompanied by a shift from using the higher-risk organs in the more urgent patients (in the pre-MELD era) to using the higher-risk organs in the less urgent patients (in the post-MELD era). Posttransplant survival has worsened over time (hazard ratio, 1.017/y; P = .005) among the less urgent patients (MELD scores, <20); mediation analysis suggests this change in survival was caused primarily by changes in organ quality. CONCLUSIONS As an unintended consequence of the MELD allocation policy, patients that are least in need of a liver transplant now receive the highest-risk organs. This has reduced posttransplant survival in recent years among patients with low MELD scores.
Collapse
|
26
|
Comparison of histidine-tryptophan-ketoglutarate and University of Wisconsin preservation in renal transplantation. Am J Transplant 2008; 8:567-73. [PMID: 18162093 DOI: 10.1111/j.1600-6143.2007.02065.x] [Citation(s) in RCA: 35] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
Histidine-tryptophan-ketoglutarate (HTK) is replacing University of Wisconsin (UW) solution as the preservation fluid for renal allografts in many centers, but recent large-scale data to support this transition are lacking. We conducted a retrospective analysis of patient and graft outcomes after renal transplantation at our center, comparing 475 consecutive living donor and 317 deceased donor transplants since the adoption of HTK with equal numbers of grafts preserved using UW solution. Data collected included donor and recipient age, race, sex, comorbidities and graft ischemia time. Graft and patient survival, as well as the incidence of delayed graft function (DGF), were studied by Kaplan-Meier and Cox regression analysis. No significant difference was seen in either patient or graft survival. Deceased donor kidneys in the HTK group had a higher incidence of DGF than the UW cohort, whereas this trend was reversed in the case of living donor organs. In multivariate analysis, HTK was associated with a significant risk reduction on the incidence of DGF. Prolonged preservation with HTK compared to UW was not associated with excess risk to the graft or patient. In summary, HTK demonstrated efficacy similar to UW in terms of patient and graft survival.
Collapse
|
27
|
Abstract
Surgical complications following pediatric liver transplantation are common and expensive. We examined the incremental costs of surgical complications and determined who pays for these complications (center or payer). We reviewed the records of 36 pediatric liver transplant patients aged <or=12 yr transplanted between July 1, 2002 and December 31, 2005. The association of recipient and financial data points was assessed. On univariate analysis, total hospital costs were significantly increased in patients with ACR, PNF, HAT, biliary complications, and ARF. Reimbursement by the payer was significantly increased in patients with PNF, HAT, biliary complications, and ARF. Hospital profits were significantly decreased in recipients with ACR and pneumonia. Multiple linear regression models (controlling for recipient factors) revealed that ARF and HAT were independently associated with a significant increase in median hospital costs (incremental costs of $238,990 and $125,650, respectively). ARF and HAT were also independently associated with a significant increase in median reimbursements (incremental costs of $231,611 and $125,287, respectively). No complications were independently associated with hospital margins. All parties (patient and families, physician, payer, and medical center) should benefit from quality improvement efforts, with payers having the largest financial interest.
Collapse
|
28
|
Abstract
Over the past several years we have noted a marked decrease in this profitability of our kidney transplant program. Our hypothesis is that this reduction in kidney transplant institutional profitability is related to aggressive donor and recipient practices. The study population included all adults with Medicare insurance who received a kidney transplant at our center between 1999 and 2005. Adopting the hospital perspective, multi-variate linear regression models to determine the independent effects of donor and recipient characteristics and era effects on total reimbursements and total hospital margin. We note statistically significant decreased medical center incremental margins in cases with ECDs (-$5887) and in cases of DGF (-4937). We also note an annual change in the medical center margin is independently associated with year and changes at a rate of -$5278 per year, related to both increasing costs and decreasing Medicare reimbursements. The financial loss associated with patient DGF and the use of ECD kidneys may resonate with other centers, and could hinder efforts to expand kidney transplantation within the United States. The Centers for Medicare and Medicaid Services (CMS) should consider risk-adjusted reimbursement for kidney transplantation.
Collapse
|
29
|
Biliary complications following liver transplantation in the model for end-stage liver disease era: effect of donor, recipient, and technical factors. Liver Transpl 2008; 14:73-80. [PMID: 18161843 DOI: 10.1002/lt.21354] [Citation(s) in RCA: 154] [Impact Index Per Article: 9.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
Biliary complications remain a significant problem following liver transplantation in the Model for End-Stage Liver Disease (MELD) era. We hypothesized that donor, recipient, and technical variables may differentially affect anastomotic biliary complications in MELD era liver transplants. We reviewed 256 deceased donor liver transplants after the institution of MELD at our center and evaluated these variables' association with anastomotic biliary complications. The bile leak rate was 18%, and the stricture rate was 23%. Univariate analysis revealed that recipient age, MELD, donor age, and warm ischemia were risk factors for leak, whereas a Roux limb or stent was protective. A bile leak was a risk factor for anastomotic stricture, whereas use of histidine tryptophan ketoglutarate (HTK) versus University of Wisconsin (UW) solution was protective. Additionally, use of a transcystic tube/stent was also protective. Multivariate analysis showed that warm ischemia was the only independent risk factor for a leak, whereas development of a leak was the only independent risk factor for a stricture. HTK versus UW use and transcystic tube/stent use were the only independent protective factors against stricture. Use of an internal stent trended in the multivariate analysis toward being protective against leaks and strictures, but this was not quite statistically significant. This represents one of the first MELD era studies of deceased donor liver transplants evaluating factors affecting the incidence of anastomotic bile leaks and strictures. Donor, recipient, and technical factors appear to differentially affect the incidence of anastomotic biliary complications, with warm ischemia, use of HTK, and use of a stent emerging as the most important variables.
Collapse
|
30
|
Abstract
Obese patients are at higher risk for morbidity and mortality after liver transplantation (LT) than nonobese recipients. However, there are no reports assessing the survival benefit of LT according to recipient body mass index (BMI). A retrospective cohort of liver transplant candidates who were initially wait-listed between September 2001 and December 2004 was identified in the Scientific Registry of Transplant Recipients database. Adjusted Cox regression models were fitted to assess the association between BMI and liver transplant survival benefit (posttransplantation vs. waiting list mortality). During the study period, 25,647 patients were placed on the waiting list. Of these, 4,488 (17%) underwent LT by December 31, 2004. At wait-listing and transplantation, similar proportions were morbidly obese (BMI>or=40; 3.8% vs. 3.4%, respectively) and underweight (BMI<20; 4.5% vs. 4.0%, respectively). Underweight patients experienced a significantly higher covariate-adjusted risk of death on the waiting list (hazard ratio [HR]=1.61; P<0.0001) compared to normal weight candidates (BMI 20 to <25), but underweight recipients had a similar risk of posttransplantation death (HR=1.28; P=0.15) compared to recipients of normal weight. In conclusion, compared to patients on the waiting list with a similar BMI, all subgroups of liver transplant recipients demonstrated a significant (P<0.0001) survival benefit, including morbidly obese and underweight recipients. Our results suggest that high or low recipient BMI should not be a contraindication for LT.
Collapse
|
31
|
Seasonal variation in surgical outcomes as measured by the American College of Surgeons-National Surgical Quality Improvement Program (ACS-NSQIP). Ann Surg 2007; 246:456-62; discussion 463-5. [PMID: 17717449 PMCID: PMC1959349 DOI: 10.1097/sla.0b013e31814855f2] [Citation(s) in RCA: 148] [Impact Index Per Article: 8.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/21/2023]
Abstract
OBJECTIVE We hypothesize that the systems of care within academic medical centers are sufficiently disrupted with the beginning of a new academic year to affect patient outcomes. METHODS This observational multiinstitutional cohort study was conducted by analysis of the National Surgical Quality Improvement Program-Patient Safety in Surgery Study database. The 30-day morbidity and mortality rates were compared between 2 periods of care: (early group: July 1 to August 30) and late group (April 15 to June 15). Patient baseline characteristics were first compared between the early and late periods. A prediction model was then constructed, via stepwise logistic regression model with a significance level for entry and a significance level for selection of 0.05. RESULTS There was 18% higher risk of postoperative morbidity in the early (n = 9941) versus the late group (n = 10313) (OR 1.18, 95%, CI 1.07-1.29, P = 0.0005, c-index 0.794). There was a 41% higher risk for mortality in the early group compared with the late group (OR 1.41, CI 1.11-1.80, P = 0.005, c-index 0.938). No significant trends in patient risk over time were noted. CONCLUSION Our data suggests higher rates of postsurgical morbidity and mortality related to the time of the year. Further study is needed to fully describe the etiologies of the seasonal variation in outcomes.
Collapse
|
32
|
Is routine ureteral stenting cost-effective in renal transplantation? J Urol 2007; 178:2509-13; discussion 2513. [PMID: 17937936 DOI: 10.1016/j.juro.2007.08.037] [Citation(s) in RCA: 24] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2007] [Indexed: 02/08/2023]
Abstract
PURPOSE Recent collective reviews show that ureteral stenting provides a decrease in ureteroneocystostomy anastomotic complications following renal transplantation. We identified the specific morbidity associated with urinary complications following renal transplantation and quantified the health care resources required to treat these patients at a high volume center. MATERIALS AND METHODS Prospective databases were used to identify patients with a renal transplant who had urinary complications and track postoperative hospital readmissions and admission diagnostic codes. Financial models were used to estimate the variable direct costs of prophylactic stent placement and removal. Cost based analysis was performed to assess the financial feasibility of routine stenting following renal transplantation. RESULTS Patient specific morbidity and hospital readmissions were significantly increased in patients with a transplant who had a urinary complication. The incremental hospital costs incurred in a patient with a renal transplant who had urinary leakage during the first 12 months postoperatively was $20,121. Routine placement of an anastomotic stent was inexpensive. Approximately 22 or 23 stents could be placed at the same incremental cost of treating 1 patient with a urinary complication in the hospital. CONCLUSIONS Urinary anastomotic complications following renal transplantation are highly morbid. Even with modest decreases in urinary complications prophylactic ureteral stent placement is financially advantageous.
Collapse
|
33
|
Incremental costs of post-liver transplantation complications. J Am Coll Surg 2007; 206:89-95. [PMID: 18155573 DOI: 10.1016/j.jamcollsurg.2007.06.292] [Citation(s) in RCA: 33] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2007] [Revised: 06/01/2007] [Accepted: 06/13/2007] [Indexed: 12/12/2022]
Abstract
BACKGROUND Complications after liver transplantation are common and expensive. The incremental costs of adult posttransplantation liver transplantation complications and who pays for these complications (center or payor) is unknown. STUDY DESIGN We reviewed the medical and financial records (first 90 postoperative days) of all adult liver transplant recipients at our center between July 1, 2002, and October 30, 2005 (N = 214). The association of donor, recipient, and financial data points (total costs, reimbursements, and profits) was assessed using standard univariable analyses. The incremental costs of complications were determined with multiple linear regression models to control for the costs inherent to donor and recipient characteristics. RESULTS Univariate analyses demonstrated that both total hospital costs and reimbursements were substantially increased in patients with several different complications. Multiple linear regression analysis, controlling for recipient (age, gender, race, and laboratory Model for End-Stage Liver Disease [MELD]) and donor factors (donor risk index), noted that increased hospital costs and hospital reimbursements were independently associated with laboratory MELD (incremental costs of $3,368 and $2,787, respectively, per MELD point) and pneumonia ($83,718 and $68,214, respectively). A negative profit margin for the medical center was independently associated with peritonitis ($21,760). Commercial insurance was associated with no changes in total costs when compared with public insurer, but it was associated with decreased reimbursement and profit. CONCLUSIONS The incremental costs of complications in liver transplantation are high for both the medical center and payor, but medical center profits are not affected substantially. The payor bears the financial burden for post-liver transplantation complications.
Collapse
|
34
|
Effect of intraoperative hyperglycemia during liver transplantation. J Surg Res 2007; 140:227-33. [PMID: 17509267 DOI: 10.1016/j.jss.2007.02.019] [Citation(s) in RCA: 102] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2007] [Revised: 02/08/2007] [Accepted: 02/11/2007] [Indexed: 01/08/2023]
Abstract
BACKGROUND Intensive blood glucose management has been shown to decrease mortality and infections for intensive care patients. The effect of intraoperative strict glucose control on surgical outcomes, including liver transplantation, has not been well evaluated. MATERIALS AND METHODS A retrospective review of all adult liver recipients transplanted between January 1, 2004 and July 6, 2006 was performed. Donor and recipient demographics, intraoperative variables, and outcomes were collected. Intraoperative glucose measurements were performed by the anesthesiology team and treated with intravenous insulin bolus or continuous infusion. Patients with strict glycemic control (mean blood glucose <150 mg/dL) were compared with those with poor control (mean blood glucose >or=150 mg/dL). RESULTS During the study period, a total of 184 patients met criteria for analysis. Recipients with strict glycemic control (n=60) had a mean glucose of 135 mg/dL compared with 184 mg/dL in the poorly controlled group (n=124). Other than recipient age (strict versus poor control, 47 +/- 2 y versus 53 +/- 1 y; P<0.01), both groups had similar donor and recipient characteristics. Although the incidence of most postoperative complications were similar, poor glycemic control was associated with a significantly increased infection rate at 30 d posttransplant (48% versus 30%; P=0.02), and also an increased 1 y mortality (21.9% versus 8.8%; P=0.05). CONCLUSIONS Intraoperative hyperglycemia during liver transplantation was associated with an increased risk of postoperative infection and mortality. Strict intraoperative glycemic control, possibly using insulin infusions, may improve outcomes following liver transplantation.
Collapse
|
35
|
Abstract
We quantified the financial implications of surgical complications following pancreas transplantation. We reviewed medical and financial records of 49 pancreas transplant recipients at the University of Michigan Health System (UMHS) between 1/6/2002 and 11/22/2004. The association of donor, transplant recipient and financial variables was assessed. The median costs to UMHS of procedures and follow-up were $92,917 for recipients without surgical complications versus $108,431 when a surgical complication occurred, a difference of $15,514 (p = 0.03). Median reimbursement by the payer was $17,363 higher in patients with a surgical complication (p = 0.001). Similar trends (higher insurer costs) were noted when stratifying by payer (public and private) and specific procedure (SPK and PAK). All parties (patient, physician, payer and medical center) should benefit from quality improvement, with payers having a financial interest in pancreas transplant surgical quality initiatives.
Collapse
|
36
|
Abstract
Urinary complications are common following renal transplantation. The aim of this study is to evaluate the risk factors associated with renal transplant urinary complications. We collected data on 1698 consecutive renal transplants patients. The association of donor, transplant and recipient characteristics with urinary complications was assessed by univariable and multivariable Cox proportional hazards models, fitted to analyze time-to-event outcomes of urinary complications and graft failure. Urinary complications were observed in 105 (6.2%) recipients, with a 2.8% ureteral stricture rate, a 1.7% rate of leak and stricture, and a 1.6% rate of urine leaks. Seventy percent of these complications were definitively managed with a percutaneous intervention. Independent risk factors for a urinary complication included: male recipient, African American recipient, and the "U"-stitch technique. Ureteral stricture was an independent risk factor for graft loss, while urinary leak was not. Laparoscopic donor technique (compared to open living donor nephrectomy) was not associated with more urinary complications. Our data suggest that several patient characteristics are associated with an increased risk of a urinary complication. The U-stitch technique should not be used for the ureteral anastomosis.
Collapse
|
37
|
Who pays for biliary complications following liver transplant? A business case for quality improvement. Am J Transplant 2006; 6:2978-82. [PMID: 17294525 DOI: 10.1111/j.1600-6143.2006.01575.x] [Citation(s) in RCA: 40] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
We use biliary complication following liver transplantation to quantify the financial implications of surgical complications and make a case for surgical improvement initiatives as a sound financial investment. We reviewed the medical and financial records of all liver transplant patients at the UMHS between July 1, 2002 and June 30, 2005 (N = 256). The association of donor, transplant, recipient and financial data points was assessed using both univariable (Student's t-test, a chi-square and logistic regression) and multivariable (logistic regression) methods. UMHS made a profit of $6822 +/- 39087 on patients without a biliary complication while taking a loss of $5742 +/- 58242 on patients with a biliary complication (p = 0.04). Reimbursement by the payer was $5562 higher in patients with a biliary complication compared to patients without a biliary complication (p = 0.001). Using multivariable logistic regression analysis, the two independent risk factors for a negative margin included private insurance (compared to public) (OR 1.88, CI 1.10-3.24, p = 0.022) and biliary leak (OR = 2.09, CI 1.06-4.13, p = 0.034). These findings underscore the important impact of surgical complications on transplant finances. Medical centers have a financial interest in transplant surgical quality improvement, but payers have the most to gain with improved surgical outcomes.
Collapse
|
38
|
Abstract
OBJECTIVE This study examines donation after cardiac death (DCD) practices and outcomes in liver transplantation. SUMMARY BACKGROUND DATA Livers procured from DCD donors have recently been used to increase the number of deceased donors and bridge the gap between limited organ supply and the pool of waiting list candidates. Comprehensive evaluation of this practice and its outcomes has not been previously reported. METHODS A national cohort of all DCD and donation after brain-death (DBD) liver transplants between January 1, 2000 and December 31, 2004 was identified in the Scientific Registry of Transplant Recipients. Time to graft failure (including death) was modeled by Cox regression, adjusted for relevant donor and recipient characteristics. RESULTS DCD livers were used for 472 (2%) of 24,070 transplants. Annual DCD liver activity increased from 39 in 2000 to 176 in 2004. The adjusted relative risk of DCD graft failure was 85% higher than for DBD grafts (relative risk, 1.85; 95% confidence interval, 1.51-2.26; P < 0.001), corresponding to 3-month, 1-year, and 3-year graft survival rates of 83.0%, 70.1%, and 60.5%, respectively (vs. 89.2%, 83.0%, and 75.0% for DBD recipients). There was no significant association between transplant program DCD liver transplant volume and graft outcome. CONCLUSIONS The annual number of DCD livers used for transplant has increased rapidly. However, DCD livers are associated with a significantly increased risk of graft failure unrelated to modifiable donor or recipient factors. Appropriate recipients for DCD livers have not been fully characterized and recipient informed consent should be obtained before use of these organs.
Collapse
|
39
|
Early pancreas transplant outcomes with histidine-tryptophan-ketoglutarate preservation: a multicenter study. Transplantation 2006; 82:136-9. [PMID: 16861954 DOI: 10.1097/01.tp.0000225764.21343.e3] [Citation(s) in RCA: 47] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/19/2022]
Abstract
Little is known about the use of histidine-tryptophan-ketoglutarate (HTK) preservation solution for pancreas preservation. We compared early pancreas graft outcomes at four pancreas transplant programs within the state of Michigan in 2002 and 2003 (University of Wisconsin [UW] era) with those in 2004 (HTK era). The primary endpoint was early graft loss. The UW group (n=41) and the HTK group (n=36) had similar outcomes with respect to: technical graft loss (9.8% vs. 8.3%, P=NS), 90-day graft function (90.2% vs. 86.1%, P=NS), and rate of pancreatic leak/abscess (12.2% vs. 11.1%, P=NS). There were also no significant differences in postoperative amylase and lipase levels between the two groups. The HTK group did have significantly more acute rejection within the first 180 days (25.0% vs. 9.8%, P<0.05). HTK is a suitable substitute for UW in the preservation of pancreas allografts.
Collapse
|
40
|
Abstract
The optimal use of kidneys from small pediatric deceased donors remains undetermined. Using data from the Scientific Registry of Transplant Recipients, 2886 small (< 21 kg) pediatric donors between 1993 and 2002 were identified. Donor factors predictive of kidney recovery and transplantation (1343 en bloc; 1600 single) were identified by logistic regression. Multivariable Cox regression was used to assess the risk of graft loss. The rate of kidney recovery from small pediatric donors was significantly higher with increasing age, weight and height. The odds of transplant of recovered small donor kidneys were significantly higher with increasing age, weight, height and en bloc recovery (adjusted odds ratio = 65.8 vs. single; p < 0.0001), and significantly lower with increasing creatinine. Compared to en bloc, solitary transplants had a 78% higher risk of graft loss (p < 0.0001). En bloc transplants had a similar graft survival to ideal donors (p = 0.45) while solitary transplants had an increased risk of graft loss (p < 0.0001). En bloc recovery of kidneys from small pediatric donors may result in the highest probability of transplantation. Although limited by the retrospective nature of the study, kidneys transplanted en bloc had a similar graft survival to ideal donors but may not maximize the number of successfully transplanted recipients.
Collapse
|
41
|
Abstract
INTRODUCTION It is unclear how to manage high risk hemodialysis patients who present with an indwelling catheter. The National Kidney Foundation Practice Guidelines urge prompt removal of the catheter, but the guidelines do not specifically address the problem of patients whose only option is a femoral arteriovenous (AV) graft. METHODS This study was a retrospective review of all patients who underwent femoral AV graft placement for hemodialysis access between January 1, 1996 and January 1, 2003 at the University of Michigan Health System (UMHS). Graft patency is reported according to the standards developed by the Society of Vascular Surgery and the American Association of Vascular Surgeons. RESULTS Thirty patients were identified who had undergone femoral AV graft placement. The mean follow-up was 23 months (range 1-75 months). The patients had had significant medical co-morbidities and multiple previous access operations (mean 3; interquartile range 1-5). The 1-year secondary graft patency rate was 41%, the 2-year rate was 26%, and the 3-year rate was 21%. Infection was the cause of final graft loss in eight patients (50% of the grafts losses, 27% of the total grafts placed.) Among those who died (n=14), the mean time from femoral graft placement to death was 31.2+/-27.5 months. The patient survival was quite low: at 1 year 81%, at 2 years 68%, and at 3 years 54%. CONCLUSIONS These complex patients who have exhausted their upper extremity hemodialysis options do poorly following femoral AV graft placement. Consideration should be given to long-term catheter-based access in some of these patients.
Collapse
|
42
|
Abstract
The severity of illness in transplant patients and the complexity of transplant operations results in significant postoperative morbidity and mortality. Remarkable efforts have been made by transplant physicians to study and improve organ allocation, graft and patient survival, immunosuppression and the long-term management of post-transplant complications. Less effort has been spent studying the actual transplant operation and systems of acute transplant care. The National Surgical Quality Improvement Program (NSQIP) has provided a standardized approach to quality improvement and has demonstrated significant potential for a reduction in postoperative morbidity and mortality in other surgical disciplines. Medical centers are under increasing pressure to measure surgical quality and the nexus of transplant surgical quality improvement should not lie in the hands of CMS or JACHO, but rather it should be created and developed within the transplant community. The time has come for a national transplant surgical quality improvement program based on the NSQIP infrastructure. Such a proactive approach toward quality improvement from the transplant community is an excellent investment for patients, providers and health care payers.
Collapse
|
43
|
More is not always better: a case postrenal transplant large volume diuresis, hyponatremia, and postoperative seizure. Transpl Int 2006; 19:85-6. [PMID: 16359383 DOI: 10.1111/j.1432-2277.2005.00221.x] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
44
|
Gender-related differences of renal mass supply and metabolic demand after living donor kidney transplantation. Clin Transplant 2006; 20:163-70. [PMID: 16640522 DOI: 10.1111/j.1399-0012.2005.00459.x] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
Kidney donation from female donors to male recipients has been reported to be associated with decreased allograft survival. Whether there was a gender-related inadequacy between donor nephron supply and recipient functional demand was investigated in this study. One hundred ninety-five living donor kidney transplant recipients that had neither ischemic injury, episode of rejection, nor any complication were included. Weights and heights of both donors and recipients were recorded to calculate body surface area, lean body weight, and body mass index. The donated kidney was weighed just after cold flush, and the recipient's serum creatinine (Scr) was measured on a daily basis post-operatively. When the recipient's Scr reached the baseline, a 24-h urine was collected for the amount of proteinuria (Upr), creatinine excretion (Ucr) and creatinine clearance (Ccr) calculation. The effect of donor and recipient gender was analysed by independent sample t-test. On average, male donors and recipients were heavier and taller than females. However, the mass of kidneys donated from men and women were not statistically different. The gender-related differences in post-transplant Scr and Ucr of recipients were associated with the differences in the parameters of metabolic demands of recipients rather than with the weight of implanted kidney (renal mass supply) or with pre-operative renal functions of donors (functional supply). The early graft function is not determined by donor gender. The effect of recipient gender on the graft function depends on the metabolic demands, which are higher in male recipients.
Collapse
|
45
|
Abstract
PURPOSE The aim of this study was to characterize the evolution of gram-negative antibiotic resistance during a study of empiric antibiotic rotation. METHODS We showed previously that quarterly rotation of a single antibiotic class is inferior to cycling two antibiotics per quarter for empiric treatment of gram-negative rod (GNR) infections, as evidenced by increased incidence of antibiotic-resistant GNR (rGNR) infections. Resistance patterns were examined by quantifying GNRs resistant to one or more of the following drug classes: Aminoglycosides, cephalosporins, carbapenems, fluoroquinolones, or piperacillin-tazobactam. For all rGNR isolates, the mean number of antibiotic classes to which an organism was resistant was calculated per quarter, as was the number of rGNR species. RESULTS Single-antibiotic rotation (SAR) was associated with significant increases in the incidence of piperacillin-tazobactam (p < 0.0005) and cephalosporin (p = 0.003) resistance, reaching nearly 25% and 30% of rGNR isolates respectively, most notably during the quarter of designated cephalosporin use (VI). Multi-drug resistance emerged over time; resistant classes/resistant GNR isolates ranged from 1.2 in the dual-antibiotic rotation (DAR) to 1.9 in the SAR period (p = 0.02). Resistance was evident in an increasing number of unique GNR species. On average, 1.3 species were isolated per month in the DAR period and 3.0/month in the SAR period (p = 0.004), but proportionally, no single GNR species became significantly more resistant across time. Compared to only 5.8% in the DAR period, 29% noncompliance was observed in the SAR, with a six-fold increase in the use of nonscheduled empiric antibiotics due to the presence of an organism resistant to the scheduled rotation drug. CONCLUSIONS A single-antibiotic rotation is associated with increased incidence and heterogeneity of resistant GNR isolates, as well as increased multiple-drug-class resistance. The attenuation of resistance observed in the single-antibiotic rotation may reflect the effect of unintended antibiotic heterogeneity driven by increasing resistance to the antibiotic class recommended for use each quarter. This suggests that reliance on a single antibiotic class for empiric treatment of GNR infection exerts sufficient pressure within the environment to encourage the development of diversified resistance, as well as cross-resistance over antibiotic classes, thus narrowing the availability of effective antibiotic treatment.
Collapse
|
46
|
Preliminary analysis of early outcomes of a prospective, randomized trial of complete steroid avoidance in liver transplantation. Transplant Proc 2005; 37:1214-6. [PMID: 15848673 DOI: 10.1016/j.transproceed.2004.12.153] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
Abstract
Steroids are a mainstay in liver transplantation for induction and maintenance immunosuppression but are associated with significant adverse effects. While prior studies have successfully limited the use of steroids, whether complete steroid avoidance will improve outcomes remains unclear. To further evaluate the need for steroids, consenting patients who underwent liver transplantation between June 2002 and May 2004 were entered into a prospective, randomized trial to receive either standard therapy (tacrolimus, mycophenolate mofetil, steroid induction/maintenance) or complete steroid avoidance (standard therapy without steroid induction/maintenance). Clinically suspected rejection was confirmed by biopsy and treated with pulse steroid therapy. Outcomes were compared on an intention to treat basis. Of the 72 patients enrolled, 36 (50%) were randomized to the steroid avoidance group with a mean follow up of 412 +/- 41 days. Donor and recipient characteristics were similar between groups. The steroid avoidance group was more likely to have significant infections (52% vs 28%, P = .03). There was a trend toward an increased rate of acute rejection (25% vs 14%, P = .23). Twelve of 36 recipients (33%) enrolled in the steroid avoidance group later received steroids. The incidence of recurrent hepatitis C was similar between groups. The 1-year patient (90% vs 83%, P = .44) and graft survivals (90% vs 81%, P = .27) were similar between groups. These data suggest complete steroid avoidance in liver transplantation results in acceptable patient and graft survival. However, the potential long-term benefits of steroid avoidance, including a decrease in severity of recurrent hepatitis C, remain under investigation.
Collapse
|
47
|
Metabolic demand and renal mass supply affecting the early graft function after living donor kidney transplantation. Kidney Int 2005; 67:744-9. [PMID: 15673325 DOI: 10.1111/j.1523-1755.2005.67136.x] [Citation(s) in RCA: 39] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
BACKGROUND Graft mass has been demonstrated to be a determinant of outcome after kidney transplantation. An insufficient nephron might fail to meet the metabolic demands of the recipient and lead to hyperfiltration. METHODS The study population was restricted to live donor transplants demonstrating immediate function that had neither ischemic injury, episodes of rejection, nor any complications that resulted in a functional decrease of the graft. The donated kidney was weighed just after cold flush, and the recipient's serum creatinine (Scr) was measured on a daily basis postoperatively. When the recipient's Scr reached the baseline, the recipient's 24-hour urine was collected for the amount of proteinuria (Upr), creatinine excretion (Ucr), and creatinine clearance (Ccr) calculation. As the parameters of the metabolic demands of donor and recipient, body weight, height, body surface area, lean body weight, and body mass index were noted. Pearson correlation and linear regression were carried out. RESULTS The graft function, as measured by Scr, Ucr, and Upr, was not directly correlated with the graft weight but rather correlated with the ratios of graft weight to the parameters of recipient's metabolic demands. As recipient size increased, the metabolic demand has increased. The parameters of recipient's metabolic demands were directly correlated with Scr and Ucr, rather than with Upr. CONCLUSION During living donor and recipient matching, both the potential sizes of the donated kidney and the recipient should be considered in terms of the early graft function after transplantation.
Collapse
|
48
|
Abstract
HYPOTHESIS Preemptive cholecystectomy in cardiac transplant patients with radiographic biliary pathology reduces the morbidity and mortality of biliary tract disease following heart transplantation compared with expectant management. DESIGN AND SETTING Institutional survey at the University of Washington, Seattle. PATIENTS Cardiac transplant recipients between January 1, 1992, and January 1, 2001. Main Outcome Measure Clinical course of patients who were diagnosed as having biliary tract disease following heart transplantation and were managed expectantly (observed) compared with the course of patients whose conditions were diagnosed and who underwent an operation. RESULTS Sixty (35.7%) of 168 cardiac transplant patients were evaluated for biliary tract pathologic condition. Of the 71.7% (43 of 60 patients) who had an abnormal radiographic evaluation, 46.5% (20 patients) had surgery on their biliary tract while the other patients were observed. Nine of the 23 patients who were followed up expectantly had cholelithiasis, 7 patients had gallbladder wall thickening, 5 patients had sludge in their gallbladder, and 2 had biliary dilatation. These patients were followed up for a mean +/- SD of 3.7 +/- 1.3 years; none developed biliary tract symptoms during this period. Cholecystectomies were completed for both emergent (7) and elective (14) indications. The mean +/- SD length of stay for patients who had emergent operations was 24.3 +/- 11.2 days, compared with 3.2 +/- 2.8 days for the patients who had elective operations. Seven (33%) of the 21 patients who had an operation had a significant complication and 1 patient died. CONCLUSIONS These data suggest that the morbidity of an elective cholecystectomy in cardiac transplant patients is significant and equivalent to the morbidity associated with emergent procedures. Expectant management of patients with radiographic evidence of biliary tract pathology discovered after transplantation was safe in this series.
Collapse
|
49
|
Abstract
Retransplantation for liver allograft failure associated with hepatitis C virus (HCV) has been increasing due to nearly universal posttransplant HCV recurrence and has been demonstrated to be associated with poor outcomes. We report on the risk factors for death after retransplantation among liver recipients with HCV. A retrospective cohort of liver transplant recipients who underwent retransplantation between January 1997 and December 2002 was identified in the Scientific Registry of Transplant Recipients database. Cox regression was used to assess the relative effect of HCV diagnosis on mortality risk after retransplantation and was adjusted for multiple covariates. Of 1,718 liver retransplantations during the study period, 464 (27%) were associated with a diagnosis of HCV infection. Based on Cox regression, retransplant recipients with HCV had a 30% higher covariate-adjusted mortality risk than those without HCV diagnosis (hazard ratio [HR], 1.30; 95% confidence interval [CI], 1.10-1.54; P = 0.002). Other covariates associated with significant relative risk of death after retransplantation included older recipient age, presence in an intensive care unit (ICU), serum creatinine, and donor age. Additional regression analysis revealed that the increase in mortality risk associated with HCV was concentrated between 3 and 24 months postretransplantation, among patients age 18 to 39 at retransplant, and in patients retransplanted during the years 2000 to 2002. In conclusion, HCV liver recipients account for a considerable proportion of all retransplantations performed. Surprisingly, younger age predicted a higher mortality for recipients with HCV undergoing liver retransplantation. This may reflect a willingness to retransplant younger patients with an increased severity of illness or a more virulent HCV infection in this population. Although HCV was predictive of an increased risk of death, consideration of other characteristics of HCV patients, including donor and recipient age and need for preoperative ICU care may identify those at significantly higher risk.
Collapse
|
50
|
Abstract
Black transplant recipients have decreased graft survival and increased rejection rates compared with whites. Because increased rejection rates may lead to more immunosuppression in black recipients, ethnic differences may exist for outcomes of posttransplant infectious complications. All episodes of infection between December 1996 and October 1998 on the transplant services at the University of Virginia Health Sciences Center were prospectively evaluated. Parameters recorded included self-designated ethnicity, demographics, APACHE II scores, laboratory and microbiologic data, immunosuppression, episodes of rejection, and outcome measures. Evaluation of 303 episodes of infection demonstrated an increased mortality rate for white compared with black recipients (19% vs. 3%, P = 0.0006) despite having a similar severity of illness (APACHE II score). Among renal transplant recipients, episodes of infection occurring in black recipients (n = 46) were also associated with a decreased mortality rate versus whites (n = 89) (0% vs. 15%, P = 0.006) and shorter mean length of stay (12 +/- 2 vs. 25 +/- 4 days, P = 0.002) despite similar severity of illness and rejection rates. For posttransplant infections in liver transplant recipients, blacks (n = 23) demonstrated a trend toward decreased mortality (9% vs. 26%, P = 0.07) but equal lengths of stay despite similar APACHE II scores, rejection rates, and age. White liver transplant recipients had an increased incidence of viral infections (15% vs. 0%, P = 0.03). All other infecting organisms were similar. The unexpected finding of a significantly decreased rate of mortality associated with posttransplant infections in black recipients remains largely unexplained but may be related to subtle differences in immune response between racial or ethnic groups.
Collapse
|