1
|
The efficacy and safety of sodium-glucose cotransporter-2 inhibitors in solid organ transplant recipients: A scoping review. Pharmacotherapy 2024. [PMID: 38773917 DOI: 10.1002/phar.2928] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2024] [Revised: 04/03/2024] [Accepted: 04/06/2024] [Indexed: 05/24/2024]
Abstract
Sodium glucose co-transporter 2 (SGLT2) inhibitors are used for the treatment of diabetes and for their cardiovascular and kidney benefits in patients with or without diabetes. Use in solid organ transplant recipients is controversial because transplant recipients were excluded from the major clinical trials assessing SGLT2 inhibitors. The goal of this review was to assess the available literature regarding the use of SGLT2 inhibitors in solid organ transplant recipients. A PubMed search was conducted for studies published in English through December 31, 2023. Studies were excluded if they were meta-analyses, review articles, commentaries, single case reports, or in vitro studies, or did not involve the use of SGLT2 inhibitors in solid organ transplant recipients with a diabetic, cardiovascular, or kidney outcome being assessed. In the final review, 20 studies were included: kidney (n = 15), heart (n = 4), and liver/lung/kidney (n = 1) transplant recipients. SGLT2 inhibitors had similar A1c reduction efficacy and were found to be weight neutral with possible weight reduction effects. Cardiovascular and kidney outcomes were not adequately assessed in the available studies. Adverse effects were reported to occur at a similar rate in transplant recipients compared to the general population. SGLT2 inhibitors were initiated ≥1-year post-transplant in most transplant recipients included in these studies. The overall safety and antihyperglycemic efficacy of SGLT2 inhibitors in kidney and heart transplant recipients is similar to the general population. Data assessing SGLT2 inhibitors use in solid organ transplant recipients for longer durations are needed.
Collapse
|
2
|
A Comprehensive Mixed-Method Approach to Characterize the Source of Diurnal Tacrolimus Exposure Variability in Children: Systematic Review, Meta-analysis, and Application to an Existing Data Set. J Clin Pharmacol 2024; 64:334-344. [PMID: 37740566 DOI: 10.1002/jcph.2352] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2023] [Accepted: 09/13/2023] [Indexed: 09/24/2023]
Abstract
Tacrolimus is widely reported to display diurnal variation in pharmacokinetic parameters with twice-daily dosing. However, the contribution of chronopharmacokinetics versus food intake is unclear, with even less evidence in the pediatric population. The objectives of this study were to summarize the existing literature by meta-analysis and evaluate the impact of food composition on 24-hour pharmacokinetics in pediatric kidney transplant recipients. For the meta-analysis, 10 studies involving 253 individuals were included. The pooled effect sizes demonstrated significant differences in area under the concentration-time curve from time 0 to 12 hours (standardized mean difference [SMD], 0.27; 95% confidence interval [CI], 0.03-0.52) and maximum concentration (SMD, 0.75; 95% CI, 0.35-1.15) between morning and evening dose administration. However, there was significant between-study heterogeneity that was explained by food exposure. The effect size for minimum concentration was not significantly different overall (SMD, -0.09; 95% CI, -0.27 to 0.09) or across the food exposure subgroups. A 2-compartment model with a lag time, linear clearance, and first-order absorption best characterized the tacrolimus pharmacokinetics in pediatric participants. As expected, adding the time of administration and food composition covariates reduced the unexplained within-subject variability for the first-order absorption rate constant, but only caloric composition significantly reduced variability for lag time. The available data suggest food intake is the major driver of diurnal variation in tacrolimus exposure, but the associated changes are not reflected by trough concentrations alone.
Collapse
|
3
|
The role of a photographic atlas in reducing unanticipated healthcare utilization following circumcision. J Pediatr Urol 2023; 19:642.e1-642.e6. [PMID: 37481429 DOI: 10.1016/j.jpurol.2023.06.029] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/07/2023] [Revised: 06/28/2023] [Accepted: 06/29/2023] [Indexed: 07/24/2023]
Abstract
INTRODUCTION Circumcision is a common procedure that can evoke caregiver anxiety in the postoperative period due to unfamiliarity with the healing process. To mitigate unnecessary healthcare utilization such as phone calls and unanticipated clinic or emergency department (ED) visits, photographic atlases have been developed to better prepare caregivers for the recovery process. The objective of our study is to further investigate the efficacy of a photographic atlas in its ability to decrease postoperative healthcare utilization using an increased sample size and extended study period compared to previous studies. MATERIALS AND METHODS In this study, we compared a prospective intervention cohort of patients undergoing circumcision at our institution who received a photographic atlas during postoperative teaching to a retrospective cohort of patients who had not received it. Our primary outcome was unanticipated healthcare utilization, defined as postoperative telephone calls and unanticipated presentations to the urology clinic or ED. RESULTS The retrospective no-atlas cohort included 105 patients, and the prospective intervention atlas cohort included 80 patients. Both groups were similar with respect to age (p = 0.47) and other demographics. There was no statistically significant difference in healthcare utilization between the no-atlas and atlas cohort. Specifically, we identified no difference in the number of phone calls to clinic staff (12 [11.4%] vs. 11 [13.8%], p = 0.64) or unanticipated postoperative clinic or ED visits (2 [1.9%] vs. 4 [5.0%], p = 0.41). DISCUSSION The use of a photographic atlas as part of caregiver support for circumcision patients did not demonstrate a statistically significant reduction in either postoperative phone calls or clinic/ED visits. The decrease in absolute number of caregiver phone calls was minimal (12-11), with a small increase in follow-up presentations (2-4). The lack of significant change may be due to the already infrequent occurrence of these events following circumcision, as demonstrated by the no-atlas cohort. Other potential advantages of the atlas, such as improved caregiver confidence and satisfaction, may have been present, but were not measured in this study. CONCLUSIONS Adding to the mixed results of previous studies, these findings do not support that photographic atlases decrease unanticipated healthcare utilization in children undergoing a circumcision. However, utilization was found to be low. Additionally, further studies are needed to determine other significant benefits of this form of education, such as improved caregiver confidence and satisfaction.
Collapse
|
4
|
Artificial Intelligence Literacy: Developing a Multi-institutional Infrastructure for AI Education. Acad Radiol 2023; 30:1472-1480. [PMID: 36323613 DOI: 10.1016/j.acra.2022.10.002] [Citation(s) in RCA: 9] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2022] [Revised: 09/23/2022] [Accepted: 10/01/2022] [Indexed: 11/17/2022]
Abstract
RATIONALE AND OBJECTIVES To evaluate the effectiveness of an artificial intelligence (AI) in radiology literacy course on participants from nine radiology residency programs in the Southeast and Mid-Atlantic United States. MATERIALS AND METHODS A week-long AI in radiology course was developed and included participants from nine radiology residency programs in the Southeast and Mid-Atlantic United States. Ten 30 minutes lectures utilizing a remote learning format covered basic AI terms and methods, clinical applications of AI in radiology by four different subspecialties, and special topics lectures on the economics of AI, ethics of AI, algorithm bias, and medicolegal implications of AI in medicine. A proctored hands-on clinical AI session allowed participants to directly use an FDA cleared AI-assisted viewer and reporting system for advanced cancer. Pre- and post-course electronic surveys were distributed to assess participants' knowledge of AI terminology and applications and interest in AI education. RESULTS There were an average of 75 participants each day of the course (range: 50-120). Nearly all participants reported a lack of sufficient exposure to AI in their radiology training (96.7%, 90/93). Mean participant score on the pre-course AI knowledge evaluation was 8.3/15, with a statistically significant increase to 10.1/15 on the post-course evaluation (p= 0.04). A majority of participants reported an interest in continued AI in radiology education in the future (78.6%, 22/28). CONCLUSION A multi-institutional AI in radiology literacy course successfully improved AI education of participants, with the majority of participants reporting a continued interest in AI in radiology education in the future.
Collapse
|
5
|
Sweet and simple as syrup: A review and guidance for use of novel antihyperglycemic agents for post-transplant diabetes mellitus and type 2 diabetes mellitus after kidney transplantation. Clin Transplant 2023; 37:e14922. [PMID: 36708369 DOI: 10.1111/ctr.14922] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2022] [Revised: 01/16/2023] [Accepted: 01/23/2023] [Indexed: 01/29/2023]
Abstract
Uncontrolled type 2 diabetes mellitus (T2DM) and post-transplant diabetes mellitus (PTDM) increase morbidity and mortality after kidney transplantation. Conventional strategies for diabetes management in this population include metformin, sulfonylureas, meglitinides and insulin. Limitations with these agents, as well as promising new antihyperglycemic agents, create a need and opportunity to explore additional options for transplant diabetes pharmacotherapy. Novel agents including sodium glucose co-transporter 2 inhibitors (SGLT2i), glucagon-like peptide-1 receptor agonists (GLP1RA), and dipeptidyl peptidase IV inhibitors (DPP4i) demonstrate great promise for T2DM management in the non-transplant population. Moreover, many of these agents possess renoprotective, cardiovascular, and/or weight loss benefits in addition to improved glucose control while having reduced risk of hypoglycemia compared with certain other conventional agents. This comprehensive review examines available literature evaluating the use of novel antihyperglycemic agents in kidney transplant recipients (KTR) with T2DM or PTDM. Formal grading of recommendations assessment, development, and evaluation (GRADE) system recommendations are provided to guide incorporation of these agents into post-transplant care. Available literature was evaluated to address the clinical questions of which agents provide greatest short- and long-term benefits, timing of novel antihyperglycemic therapy initiation after transplant, monitoring parameters for these antihyperglycemic agents, and concomitant antihyperglycemic agent and immunosuppression regimen management. Current experience with novel antihyperglycemic agents is primarily limited to single-center retrospective studies and case series. With ongoing use and increasing comfort, further and more robust research promises greater understanding of the role of these agents and place in therapy for kidney transplant recipients.
Collapse
|
6
|
Predictive Capacity of Population Pharmacokinetic Models for the Tacrolimus Dose Requirements of Pediatric Solid Organ Transplant Recipients. Ther Drug Monit 2023; 45:95-101. [PMID: 36624576 PMCID: PMC9832243 DOI: 10.1097/ftd.0000000000001002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2022] [Accepted: 04/01/2022] [Indexed: 02/01/2023]
Abstract
BACKGROUND Transplant recipients require individualized tacrolimus doses to maximize graft survival. Multiple pediatric tacrolimus population pharmacokinetic (PopPK) models incorporating CYP3A5 genotype and other covariates have been developed. Identifying the optimal popPK model is necessary for clinical implementation in pediatric solid organ transplant. The primary objective was to compare the dose prediction capabilities of the developed models in pediatric kidney and heart transplant recipients. METHODS Pediatric kidney or heart transplant recipients treated with tacrolimus and available CYP3A5 genotype data were identified. The initial weight-based tacrolimus dose and first therapeutic tacrolimus dose were collected retrospectively. Three published popPK models were used to predict the tacrolimus dose required to achieve a tacrolimus trough concentration of 10 ng/mL. Model dose predictions were compared with the initial and first therapeutic doses using Friedman test. The first therapeutic dose was plotted against the model-predicted dose. RESULTS The median initial dose approximately 2-fold lower than the first therapeutic dose for CYP3A5 expressers. The Chen et al model provided the closest estimates to the first therapeutic dose for kidney transplant recipients; however, all 3 models tended to underpredict the observed therapeutic dose. For heart transplant recipients, Andrews et al model predicted doses that were higher than the initial dose but similar to the actual therapeutic dose. CONCLUSIONS Weight-based tacrolimus dosing appears to underestimate the tacrolimus dose requirements. The development of a separate popPK model is necessary for heart transplant recipients. A genotype-guided strategy based on the Chen et al model provided the best estimates for doses in kidney transplant recipients and should be prospectively evaluated.
Collapse
|
7
|
Evaluation of a Standardized Tacrolimus Therapeutic Drug Monitoring Protocol in Stable Kidney Transplant Recipients. Prog Transplant 2022; 32:212-218. [PMID: 35695240 DOI: 10.1177/15269248221107043] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
INTRODUCTION Transplant nurse coordinators have assisted in accurately adjusting tacrolimus doses under a collaborative practice agreement for kidney transplant recipients in the early post-operative period. This study evaluated the efficiency of a standardized tacrolimus therapeutic drug monitoring (TDM) protocol in stable outpatient recipients. DESIGN We conducted a single-center, retrospective study of adult patients who received a kidney transplant at least 3 years ago and were taking immediate-release tacrolimus. Before September 2019, transplant coordinators consulted transplant providers for management of all tacrolimus trough levels (Pre-Arm). Under the standardized protocol, coordinators directly responded to out-of-range tacrolimus trough levels (Post-Arm). The primary outcome was the time to intervention for out-of-range levels. Secondary outcomes included adverse events, time in therapeutic range, coefficient of variation (CV), and protocol compliance. RESULTS Of 1712 levels (from 174 patients), 259 levels (15.1%) were out-of-range. The overall time to intervention was 13.2 hours shorter (95% CI: -26.4 to -0.1 hours; P = 0.048) in the Post-Arm. There was no rejection, graft loss, or death during the study period. The time in therapeutic range was 89.3% (17.6%) vs 89% (19.4%; P = 0.816) and CV was 19.7% (15.8%) vs 18.4 (10.7%; P = 0.358) in the Pre-Arm and Post-Arm, respectively. Within the Post-Arm, the protocol required coordinators to independently intervene on 96 out-of-range levels (65.8%), which were accurately addressed 57.5% of the time. CONCLUSION Implementation of a standardized TDM protocol improved efficiency without compromising major clinical outcomes or intrapatient variability (IPV) of tacrolimus levels for stable kidney recipients in the outpatient setting.
Collapse
|
8
|
The use of non-transplant biologics in solid organ transplant recipients: A practical review for the frontline clinician. Clin Transplant 2022; 36:e14743. [PMID: 35690919 DOI: 10.1111/ctr.14743] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2022] [Revised: 05/09/2022] [Accepted: 06/01/2022] [Indexed: 11/27/2022]
Abstract
Biologics have become the forefront of medicine for management of autoimmune conditions, leading to improved quality of life. Many autoimmune conditions occur in solid organ transplant (SOT) recipients and persist following transplant. However, the use of biologics in this patient population is not well studied, and questions arise related to risk of infection and adjustments to induction and maintenance immunosuppression. Guidelines have been published highlighting management strategies of biologics around the time of elective surgical procedures, but this is not always feasible in urgent situations, especially with deceased donor transplantation. The aim of this review is to summarize the current literature regarding the use of these agents in solid organ transplant recipients, and specifically address induction and maintenance immunosuppression, as well as the need for alternative infective prevention strategies to create a practical reference for the frontline clinician, when faced with this complex clinical scenario.
Collapse
|
9
|
Evaluating pharmacokinetic drug-drug interactions of direct oral anticoagulants in patients with renal dysfunction. Expert Opin Drug Metab Toxicol 2022; 18:189-202. [PMID: 35543017 DOI: 10.1080/17425255.2022.2074397] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
INTRODUCTION Drug transporters, metabolic enzymes, and renal clearance play significant roles in the pharmacokinetics of direct oral anticoagulants (DOACs). Recommendations for DOAC drug-drug interactions (DDIs) by the product labeling are limited to selected CYP3A4 and P-glycoprotein inhibitors and lack considerations for concomitant renal dysfunction. AREAS COVERED This review focuses on: 1) current recommendations for the management of pharmacokinetic DOAC DDIs and the evidence used to support them; 2) alterations in DOAC exposure in the setting of concomitant DDIs and mild, moderate, and severe renal impairment; 3) clinical outcomes associated with this combination; and 4) expert recommendations for the management of pharmacokinetic DOAC DDIs. English-language, full-text articles on apixaban, dabigatran, rivaroxaban, and edoxaban with a publication date up to 30 September 2021 were retrieved from PubMed. EXPERT OPINION Given the lack of supporting clinical data, empiric dose adjustments based on pharmacokinetic data alone should be avoided. When a considerable increase in a DOAC exposure is anticipated, it may be advisable to use an alternative DOAC or anticoagulant from a different class. Future research on identification of DOAC therapeutic ranges and target patient populations is needed to inform clinical utility of DOAC level monitoring to guide the management of DDIs.
Collapse
|
10
|
Tacrolimus time in therapeutic range and long-term outcomes in heart transplant recipients. Pharmacotherapy 2021; 42:106-111. [PMID: 34882822 DOI: 10.1002/phar.2653] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2021] [Revised: 11/17/2021] [Accepted: 11/18/2021] [Indexed: 11/10/2022]
Abstract
STUDY OBJECTIVE Little is known about the association between tacrolimus time in therapeutic range (TTR) within the guideline-recommended targets and heart transplant (HT) patient outcomes. This study evaluated the association of early tacrolimus TTR with rejection and other clinical outcomes during an extended follow-up after HT. DESIGN This was a single-center retrospective cohort study. SETTING The study was conducted at Michigan Medicine (1/1/2006-12/31/2017). PATIENTS HT recipients ≥18 years of age were included. MEASUREMENT The primary end point was the effect of tacrolimus TTR on time to rejection over the entire follow-up period. MAIN RESULTS A total of 137 patients were included with a median follow-up of 53 months. Based on the median TTR of 58%, the patients were divided into the low tacrolimus TTR (n = 68) and high tacrolimus TTR (n = 69) cohort. The high tacrolimus TTR was associated with a significantly lower risk of rejection compared to the low tacrolimus TTR cohort (hazard ratio [HR] 0.63, 95% confidence interval [CI] 0.41-0.98; p = 0.04). A post hoc analysis revealed associations between rejection and TTR when high and low TTR groups were created at different levels. TTR <30% was associated with a 7-fold higher risk of rejection (HR 7.56; 95% CI 1.76-37.6; p < 0.01) and TTR >75% was associated with a 77% lower risk of rejection (HR 0.23; 95% CI 0.08-0.627; p < 0.01). CONCLUSIONS Patients in the higher tacrolimus TTR cohort had a lower risk of rejection. We observed correlations between higher risk of rejection with TTR <30% and lower risk of rejection with TTR >75%. Future studies should focus on validating the optimal TTR cutoff while also exploring a cutoff to delineate high-risk patients for which early interventions to improve tacrolimus TTR may be beneficial.
Collapse
|
11
|
A multi-center evaluation of hepatitis B reactivation with and without antiviral prophylaxis after kidney transplantation. Transpl Infect Dis 2021; 24:e13751. [PMID: 34725887 DOI: 10.1111/tid.13751] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2021] [Revised: 10/10/2021] [Accepted: 10/22/2021] [Indexed: 11/28/2022]
Abstract
BACKGROUND Hepatitis B virus (HBV) reactivation in hepatitis B surface antigen (HBsAg) negative and hepatitis B core antibody (anti-HBc) positive kidney transplant recipients ranges between 1.4-9.6%. Limited evidence is available regarding routine antiviral prophylaxis and identifiable risk factors for HBV reactivation in this population. METHODS In this multi-center retrospective study, we evaluated the prevalence of HBV reactivation in HBsAg-negative anti-HBc-positive kidney transplant recipients who did or did not receive antiviral prophylaxis. The primary outcome assessed the prevalence of HBV reactivation, defined as a positive HBV DNA by PCR of any viral load at or above the minimal detection level. The principal safety outcomes assessed 1-year graft survival, 1-year all-cause mortality, biopsy proven acute rejection (BPAR), and antibody mediated rejection (AMR). RESULTS One-hundred sixty-one patients met inclusion criteria and comprised of two groups, antiviral prophylaxis (n=14) and no antiviral prophylaxis (n=147). Of patients who did not receive prophylaxis only five (3.4%) experienced HBV reactivation whereas one (7.1%) patient in the prophylaxis group experienced reactivation over a median follow-up of 1103 days (p= 0.43). Furthermore, there were no differences with respect to all secondary outcomes. Statistical analysis demonstrated delayed graft function to be a significant factor associated with HBV reactivation. CONCLUSION These study results suggest that the prevalence of HBV reactivation in HBsAg-negative anti-HBc-positive kidney transplant recipients is low, regardless of antiviral prophylaxis. Furthermore, there were no significant graft related outcomes among those that did experience reactivation. This article is protected by copyright. All rights reserved.
Collapse
|
12
|
SLCO1B3 polymorphisms and clinical outcomes in kidney transplant recipients receiving mycophenolate. Pharmacogenomics 2021; 22:1111-1120. [PMID: 34612072 DOI: 10.2217/pgs-2021-0102] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Aim: Determine the influence of SLCO1B3 polymorphisms on outcomes in kidney transplant recipients. Materials & methods: We retrospectively evaluated 181 adult kidney transplant recipients receiving mycophenolate. Outcomes included treated biopsy-proven acute rejection (tBPAR), de novo donor-specific antibody (dnDSA) formation, graft survival, patient survival and mycophenolate-related adverse effects among SLCO1B3 genotypes. Results: The presence of SLCO1B3 variants was not associated with increased risk of tBPAR (HR: 1.45, 95% CI: 0.76-2.74), dnDSA (HR: 0.46, 95% CI: 0.16-1.36) or composite of tBPAR or dnDSA (HR: 1.14, 95% CI: 0.64-2.03). Graft and patient survival were reduced among variant carriers; however, inconsistent findings with the primary analysis suggest these associations were not due to genotype. Adverse effects were similar between groups. Conclusion: Presence of SLCO1B3 polymorphisms were not predictive of rejection or dnDSA in kidney transplant recipients.
Collapse
|
13
|
Comparison of standard versus low-dose valganciclovir regimens for cytomegalovirus prophylaxis in high-risk liver transplant recipients. Transpl Infect Dis 2021; 23:e13713. [PMID: 34428337 DOI: 10.1111/tid.13713] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2021] [Revised: 07/23/2021] [Accepted: 07/31/2021] [Indexed: 12/25/2022]
Abstract
PURPOSE The purpose of this study was to compare the safety and efficacy of two valganciclovir (VGCV) institutional dosing protocols for cytomegalovirus (CMV) prophylaxis in liver transplant (LT) recipients with CMV serotype donor +/recipient- (D+/R-). METHODS This was a single-center review of CMV D+/R- adult LT recipients who received VGCV 450 mg/day for 90 days (low-dose) or VGCV 900 mg/day for 180 days (standard-dose). The primary outcome was incidence of CMV disease at 1 year. Secondary outcomes included rates of CMV syndrome, end-organ disease, breakthrough infection, and resistance. Neutropenia, early discontinuation of VGCV, growth colony stimulating factors use (G-CSF), biopsy-proven rejection (BPAR), graft loss, and death at 1 year were analyzed. RESULTS Ninety-six CMV D+/R- LT recipients were included. Although no difference in CMV disease was observed (low-dose 26% vs. standard-dose 23%, p = 0.71), 75% of CMV infections in the low-dose group presented with end-organ disease. Ganciclovir (GCV) resistance was observed only in the low-dose group (n = 2). Significantly more patients in the standard-dose group developed neutropenia (low-dose 10% vs 60% standard-dose, p < 0.001). In the standard-dose group, 29% required early discontinuation of VGCV (vs. 5% in the low-dose group, p < 0.001), and 20% were treated with G-CSF. Both cohorts had similar rates of BPAR, graft loss, and death at 1 year. CONCLUSIONS VGCV 900 mg/day for 180 days had higher rates of hematologic adverse effects resulting in frequent treatment interruptions. However, the occurrence of two cases of GCV-resistant CMV disease raises concerns about routinely using low-dose VGCV prophylaxis.
Collapse
|
14
|
Entrustable professional activities for pharmacy students: A primer for solid organ transplant preceptors. Am J Health Syst Pharm 2021; 78:zxab320. [PMID: 34350946 DOI: 10.1093/ajhp/zxab320] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2021] [Indexed: 11/14/2022] Open
Abstract
DISCLAIMER In an effort to expedite the publication of articles , AJHP is posting manuscripts online as soon as possible after acceptance. Accepted manuscripts have been peer-reviewed and copyedited, but are posted online before technical formatting and author proofing. These manuscripts are not the final version of record and will be replaced with the final article (formatted per AJHP style and proofed by the authors) at a later time. PURPOSE The role of a solid organ transplant pharmacist is multifaceted and translates to diverse experiential and elective learning experiences that can be provided to pharmacy learners. Here we provide a guide to integrating pharmacy students into patient care and other pharmacist activities in solid organ transplantation. SUMMARY Thoughtful incorporation of learners into clinical practice and clinical research creates a positive learning environment for pharmacy students that can foster the development of core skills necessary for students to become "practice-ready" and "team-ready" pharmacy graduates and can equip them with valuable skills to incorporate into the specialty practice areas and careers they pursue. To help develop these educational experiences, attention to the list of core entrustable professional activities (EPAs) established by the American Association of Colleges of Pharmacy can help create a rich environment of learning with carefully cultivated tasks. Furthermore, learners can serve as transplant pharmacist extenders to assist in overall patient care and multidisciplinary involvement on the transplant team. This article serves as a "how-to" guide for applying the EPA framework to integrating pharmacy students in patient care and other pharmacist activities in solid organ transplantation and other specialty practice areas. CONCLUSION As pharmacy preceptors design and operationalize their teaching to incorporate EPAs, they can benefit from recommendations tailored to specialty practice areas such as solid organ transplantation. Students may start and finish these experiences at different EPA levels, but continuance of training will allow them to achieve the final EPA level across the 6 EPA domains.
Collapse
|
15
|
Impact of CYP3A5 phenotype on tacrolimus time in therapeutic range and clinical outcomes in pediatric renal and heart transplant recipients. Pharmacotherapy 2021; 41:649-657. [PMID: 34129685 DOI: 10.1002/phar.2601] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/23/2021] [Revised: 06/01/2021] [Accepted: 06/02/2021] [Indexed: 11/11/2022]
Abstract
STUDY OBJECTIVE This study investigated the effect of CYP3A5 phenotype on time in therapeutic range (TTR) of tacrolimus post-transplant in pediatric patients. DESIGN AND DATA SOURCE This retrospective study assessed medical records of pediatric kidney and heart recipients with available CYP3A5 genotype for tacrolimus dosing, troughs, and the clinical events (biopsy-proven acute rejection [BPAR] and de novo donor-specific antibodies [dnDSA]). MEASUREMENTS AND MAIN RESULTS The primary outcome, mean TTR in the first 90 days post-transplant, was 9.0% (95% CI: -16.1, -1.9) lower in CYP3A5 expressers (p = 0.014) when adjusting for time to therapeutic concentration and organ type. There was no difference between CYP3A5 phenotypes in time to the first clinical event using TTR during the first 90 days. When applying TTR over the first year, there was a significant difference in event-free survival (EFS) which was 50.0% for CYP3A5 expressers/TTR < 35%, 45.5% for expressers/TTR ≥ 35%, 38.1% for nonexpressers/TTR < 35%, and 72.9% for nonexpressers/TTR ≥ 35% (log-rank p = 0.03). A post hoc analysis of EFS identified CYP3A5 expressers had lower EFS compared to nonexpressers in patients with TTR ≥ 35% (p = 0.04) but no difference among patients with TTR < 35% (p = 0.6). CONCLUSIONS The relationship between TTR and CYP3A5 phenotype suggests that achieving a TTR ≥ 35% during the first year may be a modifiable factor to attenuate the risk of BPAR and dnDSA.
Collapse
|
16
|
A call for transplant stewardship: The need for expanded evidence-based evaluation of induction and biologic-based cost-saving strategies in kidney transplantation and beyond. Clin Transplant 2021; 35:e14372. [PMID: 34033140 DOI: 10.1111/ctr.14372] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/17/2021] [Revised: 05/15/2021] [Accepted: 05/19/2021] [Indexed: 12/14/2022]
Abstract
Rising expenditures threaten healthcare sustainability. While transplant programs are typically considered profitable, transplant medications are expensive and frequently targeted for cost savings. This review aims to summarize available literature supporting cost-containment strategies used in solid organ transplant. Despite widespread use of these tactics, we found the available evidence to be fairly low quality. Strategies mainly focus on induction, particularly rabbit antithymocyte globulin (rATG), given its significant cost and the lack of consensus surrounding dosing. While there is higher-quality evidence for high single-dose rATG, and dose-rounding protocols to reduce waste are likely low risk, more aggressive strategies, such as dosing rATG by CD3+ target-attainment or on ideal-body-weight, have less robust support and did not always attain similar efficacy outcomes. Extrapolation of induction dosing strategies to rejection treatment is not supported by any currently available literature. Cost-saving strategies for supportive therapies, such as IVIG and rituximab also have minimal literature support. Deferral of high-cost agents to the outpatient arena is associated with minimal risk and increases reimbursement, although may increase complexity and cost-burden for patients and infusion centers. The available evidence highlights the need for evaluation of unique patient-specific clinical scenarios and optimization of therapies, rather than simple blanket application of cost-saving initiatives in the transplant population.
Collapse
|
17
|
Evaluating the Impact of CYP3A5 Genotype on Post-Transplant Healthcare Resource Utilization in Pediatric Renal and Heart Transplant Recipients Receiving Tacrolimus. PHARMACOGENOMICS & PERSONALIZED MEDICINE 2021; 14:319-326. [PMID: 33746516 PMCID: PMC7967030 DOI: 10.2147/pgpm.s285444] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/10/2020] [Accepted: 01/11/2021] [Indexed: 01/10/2023]
Abstract
Purpose CYP3A5 genotype is a significant contributor to inter-individual tacrolimus exposure and may impact the time required to achieve therapeutic concentrations and number of tacrolimus dose adjustments in transplant patients. Increased modifications to tacrolimus therapy may indicate a higher burden on healthcare resources. The purpose of this study was to evaluate whether CYP3A5 genotype was predictive of healthcare resource utilization in pediatric renal and heart transplant recipients. Patients and Methods Patients <18 years of age with a renal or heart transplant between 6/1/2014–12/31/2018 and tacrolimus-based immunosuppression were included. Secondary use samples were obtained for CYP3A5 genotyping. Clinical data was retrospectively collected from the electronic medical record. Healthcare resource utilization measures included the number of dose changes, number of tacrolimus concentrations, length of stay, number of clinical encounters, and total charges within the first year post-transplant. Rejection and donor-specific antibody (DSA) formation within the first year were also collected. The impact of CYP3A5 genotype was evaluated via univariate analysis for the first year and multivariable analysis at 30, 90, 180, 270, and 365 days post-transplant. Results Eighty-five subjects were included, 48 renal transplant recipients and 37 heart transplant recipients. CYP3A5 genotype was not associated with any outcomes in renal transplant, however, a CYP3A5 expresser phenotype was a predictor of more dose changes, more tacrolimus concentrations, longer length of stay, and higher total charges in heart transplant recipients. CYP3A5 genotype was not associated with rejection or DSA formation. Age and induction therapy were associated with higher total charges. Conclusion CYP3A5 genotype may predict healthcare resource utilization in the first year post-transplant, although this may be mitigated by differences in tacrolimus management. Future studies should evaluate the impact of genotype-guided dosing strategies for tacrolimus on healthcare utilization resources.
Collapse
|
18
|
Relationship between internal accuracy and load-bearing capacity of minimally invasive lithium disilicate occlusal veneers. INT J PROSTHODONT 2021; 34:365–372. [PMID: 33616560 DOI: 10.11607/ijp.6735] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
PURPOSE To test whether internal accuracy affects the load-bearing capacity of 0.5-mm-thick occlusal veneers made out of milled or heat-pressed lithium disilicate (LS2). MATERIALS AND METHODS Extracted human molars (N = 80) were divided into four groups (n = 20 each) depending on the bonding substrate (enamel [E] or dentin [D]) and the fabrication method (milling [CAD] or heat pressing [PRE]) for the occlusal LS2 veneers: (1) E-CAD, (2) D-CAD, (3) E-PRE, or (4) D-PRE. After restoration fabrication, the abutment teeth and the corresponding restorations were scanned and superimposed in order to measure the marginal and internal accuracy. After adhesive cementation, the specimens were thermomechanically aged and thereafter loaded until fracture. The load-bearing capacities (Fmax) were measured. Fmax and the marginal and internal accuracy between the groups were compared using Kruskal-Wallis test (P < .05) and pairwise group comparisons. In addition, the relationship between Fmax and the internal accuracy was analyzed using Spearman rank correlation. RESULTS Median Fmax values (and first and third quartiles) per group were as follows: 1,495 N (Q1: 932; Q3: 2'318) for E-CAD; 1,575 N (Q1: 1,314; Q3: 1,668) for E-PRE; 1,856 N (Q1: 1,555; Q3: 2,013) for D-CAD; and 1,877 N (Q1: 1,566; Q3: 2,131) for D-PRE. No statistical difference was found between the groups (P = .0981). Overall, the internal accuracy in the areas of the cusp (P < .0007) and fossa (P < .0001) showed significant differences. While no significant differences were detected in the marginal area (P = .3518), a significant correlation with a negative linear relationship was found between the 3D internal accuracy and the Fmax values (P = .0007). CONCLUSION An increase in the internal accuracy raised the load-bearing capacity of minimally invasive LS2 occlusal veneers. In general, the restorations bonded to dentin in the occlusal regions showed a better accuracy compared to those bonded to enamel.
Collapse
|
19
|
The Clinical Conundrum of Cannabis: Current Practices and Recommendations for Transplant Clinicians: An Opinion of the Immunology/Transplantation PRN of the American College of Clinical Pharmacy. Transplantation 2021; 105:291-299. [PMID: 32413017 DOI: 10.1097/tp.0000000000003309] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/20/2023]
Abstract
Cannabis, or marijuana, comprises many compounds with varying effects. It has become a treatment option for chronic diseases and debilitating symptoms, and evidence suggests that it has immunomodulatory and antiinflammatory properties. Transplant centers are more frequently facing issues about cannabis, as indications and legalization expand. As of February 2020, 33 states and the District of Columbia have legalized medical cannabis, and 14 have legalized recreational cannabis. Moreover, 8 states have passed legislation prohibiting the denial of transplant listing solely based on cannabis use. Studies demonstrate the potential for significant pharmacokinetic and pharmacodynamic interactions between cannabis and immunosuppression. Additionally, safety concerns include increased risk of myocardial infarction, ischemic stroke, tachyarrhythmias, malignancy, neurocognitive deficits, psychosis, other neuropsychiatric disorders, cannabis use disorder, respiratory symptoms, and infection. A recent retrospective database study found a negative association between documented cannabis use disorder and graft survival, but little additional evidence exists evaluating this relationship. In the absence of robust clinical data, transplant centers need a clear, reasoned, and systematic approach to cannabis. The results of our national survey, unfortunately, found little consensus among institutions. As both recreational and medicinal cannabis become more ubiquitous nationwide, transplant centers will need to develop comprehensive policies to address its use.
Collapse
|
20
|
The Clinical Conundrum of Cannabis: Current Practices and Recommendations for Transplant Clinicians: An Opinion of the Immunology/Transplantation PRN of the American College of Clinical Pharmacy. Transplantation 2021. [DOI: https:/doi.org.10.1097/tp.0000000000003309] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/01/2023]
|
21
|
Oropharyngeal candidiasis outcomes in renal transplant recipients receiving nystatin versus no antifungal prophylaxis. Transpl Infect Dis 2021; 23:e13559. [PMID: 33387388 DOI: 10.1111/tid.13559] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2020] [Revised: 12/15/2020] [Accepted: 12/29/2020] [Indexed: 11/27/2022]
Abstract
OBJECTIVE To compare the incidence of oropharyngeal candidiasis (OC), or thrush, in renal transplant recipients receiving nystatin versus no antifungal prophylaxis. METHODS This was a single-center, retrospective, non-inferiority study of adult renal transplant recipients (RTRs) who received nystatin for 30 days for OC prophylaxis (nystatin group) or no antifungal prophylaxis therapy (No PPX group). The primary outcome was the incidence of OC within 3 months post-transplant. Secondary outcomes included time to OC occurrence and severity of OC. The pre-specified non-inferiority margin was 10%. RESULTS The incidence of OC within 3 months post-transplant among 257 RTRs was 7.8% (10/128) in the No PPX group and 4.7% (6/129) RTRs in the nystatin group, a risk difference of 3.2% (95% CI, -2.7% to 9.1%, non-inferiority P = .04). The median time to OC was 7.5 days (IQR 6.3-34.3 days) in the nystatin group and 9.5 days (IQR 5.3-30.5 days) in the No PPX group (P = .64). Esophageal candidiasis was observed in 10% (1/10) of RTRs with OC in the No PPX group compared to 16.7% (1/6) RTRs in the nystatin group (P = 1.00). All RTRs with OC achieved symptom resolution with fluconazole and/or nystatin. Two patients in the No PPX group required readmission for decreased oral intake, and OC was diagnosed and treated during their hospital day. CONCLUSIONS In this retrospective study of adult RTRs, the absence of antifungal prophylaxis demonstrated non-inferiority to 30-day nystatin prophylaxis at reducing the incidence of OC within 3 months of transplant. OC prophylaxis may not be warranted after renal transplant.
Collapse
|
22
|
Use of direct-acting oral anticoagulants in solid organ transplantation: A systematic review. Pharmacotherapy 2020; 41:28-43. [PMID: 33155327 DOI: 10.1002/phar.2485] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2020] [Revised: 08/27/2020] [Accepted: 09/10/2020] [Indexed: 12/13/2022]
Abstract
The use of direct-acting oral anticoagulants (DOACs) has increased secondary to the mounting evidence for comparable efficacy and potentially superior safety to vitamin K antagonists (VKAs) in the general population. However, insufficient data regarding DOAC use in solid organ transplant (SOT) recipients and numerous pharmacokinetic and pharmacodynamic considerations limit their use in this highly selected patient population. A systematic review of recent clinical evidence on the safety and efficacy of DOACs compared to VKAs in SOT recipients was conducted. Additional considerations including transplant-specific strategies for DOAC reversal and common pharmacokinetic/pharmacodynamic concerns were also reviewed. Although current evidence is limited to single-center retrospective analyses, DOACs, especially apixaban, appear to be a safe and effective alternative to VKAs for SOT recipients with stable graft function and without drug-drug interactions. Reliable data on DOAC reversal at the time of transplant surgery are lacking, and clinicians should consider idarucizumab, andexanet alfa, and other non-specific reversal agents on an individual patient basis. There is no evidence supporting deviations from the Food and Drug Administration labeling recommendations for DOAC dosing in the setting of drug-drug interactions, obesity, and renal function, especially in patients on hemodialysis.
Collapse
|
23
|
Tacrolimus intrapatient variability in solid organ transplantation: A multiorgan perspective. Pharmacotherapy 2020; 41:103-118. [PMID: 33131078 DOI: 10.1002/phar.2480] [Citation(s) in RCA: 26] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/16/2020] [Revised: 09/21/2020] [Accepted: 09/26/2020] [Indexed: 02/06/2023]
Abstract
BACKGROUND Tacrolimus therapy in solid organ transplant (SOT) recipients is challenging due to its narrow therapeutic window and pharmacokinetic variability both between patients and within a single patient. Intrapatient variability (IPV) of tacrolimus trough concentrations has become a novel marker of interest for predicting transplant outcomes. The purpose of this review is to evaluate the association of tacrolimus IPV with graft and patient outcomes and identify interventions to improve IPV in SOT recipients. METHODS A systematic review of the literature was performed using PubMed and Embase from database inception to September 20, 2020. Studies were eligible only if they evaluated an association between tacrolimus IPV and transplant outcomes. Both pediatric and adult studies were included. Measures of variability were limited to standard deviation, coefficient of variation, and time in therapeutic range. RESULTS Forty-four studies met the inclusion criteria. Studies were published between 2008 and 2020 and were observational in nature. Majority of data were published in adult kidney transplant recipients and identified an association with rejection, de novo donor specific antibody (dnDSA) formation, graft loss, and patient survival. Evaluation of IPV-directed interventions was limited to small preliminary studies. CONCLUSIONS High tacrolimus IPV has been associated with poor outcomes including acute rejection, dnDSA formation, graft loss, and patient mortality in SOT recipients. Future research should prospectively explore IPV-directed interventions to improve transplant outcomes.
Collapse
|
24
|
Observations from a systematic review of pharmacist‐led research in solid organ transplantation: An opinion paper of the American College of Clinical Pharmacy Immunology/Transplantation Practice and Research Network. JOURNAL OF THE AMERICAN COLLEGE OF CLINICAL PHARMACY 2020. [DOI: 10.1002/jac5.1294] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
25
|
Increasing net immunosuppression after BK polyoma virus infection. Transpl Infect Dis 2020; 23:e13472. [PMID: 32959930 DOI: 10.1111/tid.13472] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/18/2020] [Revised: 08/21/2020] [Accepted: 09/06/2020] [Indexed: 11/29/2022]
Abstract
BACKGROUND Reducing immunosuppression can effectively treat BK viremia (BKV) and BK nephropathy, but has been associated with increased risks for acute rejection and development of donor-specific antibodies (DSA). To date there have been no systematic evaluations of re-escalating immunosuppression in transplant patients with resolving BKV. Importantly, the safety of this approach and impact on graft survival is unclear. METHODS We performed a single-center retrospective review of kidney transplant recipients between July 2011 and June 2013 who had immunosuppression reduction after developing BKV (plasma PCR ≥ 1000 copies/ml). Changes in immunosuppression and patient outcomes were tracked until occurrence of a complication event: biopsy-proven acute rejection (BPAR), detection of de novo DSA, or recurrent BKV. Patients were grouped according to whether or not net immunosuppression was eventually increased. RESULTS Out of 88 patients with BKV, 44 (50%) had net immunosuppression increased while the other 44 did not. Duration of viremia, peak viremia, induction, and sensitization status were similar between the two groups. In a Kaplan-Meier analysis, increasing immunosuppression was associated with less BPAR (P = .001) and a trend toward less de novo DSA development (P = .06). Death-censored graft survival (P = .27) was not different between the two groups. In the net immunosuppression increase group, recurrent BKV occurred in 22.7% without any BKV-related graft losses. CONCLUSION These findings support potential benefits of increasing immunosuppression in patients with low-level or resolved BKV, but prospective trials are needed to better understand such an approach.
Collapse
|
26
|
Alternatives to immediate release tacrolimus in solid organ transplant recipients: When the gold standard is in short supply. Clin Transplant 2020; 34:e13903. [DOI: 10.1111/ctr.13903] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2020] [Revised: 04/25/2020] [Accepted: 05/07/2020] [Indexed: 12/28/2022]
|
27
|
Utilization of direct-acting oral anticoagulation in solid organ transplant patients: A national survey of institutional practices. Clin Transplant 2020; 34:e13853. [PMID: 32163212 DOI: 10.1111/ctr.13853] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/02/2019] [Revised: 02/12/2020] [Accepted: 03/05/2020] [Indexed: 01/05/2023]
Abstract
The safety and efficacy of direct-acting oral anticoagulants (DOACs) and reversal strategies are not well established in the solid organ transplant population. This was a survey of pharmacists to assess DOAC and urgent reversal practices among adult transplant programs in the United States. A 27-question survey was distributed to members of transplant pharmacy organization listservs between 5/28/19 and 6/30/19. A total of 115 responses were received from kidney (43.5%), heart (20.0%), lung (18.3%), liver (13.9%), and pancreas (4.4%) transplant programs. DOAC use prior to transplant was mostly prohibited in thoracic programs (77.3%) but more permissive in kidney transplant programs (64.0%). If permitted, apixaban (57.8%) was most preferred. At transplant surgery, reversal of DOAC was performed "as needed" (20.9%) or was not routine (18.3%). DOAC use post-transplant was more permissive (94.3%). A majority of responders follow FDA recommended dosing in the setting of drug-drug interactions (51.1%). Major factors influencing DOAC prescribing decisions included renal function, drug-drug interactions, and insurance. High clinical practice variability exists regarding DOAC utilization and urgent reversal strategies in pre-, peri-, and post-transplant stages. While more research is needed to refine the clinical landscape, many institutions are using DOAC therapy under the perception that they pose a similar risk of bleeding compared to a non-transplant population.
Collapse
|
28
|
Safety and efficacy of direct‐acting oral anticoagulants versus warfarin in kidney transplant recipients: a retrospective single‐center cohort study. Transpl Int 2020; 33:740-751. [DOI: 10.1111/tri.13599] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2019] [Revised: 09/24/2019] [Accepted: 02/24/2020] [Indexed: 12/26/2022]
|
29
|
Renal Outcomes of Liver Transplantation Recipients Receiving Standard Immunosuppression and Early Renal Sparing Immunosuppression: A Retrospective Single Center Study. Transplant Direct 2019; 5:e480. [PMID: 31579808 PMCID: PMC6739043 DOI: 10.1097/txd.0000000000000917] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2019] [Revised: 05/24/2019] [Accepted: 06/03/2019] [Indexed: 12/26/2022] Open
Abstract
New-onset stage 4-5 chronic kidney disease (CKD) after liver transplantation (LT) is associated with high morbidity, mortality, and economic burden. In 2010, we instituted an early renal sparing immunosuppression (RSI) protocol for LT recipients with severe renal dysfunction (pre-LT dialysis/estimated glomerular filtration rate (eGFR)<30mL/min/1.73 m2 or post-LT acute kidney injury) consisting of 2 doses of basiliximab for induction and delaying tacrolimus to post-LT day 4-7. We examined the effect of early RSI on post-LT renal outcomes. METHODS Data on all adults who had LT between January 1, 2010, and December 12, 2014 were collected. We calculated the renal risk index (RRI) score for each LT recipient (https://rri.med.umich.edu). Primary outcome was new-onset post-LT stage 4-5 CKD. RESULTS Of 214 LT recipients, 121 (57%) received early RSI and 93 (43%) received standard immunosuppression. Cumulative incidence of new-onset stage 4-5 CKD was higher in early RSI compared with standard immunosuppression (P = 0.03). Female sex and RRI score were the significant risk factors for development of post-LT stage CKD in the entire study cohort as well as the LT recipients with RRI ≥ sixth decile (high-risk group). CONCLUSIONS Delaying tacrolimus initiation combined with basiliximab induction did not have a durable effect on long-term renal outcomes in high-risk LT recipients. Further studies are needed to identify the effective strategies to preserve renal function by targeting patients at high risk for CKD progression.
Collapse
|
30
|
Novel educational and goal-setting tool to improve knowledge of chronic kidney disease among liver transplant recipients: A pilot study. PLoS One 2019; 14:e0219856. [PMID: 31344043 PMCID: PMC6658055 DOI: 10.1371/journal.pone.0219856] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2019] [Accepted: 07/02/2019] [Indexed: 11/25/2022] Open
Abstract
Introduction Liver transplant (LT) recipients have limited understanding of post-transplant chronic kidney disease (CKD) despite an excellent pre-existing framework of transplant care. This pilot study examined the efficacy and feasibility of a tailored educational and goal-setting tool in improving CKD knowledge among LT recipients with early-stage CKD. Methods In this prospective cohort study, we administered the CKD educational and goal-setting tool to 81 LT recipients between 7/1/2016 and 12/31/2017. We excluded patients with simultaneous liver-kidney transplantation, eGFR<30 ml/min, non-English speaking, on hemodialysis or listed for kidney transplantation. The pre- and post-education knowledge scores were compared using a paired t-test. Linear regression was used to assess the independent predictors of change in knowledge score. Results Mean age was 56.3 years, 69.1% were males, 85.2% were Caucasians and mean eGFR was 61.2 ± 20.0 ml/min. The CKD educational and goal-setting tool improved the CKD knowledge scores among LT recipients (pre: 71.8 ± 16.6%, post: 83.3 ± 10.4%; p<0.001). In an adjusted model (r2 = 0.75), those with lower pre-education knowledge scores had the most improvement in their post-education knowledge scores (β = -83.2; p<0.001). Two-thirds stated their most important self-management goal and reported motivation to follow this goal. Time spent for the CKD education was approximately 15 minutes. Conclusions A simple LT-specific patient educational and goal-setting tool effectively improved CKD knowledge. Implementation of this tailored intervention will improve CKD awareness and may promote goal-setting in the target population.
Collapse
|
31
|
Impact of CYP3A5 phenotype on tacrolimus concentrations after sublingual and oral administration in lung transplant. Pharmacogenomics 2019; 20:421-432. [PMID: 30983501 DOI: 10.2217/pgs-2019-0002] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/05/2023] Open
Abstract
Aim: This study evaluated the impact of CYP3A5 genotype and other patient characteristics on sublingual (SL) tacrolimus exposure and compared the relationship with oral administration. Patients & methods: Tacrolimus concentrations were retrospectively collected for adult lung transplant recipients, who were genotyped for CYP3A5*3, CYP3A4*22, CYP3A7*1C, and POR*28. Regression analyses were performed to determine covariates that impacted the SL and oral tacrolimus concentration/dose ratios. Results: An interaction of CYP3A5 genotype and CYP3A inhibitor increased the SL concentration/dose, while cystic fibrosis decreased the SL concentration/dose. The oral concentration/dose was independently associated with these covariates and was increased by serum creatinine and number of tacrolimus doses. Conclusion: This study suggests personalized dosing strategies for tacrolimus likely need to consider characteristics beyond CYP3A5 genotype.
Collapse
|
32
|
Secular Trends in the Cost of Immunosuppressants after Solid Organ Transplantation in the United States. Clin J Am Soc Nephrol 2019; 14:421-430. [PMID: 30819667 PMCID: PMC6419280 DOI: 10.2215/cjn.10590918] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2018] [Accepted: 01/16/2019] [Indexed: 12/20/2022]
Abstract
BACKGROUND AND OBJECTIVES Immunosuppressive medications are critical for maintenance of graft function in transplant recipients but can represent a substantial financial burden to patients and their insurance carriers. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS To determine whether availability of generic immunosuppressive medications starting in 2009 may have alleviated some of that burden, we used Medicare Part D prescription drug events between 2008 and 2013 to estimate the average annualized per-patient payments made by patients and Medicare in a large national sample of kidney, liver, and heart transplant recipients. Repeated measures linear regression was used to determine changes in payments over the study period. RESULTS Medicare Part D payments for two commonly used immunosuppressive medications, tacrolimus and mycophenolic acid (including mycophenolate mofetil and mycophenolate sodium), decreased overall by 48%-67% across organs and drugs from 2008 to 2013, reflecting decreasing payments for brand and generic tacrolimus (21%-54%), and generic mycophenolate (72%-74%). Low-income subsidy payments, which are additional payments made under Medicare Part D, also decreased during the study period. Out-of-pocket payments by patients who did not receive the low-income subsidy decreased by more than those who did receive the low-income subsidy (63%-79% versus 24%-44%). CONCLUSIONS The decline in payments by Medicare Part D and by transplant recipients for tacrolimus and mycophenolate between 2008 and 2013 suggests that the introduction of generic immunosuppressants during this period has resulted in substantial cost savings to Medicare and to patients, largely reflecting the transition from brand to generic products.
Collapse
|
33
|
Knowledge of Chronic Kidney Disease Among Liver Transplant Recipients. Liver Transpl 2018; 24:1288-1292. [PMID: 30080951 DOI: 10.1002/lt.25302] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/21/2018] [Accepted: 06/21/2018] [Indexed: 02/07/2023]
|
34
|
A national survey of valganciclovir dosing strategies in pediatric organ transplant recipients. Clin Transplant 2018; 32:e13369. [DOI: 10.1111/ctr.13369] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2018] [Accepted: 07/29/2018] [Indexed: 11/29/2022]
|
35
|
The adoption of generic immunosuppressant medications in kidney, liver, and heart transplantation among recipients in Colorado or nationally with Medicare part D. Am J Transplant 2018; 18:1764-1773. [PMID: 29603899 PMCID: PMC6537862 DOI: 10.1111/ajt.14722] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2017] [Revised: 02/20/2018] [Accepted: 02/26/2018] [Indexed: 01/25/2023]
Abstract
The transplant community is divided regarding whether substitution with generic immunosuppressants is appropriate for organ transplant recipients. We estimated the rate of uptake over time of generic immunosuppressants using US Medicare Part D Prescription Drug Event (PDE) and Colorado pharmacy claims (including both Part D and non-Part D) data from 2008 to 2013. Data from 26 070 kidney, 15 548 liver, and 6685 heart recipients from Part D, and 1138 kidney and 389 liver recipients from Colorado were analyzed. The proportions of patients with PDEs or claims for generic and brand-name tacrolimus or mycophenolate mofetil were calculated over time by transplanted organ and drug. Among Part D kidney, liver, and heart beneficiaries, the proportion dispensed generic tacrolimus reached 50%-56% at 1 year after first generic approval and 78%-81% by December 2013. The proportion dispensed generic mycophenolate mofetil reached 70%-73% at 1 year after generic market entry and 88%-90% by December 2013. There was wide interstate variability in generic uptake, with faster uptake in Colorado compared with most other states. Overall, generic substitution for tacrolimus and mycophenolate mofetil for organ transplant recipients increased rapidly following first availability, and utilization of generic immunosuppressants exceeded that of brand-name products within a year of market entry.
Collapse
|
36
|
Incidence of psoriasiform diseases secondary to tumour necrosis factor antagonists in patients with inflammatory bowel disease: a nationwide population-based cohort study. Aliment Pharmacol Ther 2018; 48:196-205. [PMID: 29869804 DOI: 10.1111/apt.14822] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/27/2018] [Revised: 03/23/2018] [Accepted: 05/02/2018] [Indexed: 12/11/2022]
Abstract
BACKGROUND There are increasing reports of paradoxical psoriasiform diseases secondary to anti-tumour necrosis factor (TNF) agents. AIMS To determine the risks of paradoxical psoriasiform diseases secondary to anti-TNF agents in patients with inflammatory bowel disease (IBD). METHODS A nationwide population study was performed using the Korea National Health Insurance Claim Data. A total of 50 502 patients with IBD were identified between 2007 and 2016. We compared 5428 patients who were treated with any anti-TNF agent for more than 6 months (anti-TNF group) and 10 856 matched controls who had never taken anti-TNF agents (control group). RESULTS Incidence of psoriasis was significantly higher in the anti-TNF group (36.8 per 10 000 person-years) compared to the control group (14.5 per 10 000 person-years) (hazard ratio [HR] 2.357, 95% confidence interval [CI] 1.668-3.331). Palmoplantar pustulosis (HR 9.355, 95% CI 2.754-31.780) and psoriatic arthritis (HR 2.926, 95% CI 1.640-5.218) also showed higher risks in the anti-TNF group. In subgroup analyses, HRs for psoriasis by IBD subtype were 2.549 (95% CI 1.658-3.920) in Crohn's disease and 2.105 (95% CI 1.155-3.836) in ulcerative colitis. Interestingly, men and younger (10-39 years) patients have significantly higher risks of palmoplantar pustulosis (HR 19.682 [95% CI 3.867-100.169] and HR 14.318 [95% CI 2.915-70.315], respectively), whereas women and older (≥40 years) patients showed similar rates between the two groups. CONCLUSIONS The risks of psoriasiform diseases are increased by anti-TNF agents in patients with IBD. Among psoriasiform diseases, the risk of palmoplantar pustulosis shows the biggest increase particularly in male and younger patients.
Collapse
|
37
|
Detection and classification of the breast abnormalities in digital mammograms via regional Convolutional Neural Network. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2018; 2017:1230-1233. [PMID: 29060098 DOI: 10.1109/embc.2017.8037053] [Citation(s) in RCA: 49] [Impact Index Per Article: 8.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
Automatic detection and classification of the masses in mammograms are still a big challenge and play a crucial role to assist radiologists for accurate diagnosis. In this paper, we propose a novel computer-aided diagnose (CAD) system based on one of the regional deep learning techniques: a ROI-based Convolutional Neural Network (CNN) which is called You Only Look Once (YOLO). Our proposed YOLO-based CAD system contains four main stages: mammograms preprocessing, feature extraction utilizing multi convolutional deep layers, mass detection with confidence model, and finally mass classification using fully connected neural network (FC-NN). A set of training mammograms with the information of ROI masses and their types are used to train YOLO. The trained YOLO-based CAD system detects the masses and classifies their types into benign or malignant. Our results show that the proposed YOLO-based CAD system detects the mass location with an overall accuracy of 96.33%. The system also distinguishes between benign and malignant lesions with an overall accuracy of 85.52%. Our proposed system seems to be feasible as a CAD system capable of detection and classification at the same time. It also overcomes some challenging breast cancer cases such as the mass existing in the pectoral muscles or dense regions.
Collapse
|
38
|
Effective ion charge (Z eff) measurements and impurity behavior in KSTAR. THE REVIEW OF SCIENTIFIC INSTRUMENTS 2018; 89:043504. [PMID: 29716340 DOI: 10.1063/1.5004217] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
A visible bremsstrahlung detector array diagnostic system has been developed on the Korea Superconducting Tokamak Advanced Research (KSTAR) to view the whole minor radius in a narrow region of the continuum free of spectral lines. The interference filters coupled with photomultiplier tubes have been employed to determine the effective charge Zeff by using visible bremsstrahlung data during neutral beam injection in the KSTAR plasma. The Zeff profiles are typically flat for L-mode plasmas and evolve to hollow profiles during the H mode in the KSTAR. A comparison of the visible bremsstrahlung emission based on the calculated Zeff profiles is consistent with measured values of Zeff from a visible spectrometer in the core plasma. The electron temperature is measured by X-ray imaging crystal spectrometry, and electron density needed for the analysis is taken by the assumption of parabolic profiles of these parameters. The line of sight averaged local bremsstrahlung emissivity is determined with low uncertainty, and the radial emissivity is obtained by using the Abel inversion technique. In addition, a dependence of effective charge Zeff on the line-averaged electron density is evaluated, and Zeff is also determined to observe the effect of boronization.
Collapse
|
39
|
Development of a semi-continuous two-stage simultaneous saccharification and fermentation process for enhanced 2,3-butanediol production by Klebsiella oxytoca. Lett Appl Microbiol 2018; 66:300-305. [PMID: 29315769 DOI: 10.1111/lam.12845] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2017] [Revised: 12/12/2017] [Accepted: 12/13/2017] [Indexed: 11/28/2022]
Abstract
Klebsiella oxytoca naturally produces a large amount of 2,3-butanediol (2,3-BD), a promising chemical with wide industrial applications, along with various by-products. Previously, we have developed a metabolically engineered K. oxytoca ΔldhA ΔpflB strain to reduce the formation of by-products. To improve 2,3-BD productivity and examine the stability of K. oxytoca ΔldhA ΔpflB strain for industrial application, a semi-continuous two-stage simultaneous saccharification and fermentation (STSSF) process was developed. The STSSF with the K. oxytoca ΔldhA ΔpflB mutant using cassava as a carbon source could produce 108 ± 3·73 g(2,3-BD) l-1 with a yield of 0·45 g(2,3-BD) g(glucose)-1 and a productivity of 3·00 g(2,3-BD) l-1 h-1 . No apparent changes in the final titre, yield and productivity of 2,3-BD were observed for up to 20 cycles of STSSF. Also, microbial contamination and spontaneous mutation of the host strain with potential detrimental effects on fermentation efficiency did not occur during the whole fermentation period. These results strongly underpin that the K. oxytoca ΔldhA ΔpflB mutant is stable and that the STSSF process is commercially exploitable. SIGNIFICANCE AND IMPACT OF THE STUDY There is growing interest in the production of 2,3-butanediol (2,3-BD) from renewable resources by microbial fermentation because of its wide applications to specialty and commodity chemical industries. Klebsiella oxytoca usually produces 2,3-BD as a major end product during the fermentation of carbohydrates. This is the first study to provide a high-efficiency simultaneous saccharification and 2,3-BD fermentation process. Also, this study proves the stability of a metabolically engineered 2,3-BD overproducing K. oxytoca strain for industrial application.
Collapse
|
40
|
Comparison of the Analgesic Effect of an Ice Cube versus 4% Lidocaine Cream in Intradermal Antibiotic Skin Testing. HONG KONG J EMERG ME 2017. [DOI: 10.1177/102490791201900505] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
Objective This study was performed to compare the analgesic effect of an ice cube with that of 4% lidocaine cream (L.M.X.4®) for intradermal skin testing. Methods In this prospective randomised study, healthy adult volunteers were divided into ice cube and lidocaine analgesic pretreatment groups. Randomisation was performed using a randomisation table in blocks of four. Intradermal skin testing was performed after applying ice in the ice cube group and 5 mg of lidocaine cream in the lidocaine group. After the intradermal skin test, the pain intensity was investigated using the visual analog scale (VAS) on questionnaires. We calculated that a minimum of 24 subjects were required for statistical power of 80% at a significance level of 0.05 (two-sided). The groups' VAS scores were compared using the Mann-Whitney U-test. Results The study population consisted of 35 volunteers: 17 in the ice cube group and 18 in the lidocaine group. There were no differences in demographic characteristics between the two groups. The median VAS score was 20 (interquartile range: 0-35) in the ice cube group and 70 (interquartile range: 50-80) in the lidocaine group (p<0.001). Conclusions The results suggested the utility of an ice cube as analgesic pretreatment for intradermal skin testing in the emergency department.
Collapse
|
41
|
First insights into the function of the sawshark rostrum through examination of rostral tooth microwear. JOURNAL OF FISH BIOLOGY 2017; 91:1582-1602. [PMID: 29034467 DOI: 10.1111/jfb.13467] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/21/2016] [Accepted: 08/17/2017] [Indexed: 06/07/2023]
Abstract
Potential roles of the rostrum of sawsharks (Pristiophoridae), including predation and self-defence, were assessed through a variety of inferential methods. Comparison of microwear on the surface of the rostral teeth of sawsharks and sawfishes (Pristidae) show that microwear patterns are alike and suggest that the elongate rostra in these two elasmobranch families are used for a similar purpose (predation). Raman spectroscopy indicates that the rostral teeth of both sawsharks and sawfishes are composed of hydroxyapatite, but differ in their collagen content. Sawfishes possess collagen throughout their rostral teeth whereas collagen is present only in the centre of the rostral teeth of sawsharks, which may relate to differences in ecological use. The ratio of rostrum length to total length in the common sawshark Pristiophorus cirratus was found to be similar to the largetooth sawfish Pristis pristis but not the knifetooth sawfish Anoxypristis cuspidata. Analysis of the stomach contents of P. cirratus indicates that the diet consists of demersal fishes and crustaceans, with shrimp from the family Pandalidae being the most important dietary component. No prey item showed evidence of wounds inflicted by the rostral teeth. In light of the similarities in microwear patterns, rostral tooth chemistry and diet with sawfishes, it is hypothesised that sawsharks use their rostrum in a similar manner for predation (sensing and capturing prey) and possibly for self-defence.
Collapse
|
42
|
Non-local means filter denoising for DEXA images. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2017; 2017:572-575. [PMID: 29059937 DOI: 10.1109/embc.2017.8036889] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
Dual high and low energy images of Dual Energy X-ray Absorptiometry (DEXA) suffer from noises due to the use of weak amount of X-rays. Denoising these DEXA images could be a key process to enhance and improve a Bone Mineral Density (BMD) map which is derived from a pair of high and low energy images. This could further improve the accuracy of diagnosis of bone fractures, osteoporosis, and etc. In this paper, we present a denoising technique for dual high and low energy images of DEXA via non-local means filter (NLMF). The noise of dual DEXA images is modeled based on both source and detector noises of a DEXA system. Then, the parameters of the proposed NLMF are optimized for denoising utilizing the experimental data from uniform phantoms. The optimized NLMF is tested and verified with the DEXA images of the uniform phantoms and real human spine. The quantitative evaluation shows the improvement of Signal-to-Noise Ratio (SNR) for the high and low phantom images on the order of 30.36% and 27.02% and for the high and low real spine images on the order of 22.28% and 33.43%, respectively. Our work suggests that denoising via NLMF could be a key preprocessing process for clinical DEXA imaging.
Collapse
|
43
|
Do laryngoscopic findings reflect the characteristics of reflux in patients with laryngopharyngeal reflux? Clin Otolaryngol 2017; 43:137-143. [PMID: 28605121 DOI: 10.1111/coa.12914] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/04/2017] [Indexed: 12/13/2022]
Abstract
OBJECTIVE To analyse the association between 24-hour multichannel intraluminal impedance-pH (24-h MII-pH) parameters and each item of the reflux finding score (RFS) to determine whether the laryngoscopic findings of the RFS could reflect the characteristics of reflux in patients with laryngopharyngeal reflux (LPR). STUDY DESIGN Prospective cohort study. SETTINGS Tertiary care referral medical centre. PARTICIPANTS Patients complaining of LPR symptoms were evaluated via a 24-hour MII-pH. Among them, 99 patients whose LPR was confirmed via 24-hour MII-pH were enrolled in this study. MAIN OUTCOME MEASURES Correlations between RFS ratings and 24-hour MII-pH parameters were evaluated and compared between patients with or without each laryngoscopic finding used in the RFS. RESULTS Subglottic oedema had a statistically significant positive correlation with number of non-acid LPR and non-acid full column reflux events. Ventricular obliteration and posterior commissure hypertrophy showed a significant correlation with non-acid exposure time and total reflux exposure time. We also found a significant correlation between granuloma/granulation score and number of acid LPR events. The numbers of non-acid LPR and full column reflux events in patients with subglottic oedema were significantly higher than those without subglottic oedema. CONCLUSION Among the laryngoscopic findings used in the RFS, subglottic oedema is specific for non-acid reflux episodes, and granuloma/granulation is specific for acid reflux episodes.
Collapse
|
44
|
Deformation mechanisms to ameliorate the mechanical properties of novel TRIP/TWIP Co-Cr-Mo-(Cu) ultrafine eutectic alloys. Sci Rep 2017; 7:39959. [PMID: 28067248 PMCID: PMC5220307 DOI: 10.1038/srep39959] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2016] [Accepted: 11/28/2016] [Indexed: 11/09/2022] Open
Abstract
In the present study, the microstructural evolution and the modulation of the mechanical properties have been investigated for a Co-Cr-Mo (CCM) ternary eutectic alloy by addition of a small amount of copper (0.5 and 1 at.%). The microstructural observations reveal a distinct dissimilarity in the eutectic structure such as a broken lamellar structure and a well-aligned lamellar structure and an increasing volume fraction of Co lamellae as increasing amount of copper addition. This microstructural evolution leads to improved plasticity from 1% to 10% without the typical tradeoff between the overall strength and compressive plasticity. Moreover, investigation of the fractured samples indicates that the CCMCu alloy exhibits higher plastic deformability and combinatorial mechanisms for improved plastic behavior. The improved plasticity of CCMCu alloys originates from several deformation mechanisms; i) slip, ii) deformation twinning, iii) strain-induced transformation and iv) shear banding. These results reveal that the mechanical properties of eutectic alloys in the Co-Cr-Mo system can be ameliorated by micro-alloying such as Cu addition.
Collapse
|
45
|
Ten-Year Experience With Bowel Transplantation at Seoul St. Mary's Hospital. Transplant Proc 2017; 48:473-8. [PMID: 27109981 DOI: 10.1016/j.transproceed.2015.12.065] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2015] [Accepted: 12/29/2015] [Indexed: 12/12/2022]
Abstract
A retrospective review of intestinal transplantation (ITx) at Seoul St. Mary's Hospital was made by collecting clinical data over the past 10 years. Fifteen consecutive cases from 2004 were analyzed. Five children and 10 adults (6 months to 69 years of age) were included. Primary diseases in adults included 4 mesenteric vessel thromboses, 2 strangulations, and 1 each of visceral myopathy, malignant gastrointestinal stromal tumor (GIST), mesenteric lymphangiectasis, and injury. Pediatric cases involved 2 Hirschsprung disease, 2 visceral myopathy, and 1 necrotizing enterocolitis. Three of 7 stomas were closed using a serial transverse enteroplasty procedure before transplantation. The ITx were performed using 3 living-donor Itx, 12 deceased-donor ITx, 14 isolated Itx, and 1 modified multivisceral transplantation. Daclizumab, basiliximab, alemtusumab, or basiliximab with rabbit antithymocyte globulin (rATG) was used for the induction; tacrolimus monotherapy was used as the basic maintenance immunosuppressant; and m-TOR inhibitor was used for renal dysfunction patients. Seven cases of acute cellular rejection were treated with rATG. Three cases of antibody-mediated rejection were treated with rituximab alone or with rituximab and bortezomib combination. There were 4 cases of early mortality within 6 months after Itx. Causes of death were declamping shock, cardiac tamponade with acute cellular rejection, dysmotility, and sepsis. Surgical complications consisted of 1 feeding jejunostomy displacement, and a minor leakage at a colo-colostomy site. One-year survival of the patient and graft was 73.33% (Kaplan-Meier survival curve). Although the total number of ITx is small, its social impact has been remarkable in changing the related laws and reimbursement policy in Korea.
Collapse
|
46
|
Abstract
Viruses are obligate intracellular parasites that have small genomes with limited coding capacity; therefore, they extensively use host intracellular machinery for their replication and infection in host cells. In recent years, it was elucidated that plants have evolved intricate defense mechanisms to prevent or limit damage from such pathogens. Plants employ two major strategies to counteract virus infections: resistance (R) gene-mediated and RNA silencing-based defenses. In this review, plant defenses and viral counter defenses are described, as are recent studies examining the cross-talk between different plant defense mechanisms.
Collapse
|
47
|
Use of complement binding assays to assess the efficacy of antibody mediated rejection therapy and prediction of graft survival in kidney transplantation. Hum Immunol 2016; 78:57-63. [PMID: 27894836 DOI: 10.1016/j.humimm.2016.11.009] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2016] [Revised: 09/17/2016] [Accepted: 11/23/2016] [Indexed: 10/20/2022]
Abstract
BACKGROUND The Luminex® single antigen bead assay (SAB) is the method of choice for monitoring the treatment for antibody-mediated rejection (AMR). A ⩾50% reduction of the dominant donor-specific antibody (IgG-DSA) mean fluorescence intensity (MFI) has been associated with improved kidney allograft survival, and C1q-fixing DSA activity is associated with poor outcomes in patients with AMR. We aimed to investigate if C1q-DSA can be used as a reliable predictor of response to therapy and allograft survival in patients with biopsy-proven AMR. METHODS We tested pre- and post-treatment sera of 30 kidney transplant patients receiving plasmapheresis and low-dose IVIG for biopsy-proven AMR. IgG-DSA and C1q-DSA MFI were measured and correlated with graft loss or survival. Patients were classified as nonresponders (NR) when treatment resulted in <50% reduction in MFI of IgG-DSA and/or C1q-DSA was detectable following therapy. RESULTS Differences in the percentage of patients deemed NR depended upon the end-point criterion (73% by reduction in IgG-DSA MFI vs. 50% by persistent C1q-DSA activity). None of the seven patients with <50% reduction of IgG-DSA but non-detectable C1q-DSA-fixing activity after therapy experienced graft loss, suggesting that C1q-DSA activity may better correlate with response. Reduction of C1q-DSA activity predicted graft survival better than IgG-DSA in the univariate Cox analysis (20.1% vs. 5.9% in NR; log-rank P-value=0.0147). CONCLUSIONS A rapid reduction of DSA concentration below the threshold required for complement activation is associated with better graft survival, and C1q-DSA is a better predictor of outcomes than IgG-DSA MFI reduction.
Collapse
|
48
|
Multicenter evaluation of efficacy and safety of low-dose versus high-dose valganciclovir for prevention of cytomegalovirus disease in donor and recipient positive (D+/R+) renal transplant recipients. Transpl Infect Dis 2016; 18:904-912. [DOI: 10.1111/tid.12609] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/25/2016] [Revised: 05/17/2016] [Accepted: 06/29/2016] [Indexed: 12/01/2022]
|
49
|
Abstract
Solid-organ transplantation is often the last alternative in many patients with end-stage organ disease. Although advances in immunosuppressive regimens, surgical techniques, organ preservation, and overall management of transplant recipients have improved graft and patient survival, infectious complications remain problematic. Bacterial, fungal, viral, and parasitic infections are implicated after transplantation depending on numerous factors, such as degree of immunosuppression, type of organ transplant, host factors, and period after transplantation. Proper prophylactic and treatment strategies are imperative in the face of chronic immunosuppression, nosocomial and community pathogens, emerging drug resistance, drug-drug interactions, and medication toxicities. This review summarizes the pathophysiology, incidence, prevention, and treatment strategies of common post-transplant infections.
Collapse
|
50
|
Effect of metronidazole use on tacrolimus concentrations in transplant patients treated for Clostridium difficile. Transpl Infect Dis 2016; 18:714-720. [PMID: 27501504 DOI: 10.1111/tid.12588] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2015] [Revised: 05/02/2016] [Accepted: 06/05/2016] [Indexed: 11/28/2022]
Abstract
BACKGROUND Two case reports suggest that metronidazole treatment for Clostridium difficile infections (CDI) increases tacrolimus (TAC) trough levels. The primary objective of this study was to determine the clinical significance of this potential interaction in transplant patients receiving CDI treatment. Currently, no robust literature exists to estimate a magnitude of pharmacokinetic interaction between metronidazole and TAC. METHODS In this retrospective study, the effects of CDI and metronidazole treatment on TAC levels in 52 adult solid organ transplant patients were investigated. The primary outcome was to determine the difference in dose-normalized TAC levels between baseline and symptom resolution in patients treated with metronidazole or vancomycin. The secondary outcome was to determine the difference in dose-normalized TAC levels at baseline and CDI diagnosis. RESULTS The average change in log-transformed dose-normalized TAC levels from baseline to symptom resolution was 0.99 for metronidazole (n = 35) and 1.04 for vancomycin (n = 17) treatment. The mean difference between the groups was 0.96 (95% confidence interval: 0.74-1.24). No significant difference was found between dose-normalized TAC levels at CDI diagnosis and baseline (P = 0.37). CONCLUSION CDI treatment with metronidazole was not associated with a >30% increase in TAC levels compared with vancomycin. Both treatment groups required TAC dose adjustments to maintain goal TAC levels and those treated with metronidazole did not require a significantly greater dose adjustment.
Collapse
|