1
|
Associations of demographic and behavioural factors with glycaemic control in young adults with type 1 diabetes mellitus. Intern Med J 2016; 46:332-8. [PMID: 26748888 DOI: 10.1111/imj.12991] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2015] [Revised: 11/23/2015] [Accepted: 12/23/2015] [Indexed: 11/29/2022]
Abstract
BACKGROUND Despite recognised benefits of optimal glycaemic control in patients with type 1 diabetes mellitus (T1DM), good control is still difficult to achieve, particularly for adolescents and young adults. Recognition of factors that may assist early optimisation of glycaemic control is therefore important. AIMS We explored associations of demographic, social and behavioural factors with glycosylated haemoglobin (HbA1c) levels in participants with T1DM aged 18-25 years. METHODS A cross-sectional analysis was performed on young adults attending a dedicated multidisciplinary clinic at Fremantle Hospital, Western Australia from January to August 2014. RESULTS Data from 93 participants were analysed. Mean age was 21.4 ± 2.3 years, and 39.8% of the cohort were female. Longer duration of diabetes was associated with higher HbA1c (r = 0.25, P = 0.04). Men had lower HbA1c than women (8.2 ± 1.6 vs 9.2 ± 2.0%, P = 0.01). Increased frequency of clinic attendance was associated with lower HbA1c (r = -0.27, P = 0.02). Those engaged in work or study had better HbA1c compared with those who were not (8.9 ± 2.1 vs 10.5 ± 2.1%, P = 0.03). Socioeconomic disadvantage, risk-taking behaviour, insulin pump use and distance travelled to clinic were not associated with differences in HbA1c. CONCLUSION In young adults with T1DM, geographical separation, socioeconomic disadvantage and risk-taking behaviours did not influence glycaemic control. Longer duration of diabetes identifies young adults at higher risk of poor control, while attendance at a multidisciplinary clinic and engagement in work or study was associated with better glycaemic control. Additional studies are warranted to clarify the role of behavioural interventions to improve diabetes management in young adults.
Collapse
|
2
|
|
3
|
|
4
|
The efficacy and safety of 200 days valganciclovir cytomegalovirus prophylaxis in high-risk kidney transplant recipients. Am J Transplant 2010; 10:1228-37. [PMID: 20353469 DOI: 10.1111/j.1600-6143.2010.03074.x] [Citation(s) in RCA: 356] [Impact Index Per Article: 25.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
Late-onset cytomegalovirus (CMV) disease is a significant problem with a standard 3-month prophylaxis regimen. This multicentre, double-blind, randomized controlled trial compared the efficacy and safety of 200 days' versus 100 days' valganciclovir prophylaxis (900 mg once daily) in 326 high-risk (D+/R-) kidney allograft recipients. Significantly fewer patients in the 200-day group versus the 100-day group developed confirmed CMV disease up to month 12 posttransplant (16.1% vs. 36.8%; p < 0.0001). Confirmed CMV viremia was also significantly lower in the 200-day group (37.4% vs. 50.9%; p = 0.015 at month 12). There was no significant difference in the rate of biopsy-proven acute rejection between the groups (11% vs. 17%, respectively, p = 0.114). Adverse events occurred at similar rates between the groups and the majority were rated mild-to-moderate in intensity and not related to study medication. In conclusion, this study demonstrates that extending valganciclovir prophylaxis (900 mg once daily) to 200 days significantly reduces the incidence of CMV disease and viremia through to 12 months compared with 100 days' prophylaxis, without significant additional safety concerns associated with longer treatment. The number needed to treat to avoid one additional patient with CMV disease up to 12 months posttransplant is approximately 5.
Collapse
|
5
|
Abstract
Travel to procure deceased donor organs is associated with risk to transplant personnel. In many instances, multiple teams are present for a given operation. We studied our statewide experience to determine how much excess travel this redundancy entails, and generated alternate models for organ recovery. We reviewed our organ procurement organization's experience with deceased donor operations between 2002 and 2008. Travel was expressed as cumulative person-miles between procurement team origin and donor hospital. A model of minimal travel was created, using thoracic and abdominal teams from the closest in-state center. A second model involved transporting donors to a dedicated procurement facility. Travel distance was recalculated using these models, and mode and cost of travel extrapolated from current practices. In 654 thoracic and 1469 abdominal donors studied, the mean travel for thoracic teams was 1066 person-miles and for abdominal teams was 550 person-miles. The mean distance traveled by thoracic and abdominal organs was 223 miles and 142 miles, respectively. Both hypothetical models showed reductions in team travel and reliance on air transport, with favorable costs and organ transport times compared to historical data. In summary, we found significant inefficiency in current practice, which may be alleviated using new paradigms for donor procurement.
Collapse
|
6
|
Should heart, lung, and liver transplant recipients receive immunosuppression induction for kidney transplantation? Clin Transplant 2009; 24:67-72. [PMID: 19222505 DOI: 10.1111/j.1399-0012.2009.00973.x] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
As the outcomes of heart, liver, and lung transplantation continue to improve, more patients will present for subsequent renal transplantation. It remains unclear whether these patients benefit from induction immunosuppression. We retrospectively reviewed induction on solid organ graft recipients who underwent renal transplant at our center from January 1, 1995 to March 30, 2007. Induction and the non-induction groups were compared by univariate and Kaplan-Meier analyses. There were 21 patients in each group, with mean follow-up of 4.5-6.0 years. Forty-seven percent of patients receiving induction had a severe post-operative infection, compared with 28.6% in the non-induction group (p = NS). The one yr rejection rate in the induction group was 9.5% compared with 14.3% for non-induction (p = NS). One-yr graft survival was 81.0% and 95.2% in the induction and non-induction group (p = NS). In summary, there is a trend toward lower patient and graft survival among patients undergoing induction. These trends could relate to selection bias in the decision to prescribe induction immunosuppression, but further study is needed to better define the risks and benefits of antibody-induction regimens in this population.
Collapse
|
7
|
Abstract
Over the past several years we have noted a marked decrease in this profitability of our kidney transplant program. Our hypothesis is that this reduction in kidney transplant institutional profitability is related to aggressive donor and recipient practices. The study population included all adults with Medicare insurance who received a kidney transplant at our center between 1999 and 2005. Adopting the hospital perspective, multi-variate linear regression models to determine the independent effects of donor and recipient characteristics and era effects on total reimbursements and total hospital margin. We note statistically significant decreased medical center incremental margins in cases with ECDs (-$5887) and in cases of DGF (-4937). We also note an annual change in the medical center margin is independently associated with year and changes at a rate of -$5278 per year, related to both increasing costs and decreasing Medicare reimbursements. The financial loss associated with patient DGF and the use of ECD kidneys may resonate with other centers, and could hinder efforts to expand kidney transplantation within the United States. The Centers for Medicare and Medicaid Services (CMS) should consider risk-adjusted reimbursement for kidney transplantation.
Collapse
|
8
|
Abstract
We quantified the financial implications of surgical complications following pancreas transplantation. We reviewed medical and financial records of 49 pancreas transplant recipients at the University of Michigan Health System (UMHS) between 1/6/2002 and 11/22/2004. The association of donor, transplant recipient and financial variables was assessed. The median costs to UMHS of procedures and follow-up were $92,917 for recipients without surgical complications versus $108,431 when a surgical complication occurred, a difference of $15,514 (p = 0.03). Median reimbursement by the payer was $17,363 higher in patients with a surgical complication (p = 0.001). Similar trends (higher insurer costs) were noted when stratifying by payer (public and private) and specific procedure (SPK and PAK). All parties (patient, physician, payer and medical center) should benefit from quality improvement, with payers having a financial interest in pancreas transplant surgical quality initiatives.
Collapse
|
9
|
Abstract
Urinary complications are common following renal transplantation. The aim of this study is to evaluate the risk factors associated with renal transplant urinary complications. We collected data on 1698 consecutive renal transplants patients. The association of donor, transplant and recipient characteristics with urinary complications was assessed by univariable and multivariable Cox proportional hazards models, fitted to analyze time-to-event outcomes of urinary complications and graft failure. Urinary complications were observed in 105 (6.2%) recipients, with a 2.8% ureteral stricture rate, a 1.7% rate of leak and stricture, and a 1.6% rate of urine leaks. Seventy percent of these complications were definitively managed with a percutaneous intervention. Independent risk factors for a urinary complication included: male recipient, African American recipient, and the "U"-stitch technique. Ureteral stricture was an independent risk factor for graft loss, while urinary leak was not. Laparoscopic donor technique (compared to open living donor nephrectomy) was not associated with more urinary complications. Our data suggest that several patient characteristics are associated with an increased risk of a urinary complication. The U-stitch technique should not be used for the ureteral anastomosis.
Collapse
|
10
|
Abstract
The success of clinical transplantation as a therapy for end-stage organ failure is limited by the availability of suitable organs for transplant. This article discusses continued efforts by the transplant community to collaboratively improve the organ supply. There were 7593 deceased organ donors in 2005. This represents an all-time high and a 6% increase over 2004. Increases were noted in deceased organ donation of all types of organs; notable is the increase in lung donation, which occurred in 17% of all deceased donors. The percentage of deceased donations that occurred following cardiac death has also reached a new high at 7%. The number of living donors decreased by 2%, from 7003 in 2004 to 6895 in 2005. This article discusses the continued efforts of the Organ Donation Breakthrough Collaborative and the Organ Transplantation Breakthrough Collaborative to support organ recovery and use and to encourage the expectation that for every deceased donor, all organs will be placed and transplanted.
Collapse
|
11
|
Who pays for biliary complications following liver transplant? A business case for quality improvement. Am J Transplant 2006; 6:2978-82. [PMID: 17294525 DOI: 10.1111/j.1600-6143.2006.01575.x] [Citation(s) in RCA: 40] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
We use biliary complication following liver transplantation to quantify the financial implications of surgical complications and make a case for surgical improvement initiatives as a sound financial investment. We reviewed the medical and financial records of all liver transplant patients at the UMHS between July 1, 2002 and June 30, 2005 (N = 256). The association of donor, transplant, recipient and financial data points was assessed using both univariable (Student's t-test, a chi-square and logistic regression) and multivariable (logistic regression) methods. UMHS made a profit of $6822 +/- 39087 on patients without a biliary complication while taking a loss of $5742 +/- 58242 on patients with a biliary complication (p = 0.04). Reimbursement by the payer was $5562 higher in patients with a biliary complication compared to patients without a biliary complication (p = 0.001). Using multivariable logistic regression analysis, the two independent risk factors for a negative margin included private insurance (compared to public) (OR 1.88, CI 1.10-3.24, p = 0.022) and biliary leak (OR = 2.09, CI 1.06-4.13, p = 0.034). These findings underscore the important impact of surgical complications on transplant finances. Medical centers have a financial interest in transplant surgical quality improvement, but payers have the most to gain with improved surgical outcomes.
Collapse
|
12
|
|
13
|
Abstract
Transplant physicians and candidates have become increasingly aware that donor characteristics significantly impact liver transplantation outcomes. Although the qualitative effect of individual donor variables are understood, the quantitative risk associated with combinations of characteristics are unclear. Using national data from 1998 to 2002, we developed a quantitative donor risk index. Cox regression models identified seven donor characteristics that independently predicted significantly increased risk of graft failure. Donor age over 40 years (and particularly over 60 years), donation after cardiac death (DCD), and split/partial grafts were strongly associated with graft failure, while African-American race, less height, cerebrovascular accident and 'other' causes of brain death were more modestly but still significantly associated with graft failure. Grafts with an increased donor risk index have been preferentially transplanted into older candidates (>50 years of age) with moderate disease severity (nonstatus 1 with lower model for end-stage liver disease (MELD) scores) and without hepatitis C. Quantitative assessment of the risk of donor liver graft failure using a donor risk index is useful to inform the process of organ acceptance.
Collapse
|
14
|
Preliminary analysis of early outcomes of a prospective, randomized trial of complete steroid avoidance in liver transplantation. Transplant Proc 2005; 37:1214-6. [PMID: 15848673 DOI: 10.1016/j.transproceed.2004.12.153] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
Abstract
Steroids are a mainstay in liver transplantation for induction and maintenance immunosuppression but are associated with significant adverse effects. While prior studies have successfully limited the use of steroids, whether complete steroid avoidance will improve outcomes remains unclear. To further evaluate the need for steroids, consenting patients who underwent liver transplantation between June 2002 and May 2004 were entered into a prospective, randomized trial to receive either standard therapy (tacrolimus, mycophenolate mofetil, steroid induction/maintenance) or complete steroid avoidance (standard therapy without steroid induction/maintenance). Clinically suspected rejection was confirmed by biopsy and treated with pulse steroid therapy. Outcomes were compared on an intention to treat basis. Of the 72 patients enrolled, 36 (50%) were randomized to the steroid avoidance group with a mean follow up of 412 +/- 41 days. Donor and recipient characteristics were similar between groups. The steroid avoidance group was more likely to have significant infections (52% vs 28%, P = .03). There was a trend toward an increased rate of acute rejection (25% vs 14%, P = .23). Twelve of 36 recipients (33%) enrolled in the steroid avoidance group later received steroids. The incidence of recurrent hepatitis C was similar between groups. The 1-year patient (90% vs 83%, P = .44) and graft survivals (90% vs 81%, P = .27) were similar between groups. These data suggest complete steroid avoidance in liver transplantation results in acceptable patient and graft survival. However, the potential long-term benefits of steroid avoidance, including a decrease in severity of recurrent hepatitis C, remain under investigation.
Collapse
|
15
|
Abstract
BACKGROUND Venous thrombosis remains an important cause of pancreatic graft loss. Nevertheless, reports are scarce of treatment alternatives to complete graft removal. We describe a case of surgical salvage of a partial pancreatic graft thrombosis. METHODS We used descriptive retrospective analysis. RESULTS A 36-year-old patient with juvenile-onset diabetes mellitus and previous living related renal transplant received a cadaveric pancreas transplant in the right iliac fossa with enteric exocrine drainage and standard vascular anastomosis. Two days after discharge from the hospital, he presented with severe right upper quadrant pain, nausea, vomiting, fever, and leukocytosis. He was taken to the operating room for exploration. The tail of the pancreas, which was kinked under the gallbladder, was necrotic and excised. The remainder of the pancreas looked normal. The patient recovered well from surgery and was discharged home 7 days later. CONCLUSIONS Partial pancreatectomy is an acceptable surgical alternative for incomplete graft thrombosis.
Collapse
|
16
|
Outcomes of pediatric living donor renal transplant after laparoscopic versus open donor nephrectomy. Transplant Proc 2002; 34:3097-8. [PMID: 12493385 DOI: 10.1016/s0041-1345(02)03610-2] [Citation(s) in RCA: 20] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/01/2022]
|
17
|
|
18
|
Abstract
Results of liver transplantation (LT) for hepatitis B have improved significantly with the use of hepatitis B immune globulin (HBIG) and/or lamivudine. The aim of this study is to review the long-term outcome of patients who underwent LT for hepatitis B. Records of 41 patients who underwent LT for hepatitis B and survived 3 months or longer post-LT were reviewed. Twenty patients were administered no immunoprophylaxis or short-term intramuscular HBIG, whereas 21 patients were administered high-dose intravenous (IV) HBIG. Median post-LT follow-up in these 2 groups was 76 months (range, 4 to 155 months) and 25 months (range, 4 to 68 months), respectively. Hepatitis B recurred in 15 (75%) and 4 patients (19%) who underwent LT in the pre-HBIG and post-HBIG eras, respectively. Cumulative rates of recurrent hepatitis B at 1 and 3 years post-LT in these 2 groups were 66% and 77% and 20% and 20%, respectively (P <.001). Recurrent hepatitis B in the post-HBIG era correlated with antibody to hepatitis B surface antigen titer less than 100 IU/L. Nine patients with recurrent hepatitis B were administered lamivudine for 13 to 49 months (median, 28 months); 6 patients continued to have stable or improved liver disease, whereas 3 patients developed virological breakthrough with slow deterioration of liver disease. Long-term IV HBIG is effective in preventing recurrent hepatitis B. The risk for recurrent hepatitis B is negligible after the first year post-LT. Among patients with no virological breakthrough, lamivudine can stabilize or improve liver disease for up to 4 years in patients with recurrent hepatitis B post-LT.
Collapse
|
19
|
Abstract
BACKGROUND Laparoscopic live donor nephrectomy for renal transplantation is being performed in increasing numbers with the goals of broadening organ supply while minimizing pain and duration of convalescence for donors. Relative advantages in terms of recovery provided by laparoscopy over standard open surgery have not been rigorously assessed. We hypothesized that laparoscopic as compared with open surgical live donor nephrectomy provides briefer, less intense, and more complete convalescence. METHODS Of 105 volunteer, adult, potential living-renal donors interested in the laparoscopic approach, 70 were randomly assigned to undergo either hand-assisted laparoscopic or open surgical live donor nephrectomy at a single referral center. Objective data and subjective recovery information obtained with telephone interviews and validated questionnaires administered 2 weeks, 6 weeks, and 6-12 months postoperatively were compared between the 23 laparoscopic and 27 open surgical patients. RESULTS There was 47% less analgesic use (P=0.004), 35% shorter hospital stay (P=0.0001), 33% more rapid return to nonstrenuous activity (P=0.006), 23% sooner return to work (P=0.037), and 73% less pain 6 weeks postoperatively (P=0.004) in the laparoscopy group. Laparoscopic patients experienced complete recovery sooner (P=0.032) and had fewer long-term residual effects (P=0.0015). CONCLUSIONS Laparoscopic donor nephrectomy is associated with a briefer, less intense, and more complete convalescence compared with the open surgical approach.
Collapse
|
20
|
Renal transplantation at the University of Michigan 1964 to 1999. CLINICAL TRANSPLANTS 2001:139-48. [PMID: 11038632] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 02/18/2023]
Abstract
The Michigan Kidney Transplant Program has existed for 35 years. Outcomes have improved dramatically as the one-year survival of cadaver kidney grafts increased from 25% to 85-90%. Patient deaths in the first year are now uncommon. Indications for renal transplantation have been extended to infants, the elderly, diabetics and to patients with other significant health problems who would not have been candidates in the past. Chronic administration of large doses of corticosteroids is no longer necessary and the associated morbidity is largely avoided. Improvements in immunosuppression, especially the introduction of cyclosporine, account for much of this progress. With success has come increasing demand. Unfortunately, the gap between the number of available donor kidneys and the number of patients listed for a cadaver transplant continues to increase rather than diminish. Greater acceptance of volunteer donation, as has occurred in our own program, will help to reduce this shortage. If the past forecasts the future, we can anticipate extraordinary advances during the next 35 years.
Collapse
|
21
|
|
22
|
|
23
|
|
24
|
Hand-assisted laparoscopic donor nephrectomy: comparable donor/recipient outcomes, costs, and decreased convalescence as compared to open donor nephrectomy. Transplant Proc 2001; 33:1106-7. [PMID: 11267211 DOI: 10.1016/s0041-1345(00)02804-9] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
25
|
Differences in etiology for graft loss in female renal transplant recipients. Transplant Proc 2001; 33:1288-90. [PMID: 11267295 DOI: 10.1016/s0041-1345(00)02481-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
|
26
|
Abstract
BACKGROUND Acute rejection (AR) remains a major risk factor for the development of chronic renal allograft failure (CAF), which is a major cause of late graft loss. With the introduction of several newer immunosuppressive agents (e.g., mycophenolate mofetil, tacrolimus and neoral) acute rejection rates have been steadily decreasing. However, the incidence of CAF has not decreased as dramatically as the incidence of acute rejection. One possible explanation is that the impact of AR on CAF is changing. The goal of this study was to analyze the relative impact of AR era on the development of CAF. METHODS We evaluated 63,045 primary renal transplant recipients reported to the USRDS from 1988 to 1997. CAF was defined as graft loss after 6 months posttransplantation, censored for death, acute rejection, thrombosis, infection, surgical complications, or recurrent disease. A Cox proportional hazard model correcting for 15 possible confounding factors evaluated the relative impact of AR on CAF. The era effect (years 1988-1989, 1990-1991, 1992-1993, 1994-1995 and 1996-1997) was evaluated by an era versus AR interaction term. RESULTS An AR episode within the first 6 months after transplantation was the most important risk factor for subsequent CAF (RR=2.4, CI 2.3-2.5). Compared with the reference group (1988-89 with no rejection), having an AR episode in 1988-89, 1990-1991, 1992-1993, 1994-1995, and 1996-1997, conferred a 1.67, 2.35, 3.4, 4.98 and 5.2-fold relative risk for the subsequent development of CAF (P<0.001). CONCLUSIONS Independently of known confounding variables, the impact of AR on CAF has significantly increased from 1988 to 1997. This effect may in part explain the relative lack of improvements in long term renal allograft survival, despite a decline in AR rates.
Collapse
|
27
|
Impact of pre-existing donor hypertension and diabetes mellitus on cadaveric renal transplant outcomes. Am J Kidney Dis 2000; 36:153-9. [PMID: 10873885 DOI: 10.1053/ajkd.2000.8288] [Citation(s) in RCA: 75] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
Hypertension (HTN) and diabetes mellitus (DM) predispose to systemic atherosclerosis with renal involvement. The prevalence of HTN and DM in cadaveric renal donors (affected donors) and the results of transplantation are unknown. We investigated these issues with national data from the US Renal Data System. A total of 4,035 transplants from affected donors were matched 1:1 with unaffected controls according to donor age and race, recipient race, and year of transplantation. Graft and patient survival were estimated. Among the 25,039 solitary renal transplantations performed between July 1, 1994, and June 30, 1997, cadaveric renal transplants from donors with HTN accounted for 15%, and donors with DM, 2%. Programs with 1-year cadaveric renal graft survival rates greater than 90% had 50% less affected donors compared with programs having 1-year cadaveric renal graft survival rates of 85% or less. Compared with donor-age-matched controls, transplants from affected donors were at minimally increased risk for primary nonfunction, delayed graft function, and acute rejection. Three-year graft survival rates were 71% in affected donor organs and 75% in controls (P = 0.001). Compared with controls, duration of HTN was an independent risk factor for graft survival (3-year graft survival rates, 75% versus 65%; relative risk = 1.36 for HTN >10 years; P < 0.001). A substantial fraction of cadaveric renal donors have preexisting HTN. Programs transplanting fewer affected donor kidneys had better than average results. Because the negative impact of donor HTN and DM on transplant outcome was of moderate degree except when the duration of donor HTN was greater than 10 years, use of affected donors should not be discouraged, but graft and patient survival analyses should account for their presence.
Collapse
|
28
|
Abstract
BACKGROUND Antigen specific allograft tolerance is induced in mice by anti-CD2 plus anti-CD3epsilon monoclonal antibody (mAb) treatment. Because anti-CD2 mAb inhibits several aspects of anti-CD3epsilon driven T cell activation, we investigated what components of T cell activation are required or may be dispensed with for tolerance induction. Anti-CD3epsilon-mediated T cell activation depends on FcgammaR interactions. METHODS To assess the role of FcgammaR-mediated T cell activation in tolerance induction, FcgammaR binding IgG or non-binding IgG3 anti-CD3epsilon mAbs were examined. RESULTS These mAbs, administered in conjunction with anti-CD2, were equally effective in inducing tolerance. Moreover, in vivo administration of a blocking mAb directed against the FcgammaR, or the use of allograft recipients deficient in FcgammaR, had no effect on tolerance induction. Blocking IL-2 using mAb directed against IL-2 or IL-2R also did not prevent the induction of tolerance. These results suggest that complete T cell activation was not required for tolerance induction. However, substitution of a partially activating mAb, directed against the T cell receptor (TCR) beta subunit for anti-CD3epsilon, failed to synergize with anti-CD2 mAb to induce tolerance. The anti-TCRbeta mAb and anti-CD3epsilon mAb were found to differentially down modulate expression of TCR/CD3 complex subunits. In particular, anti-CD3epsilon caused transient down modulation of the TCRbeta receptor subunit and the TCRzeta signaling module, and this pattern was enhanced and prolonged by anti-CD2. Anti-TCRbeta caused persistent TCRzeta modulation but no TCRbeta modulation, and anti-CD2 did not influence this pattern. CONCLUSIONS These results suggest that, although full T cell activation is not required for the induction of tolerance by anti-CD2 plus anti-CD3epsilon mAb, a signal transduction pathway that is associated with TCRbeta and TCRzeta expression, and, specifically, is perturbed by mAb binding of the CD3epsilon epitope, is critical.
Collapse
|
29
|
Renal transplantation in children with severe lower urinary tract dysfunction. J Urol 1999; 161:240-5. [PMID: 10037414] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/10/2023]
Abstract
PURPOSE Renal transplantation in children with end stage renal disease due to congenital urological malformations has traditionally been associated with a poor outcome compared to transplantation in those with a normal urinary tract. In addition, the optimal urological treatment for such children remains unclear. To address these issues, we retrospectively reviewed our experience with renal transplantation in this population. MATERIALS AND METHODS Between 1986 and 1998, 12 boys and 6 girls a mean age of 8.4 years with a severe dysfunctional lower urinary tract underwent a total of 15 living related and 6 cadaveric renal transplantations. Urological anomalies included posterior urethral valves in 8 cases, urogenital sinus anomalies in 4, the prune-belly syndrome in 2, and complete bladder duplication, ureterocele, lipomeningocele and the VATER syndrome in 1 each. In 11 children (61%) bladder augmentation or continent urinary diversion was performed, 2 (11%) have an intestinal conduit and 5 (28%) have a transplant into the native bladder. RESULTS In this group patient and overall allograft survival was 100 and 81%, respectively. These values were the same in all children who underwent renal transplantation at our center during this era. In the 17 children with a functioning transplant mean serum creatinine was 1.4 mg./dl. Technical complications occurred in 4 patients (22%), including transplant ureteral obstruction in 2 as well as intestinal conduit stomal stenosis and Mitrofanoff stomal incontinence. CONCLUSIONS Renal transplantation may be successfully performed in children with end stage renal disease due to severe lower urinary tract dysfunction. Bladder reconstruction, which may be required in the majority of these cases, appears to be safe when performed before or after the transplant. A multidisciplinary team approach to surgery is advantageous.
Collapse
|
30
|
Incidence and management of biliary complications after 291 liver transplants following the introduction of transcystic stenting. Transplantation 1998; 66:1201-7. [PMID: 9825818 DOI: 10.1097/00007890-199811150-00015] [Citation(s) in RCA: 71] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
BACKGROUND Biliary complications occur frequently after liver transplantation, and many are historically related to T tubes. Stents placed through the donor cystic duct have been used to attempt to reduce tube-related complications yet maintain access to the biliary tree. METHODS The outcomes of all liver transplant procedures performed at the University of Michigan between December 7, 1990 (when transcystic stenting was first used), and April 6, 1995, were analyzed retrospectively. Preoperative, perioperative, and postoperative variables were studied in relationship to biliary complications. The management of complications was also reviewed. RESULTS A total of 291 transplants qualified for study. The overall biliary complication rate was 25%, with no difference between the 237 patients who received transcystic stents, the 28 who received T tubes, and the 26 who received no tube. Among the complications patients experienced, 65% had stricture(s), 44% had stone or sludge formation, and 40% had a leak. Complications attributable solely to transcystic stents occurred in 4% of cases. Advanced age was the only preoperative variable associated with complications. Primary sclerosing cholangitis was associated with intrahepatic strictures, and prolonged cold ischemia time and rejection were associated with stone or sludge formation. Nonoperative management had the highest success rate for anastomotic stricture (76%) and the lowest for intrahepatic strictures (65%). Only one death was directly attributable to a biliary complication. CONCLUSION Transcystic stenting reduces the incidence of significant tube-related complications, but not the frequency of other biliary complications. Biliary complications can usually be managed percutaneously or endoscopically, although intrahepatic strictures and large, early leaks frequently require reoperation. Aggressive, early management of these complications can reduce excess mortality to less than 2%.
Collapse
|
31
|
One center's experience with liver transplantation: alcohol use relapse over the long-term. LIVER TRANSPLANTATION AND SURGERY : OFFICIAL PUBLICATION OF THE AMERICAN ASSOCIATION FOR THE STUDY OF LIVER DISEASES AND THE INTERNATIONAL LIVER TRANSPLANTATION SOCIETY 1998; 4:S58-64. [PMID: 9742494] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 02/08/2023]
|
32
|
Tolerance induction by anti-CD2 plus anti-CD3 monoclonal antibodies: evidence for an IL-4 requirement. JOURNAL OF IMMUNOLOGY (BALTIMORE, MD. : 1950) 1998; 161:1156-62. [PMID: 9686574] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Subscribe] [Scholar Register] [Indexed: 02/08/2023]
Abstract
Anti-CD2 mAb plus anti-CD3 mAb induce alloantigen specific tolerance. We sought to determine whether Th2 cytokines are involved in the induction of tolerance in this model. Addition of anti-IL-4 mAb or anti-IL-10 mAb to anti-CD2 plus anti-CD3 treatment abrogated tolerance and resulted in graft survivals of 26+/-4 and 25+/-5 days, respectively. Splenocytes from the anti-IL-4 mAb and anti-IL-10 groups had greater proliferation in response to alloantigen than either tolerant or naive groups. Cytokine analysis of MLR supernatants showed increased IL-10 in the tolerant group and increased IFN-gamma in the anti-IL-4 mAb treated group. Donor-specific alloantibody responses in untreated immune animals had a predominantly Th1 (IgG2a) alloantibody response, while the tolerogenic regimen reduced the ratio of IgG2a:IgG1 titers. The addition of anti-IL-4 mAb to the tolerogenic regimen partly restored the Th1-related IgG2a response. Tolerance did not develop in IL-4 knockout animals treated with anti-CD2 plus anti-CD3 (mean graft survival, 27+/-5 days). Restoration of IL-4 to IL-4 knockout animals by gene transfer with plasmid DNA resulted in prolongation of survival to 46+/-7 days, while adoptive transfer of wild-type splenocytes into IL-4 knockout recipients resulted in indefinite graft survival (>60 days) and indefinite survival of second donor-type grafts. IL-10 gene transfer to IL-4 knockout recipients did not prolong graft survival (28+/-4). These results demonstrate that tolerance in this model is mediated at least in part by Th2-type cells that secrete IL-4, promote IL-10 and IgG1 production, and inhibit alloantigen reactivity.
Collapse
|
33
|
Monitoring for alcohol use relapse after liver transplantation for alcoholic liver disease. LIVER TRANSPLANTATION AND SURGERY : OFFICIAL PUBLICATION OF THE AMERICAN ASSOCIATION FOR THE STUDY OF LIVER DISEASES AND THE INTERNATIONAL LIVER TRANSPLANTATION SOCIETY 1997; 3:300-3. [PMID: 9346755 DOI: 10.1002/lt.500030317] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
|
34
|
Effectiveness of right heart catheterization: time for a randomized trial. JAMA 1997; 277:109; author reply 113-4. [PMID: 8990323 DOI: 10.1001/jama.277.2.109b] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/03/2023]
|
35
|
The infant with primary hyperoxaluria and oxalosis: from diagnosis to multiorgan transplantation. ADVANCES IN RENAL REPLACEMENT THERAPY 1996; 3:315-25. [PMID: 8914696 DOI: 10.1016/s1073-4449(96)80012-4] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/03/2023]
Abstract
The care of an infant with primary hyperoxaluria and oxalosis is discussed. After an unheralded presentation, followed by 9 months of intensive treatment that included combined hemodialysis and peritoneal dialysis, the infant successfully underwent combined liver and kidney transplantation to definitely address both kidney failure and the underlying metabolic defect. Discussion of this approach, including ongoing input from the parents, addresses both the implications of undertaking the "best therapy" for this disease as well as the ethical dilemma passed by the decision whether to proceed or not to proceed with therapy.
Collapse
|
36
|
A novel murine model for the assessment of human CD2-related reagents in vivo. THE JOURNAL OF IMMUNOLOGY 1996. [DOI: 10.4049/jimmunol.157.5.1863] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/01/2023]
Abstract
Abstract
CD2 is a T cell surface glycoprotein that mediates both cell-cell adhesion and transmembrane signal transduction. To construct a model for the in vivo evaluation of human (h)CD2 function and hCD2-related reagents, hCD2 transgenic mice and murine (m)CD2 knockout mice were crossed, and the F2 generation selected for mCD2-hCD2+ animals by fluorescent flow cytometry. The mCD2-hCD2+ mice are healthy and have a normal distribution of mCD3, mCD4, and mCD8 in thymus, spleen, and lymph node. Therefore expression of the hCD2 transgene does not appear to disrupt normal T cell development. The functionality of hCD2 was demonstrated by T lymphocyte proliferation upon stimulation by combined anti-CD2 plus anti-CD2R (anti-T11(2) plus anti-T11(3)) mAbs. Anti-T11(2) plus anti-T11(3) anti-human CD2 mAbs also induced proliferation of mCD2+hCD2+ F1 lymphocytes, but not mCD2+hCD2- wild-type murine lymphocytes. Either an anti-murine or the human CD2 specific (anti-T11(1)) mAbs inhibited proliferation in alloantigen, PHA, or anti-CD3 mAb stimulated cultures and inhibited only cells bearing the appropriate cognate CD2. In vivo studies of immune function yielded results consistent with these in vitro assays. Thus, anti-T11(1) mAb suppressed contact sensitivity in vivo in the transgenic/knockout mice. mCD2-hCD2+ mice treated with anti-T11(1) or LFA-3 fusion proteins also showed significant prolongation of cardiac allograft survival. This prolongation was associated both with depletion and down-modulation of CD2 on remaining T cells. These data suggest that the transgenic/knockout mice provide a useful in vivo model for the assessment of hCD2-related reagents and CD2 function, free from any potential interactions with mCD2 and mCD2 ligands.
Collapse
|
37
|
A novel murine model for the assessment of human CD2-related reagents in vivo. JOURNAL OF IMMUNOLOGY (BALTIMORE, MD. : 1950) 1996; 157:1863-9. [PMID: 8757303] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Subscribe] [Scholar Register] [Indexed: 02/02/2023]
Abstract
CD2 is a T cell surface glycoprotein that mediates both cell-cell adhesion and transmembrane signal transduction. To construct a model for the in vivo evaluation of human (h)CD2 function and hCD2-related reagents, hCD2 transgenic mice and murine (m)CD2 knockout mice were crossed, and the F2 generation selected for mCD2-hCD2+ animals by fluorescent flow cytometry. The mCD2-hCD2+ mice are healthy and have a normal distribution of mCD3, mCD4, and mCD8 in thymus, spleen, and lymph node. Therefore expression of the hCD2 transgene does not appear to disrupt normal T cell development. The functionality of hCD2 was demonstrated by T lymphocyte proliferation upon stimulation by combined anti-CD2 plus anti-CD2R (anti-T11(2) plus anti-T11(3)) mAbs. Anti-T11(2) plus anti-T11(3) anti-human CD2 mAbs also induced proliferation of mCD2+hCD2+ F1 lymphocytes, but not mCD2+hCD2- wild-type murine lymphocytes. Either an anti-murine or the human CD2 specific (anti-T11(1)) mAbs inhibited proliferation in alloantigen, PHA, or anti-CD3 mAb stimulated cultures and inhibited only cells bearing the appropriate cognate CD2. In vivo studies of immune function yielded results consistent with these in vitro assays. Thus, anti-T11(1) mAb suppressed contact sensitivity in vivo in the transgenic/knockout mice. mCD2-hCD2+ mice treated with anti-T11(1) or LFA-3 fusion proteins also showed significant prolongation of cardiac allograft survival. This prolongation was associated both with depletion and down-modulation of CD2 on remaining T cells. These data suggest that the transgenic/knockout mice provide a useful in vivo model for the assessment of hCD2-related reagents and CD2 function, free from any potential interactions with mCD2 and mCD2 ligands.
Collapse
|
38
|
Two-dimensional and dobutamine stress echocardiography in the preoperative assessment of patients with end-stage liver disease prior to orthotopic liver transplantation. Transplantation 1996; 61:1180-8. [PMID: 8610415 DOI: 10.1097/00007890-199604270-00011] [Citation(s) in RCA: 202] [Impact Index Per Article: 7.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
Orthotopic liver transplantation is an established therapy for end-stage liver disease. This study evaluated the range of cardiovascular abnormalities in patients undergoing evaluation for orthotopic liver transplantation and determined the prognostic implications of abnormal echocardiographic features, including ischemia during dobutamine stress echocardiography, in predicting postoperative cardiac events. Two-dimensional echocardiography was performed in 190 patients for assessment of left ventricular function, valvular pathology, and pulmonary hypertension. Dobutamine stress echocardiography was performed in 165 patients for evaluation of inducible ischemia. Contrast echocardiography for detection of intrapulmonary shunting was performed in 125 patients at rest and in 99 during dobutamine stress. Left ventricular dysfunction, significant valvular regurgitation, and inducible ischemia were identified in <1O% of patients. Pulmonary hypertension, left ventricular hypertrophy and > or = moderate intrapulmonary shunting were present in 12%, 16%, and 26% of patients, respectively. Severe intrapulmonary shunting predicted death prior to transplantation (P=0.01). Of the 71 transplanted patients, major perioperative events included global left ventricular dysfunction in four patients and myocardial infarction in one patient with normal coronary arteries. No preoperative echocardiographic parameters, including ischemia on dobutamine echocardiography, predicted these perioperative events. No cardiac events related to obstructive coronary artery disease occurred in the 154 patients without ischemia on dobutamine stress echocardiography. The majority of patients with end-stage liver disease, including those with alcoholic cirrhosis, have normal cardiac function on two-dimensional echocardiography. Severe intrapulmonary shunting portends a poor prognosis in patients awaiting transplantation. A negative dobutamine stress echocardiogram appears useful in excluding patients at risk for perioperative cardiac events related to obstructive coronary artery disease.
Collapse
|
39
|
Update of the Adult and Pediatric Liver Transplant Program at the University of Michigan. CLINICAL TRANSPLANTS 1996:203-16. [PMID: 9286569] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Abstract
Significant technical innovations and improvements in immunosuppression have been introduced into our liver transplant program since its inception in 1985. The indications for transplantation have been extended to younger and older patients, and simultaneously more patients with comorbidities have been accepted for transplant. The net impact of these changes has been a continuing trend toward improved survival. Overall, patients with hepatitis B or malignancy have had poor survival rates. The introduction of prophylactic anti-hepatitis B immunoglobulin and lamivudine, and better selection of patients with malignancy may improve results for these patients in the future. As in other programs, our most vexing problem is the continuing scarcity of donor organs which has led to an ever-expanding waiting list, more deaths while awaiting transplant, and more suffering before transplantation. The introduction of living donor hepatic transplantation will be of some help in alleviating this shortage. We are confident that the evolution of our program into a joint multidisciplinary structure will provide more efficient, convenient and cost-effective care to our patients.
Collapse
|
40
|
Abstract
BACKGROUND Long-term side effects of corticosteroids (CSs) result in > major morbidity for recipients of orthotopic liver transplants (OLT). We instituted a program of CS withdrawal among OLT recipients to quantify the contribution of CS to adverse clinical sequelae and to determine whether long-term CS administration is necessary to avoid rejection. METHODS Recipients who had normal allograft function on CS, cyclosporine, and azathioprine more than 1 year after OLT were offered CS withdrawal during 12 to 22 weeks. Patients underwent routine clinical monitoring and laboratory studies. Continuous variables were compared by paired t test analysis. RESULTS CSs were discontinued in 51 recipients; 45 (88%) of 51 patients remain steroid-free after a mean follow-up of 13.8 months (range, 4 to 36). CS therapy was reinstituted in 6 patients who had abnormal transaminase levels during routine follow-up. Among the patients who remain off CS, there were no significant changes in blood pressure, transaminase, alkaline phosphatase, bilirubin, or glucose levels during the study period. Mean number of blood pressure medications decreased from 0.7 +/- 0.1 to 0.4 +/- 0.1 (p = 0.007). Cholesterol decreased from 217 +/- 8 mg/dl on CS to 204 +/- 9 mg/dl at 1 month (p = 0.0001), 183 +/- 10 mg/dl at 3 months (p = 0.0001), 198 +/- 8 mg/dl at 6 months (p = 0.04), 213 +/- 11 mg/dl at 12 months (p = 0.01), 209 mg/dl +/- 16 at 18 months (p = 0.02), and 183 +/- 19 mg/dl at 24 months (p = 0.2) off CS. Weight loss occurred in 88% of patients and averaged 9.5 pounds. CONCLUSIONS CS therapy can be successfully withdrawn without precipitating rejection in liver transplant recipients who have stable graft function 1 year after OLT. The incidence and severity of hypertension and hypercholesterolemia are reduced in patients whose CSs have been withdrawn.
Collapse
|
41
|
Subtotal parathyroidectomy in dialysis-dependent and post-renal transplant patients. A 25-year single-center experience. ARCHIVES OF SURGERY (CHICAGO, ILL. : 1960) 1995; 130:538-42; discussion 542-3. [PMID: 7748094 DOI: 10.1001/archsurg.1995.01430050088015] [Citation(s) in RCA: 57] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/26/2023]
Abstract
OBJECTIVE To determine long-term results and durability of parathyroidectomy in patients with chronic renal failure and renal transplant recipients. DESIGN Retrospective chart review and structured telephone interviews. SETTING Tertiary-care academic medical center. PATIENTS Ninety-one consecutive patients (80 undergoing long-term dialysis, 11 with posttransplant hyperparathyroidism). Mean follow-up was 8 years (minimum follow-up, 2 years; longest follow-up, 25 years). The most common indications for operation were bone pain (70% [56/80]) and weakness (46% [37/80]) in patients with renal failure and hypercalcemia (91% [10/11]) in renal transplant recipients. INTERVENTION Subtotal parathyroidectomy without remnant gland implantation. MAIN OUTCOME MEASURES Postoperative morbidity and mortality, relief and recurrence of symptoms. RESULTS Symptoms were successfully ameliorated in 95% (86/91) of patients. Clinically significant complications occurred in 5% (5/91) of patients (one patient each with wound hematoma, wound infection, and permanent recurrent laryngeal nerve paralysis and two patients with permanent hypoparathyroidism). Recurrence occurred in five (5%) of 91 patients. Two of these patients required four operations each to eradicate all hyperfunctioning accessory glands. The other three recurrences were caused by hyperplasia of the remnant gland left in the neck. These were easily treated by simple excision, with no morbidity. The actuarial rate of recurrent hyperparathyroidism was 4.1% at 1 year and 11.7% at 20 years. Overall hospital mortality was 3% (3/91). None of the deaths was directly attributable to parathyroidectomy. CONCLUSIONS We recommend subtotal parathyroidectomy without remnant implantation as a safe and durable intervention for hyperparathyroidism associated with renal failure and following renal transplantation. This intervention is associated with an acceptably low recurrence rate over extremely long periods of follow-up.
Collapse
|
42
|
Abstract
The combination of anti-CD2 plus anti-CD3 monoclonal antibodies (mAbs) synergistically prolongs allograft survival and induces antigen-specific tolerance. Since altered expression of cell surface molecules might be important for tolerance induction, the effect of anti-CD2 and anti-CD3 mAbs on the expression of adhesion molecules was analyzed on splenic T cells with an in vitro model. The anti-CD2 mAb, 12-15, alone had no effect on the expression of integrin alpha 4-chain epitopes recognized by two anti-CD49d (VLA-4 alpha) mAbs, R1-2 and PS/2. The anti-CD3 mAb, 2C11, caused R1-2 epitope expression to decrease, while PS/2 epitope expression remained unchanged. The combination of anti-CD2 and anti-CD3 mAbs further decreased R1-2 epitope expression while preserving PS/2 epitope expression. The expression of integrin beta 1 and beta 7 chains, each of which form heterodimers with alpha 4 chains, also remained unchanged. Expression of other integrin, selectin, or immunoglobulin superfamily molecules (CD11a, CD18, CD44, CD45, CD48, CD54 and CD62L) were all significantly increased by anti-CD2 or anti-CD3 mAbs. Decreased R1-2 epitope expression was anti-CD3 dependent and specifically augmented by anti-CD2 mAb. CD2-regulated decreases in R1-2 epitope expression correlated with increased cAMP and could be prevented by addition of high doses of IL-2 but was not affected by the addition of other cytokines. R1-2 alpha 4 epitope expression could be specifically restored by the divalent cation Mn2+, which also increased functional binding to the VCAM-1 ligand. Significantly, the R1-2 but not the PS/2 mAb prolonged graft survival in a cardiac allograft model. These results show that anti-CD2 and anti-CD3 mAbs selectively decrease integrin alpha 4 chain epitope expression on T cells through conformational regulation. Decreased expression of a CD49d epitope is unique in comparison to the up-modulation of other T-cell adhesion receptors. These changes correlate with functional effects and provide an additional mechanistic explanation for the synergistic effect of anti-CD2 plus anti-CD3 in producing tolerance.
Collapse
|
43
|
Increased cAMP and cAMP-dependent protein kinase activity mediate anti-CD2 induced suppression of anti-CD3-driven interleukin-2 production and CD25 expression. Pathobiology 1995; 63:175-87. [PMID: 8866788 DOI: 10.1159/000163949] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/02/2023] Open
Abstract
Anti-CD2 monoclonal antibody (mAb) can act synergistically with anti-CD3 to produce tolerance and diminish the anti-CD3-induced cytokine syndrome. Since interleukin(IL)-2 production and IL-2 receptor (IL-2R; CD25) expression are important determinants of CD3-driven T cell activation, the effects of anti-CD2 on anti-CD3-induced CD25 expression and IL-2 production were analyzed and related mechanistically to CD2-stimulated cAMP signaling with an in vitro model of T cell activation. The anti-CD2 mAb, 12-15, alone had no effect on splenic T cell CD25 expression and IL-2 production, while the anti-CD3 mAb, 145-2C11, caused significant increases in both CD25 expression and IL-2 production. The addition of anti-CD2 inhibited anti-CD3-induced increases in CD25 and IL-2. The inhibitory signal delivered by anti-CD2 was effective in many forms of T cell activation, since other stimuli which increased CD25, such as concanavalin A, phytohemagglutinin, and Staphylococcal enterotoxin B (SEB), could also be inhibited by anti-CD2. The inhibitory effect of anti-CD2 on CD25 could not be reversed by high doses of supplemental IL-2 added to the culture. Anti-CD2 increased cytoplasmic cAMP in a dose- and time-dependent manner. Reagents that increased cytoplasmic cAMP such as forskolin, cholera toxin, and 3'-isobutyl-1-methylxanthine could mimic the inhibitory effect of anti-CD2 on anti-CD3-driven CD25 expression. Anti-CD2 also increased the activity of cAMP-dependent protein kinase (PKA). H8, a PKA antagonist, blocked the inhibitory effect of anti-CD2 on CD25 expression, further confirming the role of PKA in CD2-induced negative signaling. The use of paired agonists to PKA demonstrated that a type I PKA was the preferential enzyme isoform stimulated by CD2 ligation. These findings show that increased cAMP and PKA activity mediate anti-CD2-induced suppression of anti-CD3-driven IL-2 production and CD25 expression, and provide mechanisms for anti-CD2-induced immunosuppression and inhibition of the cytokine syndrome associated with anti-CD3 treatment.
Collapse
|
44
|
Superior allograft survival in pediatric renal transplant recipients. Transplant Proc 1994; 26:24-5. [PMID: 8108958] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/28/2023]
|
45
|
Cytomegalovirus and Epstein-Barr virus-acquired immunity after Sandoglobulin prophylaxis in the pediatric renal transplant population. Transplant Proc 1994; 26:20-1. [PMID: 8108942] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/28/2023]
|
46
|
Abstract
The burn patient initially requires many of the same measures as any other trauma patient. Both depth and surface extent of the burn injury should be evaluated. Evaluation for smoke inhalation is important, since this is prevalent and life-threatening among burn victims. A treatment plan begins with a realistic appraisal of the probability of survival. Once goals of management have been established, treatment is aimed at both physiologic and aesthetic rehabilitation.
Collapse
|
47
|
Frequency of transferable drug resistance in clinical isolates of Klebsiella, Aerobacter, and Escherichia. Antimicrob Agents Chemother 1976; 9:94-9. [PMID: 5396531 PMCID: PMC429482 DOI: 10.1128/aac.9.1.94] [Citation(s) in RCA: 27] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/15/2023] Open
Abstract
Twenty-seven patients with serious gram-negative infections were treated with ticarcillin in an average daily dosage of 237 mg/kg (range, 174 to 307 mg/kg). Ticarcillin was bactericidal for all infecting organisms in concentrations ranging from 31.2 to 125 μg/ml. Five of 8 patients (62%) with overwhelming Pseudomonas pneumonia were cured or improved, and 9 of 12 (75%) were cured of pneumonia caused by other gram-negative organisms. Of six extrapulmonary infections caused by Pseudomonas, five (83%) were cured or improved. In seven cases, the infecting organism reisolated during therapy was more resistant to ticarcillin than the primary isolate. The serum half-life of ticarcillin in three patients with renal failure was 11.2 ± 1.0 h, and during hemodialysis it decreased to 6.3 ± 1.8 h. There were two episodes of superinfection with resistant organisms. Thirteen patients (48%) manifested eosinophilia, one of whom had severe urticaria. Prolongation of bleeding time was attributable to ticarcillin in two patients. Ticarcillin appears to be effective for therapy of serious gram-negative infections in dosages 30 to 50% less than those recommended for carbenicillin.
Collapse
|
48
|
In vivo and in vitro effects of dimethyl sulfoxide on streptomycin-sensitive and -resistant Escherichia coli. Ann N Y Acad Sci 1975; 243:269-77. [PMID: 1093463 DOI: 10.1111/j.1749-6632.1975.tb25365.x] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/25/2022]
|
49
|
Abstract
Deoxyribonucleic acid preparations from two strains of Pseudomonas aeruginosa newly isolated from clinical specimens each contained double-stranded satellite DNA. The satellite DNA in one strain consisted of two components with buoyant densities of 1.705 and 1.712 g cm−3 and comprised 28% of the total extracted DNA. The other strain contained 15% satellite DNA which was composed of components with buoyant densities of 1.711 and 1.718 g cm−3.
Collapse
|
50
|
Abstract
Replication of the 50 and 58 moles per cent guanine plus cytosine (%GC) components of R factor 222 in Proteus mirabilis during growth in the presence and absence of chloramphenicol and after shifting exponential- and stationary-phase cells to conditions which inhibit host protein or deoxyribonucleic acid (DNA) synthesis was examined. Chloramphenicol reduced the growth rate but increased the amount of both R-factor components; the 58% GC component showed a larger proportionate increase. This was inferred to indicate reduced synthesis of an inhibitor that acts on both R-factor components and an initiator for replication of the 50% GC component. Replicative patterns observed after shifting exponential- and stationary-phase cells grown with or without chloramphenicol to minimal medium or chloramphenicol for one generation, or puromycin for 3 hr, corroborated this interpretation. After shifts of exponential cells from either medium, replication of the 50% GC components paralleled host replication, thus indicating a requirement for protein synthesis; replication of the 58% GC component increased due to reduced inhibitor synthesis. R-factor DNA remained constant after shifting stationary cells from drug-free medium, thus indicating that the cells contained effective concentrations of the regulatory inhibitor, whereas increased replication of the 58% GC component occurred after identical shifts of chloramphenicol-grown cells of the same chronological age. Similar responses were observed after shifts to 5 C or to medium containing streptomycin or tetracycline. Absence of replication of the 50% GC component after shifting to medium containing nalidixic acid or phenethanol and its hereditary persistence during growth indicated that the 50% GC replicon was attached to the membrane. Thus, in P. mirabilis the three replicons of R factor 222 are regulated as follows: The composite and transfer portion (RTF) replicons represented by the 50% GC component require protein synthesis and membrane attachment and are negatively regulated by an inhibitor; the 58% GC or resistance-determinants replicon exists cytoplasmically and is subject only to negative control.
Collapse
|