1
|
The immune response to a fungus in pancreatic cancer samples. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.03.28.534606. [PMID: 37034706 PMCID: PMC10081247 DOI: 10.1101/2023.03.28.534606] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 05/09/2023]
Abstract
Pancreatic ductal adenocarcinoma (PDAC) is a poor prognosis cancer with an .aggressive growth profile that is often diagnosed at late stage and that has few curative or therapeutic options. PDAC growth has been linked to alterations in the pancreas microbiome, which could include the presence of the fungus Malassezia. We used RNA-sequencing to compare 14 paired tumor and normal (tumor adjacent) pancreatic cancer samples and found Malassezia RNA in both the PDAC and normal tissues. Although the presence of Malassezia was not correlated with tumor growth, a set of immune- and inflammatory-related genes were up-regulated in the PDAC compared to the normal samples, suggesting that they are involved in tumor progression. Gene set enrichment analysis suggests that activation of the complement cascade pathway and inflammation could be involved in pro PDAC growth.
Collapse
|
2
|
Navajo Neurohepatopathy : A Case Report and Literature Review Emphasizing Clinicopathologic Diagnosis. Acta Gastroenterol Belg 2016; 79:463-469. [PMID: 28209105] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
Navajo Neurohepatopathy (NNH) is a rare hepatocerebral mitochondrial DNA (mtDNA) depletion syndrome (MDS) with nonspecific clinical or pathologic features aside from Navajo ancestry. Because of the rarity of NNH, diagnosis rests on close clinicopathologic correlation and appropriate tissue triage for quantitative mtDNA analysis. We present a new case of NNH in which the clinical presentation and H&E liver biopsy histology indicated the need for NNH workup. Quantitative analysis of mtDNA in liver tissue was significantly reduced, and mutational analysis of the MPV17 gene confirmed homozygosity for the NNH-associated missense mutation, R50Q. The patient is now one year post liver transplant and continues to have normal liver function tests but suffers multiple immunosuppression-associated co-morbidities. A comprehensive literature review is provided to assist in diagnosis and management of NNH. (Acta gastroenterol. belg., 2016, 79, 463-469).
Collapse
|
3
|
Is ovarian and adrenal venous catheterization and sampling helpful in the investigation of hyperandrogenic women? Clin Endocrinol (Oxf) 2003; 59:34-43. [PMID: 12807501 DOI: 10.1046/j.1365-2265.2003.01792.x] [Citation(s) in RCA: 75] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
OBJECTIVE To audit our practice of performing ovarian and adrenal venous catheterization and sampling in hyperandrogenic women who fail to suppress their elevated androgen levels following a 48-h low-dose dexamethasone suppression test (LDDST). We considered the technical success rate of catheterization, the extra information obtained in addition to the standard biochemical tests and imaging findings, and the impact of sampling on management decisions. DESIGN A retrospective analysis of the results of all ovarian and adrenal venous catheterizations performed at St Bartholomew's Hospital, London, in the years 1980-1996. PATIENTS AND METHODS Baseline ovarian and adrenal androgens were measured in all women presenting with symptoms and signs of hyperandrogenism. Those patients who failed to suppress their elevated testosterone (T), androstenedione (A4) and/or dehydroepiandrosterone-sulphate (DHEAS) levels following a LDDST to within the normal range or to less than 50% of the baseline value were investigated further with adrenal computed tomography (CT), ovarian ultrasound, and ovarian and adrenal venous catheterization and sampling. RESULTS Results were available in 38 patients. The overall catheterization success rate was: all four veins in 27%, three veins in 65%, two veins in 87%. The success rate for each individual vein was: right adrenal vein (RAV) 50%, right ovarian vein (ROV) 42%, left adrenal vein (LAV) 87% and left ovarian vein (LOV) 73%. Eight patients were found to have tumours by means of imaging (adrenal CT and ovarian ultrasound), three adrenal and five ovarian, seven of which underwent operation. In six of these patients the clinical presentation was suggestive of the presence of a tumour; in addition, the combination of imaging findings allowed the detection of suspicious adrenal and ovarian masses in all eight cases. The five patients with ovarian tumours had serum testosterone levels > 4.5 nmol/l. In a further eight patients, laparotomy was performed based on a combination of diagnostic and therapeutic indications; in two of these patients the catheterization results were suggestive of an ovarian tumour. All these eight patients were shown histologically to have polycystic ovarian syndrome (PCOS), and no occult ovarian tumour was identified. None of the patients with nontumourous hyperandrogenism had a baseline testosterone level in excess of 7 nmol/l (median 4.4 nmol/l, range 2.5-7 nmol/l). CONCLUSIONS Our results suggest that ovarian and adrenal venous catheterization and sampling should not be performed routinely in women presenting with symptoms and signs of hyperandrogenism, even if they fail to suppress their elevated androgen levels to a formal 48-h LDDST. All patients presenting with symptoms and signs of hyperandrogenism and elevated androgen levels, and where the suspicion of an androgen-secreting tumour is high, should have adrenal CT and ovarian ultrasound imaging to detect such a tumour. Venous catheterization and sampling should be reserved for patients in whom uncertainty remains, as the presence of a small ovarian tumour cannot be excluded on biochemical and imaging studies used in this series alone. Its use should be restricted to units with expertise in this area.
Collapse
|
4
|
In vivo staphylococcal superantigen-driven polyclonal Ig responses in mice: dependence upon CD4(+) cells and human MHC class II. Int Immunol 2001; 13:1291-300. [PMID: 11581174 DOI: 10.1093/intimm/13.10.1291] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/16/2022] Open
Abstract
Staphylococcal enterotoxin (SE) B and seven other staphylococcal superantigens (SAg), despite promoting vigorous Ig production in human peripheral blood mononuclear cell cultures, are exceedingly poor at eliciting Ig responses in cultures of spleen cells from C57BL/10J (B10) or C3H/HeJ mice. In contrast, SEB elicits Ig responses in cultures of spleen cells from human MHC class II-transgenic mice. Whereas i.p. administration of SEB (0.2-20 microg) to non-transgenic B10 mice elicits very weak in vivo Ig responses, identical treatment of CD4(+) cell-intact (but not CD4(+) cell-depleted) human MHC class II-transgenic mice elicits dramatic increases in both splenic Ig-secreting cells and serum Ig levels. Over a 2-week period, the SEB-induced in vivo Ig responses peak and then plateau or fall in association with a preferential increase in splenic CD8(+) cells. Nevertheless, in vivo depletion of CD8(+) cells has no sustained effect on SEB-driven Ig responses. Taken together, these observations demonstrate that the effects of SAg on in vivo humoral immune responses are highly CD4(+) cell dependent, are substantially CD8(+) cell independent and can be successfully investigated using human MHC class II-transgenic mice. This model system may be useful in investigating the polyclonally activating effects of microbial products (prototypic environmental insults) on the development of systemic autoimmunity.
Collapse
|
5
|
Abstract
The sonographic findings in 101 cats with splenic abnormalities are presented. Diagnosis was made by ultrasound-guided fine needle aspirate or fine-needle biopsy (n = 91), ultrasound-guided core biopsy (n = 1), surgical core biopsy (n = 1), or necropsy (n = 10). Two cats had more than one diagnostic procedure (fine needle aspirate and necropsy or core biopsy and necropsy). The splenic abnormalities included lymphosarcoma (n = 30), mast cell tumor (n = 27), extramedullary hematopoiesis and/or lymphoid hyperplasia (n = 27), epithelial tumors (n = 6), mesenchymal tumors (n = 4), malignant histiocytosis (n = 2), myeloproliferative disease (n = 2), pyogranulomatous inflammation (n = 2), erythroleukemia (n = 1), eosinophilic syndrome (n = 1), hematoma (n = 1), and granulomatous splenitis (n = 1). Three cats had more than one splenic abnormality (mast cell tumor and metastatic carcinoma, pyogranulomatous inflammation and lymphoid hyperplasia, histiocytic lymphosarcoma, and lymphoid hyperplasia). Pathognomonic changes were not seen for any of the diseases.
Collapse
|
6
|
Abstract
Left-truncated and interval-censored data, termed dynamic cohort data, arise in longitudinal studies with rolling admissions and only occasional follow-up. The authors compared four approaches for analyzing such data: a constant hazard model; maximum likelihood estimation with flexible parametric models; the midpoint method, in which the midpoint of the last negative and first positive test result is used in a Cox proportional hazards model that accounts for left truncation; and a semiparametric method that uses imputed failure times in the Cox model. By using a simulation study, they assessed the performance of these approaches under conditions that can arise in observational studies: changes in disease incidence and changes in the underlying population. The simulation results indicated that the constant hazard model and midpoint method were inadequate and that the flexible parametric model was useful when enough parameters were used in modeling the baseline hazard. The semiparametric method ensured correct parameter (odds ratio) estimation when the baseline hazard was misspecified, but the trade-off increased computational complexity. In this paper, a study of the incidence of human immunodeficiency virus in patients repeatedly tested for the virus at a sexually transmitted disease clinic in New Orleans, Louisiana, illustrates the methods used.
Collapse
|
7
|
Abstract
BACKGROUND Hepatitis occurs frequently in patients with end-stage renal disease. In 1997, 0.7% of patients receiving a renal transplant were positive for hepatitis C antibodies. Concern has been raised as to whether these patients are at an increased mortality risk after renal transplantation compared with patients who are hepatitis C antibody negative. To help answer this question, we analyzed data from the United States Renal Data System from October of 1988 through June of 1998. METHODS Primary study endpoints were patient death and death censored graft loss. Secondary study endpoints included cardiovascular, infectious, malignant, and infection-related death. Kaplan-Meier survival estimates as well as Cox proportional hazard models were used to evaluate the impact of hepatitis C antibody status on the study endpoints. RESULTS A total of 73,707 patients were analyzed. Patient survival by Kaplan-Meier analysis was higher in hepatitis C-positive patients, whereas death censored graft survival trended lower in the very long term. By the Cox model, hepatitis C-positive adjusted patient survival is slightly superior to that of hepatitis C-negative patients. CONCLUSIONS Renal transplant recipients who are hepatitis C antibody positive do not have an increased risk of death after transplantation compared with hepatitis C-negative recipients. The current policy of transplanting hepatitis C-positive patients without active liver disease seems to incur no excess mortality risk.
Collapse
|
8
|
Abstract
BACKGROUND The benefit of renal transplantation for patients with end-stage renal disease (ESRD) has been well documented. This benefit is seen throughout all age ranges of patients. However, it has been documented that older renal transplant recipients are at increased risk for death because of infectious causes when compared with younger recipients. The present study addresses whether this increased risk merely parallels an age-related increase in infectious mortality or is reflective of a particular vulnerability in older renal transplant recipients. METHODS Patients wait-listed and transplanted between 1988 and 1997 were analyzed utilizing the United States Renal Data System (USRDS) database. The primary study end point was patient death secondary to infection. Secondary end points included death secondary to cardiovascular cause and malignancy. Cox-proportional hazard models were utilized with all pertinent variables. RESULTS Death related to infectious cause increased exponentially in transplanted patients with increasing age (slope = 2.90.34x), while it increased linearly (slope = 1.9x + 8.6) with increasing age for those patients on the waiting list. Overall mortality increases with age were equal between the wait-listed and transplanted groups. CONCLUSIONS The overall survival benefit of transplantation is maintained in the older age groups. However, renal transplantation is associated with an increased risk for infectious death beyond the expected age-related increased risk in patients on the renal transplant waiting list. This may have an impact on future immunosuppressive regimens in this population.
Collapse
|
9
|
Abstract
INTRODUCTION The importance of HLA matching for renal transplantation outcomes has been appreciated for several decades. It has been hypothesized that as pharmacologic immunosuppression becomes stronger and more specific, the impact of HLA matching may be vanishing. Mycophenolate Mofetil (MMF) has been demonstrated to both decrease acute rejection and improve three-year graft survival. It is possible that with new immunosuppressive regimens containing MMF the relative effect of HLA matching may be altered. To determine the relative impact of HLA matching in patients on MMF we undertook an analysis of the United States Renal Transplant Data Registry (USRDS). METHODS All primary, solitary renal transplants registered at the USRDS between January 1995 and June 1997, on initial immunosuppression that included either MMF or AZA were followed until June 1998. Primary study end points were graft and patient survival. Kaplan-Meier analysis was performed to compare AZA vs. MMF treated patients by HLA mismatch. Cox proportional hazard models were used to investigate the interaction between HLA mismatch and AZA versus MMF therapy on the study endpoints. All multivariate analyses were corrected for 13 potential confounding pretransplant variables including intention to treat immunosuppression. RESULTS A total of 19,675 patients were analyzed (8,459 on MMF and 11,216 on AZA). Overall three year graft survival was higher in the MMF group when compared to the AZA group (87% vs. 84% respectively P<0.001). For both AZA and MMF three-year graft survival improved with fewer HLA donor-recipient mismatches. Comparing zero antigen mismatches to six antigen mismatches, the relative improvement was comparable for both patients on AZA (92.4% vs. 80.6%) and MMF (95.2% vs. 82.9%). By Cox proportional hazard model the relative risk for graft loss decreased significantly in both the AZA and MMF treated patients with increased HLA matching. CONCLUSION The use of MMF does not obviate the benefits of HLA matching, while HLA matching does not minimize the benefits of MMF on long term graft survival. Our study would suggest that HLA matching and MMF therapy are additive factors in decreasing the risk for renal allograft loss.
Collapse
|
10
|
Abstract
BACKGROUND Despite the known differences in immunological reactivity between males and females, no differences in graft survival have been described among renal transplant recipients with regard to gender. To address this paradox, we analyzed data from 73,477 primary renal transplants collected in the US Renal Data System database. METHODS Logistic regression and Cox proportional hazard models were used to investigate the primary study end points, graft loss secondary to acute rejection (AR) or chronic allograft failure (CAF). CAF was defined as graft loss beyond 6 months, not attributable to death, recurrent disease, acute rejection, thrombosis, infection, noncompliance, or technical problems. The models adjusted for 15 covariates including immunosuppressive regimen, and donor and recipient characteristics. RESULTS The overall 8-year graft and patient survivals were significantly better in female renal transplant recipients compared with male recipients. However graft survival censored for death was not significantly different by gender. By multivariate analysis, females had a 10% increased odds of AR (OR=1.10, CI 1.02-1.12), but conversely a 10% lower risk of graft loss secondary to CAF (RR=0.9, CI 0.85-0.96). The risk for CAF increased significantly with increasing age for both males and females, but this effect was greater for males than for females (P<0.001). CONCLUSION Although female renal transplant recipients have a similar death censored graft survival compared with males, there are important differences in immunological behavior. Females have a higher risk of AR while having a decreased risk of graft loss secondary to CAF.
Collapse
|
11
|
Abstract
BACKGROUND Simultaneous pancreas-kidney transplantation (SPK) ameliorates the progression of microvascular diabetic complications but the procedure is associated with excess initial morbidity and an uncertain effect on patient survival when compared with solitary cadaveric or living donor renal transplantation. We evaluated mortality risks associated with SPK, solitary renal transplantation, and dialysis treatment in a national cohort of type 1 diabetics with end-stage nephropathy. METHODS A total of 13,467 adult-type 1 diabetics enrolled on the renal and renal-pancreas transplant waiting list between 10/01/88 and 06/30/97 were followed until 06/30/98. Time-dependent mortality risks and life expectancy were calculated according to the treatment received subsequent to wait-list registration: SPK; cadaveric kidney only (CAD); living donor kidney only (LKD) transplantation; and dialysis [wait-listed, maintenance dialysis treatment (WLD)]. RESULTS Adjusted 10-year patient survival was 67% for SPK vs. 65% for LKD recipients (P=0.19) and 46% for CAD recipients (P<0.001). The excess initial mortality normally associated with renal transplantation and the risk of early infectious death was 2-fold higher in SPK recipients. The time to achieve equal proportion of survivors as the WLD patients was 170, 95, and 72 days for SPK, CAD, and LKD recipients, respectively (P<0.001). However, the adjusted 5-year morality risk (RR) using WLD as the reference and the expected remaining life years were 0.40, 0.45, and 0.75 and 23.4, 20.9, and 12.6 years for SPK, LKD, and CAD, respectively. There was no survival benefit in SPK recipients > or =50 years old (RR=1.38, P=0.81). CONCLUSIONS Among patients with type 1 DM with end-stage nephropathy, SPK transplantation before the age of 50 years was associated with long-term improvement in survival compared to solitary cadaveric renal transplantation or dialysis.
Collapse
|
12
|
CT assessment of tumour response to treatment: comparison of linear, cross-sectional and volumetric measures of tumour size. Br J Radiol 2000; 73:1178-84. [PMID: 11144795 DOI: 10.1259/bjr.73.875.11144795] [Citation(s) in RCA: 122] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/17/2022] Open
Abstract
Changes in cross-sectional area are currently used to assess tumour response to treatment. The aims of this study were to validate a helical CT technique for volume determination using a series of phantoms and to compare tumour responses indicated by one-, two- and three-dimensional measures of tumour size change in patients treated for germ cell cancer or lymphoma. All studies were performed on an IGE HiSpeed Advantage helical CT scanner with an Advantage Windows workstation. Phantom volumes were calculated using volume reconstruction software and compared with reference volumes determined by water displacement. 20 lymph node masses were studied on serial CT scans in 16 patients treated with chemotherapy for germ cell cancer or lymphoma. For each lesion the maximum diameter, maximum cross-sectional area and volume were determined before and after treatment. Tumour response was assessed using the standard World Health Organisation criteria (i.e. changes in cross-sectional area) and the newly proposed unidimensional response evaluation criteria in solid tumour (RECIST). The CT volume measurement error was 1.0-5.1% for regularly shaped phantoms larger than 35 cm3. In the assessment of treatment response there was 90% agreement between one-dimensional (1D) and two-dimensional (2D) measurements and 100% agreement between 2D and three-dimensional (3D) measurements. CT volume measurements are accurate and reproducible, particularly for larger structures. Assessment of tumour response using 1D, 2D and 3D measures had limited influence on the classification of treatment response. However, the impact of CT assessment of tumour response using 1D, 2D and 3D measurements on clinical decisions and patient outcome remains to be determined.
Collapse
|
13
|
Abstract
BACKGROUND Acute rejection (AR) remains a major risk factor for the development of chronic renal allograft failure (CAF), which is a major cause of late graft loss. With the introduction of several newer immunosuppressive agents (e.g., mycophenolate mofetil, tacrolimus and neoral) acute rejection rates have been steadily decreasing. However, the incidence of CAF has not decreased as dramatically as the incidence of acute rejection. One possible explanation is that the impact of AR on CAF is changing. The goal of this study was to analyze the relative impact of AR era on the development of CAF. METHODS We evaluated 63,045 primary renal transplant recipients reported to the USRDS from 1988 to 1997. CAF was defined as graft loss after 6 months posttransplantation, censored for death, acute rejection, thrombosis, infection, surgical complications, or recurrent disease. A Cox proportional hazard model correcting for 15 possible confounding factors evaluated the relative impact of AR on CAF. The era effect (years 1988-1989, 1990-1991, 1992-1993, 1994-1995 and 1996-1997) was evaluated by an era versus AR interaction term. RESULTS An AR episode within the first 6 months after transplantation was the most important risk factor for subsequent CAF (RR=2.4, CI 2.3-2.5). Compared with the reference group (1988-89 with no rejection), having an AR episode in 1988-89, 1990-1991, 1992-1993, 1994-1995, and 1996-1997, conferred a 1.67, 2.35, 3.4, 4.98 and 5.2-fold relative risk for the subsequent development of CAF (P<0.001). CONCLUSIONS Independently of known confounding variables, the impact of AR on CAF has significantly increased from 1988 to 1997. This effect may in part explain the relative lack of improvements in long term renal allograft survival, despite a decline in AR rates.
Collapse
|
14
|
Abstract
Injuries to the atlanto-occipital region, which range from complete atlanto-occipital or atlantoaxial dislocation to nondisplaced occipital condyle avulsion fractures, are usually of critical clinical importance. At initial cross-table lateral radiography, measurement of the basion-dens and basion-posterior axial line intervals and comparison with normal measurements may help detect injury. Computed tomography (CT) with sagittal and coronal reformatted images permits optimal detection and evaluation of fracture and luxation. CT findings that may suggest atlanto-occipital injury include joint incongruity, focal hematomas, vertebral artery injury, capsular swelling, and, rarely, fractures through cranial nerve canals. Magnetic resonance (MR) imaging of the cervical spine with fat-suppressed gradient-echo T2-weighted or short-inversion-time inversion recovery sequences can demonstrate increased signal intensity in the atlantoaxial and atlanto-occipital joints, craniocervical ligaments, prevertebral soft tissues, and spinal cord. Axial gradient-echo MR images may be particularly useful in assessing the integrity of the transverse atlantal ligament. All imaging studies should be conducted with special attention to bone integrity and the possibility of soft-tissue injury. Atlanto-occipital injuries are now recognized as potentially survivable, although commonly with substantial morbidity. Swift diagnosis by the trauma radiologist is crucial for ensuring prompt, effective treatment and preventing delayed neurologic deficits in patients who survive such injuries.
Collapse
|
15
|
Racial disparities in access to simultaneous pancreas-kidney transplantation in the United States. Am J Kidney Dis 2000; 36:526-33. [PMID: 10977784 DOI: 10.1053/ajkd.2000.9793] [Citation(s) in RCA: 34] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
The purpose of our study is to assess the extent of racial differences in the access to simultaneous pancreas-kidney (SPK) transplantation and evaluate the potential influence of socioeconomic factors on access to transplantation. We performed a retrospective analysis of the US Renal Data System and United Network for Organ Sharing data on all patients with end-stage renal disease (ESRD) due to diabetes mellitus from 1988 to 1996 (n = 562, 814), including all dialysis, wait list, and transplant patients. Racial differences in incidence, prevalence, insurance coverage, employment status, and transplantation rates were calculated. Caucasians had the highest prevalence of ESRD caused by type 1 diabetes (73%), followed by blacks (22%), Hispanics (3%), Native Americans (2%), and others (<1%). Both blacks and Native Americans increased their annual incidence of ESRD caused by insulin-dependent diabetes mellitus by 10% compared with only a 3.5% increase in Caucasians, whereas incidence rates increased annually by almost 8% for both blacks and Native Americans compared with a 3% increase for Caucasians. However, Caucasians received 92% of all SPK transplants, whereas all other racial groups combined received a disproportionate minority of the remaining transplants. Lack of private insurance and unemployment status were associated with annual changes in both incidence of ESRD caused by type 1 diabetes and SPK transplant rates. In conclusion, we observed striking racial disparities for access to SPK transplantation in the United States today, which may be related to employment status, access to private insurance, and subsequent health care. Our preliminary data support current efforts to encourage Medicare and Medicaid coverage for all patients requiring SPK transplantation regardless of racial or financial status.
Collapse
|
16
|
Abstract
BACKGROUND Numerous factors are known to impact on patient survival after renal transplantation. Recent studies have confirmed a survival advantage for renal transplant patients over those waiting on dialysis. We aimed to investigate the hypothesis that longer waiting times are more deleterious than shorter waiting times, that is, to detect a "dose effect" for waiting time. METHODS We analyzed 73,103 primary adult renal transplants registered at the United States Renal Data System Registry from 1988 to 1997 for the primary endpoints of death with functioning graft and death-censored graft failure by Cox proportional hazard models. All models were corrected for donor and recipient demographics and other factors known to affect outcome after kidney transplantation. RESULTS A longer waiting time on dialysis is a significant risk factor for death-censored graft survival and patient death with functioning graft after renal transplantation (P < 0.001 each). Relative to preemptive transplants, waiting times of 6 to 12 months, 12 to 24 months, 24 to 36, 36 to 48, and over 48 months confer a 21, 28, 41, 53, and 72% increase in mortality risk after transplantation, respectively. Relative to preemptive transplants, waiting times of 0 to 6 months, 6 to 12 months, 12 to 24 months, and over 24 months confer a 17, 37, 55, and 68% increase in risk for death-censored graft loss after transplantation, respectively. CONCLUSIONS Longer waiting times on dialysis negatively impact on post-transplant graft and patient survival. These data strongly support the hypothesis that patients who reach end-stage renal disease should receive a renal transplant as early as possible in order to enhance their chances of long-term survival.
Collapse
|
17
|
Induction of proinflammatory cytokines from human respiratory epithelial cells after stimulation by nontypeable Haemophilus influenzae. Infect Immun 2000; 68:4430-40. [PMID: 10899840 PMCID: PMC98342 DOI: 10.1128/iai.68.8.4430-4440.2000] [Citation(s) in RCA: 43] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
Abstract
Nontypeable Haemophilus influenzae (NTHi) causes repeated respiratory infections in patients with chronic lung diseases. These infections are characterized by a brisk inflammatory response which results in the accumulation of polymorphonucleated cells in the lungs and is dependent on the expression and secretion of proinflammatory cytokines. We hypothesize that multiple NTHi molecules, including lipooligosaccharide (LOS), mediate cellular interactions with respiratory epithelial cells, leading to the production of proinflammatory cytokines. To address this hypothesis, we exposed 9HTEo- human tracheal epithelial cells to NTHi and compared the resulting profiles of cytokine gene expression and secretion using multiprobe RNase protection assays and enzyme-linked immunosorbent assays (ELISA), respectively. Dose-response experiments demonstrated a maximum stimulation of most cytokines tested, using a ratio of 100 NTHi bacterial cells to 1 9HTEo- tracheal epithelial cell. Compared with purified LOS, NTHi bacterial cells stimulated 3.6- and 4.5-fold increases in epithelial cell expression of interleukin-8 (IL-8) and IL-6 genes, respectively. Similar results were seen with epithelial cell macrophage chemotactic protein 1, IL-1alpha, IL-1beta, and tumor necrosis factor alpha expression. Polymyxin B completely inhibited LOS stimulation but only partially reduced NTHi whole cell stimulation. Taken together, these results suggest that multiple bacterial molecules including LOS contribute to the NTHi stimulation of respiratory epithelial cell cytokine production. Moreover, no correlation was seen between NTHi adherence to epithelial cells mediated by hemagglutinating pili, Hia, HMW1, HMW2, and Hap and epithelial cytokine secretion. These data suggest that bacterial molecules beyond previously described NTHi cell surface adhesins and LOS play a role in the induction of proinflammatory cytokines from respiratory epithelial cells.
Collapse
|
18
|
African-American renal transplant recipients experience decreased risk of death due to infection: possible implications for immunosuppressive strategies. Transplantation 2000; 70:375-9. [PMID: 10933166 DOI: 10.1097/00007890-200007270-00024] [Citation(s) in RCA: 35] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
INTRODUCTION African-American renal transplant recipients tend to experience more acute rejection episodes and have shorter graft survival than Caucasian renal transplant recipients. Various factors have been posited to be responsible for this difference, including relative under immunosuppression. We reasoned that by looking at the balance of acute rejections versus death due to infection, we could ascertain whether African-American renal recipients might have more reserve to tolerate an increase in pharmacological immunosuppression. METHODS We analyzed the United States Renal Data System (USRDS) data from 1987 to 1997 regarding acute rejection episodes and infectious deaths. All other pertinent factors were gathered for a multivariate analysis. A total number of 68,885 adult renal transplant recipients were analyzed. RESULTS When corrected for all covariates, the relative risk for acute rejection (1.3) was higher although the relative risk for infectious death was lower (0.7) in African-Americans as compared with Caucasians (P<0.01). CONCLUSION Our study would indicate that relative to Caucasians, African-American renal transplant recipients are at decreased risk for infectious death and therefore may tolerate the more intensive immunosuppression that may be necessary to narrow the gap in acute rejection rates between African-Americans and Caucasian renal transplant recipients.
Collapse
|
19
|
Abstract
BACKGROUND The elderly are the fastest growing segment of the end stage renal disease (ERSD) population. Older renal transplant recipients experience fewer acute rejection episodes than do younger patients. Despite this, death censored graft survival is no better in these older transplant recipients than in younger recipients. We examined the United States Renal Data System (USRDS) database to determine whether recipient age itself has an independent effect on the development of chronic allograft failure (CAF). METHODS We analyzed 59,509 patients from the files of the USRDS. To determine whether age was an independent risk factor for CAF, the population was analyzed separately for Caucasians, African-Americans, and other ethnic groups. All renal transplant recipients from 1988 to 1997 were examined. Both univariate and multivariate analysis were performed using chronic allograft failure as the outcome of interest. RESULTS Actuarial 8-year censored graft survival was significantly decreased in the older age groups 67% for ages 18-49 vs. 61.8% for ages 50-64 vs. 50.7% for ages 65+ (P<0.001). In the multivariate analysis, recipient age was a strong and independent risk factor for the development of chronic allograft failure in Caucasians (RR 1.29 for ages 50-64, RR 1.67 for ages older than 65). These findings were reinforced by an analysis that was restricted to living donor transplants without acute rejection. CONCLUSION In Caucasians increased recipient age is an independent risk factor for the development of chronic renal allograft failure.
Collapse
|
20
|
Impact of pre-existing donor hypertension and diabetes mellitus on cadaveric renal transplant outcomes. Am J Kidney Dis 2000; 36:153-9. [PMID: 10873885 DOI: 10.1053/ajkd.2000.8288] [Citation(s) in RCA: 75] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
Hypertension (HTN) and diabetes mellitus (DM) predispose to systemic atherosclerosis with renal involvement. The prevalence of HTN and DM in cadaveric renal donors (affected donors) and the results of transplantation are unknown. We investigated these issues with national data from the US Renal Data System. A total of 4,035 transplants from affected donors were matched 1:1 with unaffected controls according to donor age and race, recipient race, and year of transplantation. Graft and patient survival were estimated. Among the 25,039 solitary renal transplantations performed between July 1, 1994, and June 30, 1997, cadaveric renal transplants from donors with HTN accounted for 15%, and donors with DM, 2%. Programs with 1-year cadaveric renal graft survival rates greater than 90% had 50% less affected donors compared with programs having 1-year cadaveric renal graft survival rates of 85% or less. Compared with donor-age-matched controls, transplants from affected donors were at minimally increased risk for primary nonfunction, delayed graft function, and acute rejection. Three-year graft survival rates were 71% in affected donor organs and 75% in controls (P = 0.001). Compared with controls, duration of HTN was an independent risk factor for graft survival (3-year graft survival rates, 75% versus 65%; relative risk = 1.36 for HTN >10 years; P < 0.001). A substantial fraction of cadaveric renal donors have preexisting HTN. Programs transplanting fewer affected donor kidneys had better than average results. Because the negative impact of donor HTN and DM on transplant outcome was of moderate degree except when the duration of donor HTN was greater than 10 years, use of affected donors should not be discouraged, but graft and patient survival analyses should account for their presence.
Collapse
|
21
|
Abstract
BACKGROUND Mycophenolate Mofetil (MMF) has been shown to significantly decrease the number of acute rejection episodes in renal transplant recipients during the 1st year. A beneficial effect of MMF on long-term graft survival has been more difficult to demonstrate. This beneficial effect has not been detected, despite the impact of acute rejection on the development of chronic allograft nephropathy and experimental evidence that MMF may have a salutary effect on chronic allograft nephropathy independent of that of rejection. METHODS Data on 66,774 renal transplant recipients from the U.S. renal transplant scientific registry were analyzed. Patients who received a solitary renal transplant between October 1, 1988 and June 30, 1997 were studied. The Cox proportional hazard regression was used to estimate relevant risk factors. Kaplan-Meier analysis was performed for censored graft survival. RESULTS MMF decreased the relative risk for development of chronic allograft failure (CAF) by 27% (risk ratio [RR] 0.73, P<0.001). This effect was independent of its outcome on acute rejection. Censored graft survival using MMF versus azathioprine was significantly improved by Kaplan-Meier analysis at 4 years (85.61% v. 81.9%). The effect of an acute rejection episode on the risk of developing CAF seems to be increasing over time (RR=1.9, 1988-91; RR=2.9, 1992-94; RR=3.7, 1995-97). CONCLUSION MMF therapy decreases the risk of developing CAF. This improvement is only partly caused by the decrease in the incidence of acute rejection observed with MMF; but, is also caused by an effect independent of acute rejection.
Collapse
|
22
|
|
23
|
Abstract
UNLABELLED Long-term survival in renal transplant recipients with graft function. BACKGROUND Death with graft function (DWGF) is a common cause of graft loss. The risks and determinants of DWGF have not been studied in a recent cohort of renal transplant recipients. We performed a population-based survival analysis of U.S. patients with end-stage renal disease (ESRD) transplanted between 1988 and 1997. METHODS Registry data were used to evaluate long-term patient survival and cause-specific risks of DWGF in 86,502 adult (>/=18 years) renal transplant recipients. RESULTS Out of 18,482 deaths, 38% (N = 7040) were deaths with graft function. This accounts for 42. 5% of all graft loss. Patient survival with graft function was 97, 91, and 86% at 1, 5, and 10 years, respectively. The risk of DWGF decreased by 67% (RR = 0.33, P < 0.001) between 1988 and 1997. The adjusted rate of DWGF was 4.6, 0.8, 2.2, and 1.4 deaths per 1000 person-years for cardiovascular disease, stroke, infections, and malignancy, respectively. The suicide rate was 15.7 versus 9.0 deaths per 100,000 person-years in the general population (P < 0. 001). In multivariate analysis, the following factors were independently and significantly predictive of DWGF: white recipient, age at transplantation, ESRD caused by hypertension or diabetes mellitus, length of pretransplant dialysis, delayed graft function, acute rejection, panel reactive antibody> 30%, African American donor race, age> 45 years, and donor death caused by cerebrovascular disease. CONCLUSIONS Patients with graft function have a high long-term survival. Although DWGF is a major cause of graft loss, the risk has declined substantially since 1990. Cardiovascular disease was the predominant reported cause of DWGF. Other causes vary by post-transplant time period. Attention to atherosclerotic risk factors may be the most important challenge to further improve the longevity of patients with successful renal transplants.
Collapse
|
24
|
Cervical spine injury: a clinical decision rule to identify high-risk patients for helical CT screening. AJR Am J Roentgenol 2000; 174:713-7. [PMID: 10701614 DOI: 10.2214/ajr.174.3.1740713] [Citation(s) in RCA: 103] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
OBJECTIVE We aimed to validate the routine use of a clinical decision rule to direct diagnostic imaging of adult blunt trauma patients at high risk for cervical spine injury. MATERIALS AND METHODS We previously developed and have since routinely used a prediction rule based on six clinical parameters to identify patients at greater than 5% risk of cervical spine injury to undergo screening helical CT of the cervical spine. During a 6-month period, 4285 screening imaging studies of the cervical spine were performed in adult blunt trauma patients. Six hundred one patients (398 males, 203 females; age range, 16-100 years; median age, 38 years) underwent helical CT, and the remainder underwent 3684 conventional radiographic examinations. Clinical and report data were extracted from the radiology department database, medical records, and the hospital trauma registry. Abnormal findings were independently confirmed by additional imaging studies, autopsy results, or clinical outcome. RESULTS The true-positive cervical spine injury rates in helical CT- and conventional radiography-screened patients who presented directly to our trauma center were 40 (8.7%) of 462 and seven (0.2%) of 3684, respectively. The cervical spine injury rate in patients who were transferred from outside institutions to our trauma center and who underwent helical CT was 37 (26.6%) of 139. This figure included 20 patients already known to have cervical spine fracture. CONCLUSION The clinical decision rule can distinguish patients at high and low risk of cervical spine injury, thus supporting its validity.
Collapse
|
25
|
Growth, carcass characteristics, and incidence of ascites in broilers exposed to environmental fluctuations and oiled litter. Poult Sci 2000; 79:324-30. [PMID: 10735197 DOI: 10.1093/ps/79.3.324] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
The effects of diurnal temperature fluctuations and removal of respirable dust, by application of canola oil to straw litter, on growth, carcass traits, and the degree of ascites was evaluated with 1,200 male broilers studied in two replicated 6-wk trials. Each trial used four pens of 150 birds. The temperature treatment consisted of a fluctuation of 3 C in temperature above the required temperature during the day (0600 to 1800 h) and 3 C below the required temperature at night (1800 to 0600 h) for a 6 C change in daily temperature. The control temperature was constant. All pens had the same mean daily temperature. In each trial, one control temperature pen and one fluctuation temperature pen received bi-weekly applications of canola oil to the litter (1.1 L/m2 of oil over 6 wk). At 6 wk of age, 30 birds from each pen were killed for determination of breast muscle, fatpad, and heart weights. All birds were scored for lesions of ascites at time of processing. A score of 0 or 1 represented slight pericardial effusion, slight pulmonary congestion, and edema. A score of 4 represented birds with marked accumulation of ascitic fluid in one or more ceolomic cavities (other than the pericardium) and advanced liver lesions. A cross-sectional image of each 4-mm heart slice (cross-section of the ventricles) was digitally recorded, and with image analysis we determined the right ventricular area (RVA), left ventricular area (LVA), and total heart area (HA). The final BW of the broilers were significantly different, the oiled-litter treatment (2,249 g) had lower weight gain compared with the nonoiled litter treatment (2,293 g). There were no differences in fatpad weight, shank length, lung weight, and percentage breast muscle between the main treatments. The Pectoralis minor and Pectoralis major weight were significantly heavier in the temperature fluctuation treatment than in the control temperature treatment by 3.0 and 12.0 g, respectively. The birds subjected to the control temperature treatment had a lower RVW than the birds subjected to the fluctuating temperature treatment. Temperature fluctuations also resulted in a 1.4% increase in the incidence of mortality. Temperature fluctuations negatively impact broiler growth due to heat loss when litter oiling was excessive.
Collapse
|
26
|
|
27
|
Abstract
UNLABELLED Long-term survival in renal transplant recipients with graft function. BACKGROUND Death with graft function (DWGF) is a common cause of graft loss. The risks and determinants of DWGF have not been studied in a recent cohort of renal transplant recipients. We performed a population-based survival analysis of U.S. patients with end-stage renal disease (ESRD) transplanted between 1988 and 1997. METHODS Registry data were used to evaluate long-term patient survival and cause-specific risks of DWGF in 86,502 adult (>/=18 years) renal transplant recipients. RESULTS Out of 18,482 deaths, 38% (N = 7040) were deaths with graft function. This accounts for 42. 5% of all graft loss. Patient survival with graft function was 97, 91, and 86% at 1, 5, and 10 years, respectively. The risk of DWGF decreased by 67% (RR = 0.33, P < 0.001) between 1988 and 1997. The adjusted rate of DWGF was 4.6, 0.8, 2.2, and 1.4 deaths per 1000 person-years for cardiovascular disease, stroke, infections, and malignancy, respectively. The suicide rate was 15.7 versus 9.0 deaths per 100,000 person-years in the general population (P < 0. 001). In multivariate analysis, the following factors were independently and significantly predictive of DWGF: white recipient, age at transplantation, ESRD caused by hypertension or diabetes mellitus, length of pretransplant dialysis, delayed graft function, acute rejection, panel reactive antibody> 30%, African American donor race, age> 45 years, and donor death caused by cerebrovascular disease. CONCLUSIONS Patients with graft function have a high long-term survival. Although DWGF is a major cause of graft loss, the risk has declined substantially since 1990. Cardiovascular disease was the predominant reported cause of DWGF. Other causes vary by post-transplant time period. Attention to atherosclerotic risk factors may be the most important challenge to further improve the longevity of patients with successful renal transplants.
Collapse
|
28
|
Abstract
We assessed a new dual-energy bone densitometer, the PRODIGY, that uses a narrow-angle fan-beam (4.5 degrees) oriented parallel to the longitudinal axis of the body (i.e., perpendicular to the usual orientation). High-resolution scans across the body can be stepped at 17 mm intervals. The energy-sensitive array detector uses cadmium zinc telluride, which allowed rapid photon counting. Spine and femur scans required 30 s, and total-body scans required 4-5 min; the dose was only 3.7 mrem and 0.04 mrem respectively, or about 5 to 10 times lower than conventional fan-beam densitometry. We found only a small influence of soft-tissue thickness on bone mineral density (BMD) results. There was also a small (+/- 1%) influence of height above the tabletop on BMD results. A software correction for object height allowed a first-order correction for the large magnification effects of position on bone mineral content (BMC) and area. Consequently, the results for BMC and area, as well as BMD, with PRODIGY corresponded closely to those obtained using the predecessor DPX densitometer, both in vitro and in vivo; there was a generally high correlation (r = 0.98-0.99) for BMD values. Spine and femur values for BMC, area and BMD averaged within 0.5% in vivo (n = 122), as did total-body BMC and BMD (n = 46). PRODIGY values for total-body lean tissue and fat also corresponded within 1% to DPX values. Regional and total-body BMD were measured with 0.5% precision in vitro and 1% precision in vivo. The new PRODIGY densitometer appears to combine the low dose and high accuracy of pencil-beam densitometry with the speed of fan-beam densitometers.
Collapse
|
29
|
Abstract
Both femora were measured on 61 normal adults using dual X-ray absorptiometry (DXA). In a subset of 31 subjects, each femur was scanned once using the conventional leg-positioning device supplied with the densitometer, and once using a new positioning device and software that allowed both legs to be measured simultaneously. In another subgroup (n = 30), subjects were measured three times using the new dual-femur approach to better assess precision error. The data were analyzed for differences owing to the different positioning devices and for differences between right and left sides. The correlation between results with the old and new positioners was high (r > 0.99, standard error of the estimate [SEE] = 0.01-0.02 g/cm(2)). There was no significant difference in the average bone mineral density (BMD) values between the old and new positioner. The precision errors for each femur alone with the dual-femur approach were similar to those reported for the single-femur scans (1 to 2%), but the precision errors for the combined femora were reduced by 30% as expected. The correlation between right and left sides was high (r = 0.94-0.96), and the SEE in predicting one side from the other was moderate for total, trochanteric, and femoral neck BMD (0.05, 0. 05, and 0.06 g/cm(2), respectively). These SEE equate to about 0.5 standard deviation in terms of T-score. Differences in many individual cases between the right and left sides were significantly greater than the precision error. The new dual-femur software and leg positioner allows rapid measurement and analysis of both femora, thereby eliminating the uncertainty between sides.
Collapse
|
30
|
Abstract
BACKGROUND/AIMS The purpose of this study was to investigate the frequency and characteristics of two hemodialysis sessions/week, to identify factors which influence or predict this prescription, and to examine the outcomes of patients receiving hemodialysis two times/week as compared to the more common treatment of three times/week. METHODS Data from a national sample of 15,067 adult hemodialysis patients were utilized to compare twice-weekly with thrice-weekly therapy by logistic regression. RESULTS Patients treated less than one year were more likely to be treated twice-weekly (6.1%) than patients on dialysis for one year or more (2.7%) (AOR = 1.49, p = 0.002). Treatment schedules also varied significantly by geographic region. Factors predictive of twice-weekly hemodialysis (p < 0.05) were older age, Caucasian race, female gender, higher serum albumin, lower serum creatinine levels, and lower body mass index. A higher estimated renal function at the start of ESRD was also predictive of a twice-weekly schedule among incident patients (AOR = 1.05, p = 0.05). In addition, Cox-adjusted survival analysis indicated a lower mortality risk (RR = 0.76, p = 0. 02) for twice-weekly hemodialysis compared to thrice-weekly among prevalent patients. For incident patients, however, the results were not significant when adjusted for GFR at ESRD onset (RR = 0.85, p = 0.31). CONCLUSION Geographic differences in prescribed treatment remained unexplained by measured characteristics. The survival advantage associated with twice-weekly hemodialysis is likely to be related to patient selection and greater residual renal function.
Collapse
|
31
|
|
32
|
Abstract
BACKGROUND Renal vascular thrombosis (RVT) is a rare but catastrophic complication of renal transplantation. Although a plethora of risk factors has been identified, a large proportion of cases of RVT is unexplained. Uremic coagulopathy and dialysis modality may predispose to RVT. We investigated the impact of the pretransplant dialysis modality on the risk of RVT in adult renal transplant recipients. METHODS Renal transplant recipients (age 18 years or more) who were enrolled in the national registry between 1990 and 1996 (N = 84,513) were evaluated for RVT occurring within 30 days of transplantation. Each case was matched with two controls from the same transplant center and with the year of transplantation. The association between RVT and 18 factors was studied with multivariate conditional logistic regression. RESULTS Forty-nine percent of all cases of RVT (365 out of 743) occurred in repeat transplant recipients with an adjusted odds ratio (OR) of 5.72 compared with first transplants (P < 0.001). There were a significantly higher odds of RVT in peritoneal dialysis (PD)-compared with hemodialysis (HD)-treated patients (OR = 1.87, P = 0.001). Change in dialysis modality was an independent predictor of RVT: switching from HD to PD (OR = 3.59, P < 0.001) and from PD to HD (OR = 1.62, P = 0.047). Compared with primary transplant recipients on HD (OR = 1.00), the highest odds of RVT were in repeat transplant recipients treated with PD (OR = 12.95, P < 0.001) and HD (OR = 4.50, P < 0.001). Other independent predictors of RVT were preemptive transplantation, relatively young and old donor age, diabetes mellitus and systemic lupus erythematosus as causes of end-stage renal disease, recipient gender, and lower panel reactive antibody levels (PRAs). CONCLUSIONS The strongest risk factors for RVT were retransplantation and prior PD treatment. Prevention of RVT with perioperative anticoagulation should be studied in patients who have a constellation of the identified risk factors.
Collapse
|
33
|
Growth performance, carcass characteristics, and the incidence of ascites in broilers in response to feed restriction and litter oiling. Poult Sci 1999; 78:522-8. [PMID: 10230904 DOI: 10.1093/ps/78.4.522] [Citation(s) in RCA: 36] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
The effect of feed restriction and the application of canola oil to broiler straw litter to contain respirable dust on growth performance, carcass traits, and the incidence of ascites was evaluated with 800 male broilers studied in two 6-wk periods. Two pens of birds were feed restricted. Two pens of birds received feed ad libitum for the 6-wk trial. One restricted and one ad libitum pen received biweekly addition of canola oil to the litter. At 6 wk of age, 30 birds from each pen were killed for determination of breast muscle, fat pad, and heart weights. All birds were scored for the incidence of ascites at processing. A cross sectional image of each heart was digitally recorded and, using image analysis, the right ventricular area (RVA), left ventricular area (LVA), and total heart area (HA) were determined. The right ventricular wall was removed and its weight was expressed as a percentage of total heart weight (PRVW). The 40-d BW was significantly greater in the ad libitum birds (2.07 kg) than in the feed-restricted birds (1.86 kg). The right ventricular weight (RVW) (1.69 and 1.92 g) and the RVA (0.35 and 0.40 cm2) were also significantly different between the two feeding treatments. The ascites score was significantly correlated to the RVW (r = 0.50) and RVA (r = 0.52). The RVA was also correlated to the RVW (r = 0.63). Oiling the litter did not result in differences in carcass characteristics. Litter oiling significantly reduced the RVA of the ad libitum birds (0.36 cm2) compared to the ad libitum birds that did not have oiled litter (0.44 cm2). Feed restriction reduced the incidence of ascites, but also reduced gain. Litter oiling in the feed-restricted groups reduced the RVA, but did not reduce mortality.
Collapse
|
34
|
CT appearance of the adrenal glands in adrenocorticotrophic hormone-dependent Cushing's syndrome. AJR Am J Roentgenol 1999; 172:997-1002. [PMID: 10587135 DOI: 10.2214/ajr.172.4.10587135] [Citation(s) in RCA: 69] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
OBJECTIVE The purpose of this study was to describe the size and appearance of the adrenal glands on CT in patients with adrenocorticotrophic hormone (ACTH)-dependent Cushing's syndrome and to correlate gland dimensions with circulating cortisol and ACTH levels. MATERIALS AND METHODS We retrospectively analyzed clinical, biochemical, and imaging data for 53 patients referred for CT of the adrenals as part of an investigation for ACTH-dependent Cushing's syndrome at our institution between 1988 and 1997. Two observers, who were unaware of the endocrine data, measured the body and limb widths of the adrenal glands using an electronic cursor. RESULTS Of the 53 patients, 37 (70%) were shown to have enlarged adrenal glands on CT. The mean width of the adrenal limbs positively correlated with the circulating cortisol and ACTH levels. The adrenals were larger in patients with ectopic ACTH syndrome than in patients with Cushing's disease (p < .02). Ten patients (19%) had nodules that were 10 mm or greater in diameter. CONCLUSION The adrenal glands were often enlarged in patients with ACTH-dependent Cushing's syndrome, and the enlargement could be quantified on CT. However, having normalsized adrenals (observed in 30% the patients in our study) did not preclude such a diagnosis. We found that adrenal limb width positively correlates with ACTH and cortisol levels.
Collapse
|
35
|
The R-type pyocin of Pseudomonas aeruginosa C is a bacteriophage tail-like particle that contains single-stranded DNA. Infect Immun 1999; 67:717-25. [PMID: 9916082 PMCID: PMC96378 DOI: 10.1128/iai.67.2.717-725.1999] [Citation(s) in RCA: 22] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/21/2022] Open
Abstract
Pseudomonas aeruginosa R-type pyocin particles have been described as bacteriocins that resemble bacteriophage tail-like structures. Because of their unusual structure, we reexamined whether they contained nucleic acids. Our data indicated that pyocin particles isolated from P. aeruginosa C (pyocin C) contain DNA. Probes generated from this DNA by the random-primer extension method hybridized to distinct bands in restriction endonuclease-digested P. aeruginosa C genomic DNA. These probes also hybridized to genomic DNA from 6 of 18 P. aeruginosa strains that produced R-type pyocins. Asymmetric PCR, complementary oligonucleotide hybridization, and electron microscopy indicated that pyocin C particles contained closed circular single-stranded DNA, approximately 4.0 kb in length. Examination of total intracellular DNA from mitomycin C-induced cultures revealed the presence of two extrachromosomal DNA molecules, a double-stranded molecule and a single-stranded molecule, which hybridized to pyocin DNA. Sequence analysis of 7,480 nucleotides of P. aeruginosa C chromosomal DNA containing the pyocin DNA indicated the presence of pyocin open reading frames with similarities to open reading frames from filamentous phages and cryptic phage elements. We did not observe any similarities to known phage structural proteins or previously characterized pseudomonal prt genes expressing R-type pyocin structural proteins. These studies demonstrate that pyocin particles from P. aeruginosa C are defective phages that contain a novel closed circular single-stranded DNA and that this DNA was derived from the chromosome of P. aeruginosa C.
Collapse
|
36
|
Abstract
Computed tomography (CT) evaluation of the thymus and anterior mediastinum is an important aspect of the investigation of patients with ACTH-dependent Cushing's syndrome in order to exclude an ACTH-secreting carcinoid tumor. We have reviewed the CT imaging of the thymus and anterior mediastinum in a series of 85 patients (55 females; median age 41, range 7-77 yr) with active Cushing's syndrome as there are few data on the range of appearances in hypercortisolemic states. One patient had a thymic carcinoid tumor (24 x 18 mm). Of the others, 28/84 (33%) patients showed thymic remnant tissue, consisting of either nodule(s) at least 5 mm diameter (n = 21, mean diameters 12.5 +/- 5 x 9.6 +/- 4 mm), or triangular bilobed glands (n = 7, mean thickness of the body, right and left limbs 25 +/- 7, 14 +/- 3, and 12 +/- 5 mm). Thymic involution appeared in 56/84 (67%) patients, ranging from small nodule(s) of less than 5mm diameter to linear soft tissue strands and complete fatty replacement. Patients with thymic remnant tissue were younger than those with thymic involution (P < 0.05). The thymic carcinoid tumor could be distinguished from remnant tissue on the basis of age and size. The presence of anterior mediastinal nodule(s) in hypercortisolemia need not imply the presence of a thymic carcinoid tumor, although in older patients this should arouse suspicion.
Collapse
|
37
|
Analysis of right ventricular areas to assess the severity of ascites syndrome in broiler chickens. Poult Sci 1999; 78:62-5. [PMID: 10023748 DOI: 10.1093/ps/78.1.62] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
Ascites syndrome in broiler chickens is defined as a condition associated with pulmonary hypertension leading to right heart failure, increased central venous pressure, passive congestion of the liver, and accumulations of serous fluids in body cavities. The syndrome is currently seen in fast-growing broiler chickens associated with an increase in the weight, volume, and area of the right ventricle of the heart. The ratio of the right ventricle weight to the total heart mass has been used to assess the consequences of increased blood pressure. The right ventricle area (RVA) can be quantified using image analysis technology. Hearts were removed from 719 male broilers at slaughter (42 d). All birds were visually scored for the incidence of ascites. A score of 0 or 1 represented slight hydropericardium, slight right heart hypertrophy, and slight edema. A score of 4 was assigned to birds with marked accumulation of ascitic fluid in one or more ceolomic cavities, pronounced dilation of the right heart, and prominent liver lesions. A cross-sectional image of each heart slice (a 4-mm-thick slice of the ventricles) was digitally recorded. Using image analysis software, the RVA, left ventricular area (LVA), and total heart area (HA) were determined. Because a slice of the heart was used in image analysis, the importance of maintaining the original shape was determined. Twenty hearts in five ranges of RVA size were scanned in four different positions, which have differing heart slice orientations and differing RVA shapes, for a comparison of positioning technique (placement) relating to the RVA. The shape of the heart slice for image analysis was observed not to be critical for the small RVA. For heart slices with large RVA values, it was found to be critical to analyze the heart slice in a standardized placement.
Collapse
|
38
|
What is your diagnosis? Foreign body obstruction of the duodenum with reflux of barium sulfate into the gallbladder and intrahepatic bile ducts. J Am Vet Med Assoc 1998; 213:1257-8. [PMID: 9810377] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/09/2023]
|
39
|
|
40
|
Characterization of highly radiosensitive cell lines from a human ovarian small-cell cancer. Gynecol Oncol 1997; 67:147-53. [PMID: 9367698 DOI: 10.1006/gyno.1997.4840] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/05/2023]
Abstract
Cells were obtained at paracentesis from a patient with a rapidly growing ovarian tumor. A monolayer cell line (V7S), a xenograft tumor line (V7), and subsequently a xenograft-derived monolayer cell line (V7M) were established. Histological and immunohistochemical studies of the original tumor, xenograft, and cell lines provided a diagnosis of small-cell carcinoma of the ovary-which is consistent with the clinical course of the patient. V7S and V7M had a predominantly hypodiploid karyotype with a small tetraploid population. The V7M, which has been in long-term culture, also showed a nonrandom translocation involving chromosomes 1 and 14 and monosomy of X. Radiobiologically, V7S, V7, and V7M showed marked radiosensitivity with surviving fractions at 2 Gy, measured by clonogenic assay, of between 0.022 and 0.147. Split-dose experiments provided evidence that this radiosensitivity was not due to an inability in cellular repair. In vivo data from the xenograft (V7) revealed a highly radiosensitive tumor, corroborating the in vitro studies.
Collapse
|
41
|
Escherichia coli Associated Cellulitis in Broilers: Correlation with Systemic Infection and Microscopic Visceral Lesions, and Evaluation for Skin Trimming. Avian Dis 1997. [DOI: 10.2307/1592349] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
42
|
Escherichia coli associated cellulitis in broilers: correlation with systemic infection and microscopic visceral lesions, and evaluation for skin trimming. Avian Dis 1997; 41:935-40. [PMID: 9454929] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
In Alberta, cellulitis condemnations average 0.5% and are among the highest in Canada. Presently, all cellulitis-affected birds are condemned for fear of systemic infections and public health implications. In a slaughterhouse sample of 102 birds condemned with cellulitis, Escherichia coli was isolated from 83.3% of the lesions. All hearts were cultured and from 11.2% E. coli was recovered. Gross lesions of perihepatitis, infected oviducts, and arthritis were found in 11.2%, 6.7%, and 2.9% of the birds, respectively. Serotyping suggested that visceral infection occurs independent of cellulitis in at least half of the cases. There was no correlation between microscopic visceral lesions and positive bacterial cultures. Two E. coli isolates of serogroup 0157 produced no toxin and neither isolate produced CS31A, F107, or F1845 fimbriae. Cellulitis lesions ranged from 0.55 to 218.9 cm2. All lesions under 16 cm2 and 64% of lesions up to 48 cm2 were considered suitable for trimming.
Collapse
|
43
|
The influence of experience on the reliability of goniometric and visual measurement of forefoot position. J Orthop Sports Phys Ther 1997; 25:192-202. [PMID: 9048325 DOI: 10.2519/jospt.1997.25.3.192] [Citation(s) in RCA: 41] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
Goniometric measurement of forefoot position relative to the rearfoot is a routine procedure used by rehabilitation specialists. This measurement is also frequently made by visual estimation. The influence of tester experience on the reliability of these two techniques at the forefoot is unknown. The purpose of this investigation was to directly examine the reliability of goniometric and visual estimation of forefoot position measurements when experienced and inexperienced testers perform the evaluation. Two clinicians (> or = 10 years experience) and two physical therapy students were recruited as testers. Ten subjects (20-31 years old), free from pathology, were measured. Each foot was evaluated twice with the goniometer and twice with visual estimation by each tester. Intraclass correlation coefficient (ICC) and coefficients of variation method error were used as estimates of reliability. There was no dramatic difference in the intratester or intertester reliability between experienced and inexperienced testers, regardless of the evaluation used. Estimates of intratester reliability (ICC 2,1), when using the goniometer, ranged from 0.08 to 0.78 for the experienced examiners and from 0.16 to 0.65 for the inexperienced examiners. When using visual estimation, ICC (2,1) values ranged from 0.51 to 0.76 for the experienced examiners and 0.53 to 0.57 for the inexperienced examiners. The estimate of intertester reliability [ICC (2,2)] for the goniometer was 0.38 for the experienced examiners and 0.42 for the inexperienced examiners. When using visual estimation, ICC (2,2) values were 0.81 for the experienced examiners and 0.72 for the inexperienced examiners. Although experience does not appear to influence forefoot position measurements, of the two evaluation techniques, visual estimation may be the more reliable.
Collapse
|
44
|
Determination of radiosensitivity in established and primary squamous cell carcinoma cultures using the micronucleus assay. Eur J Cancer 1997; 33:453-62. [PMID: 9155532 DOI: 10.1016/s0959-8049(97)89022-3] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/04/2023]
Abstract
In this study, the cytokinesis-block micronucleus assay (CBMN) was used to measure radiosensitivity in three established cell lines (SCC-61, V175 and V134) and 10 primary cell cultures of squamous cell carcinoma (SCC) of the head and neck. Assessment involved optimisation of the assay to determine cytochalasin-B (CB) concentration and sampling time postirradiation. A much closer correlation between dose-response data measured in the clonogenic and micronucleus assays was found when the micronucleus assay was performed under standardised conditions for each cell line (2 micrograms/ml CB: 48 h postirradiation) instead of predetermined optimised assay conditions. This indicates that, for these SCC cell lines, the CBMN assay may be able to predict in vitro radiosensitivity. To be of clinical use in predicting radiosensitivity, the CBMN assay also needs to be evaluated with primary cell cultures. In this study, no relationship between micronucleus frequency at 2 or 6 Gy and patient clinical outcome 12 months following surgery and radiotherapy was seen. Similarly, no association between patient outcome and tumour stage, nodal stage and histology was observed. These CBMN assay data from the primary cell cultures are presently inconclusive as a measure of patient tumour radiosensitivity.
Collapse
|
45
|
Abstract
This article outlines the ability of imaging techniques to stage intrathoracic non-small-cell lung cancer, particularly the extent of primary tumour (T stage), and the presence of nodal metastases (N stage). The detection of hilar and mediastinal lymph-node metastases by CT is covered initially, followed by an appraisal of MRI and radionuclide imaging techniques. Finally, the evaluation of mediastinal and chest-wall invasion by CT and MRI is described, and note is made of developing applications of ultrasound and endosonography. Computed tomography remains the standard technique, but its limitations are discussed, as is the value of other complementary imaging techniques.
Collapse
|
46
|
Magnetic resonance imaging of adrenocortical adenomas in childhood: correlation with computed tomography and ultrasound. Pediatr Radiol 1996; 26:794-9. [PMID: 8929380 DOI: 10.1007/bf01396204] [Citation(s) in RCA: 17] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/03/2023]
Abstract
There are few descriptions of the magnetic resonance (MR) appearance of hyperfunctioning adrenocortical tumours, particularly those occurring in childhood. We studied five patients, two girls and three boys, aged 6-14.3 years, presenting with clinical syndromes of adrenocortical hyperfunction. The diagnoses were Cushing's syndrome (n = 2), virilisation (n = 2), and Conn's syndrome (n = 1). Biochemical features suggested an adrenal lesion in each case. MR and ultrasound were performed in all five cases, with CT in four. Each patient had a functional adrenal tumour secreting either cortisol, androgens or aldosterone alone, or a combination of cortisol, androgens and oestradiol. The histological diagnosis was adenoma in four cases and tumour of indeterminate nature in one case. MR clearly showed the tumours (diameter 1.0-7.5 cm), all the lesions being of high signal intensity relative to liver on T2-weighted sequences. CT revealed an adrenal mass in each of the four patients scanned, three of which enhanced after intravenous contrast medium injection. The multiplanar imaging of MR allowed better distinction from adjacent structures and also demonstrated an unenlarged contralateral adrenal gland. In the patient with a 1-cm Conn's adenoma the lesion was more easily seen on MR than CT. Ultrasound showed the four larger tumours but was unable to visualise the contralateral adrenal or the Conn's adenoma. In conclusion, the MR appearances of four adrenocortical adenomas and one indeterminate tumour in children are described. MR has been found to be at least equal to CT in the detection of these tumours, with some possible advantages. Both techniques are superior to ultrasound.
Collapse
|
47
|
The role of computed tomography in evaluation of subchondral osseous lesions in seven horses with chronic synovitis. Equine Vet J 1996; 28:480-8. [PMID: 9049498 DOI: 10.1111/j.2042-3306.1996.tb01621.x] [Citation(s) in RCA: 36] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/03/2023]
Abstract
Seven horses with severe, persistent lameness of sudden onset were evaluated with scintigraphy and/or computed tomography. The lameness was localised to the front fetlock joint in 2 horses and to the tibiotarsal joint in 5 horses. Five of the horses had a history of intra-articular injections of the involved joint prior to presentation. All horses had effusion of the affected joint and were positive to flexion tests. Intraarticular anaesthesia eliminated or improved the lameness in 4 cases and a nerve conduction block proximal to the affected joint improved the lameness in another. Cytology examination of fluid from affected joints identified normal joint fluid (one horse) or elevations in nucleated cell counts of 0.9 x 10(9)/l-36.8 x 10(9)/l and total protein 20-42 g/l (6 horses). The joint fluid of 2 of these horses cultured positive for bacteria. Initial radiographs were either normal (4 cases) or the changes seen were not sufficient to explain the degree of lameness. In the 6 cases where scintigraphy was performed, intense focal isotope uptake was found in the suspected region, which corresponded to the proximal portion of the first phalanx (2 cases), distal tibia (2 cases), or talus (3 cases). Computed tomography (CT) was performed because occult fracture or osteomyelitis was suspected; and knowledge of the precise anatomical location of the lesion was considered necessary to assess the need for surgery and to plan the surgical approach. Hypodense focal lesions with hyperdense haloes were found in the subchondral bone deep to the sagittal groove of the first phalanx (P1) (2 cases) in the cochlea of the distal tibia (2 cases), and in the intertrochlear portion of the talus (3 cases). Communication between the lesion and the joint space was demonstrated by CT in 5 cases. Post mortem examination of one case revealed synovitis and a chronic bone abscess (Brodie's abscess) communicating with the joint space.
Collapse
|
48
|
Abstract
The case is described of a 72 year old woman who presented with a two year history of exertional stridor in whom the diagnosis of myasthenia gravis was delayed. Although an uncommon cause, myasthenia gravis should be included in the differential diagnosis of stridor.
Collapse
|
49
|
Abstract
The inhibition of [3H]-thymidine incorporation into the DNA of mitogen-stimulated chronic lymphocytic leukaemia lymphocytes by chlorambucil or gamma-irradiation in vitro was measured in a series of patients, some of whom were untreated, some treated and some who were showing resistance to first-line or second-line treatment. There was evidence of resistance to irradiation developing in parallel with that to chlorambucil. The resistance to chlorambucil in chronic lymphocytic leukaemia (CLL) is not necessarily due to altered drug transport or metabolism but to a more fundamental process affecting DNA damage.
Collapse
MESH Headings
- Antineoplastic Agents, Alkylating/pharmacology
- Chlorambucil/pharmacology
- DNA, Neoplasm/biosynthesis
- Drug Resistance, Neoplasm
- Female
- Gamma Rays
- Humans
- Leukemia, Lymphocytic, Chronic, B-Cell/metabolism
- Leukemia, Lymphocytic, Chronic, B-Cell/pathology
- Lymphocytes/drug effects
- Lymphocytes/metabolism
- Lymphocytes/radiation effects
- Male
- Radiation Tolerance
- Tumor Cells, Cultured/drug effects
- Tumor Cells, Cultured/metabolism
- Tumor Cells, Cultured/radiation effects
Collapse
|
50
|
Determination of radiation-induced damage in lymphocytes using the micronucleus and microgel electrophoresis 'Comet' assays. Eur J Cancer 1995; 31A:2320-3. [PMID: 8652263 DOI: 10.1016/0959-8049(95)00456-4] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
DNA damage assays may be useful as rapid predictors of normal tissue radiosensitivity in clinical samples. We measured in vitro radiation-induced (2 Gy) damage to lymphocytes from cancer patients and normal healthy donors using both the micronucleus and microgel electrophoresis (Comet) assays simultaneously. For damage assessment, there was a good correlation (P < 0.001) between the mean comet lengths and the fraction of cells with comets. There was no correlation with initial damage, determined as the proportion of cells within a sample that formed comets, in comparison with the mean frequency of micronuclei per binucleate cell. However, there appeared to be an association between the determination of repair proficiency in the Comet assay and the mean frequency of micronuclei per binucleate cell in lymphocytes from cancer patients.
Collapse
|