1
|
Social network analysis reveals the failure of between-farm movement restrictions to reduce Salmonella transmission. J Dairy Sci 2024:S0022-0302(24)00816-6. [PMID: 38788850 DOI: 10.3168/jds.2023-24554] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2023] [Accepted: 04/01/2024] [Indexed: 05/26/2024]
Abstract
An increasing number of countries are investigating options to stop the spread of the emerging zoonotic infection Salmonella (S.) Dublin, which mainly spreads among bovines and with cattle manure. Detailed surveillance and cattle movement data from an 11-year period in Denmark provided an opportunity to gain new knowledge for mitigation options through a combined social network and simulation modeling approach. The analysis revealed similar network trends for non-infected and infected cattle farms despite stringent cattle movement restrictions imposed on infected farms in the national control program. The strongest predictive factor for farms becoming infected was their cattle movement activities in the previous month, with twice the effect of local transmission. The simulation model indicated an endemic S. Dublin occurrence, with peaks in outbreak probabilities and sizes around observed cattle movement activities. Therefore, pre- and post-movement measures within a 1-mo time-window may help reduce S. Dublin spread.
Collapse
|
2
|
Comparison and interobserver reliability between a visual analog scale and the Wisconsin Calf Health Scoring Chart for detection of respiratory disease in dairy calves. J Dairy Sci 2024; 107:1102-1109. [PMID: 37709013 DOI: 10.3168/jds.2023-23554] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2023] [Accepted: 08/15/2023] [Indexed: 09/16/2023]
Abstract
Respiratory disease is an ongoing challenge for calves in the dairy sector with a relatively high prevalence and impact on welfare and economics. Applying scoring protocols for detecting respiratory disease requires that they are easily implemented, consistent between observers and fast to use in daily management. This study was conducted in one Danish dairy farm from September 2020 through January 2021. The study included 126 heifer calves enrolled in the age of 17 to 24 d. All calves were observed every second day for a period of 46 d. At each visit all calves were scored with a new visual analog scale (VAS) and the Wisconsin Calf Health Scoring Chart (WCHSC). We calculated agreement between the 2 scoring systems based on conditional probability to score higher or lower than a cutoff in the VAS compared with a specified cutoff in WCHSC used as reference test. A generalized mixed effects regression model was developed to estimate the prevalence of respiratory disease and the overall agreement between the 2 scoring systems. The overall agreement between the VAS and WCHSC was 89.6%. The second part of the study assessed interobserver reliability between 2 experienced observers and between an experienced observer and veterinary students. The interobserver reliability was calculated by intraclass correlation coefficient and was 0.58 between experienced observers and was 0.34 between an experienced observer and veterinary students indicating a moderate to poor reliability between the observers. It was possible to use VAS as an alternative clinical scoring method, which primarily focuses on the general condition of the individual calf rather than specific categories of clinical signs. Our study set up lacked a comparison to other diagnostic tools i.e., thoracic ultrasound to confirm the findings which should be considered in future studies when exploring VAS as a screening tool for detection of respiratory disease in dairy calves.
Collapse
|
3
|
Assessment of Evaluation Tools for Integrated Surveillance of Antimicrobial Use and Resistance Based on Selected Case Studies. Front Vet Sci 2021; 8:620998. [PMID: 34307513 PMCID: PMC8298032 DOI: 10.3389/fvets.2021.620998] [Citation(s) in RCA: 16] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2020] [Accepted: 05/21/2021] [Indexed: 01/18/2023] Open
Abstract
Regular evaluation of integrated surveillance for antimicrobial use (AMU) and resistance (AMR) in animals, humans, and the environment is needed to ensure system effectiveness, but the question is how. In this study, six different evaluation tools were assessed after being applied to AMU and AMR surveillance in eight countries: (1) ATLASS: the Assessment Tool for Laboratories and AMR Surveillance Systems developed by the Food and Agriculture Organization (FAO) of the United Nations, (2) ECoSur: Evaluation of Collaboration for Surveillance tool, (3) ISSEP: Integrated Surveillance System Evaluation Project, (4) NEOH: developed by the EU COST Action "Network for Evaluation of One Health," (5) PMP-AMR: The Progressive Management Pathway tool on AMR developed by the FAO, and (6) SURVTOOLS: developed in the FP7-EU project "RISKSUR." Each tool was scored using (i) 11 pre-defined functional aspects (e.g., workability concerning the need for data, time, and people); (ii) a strengths, weaknesses, opportunities, and threats (SWOT)-like approach of user experiences (e.g., things that I liked or that the tool covered well); and (iii) eight predefined content themes related to scope (e.g., development purpose and collaboration). PMP-AMR, ATLASS, ECoSur, and NEOH are evaluation tools that provide a scoring system to obtain semi-quantitative results, whereas ISSEP and SURVTOOLS will result in a plan for how to conduct evaluation(s). ISSEP, ECoSur, NEOH, and SURVTOOLS allow for in-depth analyses and therefore require more complex data, information, and specific training of evaluator(s). PMP-AMR, ATLASS, and ISSEP were developed specifically for AMR-related activities-only ISSEP included production of a direct measure for "integration" and "impact on decision making." NEOH and ISSEP were perceived as the best tools for evaluation of One Health (OH) aspects, and ECoSur as best for evaluation of the quality of collaboration. PMP-AMR and ATLASS seemed to be the most user-friendly tools, particularly designed for risk managers. ATLASS was the only tool focusing specifically on laboratory activities. Our experience is that adequate resources are needed to perform evaluation(s). In most cases, evaluation would require involvement of several assessors and/or stakeholders, taking from weeks to months to complete. This study can help direct future evaluators of integrated AMU and AMR surveillance toward the most adequate tool for their specific evaluation purpose.
Collapse
|
4
|
Estimating Clinically Relevant Cut-Off Values for a High-Throughput Quantitative Real-Time PCR Detecting Bacterial Respiratory Pathogens in Cattle. Front Vet Sci 2021; 8:674771. [PMID: 34113678 PMCID: PMC8185137 DOI: 10.3389/fvets.2021.674771] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2021] [Accepted: 04/28/2021] [Indexed: 11/13/2022] Open
Abstract
Bovine respiratory disease (BRD) results from interactions between pathogens, environmental stressors, and host factors. Obtaining a diagnosis of the causal pathogens is challenging but the use of high-throughput real-time PCR (rtPCR) may help target preventive and therapeutic interventions. The aim of this study was to improve the interpretation of rtPCR results by analysing their associations with clinical observations. The objective was to develop and illustrate a field-data driven statistical method to guide the selection of relevant quantification cycle cut-off values for pathogens associated with BRD for the high-throughput rtPCR system "Fluidigm BioMark HD" based on nasal swabs from calves. We used data from 36 herds enrolled in a Danish field study where 340 calves within pre-determined age-groups were subject to clinical examination and nasal swabs up to four times. The samples were analysed with the rtPCR system. Each of the 1,025 observation units were classified as sick with BRD or healthy, based on clinical scores. The optimal rtPCR results to predict BRD were investigated for Pasteurella multocida, Mycoplasma bovis, Histophilus somni, Mannheimia haemolytica, and Trueperella pyogenes by interpreting scatterplots and results of mixed effects logistic regression models. The clinically relevant rtPCR cut-off suggested for P. multocida and M. bovis was ≤ 21.3. For H. somni it was ≤ 17.4, while no cut-off could be determined for M. haemolytica and T. pyogenes. The demonstrated approach can provide objective support in the choice of clinically relevant cut-offs. However, for robust performance of the regression model sufficient amounts of suitable data are required.
Collapse
|
5
|
Veterinary Herd Health Consultancy and Antimicrobial Use in Dairy Herds. Front Vet Sci 2021; 7:547975. [PMID: 33604361 PMCID: PMC7884328 DOI: 10.3389/fvets.2020.547975] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2020] [Accepted: 12/14/2020] [Indexed: 01/21/2023] Open
Abstract
The globally increasing level of antimicrobial resistance affects both human and animal health, why it is necessary to identify ways to change our current use of antimicrobials. The veterinary herd health collaboration between veterinarians and dairy farmers provides a useful setting for changing antimicrobial use in livestock. However, farmers and veterinarians work in a complex agricultural setting influenced by socio-economic factors, which complicates their choices regarding antimicrobial usage. It is therefore necessary to be aware of the range of potential influencing factors and to integrate this knowledge in the relevant local settings. This manuscript presents a literature review of relevant factors relating to antimicrobial use within the veterinary herd health consultancy setting, including knowledge gaps of relevance for changing the use of antimicrobials. An enriched version of the framework of the Theory of Planned Behaviour was used to organise the literature review. We identified diverging attitudes on correct treatment practices and perceptions of antimicrobial resistance among veterinarians and farmers, influenced by individual risk perception as well as social norms. Furthermore, disagreements in terms of goal setting and in the frequency of herd visits in relation to herd health consultancy can negatively influence the collaboration and the intention to change antimicrobial use. Farmers and veterinarians emphasise the importance of legislation and the role of the dairy industry in changing antimicrobial use, but the relevance of specific factors depends on the country-specific context. Overall, farmers and veterinarians must communicate better to understand each other's perspectives and establish common goals within the collaboration if they are to work efficiently to reduce antimicrobial use. Farmers and veterinarians both requested changes in individual behaviour; however, they also called for national and structural solutions in terms of balanced legislation and the availability of better diagnostics to facilitate a change in antimicrobial use practices. These various paths to achieving the desired changes in antimicrobial use illustrate the need to bridge methodological research approaches of veterinary science and social sciences for a better understanding of our potential to change antimicrobial use within the dairy farm animal sector.
Collapse
|
6
|
A One Health Evaluation of the University of Copenhagen Research Centre for Control of Antibiotic Resistance. Front Vet Sci 2018; 5:194. [PMID: 30186842 PMCID: PMC6110841 DOI: 10.3389/fvets.2018.00194] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2017] [Accepted: 07/27/2018] [Indexed: 12/14/2022] Open
Abstract
We applied the evaluation framework developed by the EU COST Action "Network of Evaluation of One Health" (NEOH) to assess the operations, supporting infrastructures and outcomes of a research consortium "University of Copenhagen Research Centre for Control of Antibiotic Resistance" (UC-CARE). This 4-year research project was a One Health (OH) initiative with participants from 14 departments over four faculties as well as stakeholders from industry and health authorities aiming to produce new knowledge to reduce the development of antimicrobial resistance (AMR). This was a case study focusing on assessing beneficial and counter-productive characteristics that could affect the OH outcomes. The study was also used to provide feedback to NEOH about the evaluation framework. The framework and evaluation tools are described in the introduction paper of this special journal issue. Data for the evaluation were extracted from the funding research proposal, the mid-term UC-CARE project evaluation report and supplemented with opinions elicited from project participants and stakeholders. Here, we describe the underlying system, theory of change behind the initiative and adapted questions from the NEOH tools that we used for semi-open interviews with consortium members throughout the evaluation process. An online survey was used to obtain information from stakeholders. The NEOH evaluation tools were then used for the qualitative and quantitative evaluation of the OH characteristics of UC-CARE. Senior UC-CARE researchers were interested and willing to be interviewed. Young scientists were more difficult to engage in interviews, and only 25% of stakeholders answered the online survey. Interviewees mentioned that the main benefit of UC-CARE was an increased awareness and general understanding of AMR issues. All interviewees stated that the adopted OH approach was relevant given the complexity of AMR. However, some questioned the applicability, and identified potentially counter-productive issues mainly related to the information sharing, collaboration and working methods across the consortium. A more integrated project organization, more stakeholder involvement and time for the project, flexibility in planning and a dedicated OH coordinator were suggested to allow for more knowledge exchange, potentially leading to a higher societal impact.
Collapse
|
7
|
Application of the NEOH Framework for Self-Evaluation of One Health Elements of a Case-Study on Obesity in European Dogs and Dog-Owners. Front Vet Sci 2018; 5:163. [PMID: 30083538 PMCID: PMC6064947 DOI: 10.3389/fvets.2018.00163] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2017] [Accepted: 06/28/2018] [Indexed: 11/13/2022] Open
Abstract
Obesity is a malnutrition disorder of global concern with increasing prevalence driven by underlying societal, economic and environmental mechanisms leading to changed physical activity patterns, eating behaviors and diet compositions in both humans and in their pet-dogs. A questionnaire-based study was carried out as a joint effort across 11 European countries. It was considered a One Health (OH) initiative between scientists from human and animal health sectors aiming to identify factors associated with obesity in dog owners and their dogs. Expected outcomes of this approach included new insights unachievable by single-sector research initiatives, and hence potentially leading to new cross-sectorial solutions. We performed an internal evaluation among the actors of the obesity initiative using the framework for evaluation developed by the “Network for Evaluation of One Health” (NEOH). It served as a case-study for the NEOH consortium to illustrate the application and provide feedback on the utility of the framework. The evaluation was performed by a subgroup of scientists also involved in the obesity study group, and it consisted of: (1) the definition of the initiative and its context, (2) the description of the theory of change, and (3) the qualitative and quantitative process evaluation of operations and supporting infrastructures scored on a scale from 0 to 1. In the One Health operations, the obesity study initiative scored medium high on OH-thinking (0.5) and OH-planning (0.45), and relatively high on OH-working (0.7). The supporting infrastructure score was high for systemic organization (0.8), but low for sharing (0.45) and learning (0.28). The calculated OH-index was 0.29 (on scale 0 to 1) indicating that the full potential of health integration and collaboration was not exploited in the initiative, and the main issue identified was a lack of stakeholder engagement. The OH-ratio of 1.1 indicated equal focus on operations and supporting infrastructures. Hence, the evaluation identified potentially counterproductive as well as beneficial characteristics, which are further discussed in this paper in relation to the expected outcomes. The NEOH framework for evaluation requires that the evaluators have a good understanding of systems thinking and the mechanisms of the health issue targeted by the initiative.
Collapse
|
8
|
A longitudinal observational study of the dynamics of Mycoplasma bovis antibodies in naturally exposed and diseased dairy cows. J Dairy Sci 2018; 101:7383-7396. [PMID: 29778474 DOI: 10.3168/jds.2017-14340] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2017] [Accepted: 04/06/2018] [Indexed: 11/19/2022]
Abstract
Mycoplasma bovis is an important pathogen causing disease and substantial economic losses in cattle. However, knowledge of the dynamics of antibody responses in individual cows in the face of an outbreak is currently extremely limited. The use of commercial antibody tests to support clinical decision-making and for surveillance purposes is therefore challenging. Our objective was to describe the dynamics of M. bovis antibody responses in 4 Danish dairy herds experiencing an acute outbreak of M. bovis-associated disease, and to compare the antibody dynamics between dairy cows with different disease manifestations. A total of 120 cows were examined using a standardized clinical protocol and categorized into 4 disease groups: "mastitis," "systemic," "nonspecific," and "none." Paired blood and milk samples were collected and tested using a commercial M. bovis antibody-detecting ELISA. Plots of raw data and generalized additive mixed models with cow and herd as random effects were used to describe serum and milk antibody dynamics relative to the estimated time of onset of clinical disease. Cows with mastitis had high optical density measurement (ODC%) of antibodies in both milk and serum at disease onset. The estimated mean ODC% in milk was below the manufacturer's cut-off for the other groups for the entire study period. The estimated mean serum ODC% in the "systemic" group was high at onset of disease and stayed above the cut-off until 65 d after disease onset. However, the lower 95% confidence interval (CI) for the mean ODC% was only above the manufacturer's cut-off between 7 and 17 d after onset of disease. The CI of the "systemic" and "none" groups did not overlap at any time between the day of disease onset and 65 d after disease onset, and the estimated mean ODC% for both the "nonspecific" and "none" groups were generally below the cut-off for the majority of the study period. In conclusion, the serum antibody responses were highly dynamic and showed a high level of variation between individual cows. This strongly suggests that serology is unlikely to be useful for individual diagnosis of M. bovis-associated disease in dairy cows. However, it might still be useful for herd- or group-level diagnosis. Antibodies in milk were only increased in cows with M. bovis mastitis, indicating that milk antibody measurements only have diagnostic utility for cows with mastitis.
Collapse
|
9
|
Veterinary Expert Opinion on Potential Drivers and Opportunities for Changing Antimicrobial Usage Practices in Livestock in Denmark, Portugal, and Switzerland. Front Vet Sci 2018; 5:29. [PMID: 29546044 PMCID: PMC5837977 DOI: 10.3389/fvets.2018.00029] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2017] [Accepted: 02/09/2018] [Indexed: 12/24/2022] Open
Abstract
Reducing antimicrobial use (AMU) in livestock is requested by Public Health authorities. Ideally, this should be achieved without jeopardizing production output or animal health and welfare. Thus, efficient measures must be identified and developed to target drivers of AMU. Veterinarians play a central role in the identification and implementation of such efficient interventions. Sixty-seven veterinarians with expertise in livestock production in Denmark, Portugal, and Switzerland participated in an expert opinion study aimed at investigating experiences and opinions of veterinarians about the driving forces and practices related to AMU in the main livestock sectors (broiler, dairy cattle, fattening/veal calf, and pig industry) of the aforementioned countries. Opinions on potential factors influencing the choice of antimicrobials and opportunities to reduce AMU were collected. Antibiograms are seldom used, mainly due to the time lag between testing and obtaining the results. The perceived percentage of treatment failures varied between countries and livestock sectors; however, little changes were reported over time (2005-2015). The animal health problems of each livestock sector most frequently leading to AMU did not vary substantially between countries. Mandatory official interventions (i.e., binding measures applied by national or international authorities) were highlighted as having the biggest impact on AMU. There was a variation in the experts' opinion regarding feasibility and impact of interventions both between countries and livestock sectors. Nevertheless, improved biosecurity and education of veterinarians frequently received high scores. Most veterinarians believed that AMU can be reduced. The median potential reduction estimates varied from 1% in Swiss broilers to 50% in Portuguese broilers and veal/fattening calves in all countries. We hypothesize that the differences in views could be related to disease epidemiology, animal husbandry, and socio-economic factors. A profound investigation of these disparities would provide the required knowledge for developing targeted strategies to tackle AMU and consequently resistance development. However, experts also agreed that mandatory official interventions could have the greatest impact on antimicrobial consumption. Furthermore, improvement of biosecurity and education of veterinarians, the use of zinc oxide (in pigs), improving vaccination strategies, and the creation of treatment plans were the measures considered to have the largest potential to reduce AMU. This paper can inform policymakers in Europe and countries with a similar animal production regarding their AMU policy.
Collapse
|
10
|
Review of transmission routes of 24 infectious diseases preventable by biosecurity measures and comparison of the implementation of these measures in pig herds in six European countries. Transbound Emerg Dis 2017; 65:381-398. [PMID: 29124908 DOI: 10.1111/tbed.12758] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2017] [Indexed: 01/18/2023]
Abstract
This study aimed to review the transmission routes of important infectious pig diseases and to translate these into biosecurity measures preventing or reducing the transmission between and within pig herds. Furthermore, it aimed to identify the level of implementation of these measures in different European countries and discuss the observed variations to identify potentials for improvement. First, a literature review was performed to show which direct and indirect transmission routes of 24 infectious pig diseases can be prevented through different biosecurity measures. Second, a quantitative analysis was performed using the Biocheck.UGent™, a risk-based scoring system to evaluate biosecurity in pig herds, to obtain an insight into the implementation of these biosecurity measures. The database contained farm-specific biosecurity data from 574 pig farms in Belgium, Denmark, France, Germany, the Netherlands and Sweden, entered between January 2014 and January 2016. Third, a qualitative analysis based on a review of literature and other relevant information resources was performed for every subcategory of internal and external biosecurity in the Biocheck.UGent™ questionnaire. The quantitative analysis indicated that at the level of internal, external and overall biosecurity, Denmark had a significantly distinct profile with higher external biosecurity scores and less variation than the rest of the countries. This is likely due to a widely used specific pathogen-free (SPF) system with extensive focus on biosecurity since 1971 in Denmark. However, the observed pattern may also be attributed to differences in data collection methods. The qualitative analysis identified differences in applied policies, legislation, disease status, pig farm density, farming culture and habits between countries that can be used for shaping country-specific biosecurity advice to attain improved prevention and control of important pig diseases in European pig farms.
Collapse
|
11
|
Methods and processes of developing the strengthening the reporting of observational studies in epidemiology - veterinary (STROBE-Vet) statement. Prev Vet Med 2017; 134:188-196. [PMID: 27836042 DOI: 10.1016/j.prevetmed.2016.09.005] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2016] [Revised: 09/06/2016] [Accepted: 09/07/2016] [Indexed: 10/20/2022]
Abstract
BACKGROUND The reporting of observational studies in veterinary research presents many challenges that often are not adequately addressed in published reporting guidelines. OBJECTIVE To develop an extension of the STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) statement that addresses unique reporting requirements for observational studies in veterinary medicine related to health, production, welfare, and food safety. DESIGN A consensus meeting of experts was organized to develop an extension of the STROBE statement to address observational studies in veterinary medicine with respect to animal health, animal production, animal welfare, and food safety outcomes. SETTING Consensus meeting May 11-13, 2014 in Mississauga, Ontario, Canada. PARTICIPANTS Seventeen experts from North America, Europe, and Australia attended the meeting. The experts were epidemiologists and biostatisticians, many of whom hold or have held editorial positions with relevant journals. METHODS Prior to the meeting, 19 experts completed a survey about whether they felt any of the 22 items of the STROBE statement should be modified and if items should be added to address unique issues related to observational studies in animal species with health, production, welfare, or food safety outcomes. At the meeting, the participants were provided with the survey responses and relevant literature concerning the reporting of veterinary observational studies. During the meeting, each STROBE item was discussed to determine whether or not re-wording was recommended, and whether additions were warranted. Anonymous voting was used to determine whether there was consensus for each item change or addition. RESULTS The consensus was that six items needed no modifications or additions. Modifications or additions were made to the STROBE items numbered: 1 (title and abstract), 3 (objectives), 5 (setting), 6 (participants), 7 (variables), 8 (data sources/measurement), 9 (bias), 10 (study size), 12 (statistical methods), 13 (participants), 14 (descriptive data), 15 (outcome data), 16 (main results), 17 (other analyses), 19 (limitations), and 22 (funding). LIMITATION Published literature was not always available to support modification to, or inclusion of, an item. CONCLUSION The methods and processes used in the development of this statement were similar to those used for other extensions of the STROBE statement. The use of this extension to the STROBE statement should improve the reporting of observational studies in veterinary research related to animal health, production, welfare, or food safety outcomes by recognizing the unique features of observational studies involving food-producing and companion animals, products of animal origin, aquaculture, and wildlife.
Collapse
|
12
|
Abstract
1. The performance of the scoring in the Danish footpad dermatitis (FPD) surveillance system was evaluated by determining inter-rater agreement in visual inspection of FPD in broilers between two independent raters (R1 and R2) and the official scoring at a Danish slaughterhouse. 2. FPD scores were evaluated in 1599 chicken feet. The two raters and the slaughterhouse scored equal proportions of score 0. So did R1 and R2 when assessing score 1 and the more severe lesion score 2, whereas the slaughterhouse scored a markedly higher proportion of score 1 and a lower proportion of score 2. Aggregated FPD flock scores ranged from 5 to 163 (R1 and R2) and from 8 to 107 (slaughterhouse). 3. The level of agreement between the two raters was high for scores 0, 1 and 2 and for flock scores. Agreement between raters and the slaughterhouse was lower when R1 and R2 recorded score 2 than when they recorded scores 0 and 1. 4. This study indicates that the occurrence and severity of lesions are underestimated in the official Danish FPD scoring system.
Collapse
|
13
|
Comparison of Antimicrobial Consumption Patterns in the Swiss and Danish Cattle and Swine Production (2007-2013). Front Vet Sci 2017; 4:26. [PMID: 28303244 PMCID: PMC5332391 DOI: 10.3389/fvets.2017.00026] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2016] [Accepted: 02/14/2017] [Indexed: 11/30/2022] Open
Abstract
Veterinary antimicrobial consumption patterns vary considerably across Europe. These differences are not only limited to the total amount consumed but are also observed with regards to the relative proportion of the various antimicrobial classes used. Currently, most of the data on veterinary antimicrobials are reported at sales level without any information on the consumption by different animal species. This hinders a proper comparison of antimicrobial consumption at the species level between countries. However, it is imperative to improve our understanding on antimicrobial usage patterns at the species level, as well as on the drivers contributing to those differences. This will allow for development of tailored interventions with the lowest possible risk for human health, while ensuring effective treatment of diseased livestock. An important step to attain such an objective is to perform detailed comparisons of the antimicrobial consumption in each species between countries. We compared antimicrobial consumption estimates for cattle and pigs in Switzerland and Denmark, in order to distinguish species-specific patterns and trends in consumption from 2007 to 2013. Swiss data were obtained from a previous study that assessed methodologies to stratify antimicrobial sales per species; Danish antimicrobial consumption estimates were assembled from Danish Integrated Antimicrobial Resistance Monitoring and Research Programme reports. A decrease in antimicrobial consumption in milligrams per kilogram of biomass was observed for both countries (4.5% in Denmark and 34.7% in Switzerland) when comparing 2013 to 2007. For pigs and cattle, the overall consumption per kilogram of biomass of most antimicrobial classes was higher in Switzerland than in Denmark. Large variations in the relative consumption of different antimicrobial classes were also evident. Sulfonamides/trimethoprim and tetracyclines were consumed in a higher proportion in Switzerland than in Denmark, whereas the relative consumption of penicillins was higher in Denmark. The differences observed in veterinary antimicrobial consumption are not solely related to animal demographic characteristics in these two countries. Other factors, such as the level of biosecurity and farming practices, veterinarians and farmers’ education, or governmental/industry programs put in place might also partly explain these variations. These differences should be taken into account when aiming to implement targeted interventions to reduce antimicrobial consumption.
Collapse
|
14
|
Methods and Processes of Developing the Strengthening the Reporting of Observational Studies in Epidemiology-Veterinary (STROBE-Vet) Statement. J Food Prot 2016; 79:2211-2219. [PMID: 28221964 DOI: 10.4315/0362-028x.jfp-16-016] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
Reporting of observational studies in veterinary research presents challenges that often are not addressed in published reporting guidelines. Our objective was to develop an extension of the STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) statement that addresses unique reporting requirements for observational studies in veterinary medicine related to health, production, welfare, and food safety. We conducted a consensus meeting with 17 experts in Mississauga, Canada. Experts completed a premeeting survey about whether items in the STROBE statement should be modified or added to address unique issues related to observational studies in animal species with health, production, welfare, or food safety outcomes. During the meeting, each STROBE item was discussed to determine whether or not rewording was recommended, and whether additions were warranted. Anonymous voting was used to determine consensus. Six items required no modifications or additions. Modifications or additions were made to the STROBE items 1 (title and abstract), 3 (objectives), 5 (setting), 6 (participants), 7 (variables), 8 (data sources and measurement), 9 (bias), 10 (study size), 12 (statistical methods), 13 (participants), 14 (descriptive data), 15 (outcome data), 16 (main results), 17 (other analyses), 19 (limitations), and 22 (funding). The methods and processes used were similar to those used for other extensions of the STROBE statement. The use of this STROBE statement extension should improve reporting of observational studies in veterinary research by recognizing unique features of observational studies involving food-producing and companion animals, products of animal origin, aquaculture, and wildlife.
Collapse
|
15
|
Explanation and Elaboration Document for the STROBE-Vet Statement: Strengthening the Reporting of Observational Studies in Epidemiology-Veterinary Extension. J Vet Intern Med 2016; 30:1896-1928. [PMID: 27859752 PMCID: PMC5115190 DOI: 10.1111/jvim.14592] [Citation(s) in RCA: 40] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2016] [Revised: 06/24/2016] [Accepted: 08/29/2016] [Indexed: 01/15/2023] Open
Abstract
The STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) statement was first published in 2007 and again in 2014. The purpose of the original STROBE was to provide guidance for authors, reviewers, and editors to improve the comprehensiveness of reporting; however, STROBE has a unique focus on observational studies. Although much of the guidance provided by the original STROBE document is directly applicable, it was deemed useful to map those statements to veterinary concepts, provide veterinary examples, and highlight unique aspects of reporting in veterinary observational studies. Here, we present the examples and explanations for the checklist items included in the STROBE-Vet statement. Thus, this is a companion document to the STROBE-Vet statement methods and process document (JVIM_14575 "Methods and Processes of Developing the Strengthening the Reporting of Observational Studies in Epidemiology-Veterinary (STROBE-Vet) Statement" undergoing proofing), which describes the checklist and how it was developed.
Collapse
|
16
|
Methods and Processes of Developing the Strengthening the Reporting of Observational Studies in Epidemiology - Veterinary (STROBE-Vet) Statement. J Vet Intern Med 2016; 30:1887-1895. [PMID: 27859753 PMCID: PMC5115188 DOI: 10.1111/jvim.14574] [Citation(s) in RCA: 35] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2016] [Revised: 06/24/2016] [Accepted: 08/10/2016] [Indexed: 12/29/2022] Open
Abstract
Background Reporting of observational studies in veterinary research presents challenges that often are not addressed in published reporting guidelines. Objective To develop an extension of the STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) statement that addresses unique reporting requirements for observational studies in veterinary medicine related to health, production, welfare, and food safety. Design Consensus meeting of experts. Setting Mississauga, Canada. Participants Seventeen experts from North America, Europe, and Australia. Methods Experts completed a pre‐meeting survey about whether items in the STROBE statement should be modified or added to address unique issues related to observational studies in animal species with health, production, welfare, or food safety outcomes. During the meeting, each STROBE item was discussed to determine whether or not rewording was recommended and whether additions were warranted. Anonymous voting was used to determine consensus. Results Six items required no modifications or additions. Modifications or additions were made to the STROBE items 1 (title and abstract), 3 (objectives), 5 (setting), 6 (participants), 7 (variables), 8 (data sources/measurement), 9 (bias), 10 (study size), 12 (statistical methods), 13 (participants), 14 (descriptive data), 15 (outcome data), 16 (main results), 17 (other analyses), 19 (limitations), and 22 (funding). Conclusion The methods and processes used were similar to those used for other extensions of the STROBE statement. The use of this STROBE statement extension should improve reporting of observational studies in veterinary research by recognizing unique features of observational studies involving food‐producing and companion animals, products of animal origin, aquaculture, and wildlife.
Collapse
|
17
|
Factors associated with variation in bulk tank milk Mycoplasma bovis antibody-ELISA results in dairy herds. J Dairy Sci 2016; 99:3815-3823. [PMID: 26971142 DOI: 10.3168/jds.2015-10056] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2015] [Accepted: 01/24/2016] [Indexed: 11/19/2022]
Abstract
The relevance and limitations for using measurements of antibodies against Mycoplasma bovis in bulk tank milk (BTM) as a potentially cost-effective diagnostic tool for herd classification has not been evaluated before. Assuming that an increasing or high seroprevalence is a result of on-going or recent spread of M. bovis in a dairy herd, we tested the hypothesis that increasing prevalence of antibody-positive cows and young stock are associated with increasing BTM antibody ELISA values against M. bovis in Danish dairy herds with different courses of M. bovis infection. Furthermore, we tested whether herd size was associated with variations in the BTM responses. Thirty-nine Danish dairy herds selected to represent 4 different herd-level infection groups [8 control herds, 14 acute outbreak herds, 7 herds with previous outbreaks, and 10 herds with elevated BTM ELISA-values directed against M. bovis (>64% optical density measurement)] were visited 4 to 5 times, approximately 3mo apart. At each visit, 65 young stock were blood sampled. At the milk recording date closest to the herd visit date, 50 milk recording samples from individual lactating cows were randomly selected. In addition, a BTM sample was collected as a representative sample directly from the bulk tank by the dairies' milk truck drivers as part of the mandatory milk quality-control scheme. Blood and milk samples were tested for antibodies against M. bovis with a commercially available ELISA test (Bio-X BIO K 302, Bio-X Diagnostics, Rochefort, Belgium). A linear mixed effects model was used to analyze the effects of the prevalence of antibody-positive lactating cows and young stock and herd size on the BTM M. bovis ELISA results. Herd was included as a random effect to account for clustering of BTM samples originating from the same herd. Increasing prevalence of antibody-positive lactating cows was the only variable associated with increasing M. bovis BTM ELISA optical density measurement. In contrast, the prevalence of antibody-positive young stock did not correlate with the BTM optical density measurement. In conclusion, some M. bovis associated herd infections are detectable by BTM ELISA-testing, but limitations exist and further investigations of the effect of different clinical disease expressions in the herds are warranted.
Collapse
|
18
|
PS-021 Use of an E-learning program to improve paediatric nurses’ dose calculation skills. Eur J Hosp Pharm 2014. [DOI: 10.1136/ejhpharm-2013-000436.372] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022] Open
|
19
|
Exposure assessment of extended-spectrum beta-lactamases/AmpC beta-lactamases-producing Escherichia coli in meat in Denmark. Infect Ecol Epidemiol 2014; 4:22924. [PMID: 24511370 PMCID: PMC3916710 DOI: 10.3402/iee.v4.22924] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2013] [Revised: 01/13/2014] [Accepted: 01/13/2014] [Indexed: 11/29/2022] Open
Abstract
Introduction Extended-spectrum beta-lactamases (ESBL) and AmpC beta-lactamases (AmpC) are of concern for veterinary and public health because of their ability to cause treatment failure due to antimicrobial resistance in Enterobacteriaceae. The main objective was to assess the relative contribution (RC) of different types of meat to the exposure of consumers to ESBL/AmpC and their potential importance for human infections in Denmark. Material and methods The prevalence of each genotype of ESBL/AmpC-producing E. coli in imported and nationally produced broiler meat, pork and beef was weighted by the meat consumption patterns. Data originated from the Danish surveillance program for antibiotic use and antibiotic resistance (DANMAP) from 2009 to 2011. DANMAP also provided data about human ESBL/AmpC cases in 2011, which were used to assess a possible genotype overlap. Uncertainty about the occurrence of ESBL/AmpC-producing E. coli in meat was assessed by inspecting beta distributions given the available data of the genotypes in each type of meat. Results and discussion Broiler meat represented the largest part (83.8%) of the estimated ESBL/AmpC-contaminated pool of meat compared to pork (12.5%) and beef (3.7%). CMY-2 was the genotype with the highest RC to human exposure (58.3%). However, this genotype is rarely found in human infections in Denmark. Conclusion The overlap between ESBL/AmpC genotypes in meat and human E. coli infections was limited. This suggests that meat might constitute a less important source of ESBL/AmpC exposure to humans in Denmark than previously thought – maybe because the use of cephalosporins is restricted in cattle and banned in poultry and pigs. Nonetheless, more detailed surveillance data are required to determine the contribution of meat compared to other sources, such as travelling, pets, water resources, community and hospitals in the pursuit of a full source attribution model.
Collapse
|
20
|
Dynamic changes in antibody levels as an early warning of Salmonella Dublin in bovine dairy herds. J Dairy Sci 2013; 96:7558-64. [PMID: 24140322 DOI: 10.3168/jds.2012-6478] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2012] [Accepted: 09/09/2013] [Indexed: 11/19/2022]
Abstract
Salmonella Dublin is a bacterium that causes disease and production losses in cattle herds. In Denmark, a surveillance and control program was initiated in 2002 to monitor and reduce the prevalence of Salmonella Dublin. In dairy herds, the surveillance includes herd classification based on bulk tank milk measurements of antibodies directed against Salmonella Dublin at 3-mo intervals. In this study, an "alarm herd" concept, based on the dynamic progression of these repeated measurements, was formulated such that it contains predictive power for Salmonella Dublin herd classification change from "likely free of infection" to "likely infected" in the following quarter of the year, thus warning the farmer 3 mo earlier than the present system. The alarm herd concept was defined through aberrations from a stable development over time of antibody levels. For suitable parameter choices, alarm herd status was a positive predictor for Salmonella Dublin status change in dairy herds, in that alarm herds had a higher risk of changing status in the following quarter compared with nonalarm herds. This was despite the fact that both alarm and nonalarm herds had antibody levels that did not indicate the herds being "likely infected" according to the existing classification system in the present quarter. The alarm herd concept can be used as a new early warning element in the existing surveillance program. Additionally, to improve accuracy of herd classification, the alarm herd concept could be incorporated into a model including other known risk factors for change in herd classification. Furthermore, the model could be extended to other diseases monitored in similar ways.
Collapse
|
21
|
Evaluation of milk yield losses associated with Salmonella antibodies in bulk tank milk in bovine dairy herds. J Dairy Sci 2013; 95:4873-4885. [PMID: 22916892 DOI: 10.3168/jds.2011-4332] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2011] [Accepted: 04/16/2012] [Indexed: 11/19/2022]
Abstract
The effect of Salmonella on milk production is not well established in cattle. The objective of this study was to investigate whether introduction of Salmonella into dairy cattle herds was associated with reduced milk yield and determine the duration of any such effect. Longitudinal data from 2005 through 2009 were used, with data from 12 mo before until 18 mo after the estimated date of infection. Twenty-eight case herds were selected based on an increase in the level of Salmonella-specific antibodies in bulk-tank milk from <10 corrected optical density percentage (ODC%) to ≥70 ODC% between 2 consecutive three-monthly measurements in the Danish Salmonella surveillance program. All selected case herds were conventional Danish Holstein herds. Control herds (n=40) were selected randomly from Danish Holstein herds with Salmonella antibody levels consistently <10 ODC%. A date of herd infection was randomly allocated to the control herds. Hierarchical mixed effect models with the outcome test-day yield of energy-corrected milk (ECM)/cow were used to investigate daily milk yield before and after the estimated herd infection date for cows in parities 1, 2, and 3+. Control herds were used to evaluate whether the effects in the case herds could be reproduced in herds without Salmonella infection. Herd size, days in milk, somatic cell count, season, and year were included in the models. Yield in first-parity cows was reduced by a mean of 1.4 kg (95% confidence interval: 0.5 to 2.3) of ECM/cow per day from 7 to 15 mo after the estimated herd infection date, compared with that of first-parity cows in the same herds in the 12 mo before the estimated herd infection date. Yield for parity 3+ cows was reduced by a mean of 3.0 kg (95% confidence interval: 1.3 to 4.8) of ECM/cow per day from 7 to 15 mo after herd infection compared with that of parity 3+ cows in the 12 mo before the estimated herd infection. We observed minor differences in yield in second-parity cows before and after herd infection and observed no difference between cows in control herds before and after the simulated infection date. Milk yield decreased significantly in affected herds and the reduction was detectable several months after the increase in bulk tank milk Salmonella antibodies. It took more than 1 yr for milk yield to return to preinfection levels.
Collapse
|
22
|
Abstract
Bovine cysticercosis (BC) is a zoonotic, parasitic infection in cattle. Under the current EU meat inspection regulation, every single carcass from all bovines above 6 weeks of age is examined for BC. This method is costly and makes more sense in countries with higher number of BC-infected animals than in countries with few lightly infected cases per year. The aim of the present case-control study was to quantify associations between potential herd-level risk factors and BC in Danish cattle herds. Risk factors can be used in the design of a risk-based meat inspection system targeted towards the animals with the highest risk of BC. Cases (n = 77) included herds that hosted at least one animal diagnosed with BC at meat inspection, from 2006 to 2010. Control herds (n = 231) consisted of randomly selected herds that had not hosted any animals diagnosed with BC between 2004 and 2010. The answers from a questionnaire and register data from the Danish Cattle Database were grouped into meaningful variables and used to investigate the risk factors for BC using a multivariable logistic regression model. Case herds were almost three times more likely than control herds to let all or most animals out grazing. Case herds were more than five times more likely than control herds to allow their animals access to risky water sources with sewage treatment plant effluent in proximity. Case herds were also more likely to share machinery or hire contractors than control herds. The risk decreased with increasing herd size probably because the larger herds generally tend to keep cattle indoors in Denmark. The results are useful to guide future data recording that can be supplied by the farmer as food chain information and then be used for differentiated meat inspection in low- and high-risk groups, enabling development of risk-based meat inspection systems.
Collapse
|
23
|
Presence of natural genetic resistance in Fraxinus excelsior (Oleraceae) to Chalara fraxinea (Ascomycota): an emerging infectious disease. Heredity (Edinb) 2011; 106:788-97. [PMID: 20823903 PMCID: PMC3186218 DOI: 10.1038/hdy.2010.119] [Citation(s) in RCA: 96] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2010] [Revised: 06/29/2010] [Accepted: 07/15/2010] [Indexed: 11/09/2022] Open
Abstract
Fraxinus excelsior, common ash native to Europe, is threatened by a recently identified pathogenic fungus Chalara fraxinea, which causes extensive damage on ash trees across Europe. In Denmark, most stands are severely affected leaving many trees with dead crowns. However, single trees show notably fewer symptoms. In this study, the impact of the emerging infectious disease on native Danish ash trees is assessed by estimating presence of inherent resistance in natural populations. Disease symptoms were assessed from 2007 to 2009 at two different sites with grafted ramets of 39 selected clones representing native F. excelsior trees. A strong genetic variation in susceptibility to C. fraxinea infections was observed. No genetic or geographic structure can explain the differences, but strong genetic correlations to leaf senescence were observed. The results suggest that a small fraction of trees in the Danish population of ash possess substantial resistance against the damage. Though this fraction is probably too low to avoid population collapse in most natural or managed ash forests, the observed presence of putative resistance against the emerging infectious disease in natural stands is likely to be of evolutionary importance. This provides prospects of future maintenance of the species through natural or artificial selection in favour of remaining healthy individuals.
Collapse
|
24
|
Culling decisions of dairy farmers during a 3-year Salmonella control study. Prev Vet Med 2011; 100:29-37. [PMID: 21481960 DOI: 10.1016/j.prevetmed.2011.03.001] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2010] [Revised: 03/10/2011] [Accepted: 03/10/2011] [Indexed: 11/28/2022]
Abstract
Salmonella enterica subsp. enterica-serotypes lead to periodically increased morbidity and mortality in cattle herds. The bacteria can also lead to serious infections in humans. Consequently, Denmark has started a surveillance and control programme in 2002. The programme focuses on Salmonella Dublin which is the most prevalent and most persistent serotype in the Danish cattle population. A field study in 10 dairy herds with persistent Salmonella infections was carried out over three years to gain experience with control procedures including risk assessment, targeted control actions and test-and-cull procedures. From autumn 2003 until end of 2006 quarterly milk quality control samples from all lactating cows and biannual blood samples from all young stock above the age of three months were tested using an indirect antibody ELISA. The most recent and previous test results were used to categorise all animals into risk groups. These risk groups and all individual ELISA-results were communicated to the farmers as colour-coded lists four to six times per year. Farmers were advised to manage the risk of Salmonella transmission from cattle with repeatedly high ELISA results (flagged as "red") or cows with at least one recent moderately high ELISA result (flagged as "yellow") on the lists. Risk management included, e.g. culling or separation of the cows at calving. We analysed culling decisions using two models. For heifers a hierarchical multivariable logistic model with herd as random effect evaluated if animals with red and yellow flags had higher probability of being slaughtered or sold before first calving than animals without any risk flags. For adult cows a semi-parametric proportional hazard survival model was used to test the effect of number of red and yellow flags on hazards of culling at different time points and interactions with prevalence in the herd while accounting for parity, stage of lactation, milk yield, somatic cell count and the hierarchical structure of the data with animals clustered at herd level. This study illustrates how investigation of culling decisions made by herd managers when they have access to test-status of individual animals and overall apparent prevalence during control of an infection can lead to useful new knowledge. Overall herd managers were more likely to cull cattle with increasing number of yellow and red flags than animals with no flags. However, cattle were more likely to be culled with yellow and red flags during times with low or medium high within-herd seroprevalence than at times with high seroprevalence. These results are valuable knowledge for modelling and planning of control strategies and for making recommendations to farmers about control options.
Collapse
|
25
|
The range of influence between cattle herds is of importance for the local spread of Salmonella Dublin in Denmark. Prev Vet Med 2008; 84:277-90. [PMID: 18242741 DOI: 10.1016/j.prevetmed.2007.12.005] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
The objective of the study was to estimate the range of influence between cattle herds with positive Salmonella Dublin herd status. Herd status was a binary outcome of high/low antibody levels to Salmonella Dublin in bulk-tank milk and blood samples collected from all cattle herds in Denmark for surveillance purposes. Two methods were used. Initially, a spatial generalised linear mixed model was developed with an exponential correlation function to estimate the range of influence simultaneously with the effect of potential risk factors. An iteratively reweighted generalised least squares procedure was used as a second method for verifying the range of influence estimates. With this iterative procedure, deviance residuals were calculated based on a generalised linear model and the range of influence was estimated based on the residuals using an exponential semivariogram. The range of influence was estimated for six different regions in Denmark using both methods. The analyses were performed on data collected during 1 year after initiation of the Salmonella Dublin surveillance program providing herd classifications for the 4th year-quarter of 2003 and 2 years later for the 4th year-quarter of 2005. The prevalence of dairy herds with a positive Salmonella Dublin herd classification status in this period had decreased from 22.1 to 17.0%. In non-dairy herds, the prevalence was nearly unchanged during the same period (3.4 and 3.7% in 4th quarter of 2003 and 2005, respectively). For all cattle herds, the range of influence was 2.3-6.4 km in 2003 and 1.5-8.3 km in 2005. There seemed to be no association between the range of influence and the density of herds in the different regions.
Collapse
|
26
|
Growth inhibitory factors in bovine faeces impairs detection of Salmonella Dublin by conventional culture procedure. J Appl Microbiol 2007; 103:650-6. [PMID: 17714398 DOI: 10.1111/j.1365-2672.2007.03292.x] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
AIMS To analyse the relative importance of different biological and technical factors on the analytical sensitivity of conventional culture methods for detection of Salmonella Dublin in cattle faeces. METHODS AND RESULTS Faeces samples collected from six adult bovines from different salmonella-negative herds were split into subpools and spiked with three strains of S. Dublin at a concentration level of c. 10 CFU g(-1) faeces. Each of the 18 strain-pools was divided into two sets of triplicates of four volumes of faecal matter (1, 5, 10 and 25 g). The two sets were pre-enriched with and without novobiocin, followed by combinations of culture media (three types) and selective media (two types). The sensitivity of each combination and sources of variation in detection were determined by a generalized linear mixed model using a split-plot design. CONCLUSIONS Biological factors, such as faecal origin and S. Dublin strain influenced the sensitivity more than technical factors. Overall, the modified semi-solid Rappaport Vassiliadis (MSRV)-culture medium had the most reliable detection capability, whereas detection with selenite cystine broth and Mueller Kauffman tetrathionate broth combinations varied more in sensitivity and rarely reached the same level of detection as MSRV in this experiment. SIGNIFICANCE AND IMPACT OF THE STUDY The study showed that for MSRV-culture medium and xylose lysine decarboxylase agar as the indicative medium, the sensitivity of the faecal culture method may be improved by focusing on the strain variations and the ecology of the faecal sample. Detailed investigation of the faecal flora (pathogens and normal flora) and the interaction with chemical factors may result in developing an improved method for detection of S. Dublin.
Collapse
|
27
|
Effects of experimental immunosuppression in cattle with persistently high antibody levels to Salmonella Dublin lipopolysaccharide O-antigens. BMC Vet Res 2007; 3:17. [PMID: 17683640 PMCID: PMC1963323 DOI: 10.1186/1746-6148-3-17] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2007] [Accepted: 08/07/2007] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Salmonella Dublin (S. Dublin) is a zoonotic bacterium which is host adapted to cattle. The bacterium can cause subclinical persistent infection in cattle (carriers), which may be reactivated. During reactivation, animals may shed bacteria, thus constituting a source of infection for other animals. Identification of such carriers is assumed to be critical in attempts to control and eradicate the infection. Some authors suggest that persistently high antibody levels in serum or milk is indicative of a carrier state in cattle. However, this has been questioned by other studies in which S. Dublin were not found in all animals suspected of being carriers based on antibody measurements when such animals were examined at slaughter. Some hypothesize that the lack of isolated bacteria from long-term high antibody level cattle is due to a latent infection stage that can later be reactivated, for instance during stress around calving or due to transportation. This study examined nine adult cattle with persistently high antibody responses to S. Dublin O-antigen based lipopolysaccharide for cultivable bacteria in faeces, milk and internal organs before and after transportation, isolation and experimental immunosuppression with dexamethasone sodium phosphate over a period of 7-14 days. RESULTS Clear signs of immunosuppression were seen as expression of leucocytosis and neutrophilia in all animals on day 3-5 after the first injections with dexamethasone sodium phosphate. No clinical signs or necropsy findings indicating salmonellosis were observed in any of the animals. No shedding of S. Dublin was found in faeces (collected four times daily) or milk (collected twice daily) at any point in time during the 7-14 day period. S. Dublin was recovered by a conventional culture method from tissue samples from mammary lymph nodes, spleen and liver collected from three animals at necropsy. CONCLUSION In this study, immunosuppression by transportation stress or dexamethasone treatment did not lead to excretion of S. Dublin in milk or faeces from infected animals. The study questions the general conception that cattle with persistently high antibody levels against S. Dublin O-antigens in naturally infected herds should be considered high risk for transmission and therefore culled as part of effective intervention strategies. It is suggested that the location of S. Dublin infected foci in the animal plays a major role for the risk of excreting bacteria.
Collapse
|
28
|
Risk Factors for Changing Test Classification in the Danish Surveillance Program for Salmonella in Dairy Herds. J Dairy Sci 2007; 90:2815-25. [PMID: 17517722 DOI: 10.3168/jds.2006-314] [Citation(s) in RCA: 38] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
A surveillance program in which all cattle herds in Denmark are classified into Salmonella infection categories has been in place since 2002. Dairy herds were considered test negative and thus most likely free of infection if Salmonella antibody measurements were consistently low in bulk tank milk samples collected every 3 mo. Herds were considered test positive and thus most likely infected if the 4-quarter moving average bulk tank milk antibody concentration was high or if there was a large increase in the most recent measurement compared with the average value from the previous 3 samples. The objective of this study was to evaluate risk factors for changing from test negative to positive, which was indicative of herds becoming infected from one quarter of the year to the next, and risk factors for changing from test positive to negative, which was indicative of herds recovering from infection between 2 consecutive quarters of the year. The Salmonella serotypes in question were Salmonella Dublin or other serotypes that cross-react with the Salmonella Dublin antigen in the ELISA (e.g., some Salmonella Typhimurium types). Two logistic regression models that accounted for repeated measurements at the herd level and controlled for herd size and regional effects were used. Data from 2003 was used for the analyses. A change from test negative to positive occurred in 2.0% of the quarterly observations (n = 21,007) from test negative dairy herds. A change from test positive to negative occurred in 10.0% of quarterly observations (n = 6,168) available from test positive dairy herds. The higher the number of test-positive neighbor herds in the previous year-quarter, the more likely herds were to become test positive for Salmonella. The number of purchased cattle from test-positive herds was also associated with changing from test negative to positive. The bigger the herd, the more likely it was to change from negative to test positive. The effect of herd size on recovery was less clear. Large herds consisting mainly of large breeds or having test-positive neighbors in a 2-km radius were less likely to change from test positive to negative, whereas the breed and neighbor factors were not found to be important for small herds. Organic production was associated with remaining test positive, but not with becoming test positive. The results emphasize the importance of external and internal biosecurity measures to control Salmonella infections.
Collapse
|
29
|
Use of IgG avidity ELISA to differentiate acute from persistent infection with Salmonella Dublin in cattle. J Appl Microbiol 2006; 100:144-52. [PMID: 16405694 DOI: 10.1111/j.1365-2672.2005.02758.x] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
AIMS To investigate whether an immunoglobulin (Ig)G avidity ELISA can be used to differentiate between acute and persistent infection with Salmonella (S.) Dublin in cattle. To determine whether the IgG isotype, IgG(1) and IgG(2) responses in acute and persistent infections differ. METHODS AND RESULTS Animals were selected from two herds with long-term infection (years) and two herds recently infected (<3 months). Forty-seven animals were categorized into groups based on the persistence of their antibody level in milk. Based on titre from two serial dilutions the avidity index (AI) was calculated for IgG (IgG-AI), IgG(1) (IgG(1)-AI) and IgG(2) (IgG(2)-AI). The mean IgG-AI for suspected carrier animals with either persistently high (group 1) or persistently high to medium high (group 2) antibody levels was significantly (P = 0.003) higher (32.1% and 38.4%) than for acutely infected animals (21.7% and 22.3%). The probability of being a suspect carrier was associated with IgG-AI, antibody level in the sample and age. However, the effect of age could be the result of a biased sample selection. Specificities and sensitivities were calculated at a range of cut-off values for IgG-AI and IgG(1)-AI. Overall, IgG(2)-AI was high compared with IgG(1)-AI, and there was no difference in IgG(2)-AI between infection groups. There was no difference in the ratio IgG(2):IgG(1) for acute and persistent infection groups. CONCLUSIONS Assuming that a persistently high antibody response is indicative of persistent infection with S. Dublin in cattle, it can be concluded that the IgG-AI can aid in differentiating between acute and long-term infection on herd level. However, for the test to be useful as an alternative tool to repeated sampling over time for detection of persistently infected carriers during control strategies in cattle herds, the test needs to be optimized and studied further in a larger sample of well-characterized infections in cattle. The affinity of IgG(2) is higher than IgG(1) early in the S. Dublin infection. There appears to be no difference in the IgG(2)-AI between the acute and chronic infection stages. SIGNIFICANCE AND IMPACT OF THE STUDY For decades the strategies for detection of persistently infected cattle in S. Dublin infected herds have involved repeated bacteriological culture of faecal samples or repeated antibody measurements over several months. Both methods are time consuming and costly, leaving a new method for detection of carrier animals based on a single sampling highly desirable. This study illustrates a tool, IgG-AI, which may prove useful, although more validation of the method is required before it is used in practice.
Collapse
|
30
|
Simulation model estimates of test accuracy and predictive values for the Danish Salmonella surveillance program in dairy herds. Prev Vet Med 2006; 77:284-303. [PMID: 16979767 DOI: 10.1016/j.prevetmed.2006.08.001] [Citation(s) in RCA: 28] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2005] [Revised: 06/29/2006] [Accepted: 08/10/2006] [Indexed: 10/24/2022]
Abstract
The Danish government and cattle industry instituted a Salmonella surveillance program in October 2002 to help reduce Salmonella enterica subsp. enterica serotype Dublin (S. Dublin) infections. All dairy herds are tested by measuring antibodies in bulk tank milk at 3-month intervals. The program is based on a well-established ELISA, but the overall test program accuracy and misclassification was not previously investigated. We developed a model to simulate repeated bulk tank milk antibody measurements for dairy herds conditional on true infection status. The distributions of bulk tank milk antibody measurements for infected and noninfected herds were determined from field study data. Herd infection was defined as having either >or=1 Salmonella culture-positive fecal sample or >or=5% within-herd prevalence based on antibody measurements in serum or milk from individual animals. No distinction was made between Dublin and other Salmonella serotypes which cross-react in the ELISA. The simulation model was used to estimate the accuracy of herd classification for true herd-level prevalence values ranging from 0.02 to 0.5. Test program sensitivity was 0.95 across the range of prevalence values evaluated. Specificity was inversely related to prevalence and ranged from 0.83 to 0.98. For a true herd-level infection prevalence of 15%, the estimate for specificity (Sp) was 0.96. Also at the 15% herd-level prevalence, approximately 99% of herds classified as negative in the program would be truly noninfected and 80% of herds classified as positive would be infected. The predictive values were consistent with the primary goal of the surveillance program which was to have confidence that herds classified negative would be free of Salmonella infection.
Collapse
|
31
|
Reduced prevalence of early preterm delivery in women with Type 1 diabetes and microalbuminuria--possible effect of early antihypertensive treatment during pregnancy. Diabet Med 2006; 23:426-31. [PMID: 16620272 DOI: 10.1111/j.1464-5491.2006.01831.x] [Citation(s) in RCA: 47] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
AIMS In normotensive women with Type 1 diabetes and microalbuminuria we previously found preterm delivery (< 34 weeks) in 23% of the pregnancies. Antihypertensive treatment was initiated in late pregnancy when preeclampsia was diagnosed and diastolic blood pressure > 90 mmHg. From April 2000 our routine was changed and early antihypertensive treatment with methyldopa was initiated if antihypertensive treatment was given prior to pregnancy, if urinary albumin excretion (UAE) was > 2 g/24 h, or blood pressure > 140/90 mmHg. The present study describes the impact of this more aggressive antiypertensive treatment in the prevalence of preterm delivery. METHODS The old cohort (1995-1999) consisted of 26 and the new cohort (2000-2003) of 20 pregnant women with Type 1 diabetes and microalbuminuria. All were referred before gestational week 17. RESULTS The cohorts were comparable with regard to age, diabetes duration, prepregnancy body mass index, HbA1c, blood pressure 121 (13)/71 (8) vs. 121 (14)/73 (8) mmHg [mean (sd)] and early UAE 69 (16-278) vs. 74 (30-287) mg/24 h (geometric mean and range). Antihypertensive treatment was initiated in the old cohort at 29 (20-33) weeks, n = 9, and in the new at 13 (0-34) weeks, n = 10. The prevalence of preterm delivery before 34 weeks was reduced from 23% to zero (P = 0.02), preterm delivery before 37 weeks from 62% to 40% (P = 0.15) and preeclampsia from 42% to 20% (P = 0.11). Perinatal mortality occurred in 4% vs. 0%. Birth weight was 3124 (767) g vs. 3279 (663) g. CONCLUSION Introduction of early antihypertensive treatment with methyldopa in normotensive pregnant women with Type 1 diabetes and microalbuminuria resulted in a significant reduction in preterm delivery before gestational week 34.
Collapse
|
32
|
Molecular differentiation within and among island populations of the endemic plant Scalesia affinis (Asteraceae) from the Galápagos Islands. Heredity (Edinb) 2005; 93:434-42. [PMID: 15280895 DOI: 10.1038/sj.hdy.6800520] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022] Open
Abstract
Molecular variance was estimated in seven populations of the endemic species Scalesia affinis within and among islands of the Galapagos. The analysis, based on 157 polymorphic AFLP markers, revealed a high differentiation among populations, of which most was partitioned among islands. In addition, the information content of AFLP markers was tested with sets of discriminant analyses based on different numbers of AFLP markers. This indicated that the markers were highly informative in discriminating the populations. Although one of four populations from the island Isabela was sampled from a volcano 100 km away from the remaining populations, this population resembled the others on Isabela. The partitioning of molecular variance (AFLP) resulted in two unities, one consisting of populations from Isabela and one of populations from Santa Cruz and Floreana. The differentiation in two chloroplast microsatellites was higher than for AFLP markers and equally partitioned among populations within islands as among islands. Thus, gene flow via fruits within islands is as limited as among islands. The lower differentiation within islands in the nuclear AFLP markers may thus indicate that gene flow within islands is mostly accounted for by pollen transfer. S. affinis is the only species in the genus that is not listed in 2000 IUCN Red List of Threatened Species. However, due to prominent grazing and land exploitation, some populations have recently been reduced markedly, which was reflected in lower diversity. As inbreeding depression is present in the species, the rapid bottlenecks are threats to the populations.
Collapse
|
33
|
Salmonella Dublin infection in dairy cattle: risk factors for becoming a carrier. Prev Vet Med 2004; 65:47-62. [PMID: 15454326 DOI: 10.1016/j.prevetmed.2004.06.010] [Citation(s) in RCA: 54] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2003] [Revised: 06/02/2004] [Accepted: 06/21/2004] [Indexed: 11/15/2022]
Abstract
Long-term Salmonella Dublin carrier animals harbor the pathogen in lymph nodes and internal organs and can periodically shed bacteria through feces or milk, and contribute to transmission of the pathogen within infected herds. Thus, it is of great interest to reduce the number of new carrier animals in cattle herds. An observational field study was performed to evaluate factors affecting the risk that dairy cattle become carrier animals after infection with Salmonella Dublin. Based on repeated sampling, cattle in 12 Danish dairy herds were categorized according to course of infection, as either carriers (n = 157) or transiently infected (n = 87). The infection date for each animal was estimated from fecal excretion and antibody responses. The relationship between the course of infection (carrier versus transiently infected) and risk factors were analyzed using a random effect multilevel, multivariable logistic regression model. The animals with the highest risk of becoming carriers were heifers infected between the age of 1 year and 1st calving, and cows infected around the time of calving. The risk was higher in the first two quarters of the year (late Winter to Spring), and when the prevalence of potential shedders in the herd was low. The risk also varied between herds. The herds with the highest risk of carrier development were herds with clinical disease outbreaks during the study period. These findings are useful for future control strategies against Salmonella Dublin, because they show the importance of optimized calving management and management of heifers, and because they show that even when the herd prevalence is low, carriers are still being produced. The results raise new questions about the development of the carrier state in cattle after infection with low doses of Salmonella Dublin.
Collapse
|
34
|
Evaluation of an indirect serum ELISA and a bacteriological faecal culture test for diagnosis of Salmonella serotype Dublin in cattle using latent class models. J Appl Microbiol 2004; 96:311-9. [PMID: 14723692 DOI: 10.1046/j.1365-2672.2004.02151.x] [Citation(s) in RCA: 41] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
AIMS To evaluate a conventional bacteriological test based on faecal culture and an indirect serum ELISA for detection of S. Dublin infected cattle. To compare the predictive values of the two tests in relation to the prevalence. METHODS AND RESULTS A total of 4531 paired samples from cattle in 29 dairy herds were analysed for presence of S. Dublin bacteria in faeces and immunoglobulins directed against S. Dublin lipopolysaccharide in an indirect serum ELISA. Sensitivity and specificity were estimated at two ELISA cut-off values using a validation method based on latent class models, which presumably provides less biased results than traditional validation methods. Stratification of data into three age groups gave significantly better estimates of test performance of the ELISA. Receiver operating characteristic (ROC) curves were constructed for comparison of overall performance of the ELISA between the three age groups. The sensitivity of the faecal culture test was low (6-14%). ELISA appeared to have a higher validity for animals aged 100-299 days of age than older or younger animals. Overall, the negative predictive value of the ELISA was 2-10 times higher than for the faecal culture test at realistic prevalence of infection in the test population. CONCLUSIONS The diagnostic sensitivity of the faecal culture test for detection of S. Dublin is poor, the specificity is 1. The superior sensitivity and negative predictive value of the serum ELISA makes this test preferable to faecal culture as an initial screening test and for certification of herds not infected with S. Dublin. SIGNIFICANCE AND IMPACT OF THE STUDY A quantitative estimate of the sensitivity of a faecal culture test for S. Dublin in a general population was provided. ELISA was shown to be an appropriate alternative diagnostic test. Preferably, samples from animals aged 100-299 days of age should be used as these give the best overall performance of the ELISA. Plots of ROC curves and predictive values in relation to prevalence facilitates optimisation of the ELISA cut-off value.
Collapse
|
35
|
[Statins. A new osteoporosis prophylaxis?]. Ugeskr Laeger 2001; 163:2007-9. [PMID: 11307362] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/19/2023]
|
36
|
K-ras mutations in sinonasal adenocarcinomas in patients occupationally exposed to wood or leather dust. Cancer Lett 1998; 126:59-65. [PMID: 9563649 DOI: 10.1016/s0304-3835(97)00536-3] [Citation(s) in RCA: 52] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/07/2023]
Abstract
Of 39 males diagnosed with sinonasal adenocarcinomas over 30 years in the Lund University Hospital catchment area (1.5 million inhabitants), archival tumor tissue was available from 29. Of these, 16 had been exposed to wood dust and three had been exposed to leather dust. The intestinal-type and papillary adenocarcinomas were more common in the exposed patients (P = 0.0002, Fisher's exact test). The tumors from all but one of the 29 sinonasal adenocarcinomas could be analyzed for point mutations at codons 12, 13 and 61 of the K-ras gene. Four mutations were detected in the 28 tumors. The three mutations in the patients exposed to wood and leather dust were all G:C --> A:T transitions, with two at position 2 of codon 12 and one at position 2 of codon 13. The high proportion of G:C --> A:T mutations in this rare tumor may reflect a genotoxic agent in wood and leather dust.
Collapse
|
37
|
Abstract
Somatic cell gene mutation arising in vivo may be considered to be a biomarker for genotoxicity. Assays detecting mutations of the haemoglobin and glycophorin A genes in red blood cells and of the hypoxanthine-guanine phosphoribosyltransferase and human leucocyte antigenes in T-lymphocytes are available in humans. This MiniReview describes these assays and their application to studies of individuals exposed to genotoxic agents. Moreover, with the implementation of techniques of molecular biology mutation spectra can now be defined in addition to the quantitation of in vivo mutant frequencies. We describe current screening methods for unknown mutations, including the denaturing gradient gel electrophoresis, single strand conformation polymorphism analysis, heteroduplex analysis, chemical modification techniques and enzymatic cleavage methods. The advantage of mutation detection as a biomarker is that it integrates exposure and sensitivity in one measurement. With the analysis of mutation spectra it may thus be possible to identify the causative genotoxic agent.
Collapse
|
38
|
Detection of the plasma cholinesterase K variant by PCR using an amplification-created restriction site. Hum Hered 1996; 46:26-31. [PMID: 8825459 DOI: 10.1159/000154321] [Citation(s) in RCA: 35] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/02/2023] Open
Abstract
Ten individuals registered at the Danish Cholinesterase Research Unit were examined at the DNA level for the presence of the K allele of plasma cholinesterase, using amplification-created restriction sites (ACRSs). A further nine members of a family registered at the unit were tested for mutations of the K and atypical variants. The frequency of the K allele was calculated from examination of normal material from 25 individuals, representing 50 random alleles. The results show that the ACRS method successfully demonstrates the presence of the K variant, whose frequency in the Danish population was found to be 0.18. We conclude that this technique is a reliable and rapid non-radioactive diagnostic assay for detecting the plasma cholinesterase K variant.
Collapse
|
39
|
Detection of ten new mutations by screening the gene encoding factor IX of Danish hemophilia B patients. Thromb Haemost 1995; 73:774-8. [PMID: 7482402] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023]
Abstract
Hemophilia B is caused by a wide range of mutations. In order to characterize the mutations among patients in Denmark, we have systematically screened the entire coding region, the promoter region and exon flanking sequences of the gene encoding factor IX using single strand conformation and heteroduplex analyses. Patients from 32 different families were examined, and point mutations (23 different) were found in all of them. Ten of the mutations have not been reported by others; they include a splice site mutation, a single base pair deletion, and missense mutations. Notably, the study contains a female patient and a previously described Leyden mutation. In ten families with sporadic cases of hemophilia B, all 10 mothers were found to be carriers. The origin of two of these mutations was established.
Collapse
|
40
|
Abstract
Glutathione peroxidase, one of the major antioxidants in the human brain, has been found to have decreased activity in patients suffering from multiple sclerosis (MS). This study compares the activity of lymphocyte glutathione peroxidase (L-GSH-px) in MS patients suffering from acute relapses with clinically stable MS patients and with control patients referred with nondemyelinating neurological diseases. All three groups showed an increase of mean enzymatic activity (MEA) during the observation period. The highest MEA in this study was observed in the MS groups. However, there were no significant differences in the L-GSH-px activity in the three groups. These results are not in accordance with previous investigations, and the need for further research in this field is emphasized.
Collapse
|
41
|
|
42
|
Analysis of the decreased NK (natural killer) activity in lung cancer patients, using whole blood versus separated mononuclear cells. JOURNAL OF CLINICAL & LABORATORY IMMUNOLOGY 1989; 29:71-7. [PMID: 2632804] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 01/01/2023]
Abstract
The aim of this study was to analyze whether a whole blood assay would give a more correct measure of NK activity than assays using separated mononuclear cells (SMNC). We found that the NK activity of whole blood was higher than the NK activity of SMNC in the 28 lung cancer patients investigated (p = 0.01), whereas this difference between the assays could not be demonstrated in the 29 healthy controls. Since no differences were found between the NK activity of washed blood, SMNC, and monocyte-depleted lymphoid cells, there was no indication that the lower NK activity of SMNC in comparison with whole blood was due to cell loss or to a systematic disturbing effect due to monocytes. The possible effect of plasma factors on the whole blood NK activity was analyzed by comparing whole blood and washed blood. The NK activity of whole blood was increased in comparison with washed blood in the lung cancer patients (p less than 0.0001) indicating a stimulatory effect of plasma. Further, the finding that the reactive capability of lymphocytes from cancer patients was higher than in controls could indicate preactivation of the lymphocytes from the cancer patients due to the presence of stimulatory plasma factors. The NK activity of lung cancer patients was lower than the NK activity of healthy controls. The difference was found to be smaller with whole blood than with SMNC as effector cells, although both differences were significant. The decreased NK activity of cancer patients could be due to blocking immune complexes (IC), but we found no evidence for circulating or cell-bound IC in the lung cancer patients.
Collapse
|
43
|
A polyclonal IgM-RF enzyme-linked immunosorbent assay for the detection of circulating immune complexes. JOURNAL OF CLINICAL & LABORATORY IMMUNOLOGY 1988; 26:195-200. [PMID: 3199429] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 01/04/2023]
Abstract
A microplate-adapted polyclonal IgM-rheumatoid factor enzyme-linked immunosorbent assay (pIgM-RF ELISA) for the detection of circulating immune complexes (cIC) is presented. The assay involves the competitive binding of cIC and horseradish peroxidase conjugated aggregated human IgG (HRP-AHG) to solid-phase bound polyclonal IgM-RF (pIgM-RF). Aggregated human IgG (AHG) inhibited the binding of HRP-AHG to pIgM-RF in a dose-dependent way. The detection limit of the assay was about 125 ng AHG/ml diluted serum. The coefficients of variation for the assay varied from 5.0 to 14.7% for intra-assay runs and from 4.5 to 13.8% for inter-assay runs. The levels of cIC in sera from 29 patients with systemic lupus erythematosus (SLE), 85 untreated patients with breast cancer and 105 blood bank donors were studied by the pIgM-RF ELISA. Increased levels of cIC were demonstrated in 41.4% of the SLE group, in 8.2% of the breast cancer group, and in 1.9% of the normal control group. The difference in cIC activity between the SLE group and the normal control group was statistically significant.
Collapse
|