101
|
Casey AL, Karpanen TJ, Nightingale P, Elliott TS. The risk of microbial contamination associated with six different needle-free connectors. ACTA ACUST UNITED AC 2018; 27:S18-S26. [PMID: 29368573 DOI: 10.12968/bjon.2018.27.2.s18] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
BACKGROUND needle-free connectors are widely used in clinical practice. The aim of this study was to identify any differences between microbial ingress into six different connectors (three neutral-displacement, one negative-displacement and two anti-reflux connectors). METHODS each connector underwent a 7-day clinical simulation involving repeated microbial contamination of the connector's injection ports with Staphylococcus aureus followed by decontamination and then saline flushes through each connector. The simulation was designed to be a surrogate marker for the potential risk of contamination in clinical practice. RESULTS increasing numbers of S. aureus were detected in the flushes over the 7 days of sampling despite adherence to a rigorous decontamination programme. Significant differences in the number of S. aureus recovered from the saline flush of some types of connectors were also detected. Two different durations (5- and 15-second) of decontamination of the injection ports with 70% isopropyl alcohol (IPA) wipes were also investigated. There was no significant difference between the median number of S. aureus recovered in the saline flushes following a 5-second (165.5, 95% CI=93-260) or a 15-second decontamination regimen (75, 10-190). CONCLUSIONS The findings suggest that there may be differences in the risk of internal microbial contamination with different types of connectors and that even 15 seconds of decontamination may not fully eradicate microorganisms from the injection ports of some devices.
Collapse
|
102
|
Zhang N, Huang H, Su B, Ma X, Li Y. A human behavior integrated hierarchical model of airborne disease transmission in a large city. BUILDING AND ENVIRONMENT 2018; 127:211-220. [PMID: 32287976 PMCID: PMC7115769 DOI: 10.1016/j.buildenv.2017.11.011] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/22/2017] [Revised: 10/06/2017] [Accepted: 11/06/2017] [Indexed: 05/19/2023]
Abstract
Epidemics of infectious diseases such as SARS, H1N1, and MERS threaten public health, particularly in large cities such as Hong Kong. We constructed a human behavior integrated hierarchical (HiHi) model based on the SIR (Susceptible, Infectious, and Recovered) model, the Wells-Riley equation, and population movement considering both spatial and temporal dimensions. The model considers more than 7 million people, 3 million indoor environments, and 2566 public transport routes in Hong Kong. Smallpox, which could be spread through airborne routes, is studied as an example. The simulation is based on people's daily commutes and indoor human behaviors, which were summarized by mathematical patterns. We found that 59.6%, 18.1%, and 13.4% of patients become infected in their homes, offices, and schools, respectively. If both work stoppage and school closure measures are taken when the number of infected people is greater than 1000, an infectious disease will be effectively controlled after 2 months. The peak number of infected people will be reduced by 25% compared to taking no action, and the time of peak infections will be delayed by about 40 days if 90% of the infected people go to hospital during the infectious period. When ventilation rates in indoor environments increase to five times their default settings, smallpox will be naturally controlled. Residents of Kowloon and the north part of Hong Kong Island have a high risk of infection from airborne infectious diseases. Our HiHi model reduces the calculation time for infection rates to an acceptable level while preserving accuracy.
Collapse
|
103
|
Moazeni M, Nikaeen M, Hadi M, Moghim S, Mouhebat L, Hatamzadeh M, Hassanzadeh A. Estimation of health risks caused by exposure to enteroviruses from agricultural application of wastewater effluents. WATER RESEARCH 2017; 125:104-113. [PMID: 28841422 DOI: 10.1016/j.watres.2017.08.028] [Citation(s) in RCA: 23] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/05/2017] [Revised: 07/08/2017] [Accepted: 08/11/2017] [Indexed: 05/21/2023]
Abstract
Agricultural reuse of wastewater is a common practice worldwide, especially in arid and semiarid area due to the freshwater scarcity. Wastewater irrigation in the Middle East, one of the most water-stressed regions in the world, could be a key factor for socio-economic development, but the microbial contamination of untreated or partially treated wastewater is a serious public health concern. Potential transmission of enteric viral infections through wastewater reuse in agricultural activities represents a true health risk for exposed individuals. Accordingly, it is important to assess the health risks associated with wastewater reuse. A quantitative microbial risk assessment (QMRA) with Monte-Carlo simulation was used to estimate the annual risk of enterovirus (EV) infection and disease burden for farmers and consumers of wastewater-irrigated lettuce in Iran, a semiarid country in the Middle East region. Risk analysis was performed based on the measured concentrations of EV in effluent of two activated sludge wastewater treatment plants (WWTP). Wastewater effluent sampling was carried out over a nine-month period, and the presence of total and fecal coliforms and EV was determined. Fecal coliform bacteria were found at a high level exceeded the guideline limit for wastewater reuse in agriculture. EVs were detected in 40% of samples with the highest frequency in summer with a mean of 12 and 16 pfu/ml for WWTP-A and B, respectively. Statistical analysis showed no correlation between the concentration of fecal coliforms and EV. The estimated infection risk for EVs was 8.8 × 10-1 and 8.2 × 10-1 per person per year (pppy) for farmers of WWTP-A and -B, respectively which was about 2 log higher than the tolerable infection risk of 2 × 10-3 pppy. The estimated risk for lettuce consumers exhibited a lower level of infection and disease burden but higher than the guideline limits. The median disease burden for consumption of lettuce irrigated with activated sludge effluents was about 10-3 Disability Adjusted Life Years (DALY) pppy which exceeded the WHO guideline threshold of 10-4 DALY pppy. The results of study indicated that the activated sludge effluents require an additional reduction of EVs to achieve the acceptable level of risk for agricultural reuse of wastewater.
Collapse
|
104
|
Chastagner A, Pion A, Verheyden H, Lourtet B, Cargnelutti B, Picot D, Poux V, Bard É, Plantard O, McCoy KD, Leblond A, Vourc'h G, Bailly X. Host specificity, pathogen exposure, and superinfections impact the distribution of Anaplasma phagocytophilum genotypes in ticks, roe deer, and livestock in a fragmented agricultural landscape. INFECTION GENETICS AND EVOLUTION 2017; 55:31-44. [PMID: 28807858 DOI: 10.1016/j.meegid.2017.08.010] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/09/2017] [Revised: 08/08/2017] [Accepted: 08/10/2017] [Indexed: 10/19/2022]
Abstract
Anaplasma phagocytophilum is a bacterial pathogen mainly transmitted by Ixodes ricinus ticks in Europe. It infects wild mammals, livestock, and, occasionally, humans. Roe deer are considered to be the major reservoir, but the genotypes they carry differ from those that are found in livestock and humans. Here, we investigated whether roe deer were the main source of the A. phagocytophilum genotypes circulating in questing I. ricinus nymphs in a fragmented agricultural landscape in France. First, we assessed pathogen prevalence in 1837 I. ricinus nymphs (sampled along georeferenced transects) and 79 roe deer. Prevalence was dramatically different between ticks and roe deer: 1.9% versus 76%, respectively. Second, using high-throughput amplicon sequencing, we characterized the diversity of the A. phagocytophilum genotypes found in 22 infected ticks and 60 infected roe deer; the aim was to determine the frequency of co-infections. Only 22.7% of infected ticks carried genotypes associated with roe deer. This finding fits with others suggesting that cattle density is the major factor explaining infected tick density. To explore epidemiological scenarios capable of explaining these patterns, we constructed compartmental models that focused on how A. phagocytophilum exposure and infection dynamics affected pathogen prevalence in roe deer. At the exposure levels predicted by the results of this study and the literature, the high prevalence in roe deer was only seen in the model in which superinfections could occur during all infection phases and when the probability of infection post exposure was above 0.43. We then interpreted these results from the perspective of livestock and human health.
Collapse
|
105
|
Di Minno G, Navarro D, Perno CF, Canaro M, Gürtler L, Ironside JW, Eichler H, Tiede A. Pathogen reduction/inactivation of products for the treatment of bleeding disorders: what are the processes and what should we say to patients? Ann Hematol 2017; 96:1253-1270. [PMID: 28624906 PMCID: PMC5486800 DOI: 10.1007/s00277-017-3028-4] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2016] [Accepted: 05/22/2017] [Indexed: 12/11/2022]
Abstract
Patients with blood disorders (including leukaemia, platelet function disorders and coagulation factor deficiencies) or acute bleeding receive blood-derived products, such as red blood cells, platelet concentrates and plasma-derived products. Although the risk of pathogen contamination of blood products has fallen considerably over the past three decades, contamination is still a topic of concern. In order to counsel patients and obtain informed consent before transfusion, physicians are required to keep up to date with current knowledge on residual risk of pathogen transmission and methods of pathogen removal/inactivation. Here, we describe pathogens relevant to transfusion of blood products and discuss contemporary pathogen removal/inactivation procedures, as well as the potential risks associated with these products: the risk of contamination by infectious agents varies according to blood product/region, and there is a fine line between adequate inactivation and functional impairment of the product. The cost implications of implementing pathogen inactivation technology are also considered.
Collapse
|
106
|
Fraile A, McLeish MJ, Pagán I, González-Jara P, Piñero D, García-Arenal F. Environmental heterogeneity and the evolution of plant-virus interactions: Viruses in wild pepper populations. Virus Res 2017; 241:68-76. [PMID: 28554561 DOI: 10.1016/j.virusres.2017.05.015] [Citation(s) in RCA: 25] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2016] [Revised: 05/18/2017] [Accepted: 05/19/2017] [Indexed: 12/27/2022]
Abstract
Understanding host-pathogen interactions requires analyses to address the multiplicity of scales in heterogeneous landscapes. Anthropogenic influence on plant communities, especially cultivation, is a major cause of environmental heterogeneity. We have approached the analysis of how environmental heterogeneity determines plant-virus interactions by studying virus infection in a wild plant currently undergoing incipient domestication, the wild pepper or chiltepin, across its geographical range in Mexico. We have shown previously that anthropogenic disturbance is associated with higher infection and disease risk, and with disrupted patterns of host and virus genetic spatial structure. We now show that anthropogenic factors, species richness, host genetic diversity and density in communities supporting chiltepin differentially affect infection risk according to the virus analysed. We also show that in addition to these factors, a broad range of abiotic and biotic variables meaningful to continental scales, have an important role on the risk of infection depending on the virus. Last, we show that natural virus infection of chiltepin plants in wild communities results in decreased survival and fecundity, hence negatively affecting fitness. This important finding paves the way for future studies on plant-virus co-evolution.
Collapse
|
107
|
Kyrgidis A, Yavropoulou MP, Lagoudaki R, Andreadis C, Antoniades K, Kouvelas D. Increased CD14+ and decreased CD14- populations of monocytes 48 h after zolendronic acid infusion in breast cancer patients. Osteoporos Int 2017; 28:991-999. [PMID: 27858122 DOI: 10.1007/s00198-016-3807-0] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/24/2016] [Accepted: 10/06/2016] [Indexed: 11/24/2022]
Abstract
UNLABELLED It has been proposed that bisphosphonates cause osteonecrosis of the jaws through impairment of the monocyte population function and proliferation. Such changes have been confirmed in jaw tissues, ex vivo. In this clinical study, we report for the first time a similar pattern of changes in peripheral blood monocytes. INTRODUCTION The aim of this study is to examine the effect of zolendronic acid administration in the peripheral blood white cell population, seeking a plausible pathophysiological link between bisphosphonates and osteonecrosis of the jaw. METHODS Twenty-four breast cancer patients, under zolendronic acid treatment for bone metastasis, were included. Peripheral blood samples were obtained prior to and 48 h following zolendronic acid administration. Flow cytometry gated at leukocyte, monocyte, and the granulocyte populations for the CD4/CD8/CD3, CD3/CD16+56/CD45/CD19, CD14/CD123, and CD14/23 stainings were performed. RESULTS We were able to record a number of changes in the white cell populations after 48 h of zolendronic acid administration. Most importantly, in the monocyte populations, we were able to detect statistically significant increased populations of CD14+/CD23+ (p = 0.038), CD14+/CD23- (p = 0.028), CD14+/CD123+ (p = 0.070, trend), and CD14+/CD123- (p = 0.043). In contrast, statistically significant decreased populations of CD14-/CD23+ (p = 0.037) and CD14-/CD123+ (p = 0.003) were detected. CONCLUSIONS Our results provide evidence supporting the hypothesis that bisphosphonate administration modifies the monocyte-mediated immune response. An increase of CD14+ peripheral blood monocyte (PBMC) populations along with a decrease of CD14- PBMC populations has been recorded. The latter finding is in accordance with limited-currently existing-evidence and warrants further elucidation.
Collapse
|
108
|
Dobner J, Kaser S. Body mass index and the risk of infection - from underweight to obesity. Clin Microbiol Infect 2017; 24:24-28. [PMID: 28232162 DOI: 10.1016/j.cmi.2017.02.013] [Citation(s) in RCA: 201] [Impact Index Per Article: 28.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/28/2016] [Revised: 02/11/2017] [Accepted: 02/12/2017] [Indexed: 01/21/2023]
Abstract
BACKGROUND Nutritional status is a well-known risk factor for metabolic and endocrine disorders. Recent studies suggest that dietary intake also affects immune function and as a consequence infection risk. AIMS This reviews aims to give an overview on the effect of body weight on infection rate at different periods of life. SOURCES Clinically relevant prospective, cross-sectional and case-control community-based studies are summarized. CONTENT In children and adolescents underweight is a significant risk factor for infection especially in developing countries, probably reflecting malnutrition and poor hygienic standards. Data from industrialized countries suggest that infection rate is also increased in obese children and adolescents. Similarly, several studies suggest a U-shaped increased infection rate in both underweight and obese adults. In the latter, infections of the skin and respiratory tract as well as surgical-site infections have consistently been reported to be more common than in normal-weight participants. Paradoxically, mortality of critically ill patients was reduced in obesity in some studies. IMPLICATIONS Several studies in children or adults suggest that both underweight and obesity are associated with increased infection risk. However, confounding factors such as malnutrition, hygienic status and underlying disease or co-morbidities might aggravate accurate assessment of the impact of body weight on infection risk.
Collapse
|
109
|
Chao LW, Szrek H, Leite R, Ramlagan S, Peltzer K. Do Customers Flee From HIV? A Survey of HIV Stigma and Its Potential Economic Consequences on Small Businesses in Tshwane (Pretoria), South Africa. AIDS Behav 2017; 21:217-226. [PMID: 27385027 PMCID: PMC5218977 DOI: 10.1007/s10461-016-1463-1] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Abstract
HIV stigma and discrimination affect care-seeking behavior and may also affect entrepreneurial activity. We interview 2382 individuals in Pretoria, South Africa, and show that respondents believe that businesses with known HIV+ workers may lose up to half of their customers, although the impact depends on the type of business. Survey respondents' fear of getting HIV from consuming everyday products sold by the business-despite a real infection risk of zero-was a major factor driving perceived decline in customers, especially among food businesses. Respondents' perceptions of the decline in overall life satisfaction when one gets sick from HIV and the respondent's dislike of people with HIV were also important predictors of potential customer exit. We suggest policy mechanisms that could improve the earnings potential of HIV+ workers: reducing public health scare tactics that exacerbate irrational fear of HIV infection risk and enriching public health education about HIV and ARVs to improve perceptions about people with HIV.
Collapse
|
110
|
Microbiologic contamination of a positive- and a neutral- displacement needleless intravenous access device in clinical use. Am J Infect Control 2016; 44:1678-1680. [PMID: 27566872 DOI: 10.1016/j.ajic.2016.06.027] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2016] [Revised: 06/06/2016] [Accepted: 06/06/2016] [Indexed: 11/21/2022]
Abstract
The use of a positive-displacement needleless intravenous access device was associated with lower microbial contamination rates compared with a neutral-displacement device when used on central venous catheters in hemato-oncology patients. In addition, rates of central line-associated bloodstream infection did not differ when either device was used.
Collapse
|
111
|
Clegg TA, Graham DA, O'Sullivan P, McGrath G, More SJ. Temporal trends in the retention of BVD+ calves and associated animal and herd-level risk factors during the compulsory eradication programme in Ireland. Prev Vet Med 2016; 134:128-138. [PMID: 27836034 DOI: 10.1016/j.prevetmed.2016.10.010] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2016] [Revised: 10/12/2016] [Accepted: 10/13/2016] [Indexed: 11/15/2022]
Abstract
The national BVD eradication programme in Ireland started on a voluntary basis in 2012, becoming compulsory in 2013. The programme relies on accurate identification and prompt removal of BVD+ calves. However, a minority of herd owners have chosen to retain BVD+ animals (defined as still being alive more than seven weeks after the date of the initial test), typically with a view to fattening them to obtain some salvage value. During each year of the programme, additional measures have been introduced and implemented to encourage prompt removal of BVD+ animals. The objective of this study was to describe temporal trends in the retention of BVD+ calves and associated animal and herd-level risk factors during the first three years of the compulsory eradication programme in Ireland. The study population included all BVD+ calves born in Ireland in 2013-2015. A parametric survival model was developed to model the time from the initial BVD test until the animal was slaughtered/died on farm or until 31 December 2015 (whichever was earlier). A total of 29,504 BVD+ animals, from 13,917 herds, were included in the study. The proportion of BVD+ animals that were removed from the herd within 7 weeks of the initial test date increased from 43.7% in 2013 to 70.3% in 2015. BVD+ animals born in 2015 had a much lower survival time (median=33days) compared to the 2013 birth cohort (median=62days), with a year on year reduction in survival of BVD+ calves. In the initial parametric survival models, all interactions with herd type were significant. Therefore, separate models were developed for beef and dairy herds. Overall the results of the survival models were similar, with birth year, BVD+ status, herd size, county of birth and birth month consistently identified as risk factors independent of herd type (beef or dairy) or the numbers of BVD+ animals (single or multiple) in the herd. In addition, the presence of a registered mobile telephone number was identified as a risk factor in all models except for dairy herds with a single BVD+, while the sex of the BVD+ calf was only identified as a risk factor in this model. Significant progress has been made in addressing the issue of retention of BVD+ calves, however, there is a need for further improvement. A number of risk factors associated with retention have been identified suggesting areas where future efforts can be addressed.
Collapse
|
112
|
Wu Y, Tung TC, Niu JL. On-site measurement of tracer gas transmission between horizontal adjacent flats in residential building and cross- infection risk assessment. BUILDING AND ENVIRONMENT 2016; 99:13-21. [PMID: 32288039 PMCID: PMC7116928 DOI: 10.1016/j.buildenv.2016.01.013] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/10/2015] [Revised: 01/04/2016] [Accepted: 01/16/2016] [Indexed: 05/05/2023]
Abstract
Airborne transmission is a main spread mode of respiratory infectious diseases, whose frequent epidemic has brought serious social burden. Identifying possible routes of the airborne transmission and predicting the potential infection risk are meaningful for infectious disease control. In the present study, an internal spread route between horizontal adjacent flats induced by air infiltration was investigated. On-site measurements were conducted, and tracer gas technique was employed. Two measurement scenarios, closed window mode and open window mode, were compared. Using the calculated air change rate and mass fraction, the cross-infection risk was estimated using the Wells-Riley model. It found that tracer gas concentrations in receptor rooms are one order lower than the source room, and the infection risks are also one order lower. Opening windows results in larger air change rate on the one hand, but higher mass fraction on the other hand. Higher mass fraction not necessarily results in higher infection risk as the pathogen concentration in the source room is reduced by the higher air change rate. In the present study, opening windows could significantly reduce the infection risk of the index room but slightly reduce the risks in receptor rooms. The mass fraction of air originated from the index room to the receptor units could be 0.28 and the relative cross-infection risk through the internal transmission route could be 9%, which are higher than the external spread through single-sided window flush. The study implicates that the horizontal transmission route induced by air infiltration should not be underestimated.
Collapse
|
113
|
Chen J, Ma J, White SK, Cao Z, Zhen Y, He S, Zhu W, Ke C, Zhang Y, Su S, Zhang G. Live poultry market workers are susceptible to both avian and swine influenza viruses, Guangdong Province, China. Vet Microbiol 2015; 181:230-5. [PMID: 26476563 PMCID: PMC7119354 DOI: 10.1016/j.vetmic.2015.09.016] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2015] [Revised: 09/14/2015] [Accepted: 09/20/2015] [Indexed: 11/30/2022]
Abstract
This cross-sectional study
analyzed previous zoonotic infections among animal
workers. We examined 3 avian, 2 swine, 1
canine and 1 human influenza A viruses Swine and poultry farm workers can
easier be infected by respective species’ virus
subtypes. LPM workers were facing
significant higher infectious risk both from avian and swine
viruses. H5N1, H7N9 previously infected
cases were detected in this study.
Guangdong Province is recognized
for dense populations of humans, pigs, poultry and pets. In order to
evaluate the threat of viral infection faced by those working with
animals, a cross-sectional, sero-epidemiological study was conducted in
Guangdong between December 2013 and January 2014. Individuals working
with swine, at poultry farms, or live poultry markets (LPM), and
veterinarians, and controls not exposed to animals were enrolled in this
study and 11 (4 human, 3 swine, 3 avian, and 1 canine) influenza A
viruses were used in hemagglutination inhibition (HI) assays (7 strains)
and the cross-reactivity test (9 strains) in which 5 strains were used in
both tests. Univariate analysis was performed to identify which variables
were significantly associated with seropositivity. Odds ratios (OR)
revealed that swine workers had a significantly higher risk of elevated
antibodies against A/swine/Guangdong/L6/2009(H1N1), a classical swine
virus, and A/swine/Guangdong/SS1/2012(H1N1), a Eurasian avian-like swine
virus than non-exposed controls. Poultry farm workers were at a higher
risk of infection with avian influenza H7N9 and H9N2. LPM workers were at
a higher risk of infection with 3 subtypes of avian influenza, H5N1,
H7N9, and H9N2. Interestingly, the OR also indicated that LPM workers
were at risk of H1N1 swine influenza virus infection, perhaps due to the
presence of pigs in the LPM. While partial confounding by cross-reactive
antibodies against human viruses or vaccines cannot be ruled out, our
data suggests that animal exposed people as are more likely to have
antibodies against animal influenza viruses.
Collapse
|
114
|
Future risk of bovine tuberculosis recurrence among higher risk herds in Ireland. Prev Vet Med 2014; 118:71-9. [PMID: 25441049 DOI: 10.1016/j.prevetmed.2014.11.013] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2014] [Revised: 11/05/2014] [Accepted: 11/13/2014] [Indexed: 11/20/2022]
Abstract
Within the Irish national bovine tuberculosis (bTB) eradication programme, controls are tighter on higher risk herds, known as H-herds. These H-herds are defined as herds that have previously had a bTB restriction (also known as a bTB episode), with at least 2 animals positive to the single intradermal comparative tuberculin test (SICTT) or with a bTB lesion detected at slaughter. Such herds are considered at higher risk of recurrence following the end of the bTB episode. In this study, we examined if, and when, the future bTB risk of H-herds returned to a similar level comparable to herds with no history of bTB. In addition, the proportion of bTB episodes in 2012 that could be attributed to the recent introduction of an infected animal was also estimated, providing an update of earlier work. The study population consisted of all Irish herds that were not bTB restricted at the start of 2012 and with at least one whole-herd SICTT in 2012, with the herd being the unit of interest. The outcome measure was a bTB restriction, defined as any herd where at least 1 standard SICTT reactor or an animal with a bTB lesion at slaughter in 2012 was identified. A logistic regression model was used to model the probability of a herd being restricted in 2012. Herds that were previously restricted had significantly higher odds of being restricted in 2012 compared to herds that had not. Similarly, the odds of being restricted in 2012 decreased as the time since the previous restriction increased, but increased as the severity of the previous restriction increased. Odds of being restricted also increased with an increase (although not linear) in herd size, the number of animals greater than 1 year of age purchased in 2011, the county incidence rate and the proportion of cows in the herd. The recent introduction of an infected animal accounted for 7.4% (6.7-8.2) of herd restrictions. This study confirms the key role of past bTB history in determining the future risk of Irish herds, with the odds related to both the severity of and time since the previous restriction. It also illustrates the difficulty in clearly defining H-herds, noting that risk persists for extended periods following a bTB restriction, regardless of breakdown severity. There is a need for robust controls on H-herds for an extended period post de-restriction.
Collapse
|
115
|
Witard OC, Turner JE, Jackman SR, Kies AK, Jeukendrup AE, Bosch JA, Tipton KD. High dietary protein restores overreaching induced impairments in leukocyte trafficking and reduces the incidence of upper respiratory tract infection in elite cyclists. Brain Behav Immun 2014; 39:211-9. [PMID: 24120932 DOI: 10.1016/j.bbi.2013.10.002] [Citation(s) in RCA: 32] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/31/2013] [Revised: 09/24/2013] [Accepted: 10/02/2013] [Indexed: 12/28/2022] Open
Abstract
The present study examined whether a high protein diet prevents the impaired leukocyte redistribution in response to acute exercise caused by a large volume of high-intensity exercise training. Eight cyclists (VO2max: 64.2±6.5mLkg(-1)min(-1)) undertook two separate weeks of high-intensity training while consuming either a high protein diet (3gkg(-1)proteinBM(-1)day(-1)) or an energy and carbohydrate-matched control diet (1.5gkg(-1)proteinBM(-1)day(-1)). High-intensity training weeks were preceded by a week of normal-intensity training under the control diet. Leukocyte and lymphocyte sub-population responses to acute exercise were determined at the end of each training week. Self-reported symptoms of upper-respiratory tract infections (URTI) were monitored daily by questionnaire. Undertaking high-intensity training with a high protein diet restored leukocyte kinetics to similar levels observed during normal-intensity training: CD8(+) TL mobilization (normal-intensity: 29,319±13,130cells/μL×∼165min vs. high-intensity with protein: 26,031±17,474cells/μL×∼165min, P>0.05), CD8(+) TL egress (normal-intensity: 624±264cells/μL vs. high-intensity with protein: 597±478cells/μL, P>0.05). This pattern was driven by effector-memory populations mobilizing (normal-intensity: 6,145±6,227cells/μL×∼165min vs. high-intensity with protein: 6,783±8,203cells/μL×∼165min, P>0.05) and extravastating from blood (normal-intensity: 147±129cells/μL vs. high-intensity with protein: 165±192cells/μL, P>0.05). High-intensity training while consuming a high protein diet was associated with fewer symptoms of URTI compared to performing high-intensity training with a normal diet (P<0.05). To conclude, a high protein diet might reduce the incidence of URTI in athletes potentially mediated by preventing training-induced impairments in immune-surveillance.
Collapse
|
116
|
Pogłód R, Rosiek A, Łętowska M. [Emerging infectious diseases in the context of blood safety]. ACTA ACUST UNITED AC 2013; 44:284-293. [PMID: 32226059 PMCID: PMC7094095 DOI: 10.1016/j.achaem.2013.07.022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2013] [Accepted: 07/02/2013] [Indexed: 11/24/2022]
|