1
|
Improving modelling for epidemic responses: reflections from members of the UK infectious disease modelling community on their experiences during the COVID-19 pandemic. Wellcome Open Res 2024; 9:12. [PMID: 38784437 PMCID: PMC11112301 DOI: 10.12688/wellcomeopenres.19601.1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/10/2023] [Indexed: 05/25/2024] Open
Abstract
Background The COVID-19 pandemic both relied and placed significant burdens on the experts involved from research and public health sectors. The sustained high pressure of a pandemic on responders, such as healthcare workers, can lead to lasting psychological impacts including acute stress disorder, post-traumatic stress disorder, burnout, and moral injury, which can impact individual wellbeing and productivity. Methods As members of the infectious disease modelling community, we convened a reflective workshop to understand the professional and personal impacts of response work on our community and to propose recommendations for future epidemic responses. The attendees represented a range of career stages, institutions, and disciplines. This piece was collectively produced by those present at the session based on our collective experiences. Results Key issues we identified at the workshop were lack of institutional support, insecure contracts, unequal credit and recognition, and mental health impacts. Our recommendations include rewarding impactful work, fostering academia-public health collaboration, decreasing dependence on key individuals by developing teams, increasing transparency in decision-making, and implementing sustainable work practices. Conclusions Despite limitations in representation, this workshop provided valuable insights into the UK COVID-19 modelling experience and guidance for future public health crises. Recognising and addressing the issues highlighted is crucial, in our view, for ensuring the effectiveness of epidemic response work in the future.
Collapse
|
2
|
Impact of opioid-free analgesia on pain severity and patient satisfaction after discharge from surgery: multispecialty, prospective cohort study in 25 countries. Br J Surg 2024; 111:znad421. [PMID: 38207169 PMCID: PMC10783642 DOI: 10.1093/bjs/znad421] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2023] [Revised: 11/29/2023] [Accepted: 12/05/2023] [Indexed: 01/13/2024]
Abstract
BACKGROUND Balancing opioid stewardship and the need for adequate analgesia following discharge after surgery is challenging. This study aimed to compare the outcomes for patients discharged with opioid versus opioid-free analgesia after common surgical procedures. METHODS This international, multicentre, prospective cohort study collected data from patients undergoing common acute and elective general surgical, urological, gynaecological, and orthopaedic procedures. The primary outcomes were patient-reported time in severe pain measured on a numerical analogue scale from 0 to 100% and patient-reported satisfaction with pain relief during the first week following discharge. Data were collected by in-hospital chart review and patient telephone interview 1 week after discharge. RESULTS The study recruited 4273 patients from 144 centres in 25 countries; 1311 patients (30.7%) were prescribed opioid analgesia at discharge. Patients reported being in severe pain for 10 (i.q.r. 1-30)% of the first week after discharge and rated satisfaction with analgesia as 90 (i.q.r. 80-100) of 100. After adjustment for confounders, opioid analgesia on discharge was independently associated with increased pain severity (risk ratio 1.52, 95% c.i. 1.31 to 1.76; P < 0.001) and re-presentation to healthcare providers owing to side-effects of medication (OR 2.38, 95% c.i. 1.36 to 4.17; P = 0.004), but not with satisfaction with analgesia (β coefficient 0.92, 95% c.i. -1.52 to 3.36; P = 0.468) compared with opioid-free analgesia. Although opioid prescribing varied greatly between high-income and low- and middle-income countries, patient-reported outcomes did not. CONCLUSION Opioid analgesia prescription on surgical discharge is associated with a higher risk of re-presentation owing to side-effects of medication and increased patient-reported pain, but not with changes in patient-reported satisfaction. Opioid-free discharge analgesia should be adopted routinely.
Collapse
|
3
|
Retrospective evaluation of real-time estimates of global COVID-19 transmission trends and mortality forecasts. PLoS One 2023; 18:e0286199. [PMID: 37851661 PMCID: PMC10584190 DOI: 10.1371/journal.pone.0286199] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2022] [Accepted: 05/11/2023] [Indexed: 10/20/2023] Open
Abstract
Since 8th March 2020 up to the time of writing, we have been producing near real-time weekly estimates of SARS-CoV-2 transmissibility and forecasts of deaths due to COVID-19 for all countries with evidence of sustained transmission, shared online. We also developed a novel heuristic to combine weekly estimates of transmissibility to produce forecasts over a 4-week horizon. Here we present a retrospective evaluation of the forecasts produced between 8th March to 29th November 2020 for 81 countries. We evaluated the robustness of the forecasts produced in real-time using relative error, coverage probability, and comparisons with null models. During the 39-week period covered by this study, both the short- and medium-term forecasts captured well the epidemic trajectory across different waves of COVID-19 infections with small relative errors over the forecast horizon. The model was well calibrated with 56.3% and 45.6% of the observations lying in the 50% Credible Interval in 1-week and 4-week ahead forecasts respectively. The retrospective evaluation of our models shows that simple transmission models calibrated using routine disease surveillance data can reliably capture the epidemic trajectory in multiple countries. The medium-term forecasts can be used in conjunction with the short-term forecasts of COVID-19 mortality as a useful planning tool as countries continue to relax public health measures.
Collapse
|
4
|
Branched flows of flexural elastic waves in non-uniform cylindrical shells. PLoS One 2023; 18:e0286420. [PMID: 37235628 DOI: 10.1371/journal.pone.0286420] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2023] [Accepted: 05/16/2023] [Indexed: 05/28/2023] Open
Abstract
Propagation of elastic waves along the axis of cylindrical shells is of great current interest due to their ubiquitous presence and technological importance. Geometric imperfections and spatial variations of properties are inevitable in such structures. Here we report the existence of branched flows of flexural waves in such waveguides. The location of high amplitude motion, away from the launch location, scales as a power law with respect to the variance, and linearly with respect to the correlation length of the spatial variation in the bending stiffness. These scaling laws are then theoretically derived from the ray equations. Numerical integration of the ray equations also exhibit this behaviour-consistent with finite element numerical simulations as well as the theoretically derived scaling. There appears to be a universality for the exponents in the scaling with respect to similar observations in the past for waves in other physical contexts, as well as dispersive flexural waves in elastic plates.
Collapse
|
5
|
The societal value of SARS-CoV-2 booster vaccination in Indonesia. Vaccine 2023; 41:1885-1891. [PMID: 36781331 PMCID: PMC9889258 DOI: 10.1016/j.vaccine.2023.01.068] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2022] [Revised: 01/25/2023] [Accepted: 01/30/2023] [Indexed: 02/05/2023]
Abstract
OBJECTIVES To estimate the expected socio-economic value of booster vaccination in terms of averted deaths and averted closures of businesses and schools using simulation modelling. METHODS The value of booster vaccination in Indonesia is estimated by comparing simulated societal costs under a twelve-month, 187-million-dose Moderna booster vaccination campaign to costs without boosters. The costs of an epidemic and its mitigation consist of lost lives, economic closures and lost education; cost-minimising non-pharmaceutical mitigation is chosen for each scenario. RESULTS The cost-minimising non-pharmaceutical mitigation depends on the availability of vaccines: the differences between the two scenarios are 14 to 19 million years of in-person education and $153 to $204 billion in economic activity. The value of the booster campaign ranges from $2,500 ($1,400-$4,100) to $2,800 ($1,700-$4,600) per dose in the first year, depending on life-year valuations. CONCLUSIONS The societal benefits of booster vaccination are substantial. Much of the value of vaccination resides in the reduced need for costly non-pharmaceutical mitigation. We propose cost minimisation as a tool for policy decision-making and valuation of vaccination, taking into account all socio-economic costs, and not averted deaths alone.
Collapse
|
6
|
Climate change and communicable diseases in the Gulf Cooperation Council (GCC) countries. Epidemics 2023; 42:100667. [PMID: 36652872 DOI: 10.1016/j.epidem.2023.100667] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2022] [Revised: 12/05/2022] [Accepted: 01/06/2023] [Indexed: 01/15/2023] Open
Abstract
A review of the extant literature reveals the extent to which the spread of communicable diseases will be significantly impacted by climate change. Specific research into how this will likely be observed in the countries of the Gulf Cooperation Council (GCC) is, however, greatly lacking. This report summarises the unique public health challenges faced by the GCC countries in the coming century, and outlines the need for greater investment in public health research and disease surveillance to better forecast the imminent epidemiological landscape. Significant data gaps currently exist regarding vector occurrence, spatial climate measures, and communicable disease case counts in the GCC - presenting an immediate research priority for the region. We outline policy work necessary to strengthen public health interventions, and to facilitate evidence-driven mitigation strategies. Such research will require a transdisciplinary approach, utilising existing cross-border public health initiatives, to ensure that such investigations are well-targeted and effectively communicated.
Collapse
|
7
|
Hospitalisation and mortality risk of SARS-COV-2 variant omicron sub-lineage BA.2 compared to BA.1 in England. Nat Commun 2022; 13:6053. [PMID: 36229438 PMCID: PMC9559149 DOI: 10.1038/s41467-022-33740-9] [Citation(s) in RCA: 15] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2022] [Accepted: 09/29/2022] [Indexed: 12/24/2022] Open
Abstract
The Omicron variant of SARS-CoV-2 became the globally dominant variant in early 2022. A sub-lineage of the Omicron variant (BA.2) was identified in England in January 2022. Here, we investigated hospitalisation and mortality risks of COVID-19 cases with the Omicron sub-lineage BA.2 (n = 258,875) compared to BA.1 (n = 984,337) in a large cohort study in England. We estimated the risk of hospital attendance, hospital admission or death using multivariable stratified proportional hazards regression models. After adjustment for confounders, BA.2 cases had lower or similar risks of death (HR = 0.80, 95% CI 0.71-0.90), hospital admission (HR = 0.88, 95% CI 0.83-0.94) and any hospital attendance (HR = 0.98, 95% CI 0.95-1.01). These findings that the risk of severe outcomes following infection with BA.2 SARS-CoV-2 was slightly lower or equivalent to the BA.1 sub-lineage can inform public health strategies in countries where BA.2 is spreading.
Collapse
|
8
|
Onset and window of SARS-CoV-2 infectiousness and temporal correlation with symptom onset: a prospective, longitudinal, community cohort study. THE LANCET RESPIRATORY MEDICINE 2022; 10:1061-1073. [PMID: 35988572 PMCID: PMC9388060 DOI: 10.1016/s2213-2600(22)00226-0] [Citation(s) in RCA: 46] [Impact Index Per Article: 23.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/14/2022] [Revised: 05/24/2022] [Accepted: 06/08/2022] [Indexed: 12/05/2022]
Abstract
Background Knowledge of the window of SARS-CoV-2 infectiousness is crucial in developing policies to curb transmission. Mathematical modelling based on scarce empirical evidence and key assumptions has driven isolation and testing policy, but real-world data are needed. We aimed to characterise infectiousness across the full course of infection in a real-world community setting. Methods The Assessment of Transmission and Contagiousness of COVID-19 in Contacts (ATACCC) study was a UK prospective, longitudinal, community cohort of contacts of newly diagnosed, PCR-confirmed SARS-CoV-2 index cases. Household and non-household exposed contacts aged 5 years or older were eligible for recruitment if they could provide informed consent and agree to self-swabbing of the upper respiratory tract. The primary objective was to define the window of SARS-CoV-2 infectiousness and its temporal correlation with symptom onset. We quantified viral RNA load by RT-PCR and infectious viral shedding by enumerating cultivable virus daily across the course of infection. Participants completed a daily diary to track the emergence of symptoms. Outcomes were assessed with empirical data and a phenomenological Bayesian hierarchical model. Findings Between Sept 13, 2020, and March 31, 2021, we enrolled 393 contacts from 327 households (the SARS-CoV-2 pre-alpha and alpha variant waves); and between May 24, 2021, and Oct 28, 2021, we enrolled 345 contacts from 215 households (the delta variant wave). 173 of these 738 contacts were PCR positive for more than one timepoint, 57 of which were at the start of infection and comprised the final study population. The onset and end of infectious viral shedding were captured in 42 cases and the median duration of infectiousness was 5 (IQR 3–7) days. Although 24 (63%) of 38 cases had PCR-detectable virus before symptom onset, only seven (20%) of 35 shed infectious virus presymptomatically. Symptom onset was a median of 3 days before both peak viral RNA and peak infectious viral load (viral RNA IQR 3–5 days, n=38; plaque-forming units IQR 3–6 days, n=35). Notably, 22 (65%) of 34 cases and eight (24%) of 34 cases continued to shed infectious virus 5 days and 7 days post-symptom onset, respectively (survival probabilities 67% and 35%). Correlation of lateral flow device (LFD) results with infectious viral shedding was poor during the viral growth phase (sensitivity 67% [95% CI 59–75]), but high during the decline phase (92% [86–96]). Infectious virus kinetic modelling suggested that the initial rate of viral replication determines the course of infection and infectiousness. Interpretation Less than a quarter of COVID-19 cases shed infectious virus before symptom onset; under a crude 5-day self-isolation period from symptom onset, two-thirds of cases released into the community would still be infectious, but with reduced infectious viral shedding. Our findings support a role for LFDs to safely accelerate deisolation but not for early diagnosis, unless used daily. These high-resolution, community-based data provide evidence to inform infection control guidance. Funding National Institute for Health and Care Research.
Collapse
|
9
|
Broadening symptom criteria improves early case identification in SARS-CoV-2 contacts. Eur Respir J 2022; 60:2102308. [PMID: 34824057 PMCID: PMC8620106 DOI: 10.1183/13993003.02308-2021] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2021] [Accepted: 11/11/2021] [Indexed: 11/05/2022]
Abstract
BACKGROUND The success of case isolation and contact tracing for the control of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) transmission depends on the accuracy and speed of case identification. We assessed whether inclusion of additional symptoms alongside three canonical symptoms (CS), i.e. fever, cough and loss or change in smell or taste, could improve case definitions and accelerate case identification in SARS-CoV-2 contacts. METHODS Two prospective longitudinal London (UK)-based cohorts of community SARS-CoV-2 contacts, recruited within 5 days of exposure, provided independent training and test datasets. Infected and uninfected contacts completed daily symptom diaries from the earliest possible time-points. Diagnostic information gained by adding symptoms to the CS was quantified using likelihood ratios and area under the receiver operating characteristic curve. Improvements in sensitivity and time to detection were compared with penalties in terms of specificity and number needed to test. RESULTS Of 529 contacts within two cohorts, 164 (31%) developed PCR-confirmed infection and 365 (69%) remained uninfected. In the training dataset (n=168), 29% of infected contacts did not report the CS. Four symptoms (sore throat, muscle aches, headache and appetite loss) were identified as early-predictors (EP) which added diagnostic value to the CS. The broadened symptom criterion "≥1 of the CS, or ≥2 of the EP" identified PCR-positive contacts in the test dataset on average 2 days earlier after exposure (p=0.07) than "≥1 of the CS", with only modest reduction in specificity (5.7%). CONCLUSIONS Broadening symptom criteria to include individuals with at least two of muscle aches, headache, appetite loss and sore throat identifies more infections and reduces time to detection, providing greater opportunities to prevent SARS-CoV-2 transmission.
Collapse
|
10
|
Assessment of mortality and hospital admissions associated with confirmed infection with SARS-CoV-2 Alpha variant: a matched cohort and time-to-event analysis, England, October to December 2020. Euro Surveill 2022; 27:2100377. [PMID: 35593163 PMCID: PMC9121661 DOI: 10.2807/1560-7917.es.2022.27.20.2100377] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022] Open
Abstract
BackgroundThe emergence of the SARS-CoV-2 Alpha variant in England coincided with a rapid increase in the number of PCR-confirmed COVID-19 cases in areas where the variant was concentrated.AimOur aim was to assess whether infection with Alpha was associated with more severe clinical outcomes than the wild type.MethodsLaboratory-confirmed infections with genomically sequenced SARS-CoV-2 Alpha and wild type between October and December 2020 were linked to routine healthcare and surveillance datasets. We conducted two statistical analyses to compare the risk of hospital admission and death within 28 days of testing between Alpha and wild-type infections: a matched cohort study and an adjusted Cox proportional hazards model. We assessed differences in disease severity by comparing hospital admission and mortality, including length of hospitalisation and time to death.ResultsOf 63,609 COVID-19 cases sequenced in England between October and December 2020, 6,038 had the Alpha variant. In the matched cohort analysis, we matched 2,821 cases with Alpha to 2,821 to cases with wild type. In the time-to-event analysis, we observed a 34% increased risk in hospitalisation associated with Alpha compared with wild type, but no significant difference in the risk of mortality.ConclusionWe found evidence of increased risk of hospitalisation after adjusting for key confounders, suggesting increased infection severity associated with the Alpha variant. Rapid assessments of the relative morbidity in terms of clinical outcomes and mortality associated with emerging SARS-CoV-2 variants compared with dominant variants are required to assess overall impact of SARS-CoV-2 mutations.
Collapse
|
11
|
Abstract
BACKGROUND A rapid increase in coronavirus disease 2019 (Covid-19) cases due to the omicron (B.1.1.529) variant of severe acute respiratory syndrome coronavirus 2 in highly vaccinated populations has aroused concerns about the effectiveness of current vaccines. METHODS We used a test-negative case-control design to estimate vaccine effectiveness against symptomatic disease caused by the omicron and delta (B.1.617.2) variants in England. Vaccine effectiveness was calculated after primary immunization with two doses of BNT162b2 (Pfizer-BioNTech), ChAdOx1 nCoV-19 (AstraZeneca), or mRNA-1273 (Moderna) vaccine and after a booster dose of BNT162b2, ChAdOx1 nCoV-19, or mRNA-1273. RESULTS Between November 27, 2021, and January 12, 2022, a total of 886,774 eligible persons infected with the omicron variant, 204,154 eligible persons infected with the delta variant, and 1,572,621 eligible test-negative controls were identified. At all time points investigated and for all combinations of primary course and booster vaccines, vaccine effectiveness against symptomatic disease was higher for the delta variant than for the omicron variant. No effect against the omicron variant was noted from 20 weeks after two ChAdOx1 nCoV-19 doses, whereas vaccine effectiveness after two BNT162b2 doses was 65.5% (95% confidence interval [CI], 63.9 to 67.0) at 2 to 4 weeks, dropping to 8.8% (95% CI, 7.0 to 10.5) at 25 or more weeks. Among ChAdOx1 nCoV-19 primary course recipients, vaccine effectiveness increased to 62.4% (95% CI, 61.8 to 63.0) at 2 to 4 weeks after a BNT162b2 booster before decreasing to 39.6% (95% CI, 38.0 to 41.1) at 10 or more weeks. Among BNT162b2 primary course recipients, vaccine effectiveness increased to 67.2% (95% CI, 66.5 to 67.8) at 2 to 4 weeks after a BNT162b2 booster before declining to 45.7% (95% CI, 44.7 to 46.7) at 10 or more weeks. Vaccine effectiveness after a ChAdOx1 nCoV-19 primary course increased to 70.1% (95% CI, 69.5 to 70.7) at 2 to 4 weeks after an mRNA-1273 booster and decreased to 60.9% (95% CI, 59.7 to 62.1) at 5 to 9 weeks. After a BNT162b2 primary course, the mRNA-1273 booster increased vaccine effectiveness to 73.9% (95% CI, 73.1 to 74.6) at 2 to 4 weeks; vaccine effectiveness fell to 64.4% (95% CI, 62.6 to 66.1) at 5 to 9 weeks. CONCLUSIONS Primary immunization with two doses of ChAdOx1 nCoV-19 or BNT162b2 vaccine provided limited protection against symptomatic disease caused by the omicron variant. A BNT162b2 or mRNA-1273 booster after either the ChAdOx1 nCoV-19 or BNT162b2 primary course substantially increased protection, but that protection waned over time. (Funded by the U.K. Health Security Agency.).
Collapse
|
12
|
Abstract
BACKGROUND A rapid increase in coronavirus disease 2019 (Covid-19) cases due to the omicron (B.1.1.529) variant of severe acute respiratory syndrome coronavirus 2 in highly vaccinated populations has aroused concerns about the effectiveness of current vaccines. METHODS We used a test-negative case-control design to estimate vaccine effectiveness against symptomatic disease caused by the omicron and delta (B.1.617.2) variants in England. Vaccine effectiveness was calculated after primary immunization with two doses of BNT162b2 (Pfizer-BioNTech), ChAdOx1 nCoV-19 (AstraZeneca), or mRNA-1273 (Moderna) vaccine and after a booster dose of BNT162b2, ChAdOx1 nCoV-19, or mRNA-1273. RESULTS Between November 27, 2021, and January 12, 2022, a total of 886,774 eligible persons infected with the omicron variant, 204,154 eligible persons infected with the delta variant, and 1,572,621 eligible test-negative controls were identified. At all time points investigated and for all combinations of primary course and booster vaccines, vaccine effectiveness against symptomatic disease was higher for the delta variant than for the omicron variant. No effect against the omicron variant was noted from 20 weeks after two ChAdOx1 nCoV-19 doses, whereas vaccine effectiveness after two BNT162b2 doses was 65.5% (95% confidence interval [CI], 63.9 to 67.0) at 2 to 4 weeks, dropping to 8.8% (95% CI, 7.0 to 10.5) at 25 or more weeks. Among ChAdOx1 nCoV-19 primary course recipients, vaccine effectiveness increased to 62.4% (95% CI, 61.8 to 63.0) at 2 to 4 weeks after a BNT162b2 booster before decreasing to 39.6% (95% CI, 38.0 to 41.1) at 10 or more weeks. Among BNT162b2 primary course recipients, vaccine effectiveness increased to 67.2% (95% CI, 66.5 to 67.8) at 2 to 4 weeks after a BNT162b2 booster before declining to 45.7% (95% CI, 44.7 to 46.7) at 10 or more weeks. Vaccine effectiveness after a ChAdOx1 nCoV-19 primary course increased to 70.1% (95% CI, 69.5 to 70.7) at 2 to 4 weeks after an mRNA-1273 booster and decreased to 60.9% (95% CI, 59.7 to 62.1) at 5 to 9 weeks. After a BNT162b2 primary course, the mRNA-1273 booster increased vaccine effectiveness to 73.9% (95% CI, 73.1 to 74.6) at 2 to 4 weeks; vaccine effectiveness fell to 64.4% (95% CI, 62.6 to 66.1) at 5 to 9 weeks. CONCLUSIONS Primary immunization with two doses of ChAdOx1 nCoV-19 or BNT162b2 vaccine provided limited protection against symptomatic disease caused by the omicron variant. A BNT162b2 or mRNA-1273 booster after either the ChAdOx1 nCoV-19 or BNT162b2 primary course substantially increased protection, but that protection waned over time. (Funded by the U.K. Health Security Agency.).
Collapse
|
13
|
|
14
|
The impact of the COVID-19 pandemic on patterns of attendance at emergency departments in two large London hospitals: an observational study. BMC Health Serv Res 2021; 21:1008. [PMID: 34556119 PMCID: PMC8460185 DOI: 10.1186/s12913-021-07008-9] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2020] [Accepted: 09/09/2021] [Indexed: 02/08/2023] Open
Abstract
BACKGROUND Hospitals in England have undergone considerable change to address the surge in demand imposed by the COVID-19 pandemic. The impact of this on emergency department (ED) attendances is unknown, especially for non-COVID-19 related emergencies. METHODS This analysis is an observational study of ED attendances at the Imperial College Healthcare NHS Trust (ICHNT). We calibrated auto-regressive integrated moving average time-series models of ED attendances using historic (2015-2019) data. Forecasted trends were compared to present year ICHNT data for the period between March 12, 2020 (when England implemented the first COVID-19 public health measure) and May 31, 2020. We compared ICHTN trends with publicly available regional and national data. Lastly, we compared hospital admissions made via the ED and in-hospital mortality at ICHNT during the present year to the historic 5-year average. RESULTS ED attendances at ICHNT decreased by 35% during the period after the first lockdown was imposed on March 12, 2020 and before May 31, 2020, reflecting broader trends seen for ED attendances across all England regions, which fell by approximately 50% for the same time frame. For ICHNT, the decrease in attendances was mainly amongst those aged < 65 years and those arriving by their own means (e.g. personal or public transport) and not correlated with any of the spatial dependencies analysed such as increasing distance from postcode of residence to the hospital. Emergency admissions of patients without COVID-19 after March 12, 2020 fell by 48%; we did not observe a significant change to the crude mortality risk in patients without COVID-19 (RR 1.13, 95%CI 0.94-1.37, p = 0.19). CONCLUSIONS Our study findings reflect broader trends seen across England and give an indication how emergency healthcare seeking has drastically changed. At ICHNT, we find that a larger proportion arrived by ambulance and that hospitalisation outcomes of patients without COVID-19 did not differ from previous years. The extent to which these findings relate to ED avoidance behaviours compared to having sought alternative emergency health services outside of hospital remains unknown. National analyses and strategies to streamline emergency services in England going forward are urgently needed.
Collapse
|
15
|
Optimal national prioritization policies for hospital care during the SARS-CoV-2 pandemic. NATURE COMPUTATIONAL SCIENCE 2021; 1:521-531. [PMID: 38217250 PMCID: PMC10766519 DOI: 10.1038/s43588-021-00111-1] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/23/2021] [Accepted: 07/15/2021] [Indexed: 01/15/2024]
Abstract
In response to unprecedented surges in the demand for hospital care during the SARS-CoV-2 pandemic, health systems have prioritized patients with COVID-19 to life-saving hospital care to the detriment of other patients. In contrast to these ad hoc policies, we develop a linear programming framework to optimally schedule elective procedures and allocate hospital beds among all planned and emergency patients to minimize years of life lost. Leveraging a large dataset of administrative patient medical records, we apply our framework to the National Health Service in England and show that an extra 50,750-5,891,608 years of life can be gained compared with prioritization policies that reflect those implemented during the pandemic. Notable health gains are observed for neoplasms, diseases of the digestive system, and injuries and poisoning. Our open-source framework provides a computationally efficient approximation of a large-scale discrete optimization problem that can be applied globally to support national-level care prioritization policies.
Collapse
|
16
|
Fine-scale estimation of key life-history parameters of malaria vectors: implications for next-generation vector control technologies. Parasit Vectors 2021; 14:311. [PMID: 34103094 PMCID: PMC8188720 DOI: 10.1186/s13071-021-04789-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2020] [Accepted: 05/11/2021] [Indexed: 11/12/2022] Open
Abstract
Background Mosquito control has the potential to significantly reduce malaria burden on a region, but to influence public health policy must also show cost-effectiveness. Gaps in our knowledge of mosquito population dynamics mean that mathematical modelling of vector control interventions have typically made simplifying assumptions about key aspects of mosquito ecology. Often, these assumptions can distort the predicted efficacy of vector control, particularly next-generation tools such as gene drive, which are highly sensitive to local mosquito dynamics. Methods We developed a discrete-time stochastic mathematical model of mosquito population dynamics to explore the fine-scale behaviour of egg-laying and larval density dependence on parameter estimation. The model was fitted to longitudinal mosquito population count data using particle Markov chain Monte Carlo methods. Results By modelling fine-scale behaviour of egg-laying under varying density dependence scenarios we refine our life history parameter estimates, and in particular we see how model assumptions affect population growth rate (Rm), a crucial determinate of vector control efficacy. Conclusions Subsequent application of these new parameter estimates to gene drive models show how the understanding and implementation of fine-scale processes, when deriving parameter estimates, may have a profound influence on successful vector control. The consequences of this may be of crucial interest when devising future public health policy. Graphic abstract ![]()
Supplementary Information The online version contains supplementary material available at 10.1186/s13071-021-04789-0.
Collapse
|
17
|
Leveraging community mortality indicators to infer COVID-19 mortality and transmission dynamics in Damascus, Syria. Nat Commun 2021; 12:2394. [PMID: 33888698 PMCID: PMC8062464 DOI: 10.1038/s41467-021-22474-9] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2020] [Accepted: 03/15/2021] [Indexed: 12/20/2022] Open
Abstract
The COVID-19 pandemic has resulted in substantial mortality worldwide. However, to date, countries in the Middle East and Africa have reported considerably lower mortality rates than in Europe and the Americas. Motivated by reports of an overwhelmed health system, we estimate the likely under-ascertainment of COVID-19 mortality in Damascus, Syria. Using all-cause mortality data, we fit a mathematical model of COVID-19 transmission to reported mortality, estimating that 1.25% of COVID-19 deaths (sensitivity range 1.00% - 3.00%) have been reported as of 2 September 2020. By 2 September, we estimate that 4,380 (95% CI: 3,250 - 5,550) COVID-19 deaths in Damascus may have been missed, with 39.0% (95% CI: 32.5% - 45.0%) of the population in Damascus estimated to have been infected. Accounting for under-ascertainment corroborates reports of exceeded hospital bed capacity and is validated by community-uploaded obituary notifications, which confirm extensive unreported mortality in Damascus.
Collapse
|
18
|
Genetic evidence for the association between COVID-19 epidemic severity and timing of non-pharmaceutical interventions. Nat Commun 2021; 12:2188. [PMID: 33846321 PMCID: PMC8041850 DOI: 10.1038/s41467-021-22366-y] [Citation(s) in RCA: 18] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2020] [Accepted: 03/10/2021] [Indexed: 01/09/2023] Open
Abstract
Unprecedented public health interventions including travel restrictions and national lockdowns have been implemented to stem the COVID-19 epidemic, but the effectiveness of non-pharmaceutical interventions is still debated. We carried out a phylogenetic analysis of more than 29,000 publicly available whole genome SARS-CoV-2 sequences from 57 locations to estimate the time that the epidemic originated in different places. These estimates were examined in relation to the dates of the most stringent interventions in each location as well as to the number of cumulative COVID-19 deaths and phylodynamic estimates of epidemic size. Here we report that the time elapsed between epidemic origin and maximum intervention is associated with different measures of epidemic severity and explains 11% of the variance in reported deaths one month after the most stringent intervention. Locations where strong non-pharmaceutical interventions were implemented earlier experienced much less severe COVID-19 morbidity and mortality during the period of study.
Collapse
|
19
|
Abstract
Yellow fever (YF) is a viral, vector-borne, haemorrhagic fever endemic in tropical regions of Africa and South America. The vaccine for YF is considered safe and effective, but intervention strategies need to be optimised; one of the tools for this is mathematical modelling. We refine and expand an existing modelling framework for Africa to account for transmission in South America. We fit to YF occurrence and serology data. We then estimate the subnational forces of infection for the entire endemic region. Finally, using demographic and vaccination data, we examine the impact of vaccination activities. We estimate that there were 109,000 (95% credible interval [CrI] [67,000–173,000]) severe infections and 51,000 (95% CrI [31,000–82,000]) deaths due to YF in Africa and South America in 2018. We find that mass vaccination activities in Africa reduced deaths by 47% (95% CrI [10%–77%]). This methodology allows us to evaluate the effectiveness of vaccination and illustrates the need for continued vigilance and surveillance of YF.
Collapse
|
20
|
Abstract
Abstract. The threat of terrorism and rise of extremist movements across the globe pose some of the greatest challenges the world currently faces. While there have been serious conceptual and methodological problems within the psychological study of terrorism, the nascent field has advanced and the evidence, theories, and models have developed in their sophistication. The current article explores the role of social or collective identity in instigating, propagating, and diminishing engagement in violent extremism. Specifically examining how when a fundamental need to belong is challenged through threats and uncertainty this can lead to the people joining entitative groups which can fuse personal and social identities. These identities can be further amplified through ingroup and outgroup processes leading to involvement in violent extremism. The paper also explores how identity can mediate the stress of this extremist lifestyle and sustain engagement in violence. In order to illustrate these processes the article draws on interviews with Northern Irish paramilitaries. Finally, the paper explores the role of identity in moderating violent extremism, and provides suggestions of approaches to promote desistence from violent extremism.
Collapse
|
21
|
Clinical characteristics and predictors of outcomes of hospitalized patients with COVID-19 in a multi-ethnic London NHS Trust: a retrospective cohort study. Clin Infect Dis 2020; 73:e4047-e4057. [PMID: 32766823 PMCID: PMC7454410 DOI: 10.1093/cid/ciaa1091] [Citation(s) in RCA: 61] [Impact Index Per Article: 15.3] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2020] [Accepted: 07/27/2020] [Indexed: 12/21/2022] Open
Abstract
Background Emerging evidence suggests ethnic minorities are disproportionately affected by coronavirus disease 2019 (COVID-19). Detailed clinical analyses of multicultural hospitalized patient cohorts remain largely undescribed. Methods We performed regression, survival, and cumulative competing risk analyses to evaluate factors associated with mortality in patients admitted for COVID-19 in 3 large London hospitals between 25 February and 5 April, censored as of 1 May 2020. Results Of 614 patients (median age, 69 [interquartile range, 25] years) and 62% male), 381 (62%) were discharged alive, 178 (29%) died, and 55 (9%) remained hospitalized at censoring. Severe hypoxemia (adjusted odds ratio [aOR], 4.25 [95% confidence interval {CI}, 2.36–7.64]), leukocytosis (aOR, 2.35 [95% CI, 1.35–4.11]), thrombocytopenia (aOR [1.01, 95% CI, 1.00–1.01], increase per 109 decrease), severe renal impairment (aOR, 5.14 [95% CI, 2.65–9.97]), and low albumin (aOR, 1.06 [95% CI, 1.02–1.09], increase per gram decrease) were associated with death. Forty percent (n = 244) were from black, Asian, and other minority ethnic (BAME) groups, 38% (n = 235) were white, and ethnicity was unknown for 22% (n = 135). BAME patients were younger and had fewer comorbidities. Although the unadjusted odds of death did not differ by ethnicity, when adjusting for age, sex, and comorbidities, black patients were at higher odds of death compared to whites (aOR, 1.69 [95% CI, 1.00–2.86]). This association was stronger when further adjusting for admission severity (aOR, 1.85 [95% CI, 1.06–3.24]). Conclusions BAME patients were overrepresented in our cohort; when accounting for demographic and clinical profile of admission, black patients were at increased odds of death. Further research is needed into biologic drivers of differences in COVID-19 outcomes by ethnicity.
Collapse
|
22
|
Electromagnetic Sensors for Underwater Scour Monitoring. SENSORS 2020; 20:s20154096. [PMID: 32717822 PMCID: PMC7436009 DOI: 10.3390/s20154096] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/28/2020] [Revised: 07/15/2020] [Accepted: 07/18/2020] [Indexed: 02/02/2023]
Abstract
Scour jeopardises the safety of many civil engineering structures with foundations in riverbeds and it is the leading cause for the collapse of bridges worldwide. Current approaches for bridge scour risk management rely mainly on visual inspections, which provide unreliable estimates of scour and of its effects, also considering the difficulties in visually monitoring the riverbed erosion around submerged foundations. Thus, there is a need to introduce systems capable of continuously monitoring the evolution of scour at bridge foundations, even during extreme flood events. This paper illustrates the development and deployment of a scour monitoring system consisting of smart probes equipped with electromagnetic sensors. This is the first application of this type of sensing probes to a real case-study for continuous scour monitoring. Designed to observe changes in the permittivity of the medium around bridge foundations, the sensors allow for detection of scour depths and the assessment of whether the scour hole has been refilled. The monitoring system was installed on the A76 200 Bridge in New Cumnock (S-W Scotland) and has provided a continuous recording of the scour for nearly two years. The scour data registered after a peak flood event (validated against actual measurements of scour during a bridge inspection) show the potential of the technology in providing continuous scour measures, even during extreme flood events, thus avoiding the deployment of divers for underwater examination.
Collapse
|
23
|
Staying Engaged in Terrorism: Narrative Accounts of Sustaining Participation in Violent Extremism. Front Psychol 2020; 11:1338. [PMID: 32625152 PMCID: PMC7313378 DOI: 10.3389/fpsyg.2020.01338] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2019] [Accepted: 05/20/2020] [Indexed: 11/13/2022] Open
Abstract
Research exploring radicalization pathways and how and why people become involved in terrorism has expanded since the 9/11 attacks. Likewise, over the last decade research exploring de-radicalization and desistence from terrorism has grown and expanded in an attempt to promote exit from extremist or terror groups. However, research studies on how individuals sustain engagement in terrorism and their involvement with extremist organizations, often in the face of great adversity, are absent from the body of research. To address this scarcity of research this study analyzed accounts of engagement in violent extremism produced by Northern Irish loyalist and republican paramilitaries in order to explore how their paramilitary lifestyle, perpetration of acts of political violence and the pressure from countering threats posed by rival groups, and the State security forces impacted on them. The analysis utilized a hybrid of thematic analysis and interpretative phenomenological analysis (IPA). The themes raised through the analysis reflected the psychological, social and economic hardship associated with this lifestyle. The narrative accounts also illustrated psychological changes associated to engagement in violence and from insulation within tightly knit extremist groups. As most of the participants faced incarceration during their paramilitary careers, themes also reflected on the impact imprisonment had on them. The themes explored factors that sustained their involvement, including the role of identity development and identity fusion in sustaining their extremism, the impact of insulated group membership, feelings of efficacy, dehumanization processes, community support, and beliefs in the utility of violence.
Collapse
|
24
|
COVID-19 and the difficulty of inferring epidemiological parameters from clinical data - Authors' reply. THE LANCET. INFECTIOUS DISEASES 2020; 21:28. [PMID: 32473660 PMCID: PMC7255722 DOI: 10.1016/s1473-3099(20)30443-6] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 05/07/2020] [Accepted: 05/07/2020] [Indexed: 01/06/2023]
|
25
|
28 Can Comprehensive Geriatric Assessment be Achieved in the Emergency Department? Age Ageing 2020. [DOI: 10.1093/ageing/afz183.28] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Abstract
Background
Our National Health Service is facing unprecedented challenges to accommodate our frailer healthcare users. The gold standard tool for the identification and management of frailty is the Comprehensive Geriatric Assessment (CGA) and has been shown to lead to better outcomes in terms of morbidity and mortality.
Introduction
With a largely elderly demographic profile in the East of England, the Norfolk and Norwich University Hospital opened the first Older People’s Emergency Department (OPED) in the UK in 2017. This work reviews the effectiveness of a geriatrician-led CGA in a dedicated OPED, which operates during daylight hours, compared to usual care in Accident & Emergency (A&E).
Methods
99 patients assessed in OPED and 99 patients assessed overnight in A&E during February 2019 were included in this retrospective study. Electronic case notes for each patient were reviewed by the authors and results were expressed as percentages.
Results
OPED outperformed A&E in all components of the CGA; strongest areas included assessing for pain, falls risk and activities of daily living. Both departments performed well in reviewing medications and assessing for safeguarding concerns. Areas for improvement include assessing for mood disorders, sensory impairment, discussing Do Not Attempt Cardiopulmonary Resuscitation status, and end of life care plans. The average length of stay of OPED patients was only 7.3 days compared to 8.7 days in A&E, and 89% of OPED patients were discharged back to their usual residences compared to 87% in A&E.
Conclusions
The improved CGA process in OPED has led to better outcomes, notably through a reduction in the average length of inpatient stay. Nevertheless, certain components of the CGA still require improvement. Further examination is needed to assess long-term mortality to support the use of CGA in the emergency setting.
Collapse
|
26
|
Hospital staff awareness of operating theatre supply cost in a regional centre. Aust J Rural Health 2020; 28:87-88. [DOI: 10.1111/ajr.12582] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2019] [Revised: 09/16/2019] [Accepted: 09/21/2019] [Indexed: 11/27/2022] Open
|
27
|
Estimating the standardised ileal digestible tryptophan requirement of pigs kept under commercial conditions in the immediate post-weaning period. Anim Feed Sci Technol 2020. [DOI: 10.1016/j.anifeedsci.2019.114342] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
|
28
|
|
29
|
Wave propagation and scattering in reinforced concrete beams. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2019; 146:3283. [PMID: 31795707 DOI: 10.1121/1.5131644] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/25/2019] [Accepted: 10/11/2019] [Indexed: 06/10/2023]
Abstract
Steel reinforcement bars (rebars) are vital to the strength of reinforced concrete (RC) structures, but can become damaged due to corrosion. Such damage is generally invisible and non-destructive testing methods are needed to assess their integrity. Guided wave methods are popular because they are capable of detecting damage using sensors placed remotely from the damage site, which is often unknown. This paper predicts free wave propagation in RC beams from which the concept of a guided wave based damage detection method emerges. The wave solutions are obtained using the wave finite element framework where a short section of a beam's cross section is modeled in conventional finite element (FE) and periodic boundary conditions are subsequently applied. Reinforcement elements are used in the FE model of the cross section as a neat and efficient means of coupling the concrete to the rebars and imposing prestress. The results show that prestress, important for static behavior, has a negligible effect on wave dispersion. A RC beam with a damaged section is modeled by coupling three waveguides, the center waveguide being identical to the outer ones except for a thickness loss in one rebar. Only small differences in cut-on frequencies are observed between the damaged and undamaged sections. However, these small differences give rise to strong reflection of some waves at frequencies close to cut-on. Below cut-on, most incident power is transmitted but experiences wave mode conversion, whereas above cut-on most power is transmitted to the same wave type. These observations form the basis for ongoing work to develop a damage detection technique premised on wave reflection near cut-on.
Collapse
|
30
|
A systematic review of MERS-CoV seroprevalence and RNA prevalence in dromedary camels: Implications for animal vaccination. Epidemics 2019; 29:100350. [PMID: 31201040 PMCID: PMC6899506 DOI: 10.1016/j.epidem.2019.100350] [Citation(s) in RCA: 29] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2019] [Revised: 05/29/2019] [Accepted: 06/03/2019] [Indexed: 12/17/2022] Open
Abstract
Most adult dromedaries in Africa and the Middle East have been infected with MERS-CoV. Seroprevalence increases with age, while active infection is more common in calves. Prevalence is higher at sites where different dromedary populations mix. Further study is needed to determine if prevalence of infection varies seasonally.
Human infection with Middle East Respiratory Syndrome Coronavirus (MERS-CoV) is driven by recurring dromedary-to-human spill-over events, leading decision-makers to consider dromedary vaccination. Dromedary vaccine candidates in the development pipeline are showing hopeful results, but gaps in our understanding of the epidemiology of MERS-CoV in dromedaries must be addressed to design and evaluate potential vaccination strategies. We aim to bring together existing measures of MERS-CoV infection in dromedary camels to assess the distribution of infection, highlighting knowledge gaps and implications for animal vaccination. We systematically reviewed the published literature on MEDLINE, EMBASE and Web of Science that reported seroprevalence and/or prevalence of active MERS-CoV infection in dromedary camels from both cross-sectional and longitudinal studies. 60 studies met our eligibility criteria. Qualitative syntheses determined that MERS-CoV seroprevalence increased with age up to 80–100% in adult dromedaries supporting geographically widespread endemicity of MERS-CoV in dromedaries in both the Arabian Peninsula and countries exporting dromedaries from Africa. The high prevalence of active infection measured in juveniles and at sites where dromedary populations mix should guide further investigation – particularly of dromedary movement – and inform vaccination strategy design and evaluation through mathematical modelling.
Collapse
|
31
|
POLICI: A web application for visualising and extracting yellow fever vaccination coverage in Africa. Vaccine 2019; 37:1384-1388. [PMID: 30770224 DOI: 10.1016/j.vaccine.2019.01.074] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2018] [Revised: 01/22/2019] [Accepted: 01/23/2019] [Indexed: 11/16/2022]
Abstract
Recent yellow fever (YF) outbreaks have highlighted the increasing global risk of urban spread of the disease. In context of recurrent vaccine shortages, preventive vaccination activities require accurate estimates of existing population-level immunity. We present POLICI (POpulation-Level Immunization Coverage - Imperial), an interactive online tool for visualising and extracting YF vaccination coverage estimates in Africa. We calculated single year age-disaggregated sub-national population-level vaccination coverage for 1950-2050 across the African endemic zone by collating vaccination information and inputting it into a demographic model. This was then implemented on an open interactive web platform. POLICI interactively displays age-disaggregated, population-level vaccination coverages at the first subnational administrative level, through numerous downloadable and customisable visualisations. POLICI is available at https://polici.shinyapps.io/yellow_fever_africa/. POLICI offers an accessible platform for relevant stakeholders in global health to access and explore vaccination coverages. These estimates have already been used to inform the WHO strategy to Eliminate Yellow fever Epidemics (EYE).
Collapse
|
32
|
Bayesian inference of transmission chains using timing of symptoms, pathogen genomes and contact data. PLoS Comput Biol 2019; 15:e1006930. [PMID: 30925168 PMCID: PMC6457559 DOI: 10.1371/journal.pcbi.1006930] [Citation(s) in RCA: 41] [Impact Index Per Article: 8.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2018] [Revised: 04/10/2019] [Accepted: 03/04/2019] [Indexed: 12/13/2022] Open
Abstract
There exists significant interest in developing statistical and computational tools for inferring 'who infected whom' in an infectious disease outbreak from densely sampled case data, with most recent studies focusing on the analysis of whole genome sequence data. However, genomic data can be poorly informative of transmission events if mutations accumulate too slowly to resolve individual transmission pairs or if there exist multiple pathogens lineages within-host, and there has been little focus on incorporating other types of outbreak data. We present here a methodology that uses contact data for the inference of transmission trees in a statistically rigorous manner, alongside genomic data and temporal data. Contact data is frequently collected in outbreaks of pathogens spread by close contact, including Ebola virus (EBOV), severe acute respiratory syndrome coronavirus (SARS-CoV) and Mycobacterium tuberculosis (TB), and routinely used to reconstruct transmission chains. As an improvement over previous, ad-hoc approaches, we developed a probabilistic model that relates a set of contact data to an underlying transmission tree and integrated this in the outbreaker2 inference framework. By analyzing simulated outbreaks under various contact tracing scenarios, we demonstrate that contact data significantly improves our ability to reconstruct transmission trees, even under realistic limitations on the coverage of the contact tracing effort and the amount of non-infectious mixing between cases. Indeed, contact data is equally or more informative than fully sampled whole genome sequence data in certain scenarios. We then use our method to analyze the early stages of the 2003 SARS outbreak in Singapore and describe the range of transmission scenarios consistent with contact data and genetic sequence in a probabilistic manner for the first time. This simple yet flexible model can easily be incorporated into existing tools for outbreak reconstruction and should permit a better integration of genomic and epidemiological data for inferring transmission chains.
Collapse
|
33
|
Abstract
BACKGROUND Reconstructing individual transmission events in an infectious disease outbreak can provide valuable information and help inform infection control policy. Recent years have seen considerable progress in the development of methodologies for reconstructing transmission chains using both epidemiological and genetic data. However, only a few of these methods have been implemented in software packages, and with little consideration for customisability and interoperability. Users are therefore limited to a small number of alternatives, incompatible tools with fixed functionality, or forced to develop their own algorithms at considerable personal effort. RESULTS Here we present outbreaker2, a flexible framework for outbreak reconstruction. This R package re-implements and extends the original model introduced with outbreaker, but most importantly also provides a modular platform allowing users to specify custom models within an optimised inferential framework. As a proof of concept, we implement the within-host evolutionary model introduced with TransPhylo, which is very distinct from the original genetic model in outbreaker, and demonstrate how even complex model results can be successfully included with minimal effort. CONCLUSIONS outbreaker2 provides a valuable starting point for future outbreak reconstruction tools, and represents a unifying platform that promotes customisability and interoperability. Implemented in the R software, outbreaker2 joins a growing body of tools for outbreak analysis.
Collapse
|
34
|
Assessment of implementation of the health management information system at the district level in southern Malawi. Malawi Med J 2018; 29:240-246. [PMID: 29872514 PMCID: PMC5811996 DOI: 10.4314/mmj.v29i3.3] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
Background Despite Malawi's introduction of a health management information system (HMIS) in 1999, the country's health sector still lacks accurate, reliable, complete, consistent and timely health data to inform effective planning and resource management. Methods A cross-sectional survey was conducted wherein qualitative and quantitative data were collected through in-depth interviews, document review, and focus group discussions. Study participants comprised 10 HMIS officers and 10 district health managers from 10 districts in the Southern Region of Malawi. The study was conducted from March to April 2012. Quantitative data were analysed using Microsoft Excel and qualitative data were summarised and analysed using thematic analysis. Results The study established that, based on the Ministry of Health's minimum requirements, 1 out of 10 HMIS officers was qualified for the post. The HMIS officers stated that HMIS data collectors from the district hospital, health facilities, and the community included medical assistants, nurse-midwives, statistical clerks, and health surveillance assistants. Challenges with the system included inadequate resources, knowledge gaps, inadequacy of staff, and lack of training and refresher courses, which collectively contribute to unreliable information and therefore poorly informed decision-making, according to the respondents. The HMIS officers further commented that missing values arose from incomplete registers and data gaps. Furthermore, improper comprehension of some terms by health surveillance assistants (HSAs) and statistical clerks led to incorrectly recorded data. Conclusions The inadequate qualifications among the diverse group of data collectors, along with the varying availability and utilisation different data collection tools, contributed to data inaccuracies. Nevertheless, HMIS was useful for the development of District Implementation Plans (DIPs) and planning for other projects. To reduce data inconsistencies, HMIS indicators should be revised and data collection tools should be harmonised.
Collapse
|
35
|
Repurposing isoxazoline veterinary drugs for control of vector-borne human diseases. Proc Natl Acad Sci U S A 2018; 115:E6920-E6926. [PMID: 29967151 PMCID: PMC6055183 DOI: 10.1073/pnas.1801338115] [Citation(s) in RCA: 47] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/24/2023] Open
Abstract
Isoxazolines are oral insecticidal drugs currently licensed for ectoparasite control in companion animals. Here we propose their use in humans for the reduction of vector-borne disease incidence. Fluralaner and afoxolaner rapidly killed Anopheles, Aedes, and Culex mosquitoes and Phlebotomus sand flies after feeding on a drug-supplemented blood meal, with IC50 values ranging from 33 to 575 nM, and were fully active against strains with preexisting resistance to common insecticides. Based on allometric scaling of preclinical pharmacokinetics data, we predict that a single human median dose of 260 mg (IQR, 177-407 mg) for afoxolaner, or 410 mg (IQR, 278-648 mg) for fluralaner, could provide an insecticidal effect lasting 50-90 days against mosquitoes and Phlebotomus sand flies. Computational modeling showed that seasonal mass drug administration of such a single dose to a fraction of a regional population would dramatically reduce clinical cases of Zika and malaria in endemic settings. Isoxazolines therefore represent a promising new component of drug-based vector control.
Collapse
|
36
|
The seasonal influence of climate and environment on yellow fever transmission across Africa. PLoS Negl Trop Dis 2018; 12:e0006284. [PMID: 29543798 PMCID: PMC5854243 DOI: 10.1371/journal.pntd.0006284] [Citation(s) in RCA: 54] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2017] [Accepted: 01/30/2018] [Indexed: 11/19/2022] Open
Abstract
Background Yellow fever virus (YFV) is a vector-borne flavivirus endemic to Africa and Latin America. Ninety per cent of the global burden occurs in Africa where it is primarily transmitted by Aedes spp, with Aedes aegypti the main vector for urban yellow fever (YF). Mosquito life cycle and viral replication in the mosquito are heavily dependent on climate, particularly temperature and rainfall. We aimed to assess whether seasonal variations in climatic factors are associated with the seasonality of YF reports. Methodology/Principal findings We constructed a temperature suitability index for YFV transmission, capturing the temperature dependence of mosquito behaviour and viral replication within the mosquito. We then fitted a series of multilevel logistic regression models to a dataset of YF reports across Africa, considering location and seasonality of occurrence for seasonal models, against the temperature suitability index, rainfall and the Enhanced Vegetation Index (EVI) as covariates alongside further demographic indicators. Model fit was assessed by the Area Under the Curve (AUC), and models were ranked by Akaike’s Information Criterion which was used to weight model outputs to create combined model predictions. The seasonal model accurately captured both the geographic and temporal heterogeneities in YF transmission (AUC = 0.81), and did not perform significantly worse than the annual model which only captured the geographic distribution. The interaction between temperature suitability and rainfall accounted for much of the occurrence of YF, which offers a statistical explanation for the spatio-temporal variability in transmission. Conclusions/Significance The description of seasonality offers an explanation for heterogeneities in the West-East YF burden across Africa. Annual climatic variables may indicate a transmission suitability not always reflected in seasonal interactions. This finding, in conjunction with forecasted data, could highlight areas of increased transmission and provide insights into the occurrence of large outbreaks, such as those seen in Angola, the Democratic Republic of the Congo and Brazil. In this article, we describe the development of a model to quantify the seasonal dynamics of yellow fever virus (YFV) transmission across Africa. YFV is a flavivirus transmitted, within Africa, primarily by Aedes spp where it causes an estimated 78,000 deaths a year despite the presence of a safe and effective vaccine. The importance of sufficient vaccination, made difficult by a global shortage, has been highlighted by recent large scale, devastating, outbreaks in Angola, the Democratic Republic of the Congo and Brazil. Here we describe a novel way of parameterising the effect of temperature on YFV transmission and implement statistical models to predict both the geographic and temporal heterogeneities in transmissions, while demonstrating their robustness in comparison to models simply predicting geographic distribution. We believe this quantification of seasonality could lead to more precise applications of vaccination campaigns and vector-control programmes. In turn this would help maximise their impact, especially vital with limited resources, and could contribute to lessening the risk of large scale outbreaks. Not only this, but the methods described here could be applied to other Aedes-borne diseases and as such provide a useful tool in understanding, and combatting, several other important diseases such as dengue and zika.
Collapse
|
37
|
When are pathogen genome sequences informative of transmission events? PLoS Pathog 2018; 14:e1006885. [PMID: 29420641 PMCID: PMC5821398 DOI: 10.1371/journal.ppat.1006885] [Citation(s) in RCA: 65] [Impact Index Per Article: 10.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2017] [Revised: 02/21/2018] [Accepted: 01/18/2018] [Indexed: 01/19/2023] Open
Abstract
Recent years have seen the development of numerous methodologies for reconstructing transmission trees in infectious disease outbreaks from densely sampled whole genome sequence data. However, a fundamental and as of yet poorly addressed limitation of such approaches is the requirement for genetic diversity to arise on epidemiological timescales. Specifically, the position of infected individuals in a transmission tree can only be resolved by genetic data if mutations have accumulated between the sampled pathogen genomes. To quantify and compare the useful genetic diversity expected from genetic data in different pathogen outbreaks, we introduce here the concept of ‘transmission divergence’, defined as the number of mutations separating whole genome sequences sampled from transmission pairs. Using parameter values obtained by literature review, we simulate outbreak scenarios alongside sequence evolution using two models described in the literature to describe transmission divergence of ten major outbreak-causing pathogens. We find that while mean values vary significantly between the pathogens considered, their transmission divergence is generally very low, with many outbreaks characterised by large numbers of genetically identical transmission pairs. We describe the impact of transmission divergence on our ability to reconstruct outbreaks using two outbreak reconstruction tools, the R packages outbreaker and phybreak, and demonstrate that, in agreement with previous observations, genetic sequence data of rapidly evolving pathogens such as RNA viruses can provide valuable information on individual transmission events. Conversely, sequence data of pathogens with lower mean transmission divergence, including Streptococcus pneumoniae, Shigella sonnei and Clostridium difficile, provide little to no information about individual transmission events. Our results highlight the informational limitations of genetic sequence data in certain outbreak scenarios, and demonstrate the need to expand the toolkit of outbreak reconstruction tools to integrate other types of epidemiological data. The increasing availability of genetic sequence data has sparked an interest in using pathogen whole genome sequences to reconstruct the history of individual transmission events in an infectious disease outbreak. However, such methodologies rely on pathogen genomes mutating rapidly enough to discriminate between infected individuals, an assumption that remains to be investigated. To determine pathogen outbreaks for which genetic data is expected to be informative of transmission events, we introduce here the concept of ‘transmission divergence’, defined as the number of mutations separating pathogen genome sequences sampled from transmission pairs. We characterise transmission divergence of ten major outbreak causing pathogens using simulations and find significant variation between diseases, with viral outbreaks generally exhibiting higher transmission divergence than bacterial ones. We reconstruct these outbreaks using the R-packages outbreaker and phybreak and find that genetic sequence data, though useful for rapidly evolving pathogens, provides little to no information about outbreaks with low transmission divergence, such as Streptococcus pneumoniae and Shigella sonnei. Our results demonstrate the need to incorporate other sources of outbreak data, such as contact tracing data and spatial location data, into outbreak reconstruction tools.
Collapse
|
38
|
Social movements, structural violence, and conflict transformation in Northern Ireland: The role of loyalist paramilitaries. PEACE AND CONFLICT: JOURNAL OF PEACE PSYCHOLOGY 2018. [DOI: 10.1037/pac0000274] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
39
|
Diabetes insipidus after discontinuation of vasopressin infusion for septic shock. J Clin Pharm Ther 2017; 43:287-290. [PMID: 28895166 DOI: 10.1111/jcpt.12627] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2017] [Accepted: 08/21/2017] [Indexed: 12/25/2022]
Abstract
WHAT IS KNOWN AND OBJECTIVE Despite widespread use of vasopressin for the treatment of septic shock, few cases of diabetes insipidus (DI) following its discontinuation have been reported. CASE SUMMARY A 54-year-old man presented with pneumonia progressing to septic shock, requiring norepinephrine and vasopressin for refractory hypotension. After clinical improvement, the patient on 3 separate occasions developed polyuria and severe hypernatremia upon discontinuation of vasopressin, with prompt recovery upon its resumption. WHAT IS NEW AND CONCLUSION Occurrence of DI upon discontinuation of vasopressin infusion appears to be rare, but incidence may be underestimated due to a paucity of published reports. Actual incidence and underlying mechanism of this phenomenon remain to be elucidated.
Collapse
|
40
|
604 Effect of milk replacer fat content during calfhood and cereal type and supplemental saturated fat inclusion in the finishing ration on the performance and carcass composition of young Holstein Friesian bulls. J Anim Sci 2017. [DOI: 10.2527/asasann.2017.604] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
|
41
|
Care-Seeking for Diarrhoea in Southern Malawi: Attitudes, Practices and Implications for Diarrhoea Control. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2016; 13:ijerph13111140. [PMID: 27854311 PMCID: PMC5129350 DOI: 10.3390/ijerph13111140] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 07/19/2016] [Revised: 10/21/2016] [Accepted: 10/24/2016] [Indexed: 11/18/2022]
Abstract
This paper examined care-seeking behaviour and its associated risk factors when a family member had diarrhoea. Data was obtained from a survey conducted in Chikwawa, a district in Southern Malawi. Chikwawa is faced with a number of environmental and socioeconomic problems and currently diarrhoea morbidity in the district is estimated at 24.4%, statistically higher than the national average of 17%. Using hierarchically built data from a survey of 1403 households nested within 33 communities, a series of two level binary logistic regression models with Bayesian estimation were used to determine predictors of care-seeking behaviour. The results show that 68% of mothers used oral rehydration solutions (ORS) the last time a child in their family had diarrhoea. However, when asked on the action they take when a member of their household has diarrhoea two thirds of the mothers said they visit a health facility. Most respondents (73%) mentioned distance and transport costs as the main obstacles to accessing their nearest health facility and the same proportion of respondents mentioned prolonged waiting time and absence of health workers as the main obstacles encountered at the health facilities. The main predictor variables when a member of the family had diarrhoea were maternal age, distance to the nearest health facility, school level, and relative wealth, household diarrhoea endemicity, and household size while the main predictor variables when a child had diarrhoea were existence of a village health committee (VHC), distance to the nearest health facility, and maternal age. Most households use ORS for the treatment of diarrhoea and village health committees and health surveillance assistants (HSAs) are important factors in this choice of treatment. Health education messages on the use and efficacy of ORS to ensure proper and prescribed handling are important. There is need for a comprehensive concept addressing several dimensions of management and proper coordination of delivery of resources and services; availability of adequate healthcare workers at all levels; affordability to accessibility of healthcare resources and services to all communities; acceptability and quality of care; intensification of health education messages on the use and management of ORS, and prompt and timely treatment of diarrhoeal illness.
Collapse
|
42
|
|
43
|
Abstract
Despite evidence that intensive rehabilitation speeds recovery from acute illness, several studies on British rehabilitation units have shown that the time spent by patients in therapeutic activities is low and that levels of 'engagement' are poor. We carried out an observational study of patient activity on four rehabilitation wards for the elderly (51 patients observed at half-hourly intervals between 8 a.m. and 5 p.m. on five successive days). Patients were found to be engaged in therapeutically useful activities at only 17% of the observation points. When time spent in the therapy departments (where activities were not monitored) was excluded the proportion of useful activities fell to 11%. Similar patterns of activity were seen in all patient subgroups. An intervention scheme was therefore devised, whereby an hourly activities programme tailored to the needs of each patient was worked out by therapists and ward staff, to be supervised by nurses. One nurse also organized regular group activities. The intervention programme, which required no extra resources, was instituted on two of the four wards. A repeat survey conducted two months later showed a 55% increase in the proportion of time spent in useful activities on the two intervention wards but no change on the other two wards.
Collapse
|
44
|
Comparison of Northern Irish Children’s Attitudes to War and Peace Before and After the Paramilitary Ceasefires. INTERNATIONAL JOURNAL OF BEHAVIORAL DEVELOPMENT 2016. [DOI: 10.1080/016502597385144] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
Abstract
This study compares the attitudes of young people in Northern Ireland to conflict and conflict resolution, before and after the 1994 ceasefire announcements. Content analysis on the responses of 117 adolescents aged 14-15 years showed differences in their attitudes to war and peace and in their strategies to attain peace. Concepts of war as static and unchanging showed a significant difference after the ceasefire. In addition, the perception of war as a struggle between national leaders before the ceasefire shifted significantly to a more general view of war in terms of war activities and their negative consequences. Perceptions of peace as “active” showed a marked swing after the ceasefire to a more abstract view of peace as freedom, justice, and liberty after the ceasefire. Before the ceasefire, adolescents were reluctant to provide strategies to attain peace, but after the ceasefire, strategies were suggested with more confidence. Results also indicated that adolescents prefer an alternative to violence in the resolution of conflict. Although the proportion of adolescents who said the country was at peace did not change significantly after the ceasefire, the percentage who expressed ambivalent feelings about the status of Northern Ireland in terms of peace increased significantly. This suggests that, at the time of this study, many young people had not fully accepted the reality of the peace process.
Collapse
|
45
|
WS10.6 Are cystic fibrosis patients frail? Is 50 the new 80? J Cyst Fibros 2016. [DOI: 10.1016/s1569-1993(16)30119-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
|
46
|
How would a decision to leave the European Union affect medical research and health in the United Kingdom? J R Soc Med 2016; 109:216-218. [PMID: 27222366 DOI: 10.1177/0141076816652027] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
|
47
|
Accounting for uncertainty in the quantification of the environmental impacts of Canadian pig farming systems. J Anim Sci 2016; 93:3130-43. [PMID: 26115299 DOI: 10.2527/jas.2014-8403] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
The objective of the study was to develop a life cycle assessment (LCA) for pig farming systems that would account for uncertainty and variability in input data and allow systematic environmental impact comparisons between production systems. The environmental impacts of commercial pig production for 2 regions in Canada (Eastern and Western) were compared using a cradle-to-farm gate LCA. These systems had important contrasting characteristics such as typical feed ingredients used, herd performance, and expected emission factors from manure management. The study used detailed production data supplied by the industry and incorporated uncertainty/variation in all major aspects of the system including life cycle inventory data for feed ingredients, animal performance, energy inputs, and emission factors. The impacts were defined using 5 metrics-global warming potential, acidification potential, eutrophication potential (EP), abiotic resource use, and nonrenewable energy use-and were expressed per kilogram carcass weight at farm gate. Eutrophication potential was further separated into marine EP (MEP) and freshwater EP (FEP). Uncertainties in the model inputs were separated into 2 types: uncertainty in the data used to describe the system (α uncertainties) and uncertainty in impact calculations or background data that affects all systems equally (β uncertainties). The impacts of pig production in the 2 regions were systematically compared based on the differences in the systems (α uncertainties). The method of ascribing uncertainty influenced the outcomes. In eastern systems, EP, MEP, and FEP were lower (P < 0.05) when assuming that all uncertainty in the emission factors for leaching from manure application was β. This was mainly due to increased EP resulting from field emissions for typical ingredients in western diets. When uncertainty in these emission factors was assumed to be α, only FEP was lower in eastern systems (P < 0.05). The environmental impacts for the other impact categories were not significantly different between the 2 systems, despite their aforementioned differences. In conclusion, a probabilistic approach was used to develop an LCA that systematically dealt with uncertainty in the data when comparing multiple environmental impacts measures in pig farming systems for the first time. The method was used to identify differences between Canadian pig production systems but can also be applied for comparisons between other agricultural systems that include inherent variation.
Collapse
|
48
|
The Ecological Dynamics of Fecal Contamination and Salmonella Typhi and Salmonella Paratyphi A in Municipal Kathmandu Drinking Water. PLoS Negl Trop Dis 2016; 10:e0004346. [PMID: 26735696 PMCID: PMC4703202 DOI: 10.1371/journal.pntd.0004346] [Citation(s) in RCA: 54] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2015] [Accepted: 12/09/2015] [Indexed: 11/20/2022] Open
Abstract
One of the UN sustainable development goals is to achieve universal access to safe and affordable drinking water by 2030. It is locations like Kathmandu, Nepal, a densely populated city in South Asia with endemic typhoid fever, where this goal is most pertinent. Aiming to understand the public health implications of water quality in Kathmandu we subjected weekly water samples from 10 sources for one year to a range of chemical and bacteriological analyses. We additionally aimed to detect the etiological agents of typhoid fever and longitudinally assess microbial diversity by 16S rRNA gene surveying. We found that the majority of water sources exhibited chemical and bacterial contamination exceeding WHO guidelines. Further analysis of the chemical and bacterial data indicated site-specific pollution, symptomatic of highly localized fecal contamination. Rainfall was found to be a key driver of this fecal contamination, correlating with nitrates and evidence of S. Typhi and S. Paratyphi A, for which DNA was detectable in 333 (77%) and 303 (70%) of 432 water samples, respectively. 16S rRNA gene surveying outlined a spectrum of fecal bacteria in the contaminated water, forming complex communities again displaying location-specific temporal signatures. Our data signify that the municipal water in Kathmandu is a predominant vehicle for the transmission of S. Typhi and S. Paratyphi A. This study represents the first extensive spatiotemporal investigation of water pollution in an endemic typhoid fever setting and implicates highly localized human waste as the major contributor to poor water quality in the Kathmandu Valley. Aiming to understand the ecology of municipal drinking water and measure the potential exposure to pathogens that cause typhoid fever (Salmonella Typhi and Salmonella Paratyphi A) in Kathmandu, Nepal, we collected water samples from 10 water sources weekly for one year and subjected them to comprehensive chemical, bacteriological and molecular analyses. We found that Kathmandu drinking water exhibits longitudinal fecal contamination in excess of WHO guidelines. The chemical composition of water indicated site-specific pollution profiles, which were likely driven by localized contamination with human fecal material. We additionally found that Salmonella Typhi and Salmonella Paratyphi A could be detected throughout the year in every water sampling location, but specifically peaked after the monsoons. A microbiota analysis (a method for studying bacterial diversity in biological samples) revealed the water to be contaminated by complex populations of fecal bacteria, which again exhibited a unique profile by both location and time. This study shows that Salmonella Typhi and Salmonella Paratyphi A can be longitudinally detected in drinking water in Kathmandu and represents the first major investigation of the spatiotemporal dynamics of drinking water pollution in an endemic typhoid setting.
Collapse
|
49
|
Community knowledge variation, bed-net coverage and the role of a district healthcare system, and their implications for malaria control in southern Malawi. ACTA ACUST UNITED AC 2015. [DOI: 10.1080/10158782.2012.11441496] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
50
|
|