51
|
Moraru A, de Almeida MM, Degryse JM. PALTEM: What Parameters Should Be Collected in Disaster Settings to Assess the Long-Term Outcomes of Famine? INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2018; 15:ijerph15050857. [PMID: 29693637 PMCID: PMC5981896 DOI: 10.3390/ijerph15050857] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/08/2018] [Revised: 04/14/2018] [Accepted: 04/21/2018] [Indexed: 12/18/2022]
Abstract
Evidence suggests that nutritional status during fetal development and early life leaves an imprint on the genome, which leads to health outcomes not only on a person as an adult but also on his offspring. The purpose of this study is to bring forth an overview of the relevant parameters that need to be collected to assess the long-term and transgenerational health outcomes of famine. A literature search was conducted for the most pertinent articles on the epigenetic effects of famine. The results were compiled, synthesized and discussed with an expert in genetics for critical input and validation. Prenatal and early life exposure to famine was associated with metabolic, cardiovascular, respiratory, reproductive, neuropsychiatric and oncologic diseases. We propose a set of parameters to be collected in disaster settings to assess the long-term outcomes of famine: PALTEM (parameters to assess long-term effects of malnutrition).
Collapse
|
52
|
Brand MP, Peeters PH, van Gils CH, Elias SG. Pre-adult famine exposure and subsequent colorectal cancer risk in women. Int J Epidemiol 2018; 46:612-621. [PMID: 27585673 DOI: 10.1093/ije/dyw121] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/18/2016] [Indexed: 01/20/2023] Open
Abstract
Background Nutritional deprivation during growth and development may contribute to colorectal cancer (CRC) risk in later life. Methods We studied 7906 women who were aged 0-21 years during the 1944-45 Dutch famine, who enrolled in the Prospect-EPIC study between 1993 and 1997. We used Cox proportional hazard analyses to estimate hazard ratios (HRs) and 95% confidence intervals (CIs) for colorectal (proximal, distal and rectal) cancer risk across self-reported famine exposure and exposure-age categories, while adjusting for potential confounders. Results During a median of 17.3 years of follow-up, 245 CRC cases occurred. Moderately and severely famine-exposed women showed a respective 24% and 44% higher CRC risk compared with women who reported no exposure [HR moderate 1.24 (95% CI: 0.93-1.64); HR severe 1.44 (1.03-2.03); P trend 0.027]. This relation attenuated when adjusted for potential confounders [adjusted HR moderate 1.15 (0.87-1.53); HR severe 1.35 (0.96-1.90); P trend 0.091]. Stratified results suggested that severe famine exposure between 10 and 17 years of age was particularly related to CRC risk[adjusted HR moderate 1.39 (0.91-2.11); HR severe 1.76 (1.10-2.83); P trend 0.019; P interaction(famine*10-17yrs) 0.096]. Overall, we found no differences in famine effects across CRC subsites, but age-at-exposure stratified results suggested an increased risk for proximal CRC in those aged 10-17 years during exposure to the famine [adjusted HR moderate 2.14 (1.06-4.32), HR severe 2.96 (1.35-6.46); P trend 0.005]. Overall and within age-at-exposure categories, tests for subsite specific heterogeneity in famine effects were not significant. Conclusions Our findings suggest that severe exposure to a short period of caloric restriction in pre-adult women may relate to CRC risk decades later.
Collapse
|
53
|
Xu H, Zhang Z, Li L, Liu J. Early life exposure to China's 1959-61 famine and midlife cognition. Int J Epidemiol 2018; 47:109-120. [PMID: 29126190 PMCID: PMC6075478 DOI: 10.1093/ije/dyx222] [Citation(s) in RCA: 55] [Impact Index Per Article: 9.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Revised: 09/28/2017] [Accepted: 10/04/2017] [Indexed: 11/14/2022] Open
Abstract
Background Existing studies of the 1944-45 Dutch famine found little evidence of the association between early life malnutrition and midlife cognition. Methods Among 2446 rural participants born between 1958 and 1963 in the China Health and Retirement Longitudinal Study, we examined effects of exposure to China's 1959-61 Great Leap Forward famine during prenatal and early postnatal life, on four cognitive measures in 2011 (baseline) and changes in cognition between 2011 and 2013 (first follow-up). We obtained difference-in-differences (DID) estimates of the famine effects by exploiting temporal variation in the timing and duration of famine exposure across six birth cohorts born between 1958 and 1963, together with geographical variation in famine severity at the prefecture level. Results After adjusting for gender, marital status and provincial fixed effects, we found that the 1961 cohort who experienced full-term prenatal and partial-term postnatal exposures to famine had lower scores on the Telephone Interview of Cognitive Status (TICS), a test of drawing pentagons, and general cognition at age 50 years compared with the unexposed 1963 cohort. Adjusting for education, the famine effects on drawing pentagons and general cognition were fully attenuated, but the effect on TICS persisted. We also found a robust negative famine effect on the longitudinal change in general cognition during the 2-year follow-up in the 1959 cohort. Conclusions Severe nutritional deprivation during prenatal and postnatal periods has a lasting impact on cognitive performance in Chinese adults in their early 50s.
Collapse
|
54
|
Chang X, Song P, Wang M, An L. The Risks of Overweight, Obesity and Abdominal Obesity in Middle Age after Exposure to Famine in Early Life: Evidence from the China's 1959-1961 Famine. J Nutr Health Aging 2018; 22:1198-1204. [PMID: 30498826 DOI: 10.1007/s12603-018-1144-z] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/02/2023]
Abstract
OBJECTIVE Several studies have revealed that exposure to famine in early life was associated with higher body mass index(BMI) and waist circumference, and most of them used data from cross-sectional studies and defined those born before or after the famine period as non-exposed participants, which ignored the effects caused by age. Our objective was to study the effects of undernutrition in early life on overweight, obesity and abdominal obesity in those aged 54-56. METHODS This was a retrospective cohort study with the status at age of 54-56 as outcomes. 1092 participants born between 1959 and 1961 from 2015 wave of China Health and Retirement Longitudinal Study (CHARLS) were defined as exposed and 1616 born between 1955 and 1957 from 2011 wave of CHARLS were defined as control. We used the prevalence odds ratios(ORs) to estimate the risks of overweight, obesity, abdominal obesity, and stratified by famine severity and sex separately for comparisons. RESULTS Exposed group had higher risks of overweight (OR 1.357, 95%CI 1.067,1.727) and obesity (OR 1.356, 95%CI 1.001,1.836) in women, not in men. Participants in exposed group were more likely to have abdominal obesity (OR 1.362, 95%CI 1.139,1.629), regardless of famine severity and gender. CONCLUSION Undernutrition in early life increased the risks of overweight and obesity in women not in men. And the risk of abdominal obesity was increased with the experience of undernutrition at early age both in men and women.
Collapse
|
55
|
Vaiserman AM. Early-Life Nutritional Programming of Type 2 Diabetes: Experimental and Quasi-Experimental Evidence. Nutrients 2017; 9:nu9030236. [PMID: 28273874 PMCID: PMC5372899 DOI: 10.3390/nu9030236] [Citation(s) in RCA: 39] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2017] [Revised: 02/11/2017] [Accepted: 02/23/2017] [Indexed: 12/24/2022] Open
Abstract
Consistent evidence from both experimental and human studies suggest that inadequate nutrition in early life can contribute to risk of developing metabolic disorders including type 2 diabetes (T2D) in adult life. In human populations, most findings supporting a causative relationship between early-life malnutrition and subsequent risk of T2D were obtained from quasi-experimental studies (‘natural experiments’). Prenatal and/or early postnatal exposures to famine were demonstrated to be associated with higher risk of T2D in many cohorts around the world. Recent studies have highlighted the importance of epigenetic regulation of gene expression as a possible major contributor to the link between the early-life famine exposure and T2D in adulthood. Findings from these studies suggest that prenatal exposure to the famine may result in induction of persistent epigenetic changes that have adaptive significance in postnatal development but can predispose to metabolic disorders including T2D at the late stages of life. In this review, quasi-experimental data on the developmental programming of T2D are summarized and recent research findings on changes in DNA methylation that mediate these effects are discussed.
Collapse
|
56
|
Ekamper P, Bijwaard G, van Poppel F, Lumey LH. War-related excess mortality in The Netherlands, 1944-45: New estimates of famine- and non-famine-related deaths from national death records. HISTORICAL METHODS 2017; 50:113-128. [PMID: 30416230 PMCID: PMC6226247 DOI: 10.1080/01615440.2017.1285260] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/09/2023]
Abstract
Despite there being several estimates for famine-related deaths in the west of The Netherlands during the last stage of World War II, no such information exists for war-related excess mortality among the civilian population from other areas of the country. Previously unavailable data files from Statistics Netherlands allow researchers to estimate the number of war-related excess deaths during the last stage of the war in the whole country. This study uses a seasonal-adjusted mortality model combined with a difference-in-difference approach to estimate the number of excess deaths in the period between January 1944 and July 1945 at a total of close to 91,000 (75%) excess deaths. Almost half of all war-related excess mortality during the last year of the war occurred outside the west.
Collapse
|
57
|
Xie SH, Lagergren J. A possible link between famine exposure in early life and future risk of gastrointestinal cancers: Implications from age-period-cohort analysis. Int J Cancer 2016; 140:636-645. [PMID: 27778325 DOI: 10.1002/ijc.30485] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2016] [Revised: 10/05/2016] [Accepted: 10/19/2016] [Indexed: 01/21/2023]
Abstract
The Chinese famine in 1958-1962 was one of the worst in human history, but its potential influence on cancer risks is uncertain. Using cancer incidence data in Shanghai, China, during 1983-2007, we calculated age-specific incidence rates of gastrointestinal cancers in birth cohorts exposed to the Chinese famine in different periods of life and a non-exposed reference cohort. Age-period-cohort regressions estimated the overall relative risks of gastrointestinal cancers in each birth cohort. A total of 212,098 new cases of gastrointestinal cancer were identified during the study period (129,233 males and 82,865 females), among whom 18,146 had esophageal cancer, 71,011 gastric cancer, 55,864 colorectal cancer, 42,751 liver cancer, 9,382 gallbladder cancer and 14,944 had pancreatic cancer. The risk of esophageal, gastric, colorectal and liver cancers was higher in cohorts exposed to the Chinese famine in early life than in the reference cohort, except for esophageal cancer in women. The risk of esophageal, liver and colorectal cancers was particularly high in men exposed to famine during early childhood (0-9 years). There were no clear associations between famine exposure and the risk of pancreatic or gallbladder cancer. This study suggests an increased risk of esophageal, gastric, liver and colorectal cancers associated with childhood exposure to the Chinese famine. These findings indicate a need for further investigations confirming the results and identifying the underlying mechanisms.
Collapse
|
58
|
Wang J, Li Y, Han X, Liu B, Hu H, Wang F, Li X, Yang K, Yuan J, Yao P, Miao X, Wei S, Wang Y, Liang Y, Zhang X, Guo H, Yang H, Hu FB, Wu T, He M. Exposure to the Chinese Famine in Childhood Increases Type 2 Diabetes Risk in Adults. J Nutr 2016; 146:2289-2295. [PMID: 27629572 DOI: 10.3945/jn.116.234575] [Citation(s) in RCA: 60] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/19/2016] [Accepted: 08/18/2016] [Indexed: 01/19/2023] Open
Abstract
BACKGROUND Evidence shows that exposure to poor conditions in early life is associated with an increased risk of chronic diseases in adults. OBJECTIVE We investigated whether exposure to the Chinese famine (1959-1961) in the fetal stage or in childhood (0-9 y) was associated with type 2 diabetes (T2D) and hyperglycemia in adulthood. METHODS We included 7801 subjects aged 56.4 ± 3.3 y from the Dongfeng-Tongji cohort. Subjects were classified into late-, middle-, and early-childhood-exposed, fetal-exposed, and unexposed groups. Excess mortality rate was used to evaluate the severity of famine. Logistic regression models were used to analyze the famine-dysglycemia associations. Generalized linear models were used to assess the famine effects on dysglycemia risk during the 5-y follow-up period among 3100 subjects. RESULTS In descriptive analyses, the risk of T2D was significantly greater in the middle-childhood-exposed group (OR: 1.44; 95% CI: 1.10, 1.87; P = 0.007), and the risk of hyperglycemia was higher in the middle- and late-childhood-exposed groups than in the unexposed group (OR: 1.54; 95% CI: 1.26, 1.88 and OR: 1.51; 95% CI: 1.23, 1.85, respectively). In sex-specific analyses, women exposed in middle childhood (OR: 1.55; 95% CI: 1.16, 2.06) and late childhood (OR: 1.40; 95% CI: 1.05, 1.87) had a higher risk of T2D than unexposed women. This association was not found in men. Similar associations were found for hyperglycemia risk. Moreover, subjects who experienced severe famine in childhood had a 38% higher T2D risk (95% CI: 1.05, 1.81) than those exposed to less severe famine. In retrospective cohort analyses, participants who experienced famine in middle childhood had a higher hyperglycemia risk relative to the unexposed group (RR: 2.06; 95% CI: 1.08, 3.90). CONCLUSION Exposure to the Chinese famine in childhood was related to an increased risk of adulthood T2D and hyperglycemia, particularly in women.
Collapse
|
59
|
Kobyliansky E, Torchinsky D, Kalichman L, Karasik D. Leukocyte telomere length pattern in a Chuvash population that experienced mass famine in 1922-1923: a retrospective cohort study. Am J Clin Nutr 2016; 104:1410-1415. [PMID: 27733399 DOI: 10.3945/ajcn.116.138040] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2016] [Accepted: 09/01/2016] [Indexed: 11/14/2022] Open
Abstract
BACKGROUND To our knowledge, there are no experimental studies that have addressed the effects of starvation on the maintenance of telomere length. Two epidemiologic studies that have addressed this topic gave controversial results. OBJECTIVE We characterized leukocyte telomere length (LTL) in a Chuvash population that was comprised of survivors of the mass famine of 1922-1923 and in these survivors' descendants. DESIGN The tested cohort consisted of native Chuvash men (n = 687) and women (n = 647) who were born between 1909 and 1980 and who resided in small villages in the Chuvash Republic of the Russian Federation. Data were gathered during 3 expeditions undertaken in 1994, 1999, and 2002. With the use of this method of gathering the study cohort, we were able to treat age and birth year as independent variables (i.e., after adjustment for age, we were able to analyze how LTL correlates with a birth year in the interval between 1909 and 1980). The DNA of peripheral blood leukocytes was used to measure the telomere length with a quantitative polymerase chain reaction technique. RESULTS The main observations were as follows: 1) there were shorter leukocyte telomeres in men born after 1923 (i.e., after the mass famine) than in men born before 1922 (i.e., before the mass famine); 2) there was a stable inheritance of shorter telomeres by men of ensuing generations; and 3) there was an absence of a correlation between LTL and birth year in women. CONCLUSIONS Our study does not provide direct evidence for leukocyte telomere shortening in famine survivors. However, the comparative analysis of LTL in the survivors and their descendants suggests that such an effect did take place. The study also implies that mass famine may be associated with telomere shortening in male descendants of famine survivors. This observation is in agreement with the "thrifty telomere hypothesis" predicting that longer telomeres are disadvantageous in nutritionally marginal environments.
Collapse
|
60
|
Welburn SC, Molyneux DH, Maudlin I. Beyond Tsetse--Implications for Research and Control of Human African Trypanosomiasis Epidemics. Trends Parasitol 2016; 32:230-241. [PMID: 26826783 DOI: 10.1016/j.pt.2015.11.008] [Citation(s) in RCA: 40] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2015] [Revised: 11/02/2015] [Accepted: 11/13/2015] [Indexed: 01/16/2023]
Abstract
Epidemics of both forms of human African trypanosomiasis (HAT) are confined to spatially stable foci in Sub-Saharan Africa while tsetse distribution is widespread. Infection rates of Trypanosoma brucei gambiense in tsetse are extremely low and cannot account for the catastrophic epidemics of Gambian HAT (gHAT) seen over the past century. Here we examine the origins of gHAT epidemics and evidence implicating human genetics in HAT epidemiology. We discuss the role of stress causing breakdown of heritable tolerance in silent disease carriers generating gHAT outbreaks and see how peculiarities in the epidemiologies of gHAT and Rhodesian HAT (rHAT) impact on strategies for disease control.
Collapse
|
61
|
Abstract
OBJECTIVES Early-life adversity has been shown to be associated with cardiovascular disease and mortality in later life, but little is known about the mechanisms that underlie this association. Prenatal undernutrition, a severe early-life stressor, is associated with double the risk of coronary heart disease and increased blood pressure responses to psychological stress. In the present study, we tested the hypothesis that prenatal undernutrition induces alterations in the autonomic nervous system, which may increase the risk of developing heart disease. METHODS We studied autonomic function in 740 men and women (mean [SD] age, 58 [0.9] years) who were members of the Dutch famine birth cohort. We compared those exposed to famine during early (n = 64), mid (n = 107), or late gestation (n = 127) to those unexposed to famine in utero (n = 442). Participants underwent a series of 3 psychological stressors (Stroop, mirror tracing, and speech) while their blood pressure and heart rate were recorded continuously. RESULTS Data had sufficient quality in 602 participants for derivation of autonomic function indices by spectral analysis. The stress protocol led to significant sample-level changes in systolic blood pressure, heart rate, and all cardiovascular control measures (all p values < .001). None of the autonomic function parameters, at rest or in response to stress, differed significantly (all p values > .050) according to prenatal famine exposure. CONCLUSIONS Prenatal undernutrition was not associated with autonomic function in late adulthood. We conclude that altered autonomic function does not seem to explain our previous findings of increased coronary heart disease risk among those exposed to famine prenatally.
Collapse
|
62
|
Akerkar S. Development of a normative framework for disaster relief: learning from colonial famine histories in India. DISASTERS 2015; 39 Suppl 2:219-243. [PMID: 26395110 DOI: 10.1111/disa.12155] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
Contemporary academic debates on the history of the colonial Famine Codes in India--also considered to be the first coded and institutionalised normative frameworks for natural disaster response on the continent--generally are based on one of two perspectives. The first focuses on their economic rationale, whereas the second underlines that they constitute an anti-famine contract between the colonial masters and the people of India. This paper demonstrates that both of these viewpoints are limited in scope and that they simplify the nature of governance instituted through famine response practices in Colonial India. It links this reality to current disaster response policies and practices in India and shows that the discussion on the development of normative frameworks underlying disaster response is far from over. The paper goes on to evaluate the development of normative frameworks for disaster response and recovery, which remain embroiled in the politics of governmentality that underlies their development.
Collapse
|
63
|
Fuller P. Changing disaster relief regimes in China: an analysis using four famines between 1876 and 1962. DISASTERS 2015; 39 Suppl 2:146-165. [PMID: 26395106 DOI: 10.1111/disa.12152] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
Once afflicted by frequent episodes of famine, China--particularly the Chinese state--is growing in importance as a player in the overseas aid and development sector. This paper examines four famines in modern China-defined as the period since the First Opium War of 1839-42-to shed light on the changing nature of state involvement in disaster relief in the country, while also demonstrating the breadth and diversity of relief agency in the past. It makes the case that traditional disaster relief principles and methods were active well into the twentieth century, and that the statist model of today's People's Republic is not an essential characteristic of Chinese humanitarian organisation. Rather, the extent to which the Chinese state will continue to assume a dominant role in the country's re-emerging civic and charity sector is, as in earlier times, a function of the political developments and struggles that lie ahead.
Collapse
|
64
|
Ekamper P, van Poppel F, Stein AD, Bijwaard GE, Lumey LH. Prenatal famine exposure and adult mortality from cancer, cardiovascular disease, and other causes through age 63 years. Am J Epidemiol 2015; 181:271-9. [PMID: 25632050 DOI: 10.1093/aje/kwu288] [Citation(s) in RCA: 44] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/03/2023] Open
Abstract
Nutritional conditions in early life may affect adult health, but prior studies of mortality have been limited to small samples. We evaluated the relationship between pre-/perinatal famine exposure during the Dutch Hunger Winter of 1944-1945 and mortality through age 63 years among 41,096 men born in 1944-1947 and examined at age 18 years for universal military service in the Netherlands. Of these men, 22,952 had been born around the time of the Dutch famine in 6 affected cities; the remainder served as unexposed controls. Cox proportional hazards models were used to estimate hazard ratios for death from cancer, heart disease, other natural causes, and external causes. After 1,853,023 person-years of follow-up, we recorded 1,938 deaths from cancer, 1,040 from heart disease, 1,418 from other natural causes, and 523 from external causes. We found no increase in mortality from cancer or cardiovascular disease after prenatal famine exposure. However, there were increases in mortality from other natural causes (hazard ratio = 1.24, 95% confidence interval: 1.03, 1.49) and external causes (hazard ratio = 1.46, 95% confidence interval: 1.09, 1.97) after famine exposure in the first trimester of gestation. Further follow-up of the cohort is needed to provide more accurate risk estimates of mortality from specific causes of death after nutritional disturbances during gestation and very early life.
Collapse
|
65
|
Huang C, Guo C, Nichols C, Chen S, Martorell R. Elevated levels of protein in urine in adulthood after exposure to the Chinese famine of 1959-61 during gestation and the early postnatal period. Int J Epidemiol 2014; 43:1806-14. [PMID: 25298393 DOI: 10.1093/ije/dyu193] [Citation(s) in RCA: 32] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/04/2023] Open
Abstract
BACKGROUND Animal models have suggested that undernutrition during gestation and the early postnatal period may adversely affect kidney development and compromise renal function. As a natural experiment, famines provide an opportunity to test such potential effects in humans. We assessed whether exposure to the Chinese famine of 1959-1961 during gestation and early postnatal life was associated with the levels of proteinuria among female adults three decades after exposure to the famine. METHODS We measured famine intensity using the cohort size shrinkage index and we constructed a difference-in-difference model to compare the levels of proteinuria, measured with a dipstick test of random urine specimens, among Chinese women (n = 70 543) whose exposure status to the famine varied across birth cohorts (born before, during or after the famine) and counties of residence with different degrees of famine intensity. RESULTS Famine exposure was associated with a greater risk [odds ratio (OR) = 1.54; 95% confidence interval (CI): 1.04, 2.28; P = 0.029) of having higher level of proteinuria among women born during the famine years (1959-61) compared with the unexposed post famine-born cohort (1964-65) in rural samples. No association was observed among urban samples. Results were robust to adjustment for covariates. CONCLUSIONS Severe undernutrition during gestation and the early postnatal period may have long-term effects on levels of proteinuria in humans, but the effect sizes may be small.
Collapse
|
66
|
Mendis N, Lin YR, Faucher SP. Comparison of virulence properties of Pseudomonas aeruginosa exposed to water and grown in rich broth. Can J Microbiol 2014; 60:777-81. [PMID: 25352257 DOI: 10.1139/cjm-2014-0519] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
Abstract
Pseudomonas aeruginosa is an opportunistic pathogen that can infect susceptible patients suffering from cystic fibrosis, immunosuppression, and severe burns. Nosocomial- and community-acquired infection is likely due to contact with water sources contaminated with P. aeruginosa. Most of what is known about the virulence properties of P. aeruginosa was derived from studies using fairly rich broths, which do not represent conditions found in water, such as low nutrient concentrations. Here, we compare biofilm production, invasion of epithelial cells, cytotoxicity, and pyocyanin production of P. aeruginosa in water with P. aeruginosa grown in rich broth. Since tap water is variable, we used a defined water medium, Fraquil, to ensure reproducibility of the results. We found that P. aeruginosa does not readily form biofilm in Fraquil. Pseudomonas aeruginosa is equally able to attach to and invade epithelial cells but is more cytotoxic after incubation in water for 30 days than when it is grown in rich broth. Moreover, P. aeruginosa produces less pyocyanin when exposed to water. Our results show that P. aeruginosa seems to have different properties when exposed to water than when grown in rich broth.
Collapse
|
67
|
Navarro-Colorado C, Mahamud A, Burton A, Haskew C, Maina GK, Wagacha JB, Ahmed JA, Shetty S, Cookson S, Goodson JL, Schilperoord M, Spiegel P. Measles outbreak response among adolescent and adult Somali refugees displaced by famine in Kenya and Ethiopia, 2011. J Infect Dis 2014; 210:1863-70. [PMID: 25117754 DOI: 10.1093/infdis/jiu395] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
BACKGROUND The refugee complexes of Dadaab, Kenya, and Dollo-Ado, Ethiopia, experienced measles outbreaks during June-November 2011, following a large influx of refugees from Somalia. METHODS Line-lists from health facilities were used to describe the outbreak in terms of age, sex, vaccination status, arrival date, attack rates (ARs), and case fatality ratios (CFRs) for each camp. Vaccination data and coverage surveys were reviewed. RESULTS In Dadaab, 1370 measles cases and 32 deaths (CFR, 2.3%) were reported. A total of 821 cases (60.1%) were aged ≥15 years, 906 (82.1%) arrived to the camps in 2011, and 1027 (79.6%) were unvaccinated. Camp-specific ARs ranged from 212 to 506 cases per 100 000 people. In Dollo-Ado, 407 cases and 23 deaths (CFR, 5.7%) were reported. Adults aged ≥15 years represented 178 cases (43.7%) and 6 deaths (26.0%). Camp-specific ARs ranged from 21 to 1100 cases per 100 000 people. Immunization activities that were part of the outbreak responses initially targeted children aged 6 months to 14 years and were later expanded to include individuals up to 30 years of age. CONCLUSIONS The target age group for outbreak response-associated immunization activities at the start of the outbreaks was inconsistent with the numbers of cases among unvaccinated adolescents and adults in the new population. In displacement of populations from areas affected by measles outbreaks, health authorities should consider vaccinating adults in routine and outbreak response activities.
Collapse
|
68
|
KIM DJ, YOO HS, LEE H. [Effects of the periodical spread of rinderpest on famine, epidemic, and tiger disasters in the late 17th Century]. UI SAHAK 2014; 23:1-56. [PMID: 24804681 PMCID: PMC10565085 DOI: 10.13081/kjmh.2014.23.1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/04/2014] [Accepted: 04/07/2014] [Indexed: 06/03/2023]
Abstract
This study clarifies the causes of the repetitive occurrences of such phenomena as rinderpest, epidemic, famine, and tiger disasters recorded in the Joseon Dynasty Chronicle and the Seungjeongwon Journals in the period of great catastrophe, the late 17th century in which the great Gyeongsin famine (1670~1671) and the great Eulbyeong famine (1695~1696) occurred, from the perspective that they were biological exchanges caused by the new arrival of rinderpest in the early 17th century. It is an objection to the achievements by existing studies which suggest that the great catastrophes occurring in the late 17th century are evidence of phenomena in a little ice age. First of all, rinderpest has had influence on East Asia as it had been spread from certain areas in Machuria in May 1636 through Joseon, where it raged throughout the nation, and then to the west part of Japan. The new arrival of rinderpest was indigenized in Joseon, where it was localized and spread periodically while it was adjusted to changes in the population of cattle with immunity in accordance with their life spans and reproduction rates. As the new rinderpest, which showed high pathogenicity in the early 17th century, was indigenized with its high mortality and continued until the late 17th century, it broke out periodically in general. Contrastively, epidemics like smallpox and measles that were indigenized as routine ones had occurred constantly from far past times. As a result, the rinderpest, which tried a new indigenization, and the human epidemics, which had been already indigenized long ago, were unexpectedly overlapped in their breakout, and hence great changes were noticed in the aspects of the human casualty due to epidemics. The outbreak of rinderpest resulted in famine due to lack of farming cattle, and the famine caused epidemics among people. The casualty of the human population due to the epidemics in turn led to negligence of farming cattle, which constituted factors that triggered rage and epidemics of rinderpest. The more the number of sources of infection and hosts with low immunity increased, the more lost human resources and farming cattle were lost, which led to a great famine. The periodic outbreak of the rinderpester along with the routine prevalence of various epidemics in the 17thcentury also had influenced on domestic and wild animals. Due to these phenomenon, full-fledged famines occurred that were incomparable with earlier ones. The number of domestic animals that were neglected by people who, faced with famines, were not able to take care of them was increased, and this might have brought about the rage of epidemics like rinderpest in domestic animals like cattle. The great Gyeongsin and Eulbyeong famines due to reoccurrence of the rinderpest in the late 17th century linked rinderpester, epidemics and great famines so that they interacted with each other. Furthermore, the recurring cycle of epidemics-famines-rinderpest-great famines constituted a great cycle with synergy, which resulted in eco-economic-historical great catastrophes accompanied by large scale casualties. Therefore, the Gyeongsin and Eulbyeong famines occurring in the late 17th century can be treated as events caused by the repetition of various periodic disastrous factors generated in 1670~1671 and in 1695~1696 respectively, and particularly as phenomena caused by biological exchanges based on rinderpester., rather than as little ice age phenomena due to relatively long term temperature lowering.
Collapse
|
69
|
Ekamper P, van Poppel F, Stein AD, Lumey LH. Independent and additive association of prenatal famine exposure and intermediary life conditions with adult mortality between age 18-63 years. Soc Sci Med 2013; 119:232-9. [PMID: 24262812 DOI: 10.1016/j.socscimed.2013.10.027] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2013] [Revised: 07/01/2013] [Accepted: 10/23/2013] [Indexed: 12/15/2022]
Abstract
OBJECTIVES To quantify the relation between prenatal famine exposure and adult mortality, taking into account mediating effects of intermediary life conditions. DESIGN Historical follow-up study. SETTING The Dutch famine (Hunger Winter) of 1944-1945 which occurred towards the end of WWII in occupied Netherlands. STUDY POPULATION From 408,015 Dutch male births born 1944-1947, examined for military service at age 18, we selected for follow-up all men born at the time of the famine in six affected cities in the Western Netherlands (n=25,283), and a sample of unexposed time (n=10,667) and place (n=9087) controls. These men were traced and followed for mortality through the national population and death record systems. OUTCOME MEASURE All-cause mortality between ages 18 and 63 years using Cox proportional hazards models adjusted for intermediary life conditions. RESULTS An increase in mortality was seen after famine exposure in early gestation (HR 1.12; 95% confidence interval (CI): 1.01-1.24) but not late gestation (HR 1.04; 95% CI: 0.96-1.13). Among intermediary life conditions at age 18 years, educational level was inversely associated with mortality and mortality was elevated in men with fathers with manual versus non-manual occupations (HR 1.08; CI: 1.02-1.16) and in men who were declared unfit for military service (HR 1.44; CI: 1.31-1.58). Associations of intermediate factors with mortality were independent of famine exposure in early life and associations between prenatal famine exposure and adult mortality were independent of social class and education at age 18. CONCLUSIONS Timing of exposure in relation to the stage of pregnancy may be of critical importance for later health outcomes independent of intermediary life conditions.
Collapse
|
70
|
Food Security-A Commentary: What Is It and Why Is It So Complicated? Foods 2012; 1:18-27. [PMID: 28239088 PMCID: PMC5302220 DOI: 10.3390/foods1010018] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2012] [Revised: 11/16/2012] [Accepted: 11/28/2012] [Indexed: 11/16/2022] Open
Abstract
Every year over 10 million people die of hunger and hunger related diseases. Nearly six million of these are children under the age of five; that is one child’s death approximately every six seconds. Understanding how this still occurs amid the ever increasing social enlightenment of the 21st century—and under the auspices of a vigilant global developmental community—is one of the key challenges of our time. The science of food security aims to address such concerns. By understanding the multiplicity of the phenomenon, practitioners of global multilateral hegemony seek to shape appropriate policy to address these issues. The difficulty however is that the phenomenon is increasingly wrapped up inside an ever growing bundle of societal aspirations including inter-alia under-nutrition, poverty, sustainability, free trade, national self sufficiency, reducing female subjugation and so on. Any solutions therefore, involve fully understanding just what is indeed included, implied, understood or excluded within the food security catchall. Indeed, until such time as consensus can be found that adequately binds the phenomenon within a fixed delineated concept, current efforts to address the multitude of often divergent threads only serves to dilute efforts and confound attempts to once-and-for-all bring these unacceptable figures under control.
Collapse
|
71
|
Ocho DL, Struik PC, Price LL, Kelbessa E, Kolo K. Assessing the levels of food shortage using the traffic light metaphor by analyzing the gathering and consumption of wild food plants, crop parts and crop residues in Konso, Ethiopia. JOURNAL OF ETHNOBIOLOGY AND ETHNOMEDICINE 2012; 8:30. [PMID: 22871123 PMCID: PMC3502140 DOI: 10.1186/1746-4269-8-30] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/14/2011] [Accepted: 07/25/2012] [Indexed: 05/19/2023]
Abstract
BACKGROUND Humanitarian relief agencies use scales to assess levels of critical food shortage to efficiently target and allocate food to the neediest. These scales are often labor-intensive. A lesser used approach is assessing gathering and consumption of wild food plants. This gathering per se is not a reliable signal of emerging food stress. However, the gathering and consumption of some specific plant species could be considered markers of food shortage, as it indicates that people are compelled to eat very poor or even health-threatening food. METHODS We used the traffic light metaphor to indicate normal (green), alarmingly low (amber) and fully depleted (red) food supplies and identified these conditions for Konso (Ethiopia) on the basis of wild food plants (WFPs), crop parts (crop parts not used for human consumption under normal conditions; CPs) and crop residues (CRs) being gathered and consumed. Plant specimens were collected for expert identification and deposition in the National Herbarium. Two hundred twenty individual households free-listed WFPs, CPs, and CRs gathered and consumed during times of food stress. Through focus group discussions, the species list from the free-listing that was further enriched through key informants interviews and own field observations was categorized into species used for green, amber and red conditions. RESULTS The study identified 113 WFPs (120 products/food items) whose gathering and consumption reflect the three traffic light metaphors: red, amber and green. We identified 25 food items for the red, 30 food items for the amber and 65 food items for the green metaphor. We also obtained reliable information on 21 different products/food items (from 17 crops) normally not consumed as food, reflecting the red or amber metaphor and 10 crop residues (from various crops), plus one recycled stuff which are used as emergency foods in the study area clearly indicating the severity of food stress (red metaphor) households are dealing with. Our traffic light metaphor proved useful to identify and closely monitor the types of WFPs, CPs, and CRs collected and consumed and their time of collection by subsistence households in rural settings. Examples of plant material only consumed under severe food stress included WFPs with health-threatening features like Dobera glabra (Forssk.) Juss. ex Poir. and inkutayata, parts of 17 crops with 21 food items conventionally not used as food (for example, maize tassels, husks, empty pods), ten crop residues (for example bran from various crops) and one recycled food item (tata). CONCLUSIONS We have complemented the conventional seasonal food security assessment tool used by humanitarian partners by providing an easy, cheap tool to scale food stress encountered by subsistence farmers. In cognizance of environmental, socio-cultural differences in Ethiopia and other parts of the globe, we recommend analogous studies in other parts of Ethiopia and elsewhere in the world where recurrent food stress also occurs and where communities intensively use WFPs, CPs, and CRs to cope with food stress.
Collapse
|
72
|
Kim JJ, Guha-Sapir D. Famines in Africa: is early warning early enough? Glob Health Action 2012; 5:GHA-5-18481. [PMID: 22745628 PMCID: PMC3384989 DOI: 10.3402/gha.v5i0.18481] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2012] [Revised: 05/19/2012] [Accepted: 05/19/2012] [Indexed: 11/23/2022] Open
Abstract
Following the second Sahelian famine in 1984–1985, major investments were made to establish Early Warning Systems. These systems help to ensure that timely warnings and vulnerability information are available to decision makers to anticipate and avert food crises. In the recent crisis in the Horn of Africa, alarming levels of acute malnutrition were documented from March 2010, and by August 2010, an impending food crisis was forecast. Despite these measures, the situation remained unrecognised, and further deteriorated causing malnutrition levels to grow in severity and scope. By the time the United Nations officially declared famine on 20 July 2011, and the humanitarian community sluggishly went into response mode, levels of malnutrition and mortality exceeded catastrophic levels. At this time, an estimated 11 million people were in desperate and immediate need for food. With warnings of food crises in the Sahel, South Sudan, and forecast of the drought returning to the Horn, there is an immediate need to institutionalize change in the health response during humanitarian emergencies. Early warning systems are only effective if they trigger an early response.
Collapse
|
73
|
Brown AS. The environment and susceptibility to schizophrenia. Prog Neurobiol 2011; 93:23-58. [PMID: 20955757 PMCID: PMC3521525 DOI: 10.1016/j.pneurobio.2010.09.003] [Citation(s) in RCA: 451] [Impact Index Per Article: 34.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2010] [Revised: 09/22/2010] [Accepted: 09/30/2010] [Indexed: 02/07/2023]
Abstract
In the present article the putative role of environmental factors in schizophrenia is reviewed and synthesized. Accumulating evidence from recent studies suggests that environmental exposures may play a more significant role in the etiopathogenesis of this disorder than previously thought. This expanding knowledge base is largely a consequence of refinements in the methodology of epidemiologic studies, including birth cohort investigations, and in preclinical research that has been inspired by the evolving literature on animal models of environmental exposures. This paper is divided into four sections. In the first, the descriptive epidemiology of schizophrenia is reviewed. This includes general studies on incidence, prevalence, and differences in these measures by urban-rural, neighborhood, migrant, and season of birth status, as well as time trends. In the second section, we discuss the contribution of environmental risk factors acting during fetal and perinatal life; these include infections [e.g. rubella, influenza, Toxoplasma gondii (T. gondii), herpes simplex virus type 2 (HSV-2)], nutritional deficiencies (e.g., famine, folic acid, iron, vitamin D), paternal age, fetal/neonatal hypoxic and other obstetric insults and complications, maternal stress and other exposures [e.g. lead, rhesus (Rh) incompatibility, maternal stress]. Other putative neurodevelopmental determinants, including cannabis, socioeconomic status, trauma, and infections during childhood and adolescence are also covered. In the third section, these findings are synthesized and their implications for prevention and uncovering biological mechanisms, including oxidative stress, apoptosis, and inflammation, are discussed. Animal models, including maternal immune activation, have yielded evidence suggesting that these exposures cause brain and behavioral phenotypes that are analogous to findings observed in patients with schizophrenia. In the final section, future studies including new, larger, and more rigorous epidemiologic investigations, and research on translational and clinical neuroscience, gene-environment interactions, epigenetics, developmental trajectories and windows of vulnerability, are elaborated upon. These studies are aimed at confirming observed risk factors, identifying new environmental exposures, elucidating developmental mechanisms, and shedding further light on genes and exposures that may not be identified in the absence of these integrated approaches. The study of environmental factors in schizophrenia may have important implications for the identification of causes and prevention of this disorder, and offers the potential to complement, and refine, existing efforts on explanatory neurodevelopmental models.
Collapse
|
74
|
Abstract
Regional variations are observed in outcome of schizophrenia, but reasons remain unclear. Outcome of schizophrenia is reported to be better in India. In this report based on census data, we highlight substantially greater mortality observed among the mentally ill than among the general population during famines in India in the 19th century. A possible selection against the most severe forms of schizophrenia could account for greater occurrence of better-outcome phenotypes. Population histories and environmental influences, including epigenetics, need to be considered to further investigate differences between schizophrenia phenotypes.
Collapse
|
75
|
Xu MQ, Sun WS, Liu BX, Feng GY, Yu L, Yang L, He G, Sham P, Susser E, St. Clair D, He L. Prenatal malnutrition and adult schizophrenia: further evidence from the 1959-1961 Chinese famine. Schizophr Bull 2009; 35:568-76. [PMID: 19155344 PMCID: PMC2669578 DOI: 10.1093/schbul/sbn168] [Citation(s) in RCA: 148] [Impact Index Per Article: 9.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
Abstract
OBJECTIVE Evidence from the 1944-1995 Dutch Hunger Winter and the 1959-1961 Chinese famines suggests that those conceived or in early gestation during famines, have a 2-fold increased risk of developing schizophrenia in adult life. We tested the hypothesis in a second Chinese population and also determined whether risk differed between urban and rural areas. METHOD The risk of schizophrenia was examined in Liuzhou prefecture of Guangxi autonomous region. Rates were compared among those conceived before, during, and after the famine years. Based on the decline in birth rates, we predicted that those born in 1960 and 1961 would have been exposed to the famine during conception or early gestation. All psychiatric case records in Liuzhou psychiatric hospital for the years 1971 through 2001 were examined and clinical/sociodemographic data extracted by psychiatrists blind to exposure status. Data on births and deaths in the famine years were also available, and cumulative mortality was estimated from later demographic surveys. Evidence of famine was verified, and results were adjusted for mortality. Relative risks (RRs) for schizophrenia were calculated for the region as a whole and for urban and rural areas separately. RESULTS Mortality-adjusted RR for schizophrenia was 1.5 (1960) and 2.05 (1961), respectively. However, the effect was exclusively from the rural areas RR = 1.68 (1960) and RR = 2.25 (1961). CONCLUSIONS We observe a 2-fold increased risk of schizophrenia among those conceived or in early gestation at the height of famine with risk related to severity of famine conditions.
Collapse
|