51
|
Abstract
BACKGROUND Adenocarcinoma of the lung, once considered minimally related to cigarette smoking, has become the most common type of lung cancer in the United States. The increased incidence of this cancer might be explained by advances in diagnostic technology (i.e., increased ability to perform biopsies on tumors in smaller, more distal airways), changes in cigarette design (e.g., the adoption of filtertips), or changes in smoking practices. We examined data from the Connecticut Tumor Registry and two American Cancer Society studies to explore these possibilities. METHODS Connecticut Tumor Registry data from 1959 through 1991 were analyzed to determine whether the increase in lung adenocarcinoma observed during that period could be best described by birth cohort effects (i.e., generational changes in cigarette smoking) or calendar period effects (i.e., diagnostic advances). Associations between cigarette smoking and death from specific types of lung cancer during the first 2 years of follow-up in Cancer Prevention Study I (CPS-I), initiated in 1959) and Cancer Prevention Study II (CPS-II, initiated in 1982) were also examined. RESULTS Adenocarcinoma incidence in Connecticut increased nearly 17-fold in women and nearly 10-fold in men from 1959 through 1991. The increases followed a clear birth cohort pattern, paralleling gender and generational changes in smoking more than diagnostic advances. Cigarette smoking became more strongly associated with death from lung adenocarcinoma in CPS-II compared with CPS-I, with relative risks of 19.0 (95% confidence interval [CI] = 8.3-47.7) for men and 8.1 (95% CI = 4.5-14.6) for women in CPS-II and 4.6 (95% CI = 1.7-12.6) for men and 1.5 (0.3-7.7) for women in CPS-I. CONCLUSIONS The increase in lung adenocarcinoma since the 1950s is more consistent with changes in smoking behavior and cigarette design than with diagnostic advances.
Collapse
Affiliation(s)
- M J Thun
- Epidemiology and Surveillance Research, American Cancer Society, Atlanta, GA 30329-4251, USA.
| | | | | | | | | | | |
Collapse
|
52
|
Abstract
With advances in molecular genetic technology, more studies will examine gene-environment interaction in disease etiology. If the primary purpose of the study is to estimate the effect of gene-environment interaction in disease etiology, one can do so without employing controls. The case-only design has been promoted as an efficient and valid method for screening for gene-environment interaction. The authors derive a method for estimating sample size requirements, present sample size estimates, and compare minimum sample size requirements to detect gene-environment interaction in case-only studies with case-control studies. Assuming independence between exposure and genotype in the population, the authors believe that the case-only design is more efficient than a case-control design in detecting gene-environment interaction. They also illustrate a method to estimate sample size when information on marginal effects (relative risk) of exposure and genotype is available from previous studies.
Collapse
Affiliation(s)
- Q Yang
- Epidemic Intelligence Service, Centers for Disease Control and Prevention, Atlanta, GA, USA
| | | | | |
Collapse
|
53
|
Flanders WD, Lin L, Pirkle J, Caudill S. THE AUTHORS REPLY. Am J Epidemiol 1997. [DOI: 10.1093/oxfordjournals.aje.a009357] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
|
54
|
Freedman DS, Gates L, Flanders WD, Van Assendelft OW, Barboriak JJ, Joesoef MR, Byers T. Black/white differences in leukocyte subpopulations in men. Int J Epidemiol 1997; 26:757-64. [PMID: 9279607 DOI: 10.1093/ije/26.4.757] [Citation(s) in RCA: 45] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/05/2023] Open
Abstract
BACKGROUND Although counts of leukocytes differ substantially between blacks and whites, and are predictive of ischaemic heart disease (IHD), racial differences in counts of leukocyte subpopulations have received less attention. METHODS We examined black/white differences in leukocyte subpopulations among 3467 white and 493 black 31-45 year-old-men who had previously served in the US Army. Laboratory determinations were performed at a central location during 1985-1986. RESULTS Black men had an 840 cell/microliter (or 15%) lower mean total leukocyte count than did white men, largely due to a 960 cell/microliter (or 25%) lower mean neutrophil count. Although black men also had a 20% lower mean monocyte count (= 70 cells/microliter) than did white men, their mean lymphocyte count was 10% higher (approximately = 200 cells/microliter). Counts of various leukocyte subpopulations were associated with cigarette smoking, haemoglobin levels, platelet counts, and several other characteristics, but black/white differences in counts of neutrophils, lymphocytes, monocytes and other subpopulations could not be attributed to any of the examined covariates. CONCLUSIONS Despite the relatively low counts of leukocytes and neutrophils among black men, their lymphocyte counts are generally higher than those among white men. It is possible that black/white differences in counts of various cell types may influence race-specific rates of IHD, and future studies should attempt to assess the importance of leukocyte subpopulations in the development of clinical disease.
Collapse
Affiliation(s)
- D S Freedman
- National Center for Chronic Disease Prevention and Health Promotion, Centers for Disease Control, Atlanta, GA 30341-3724, USA
| | | | | | | | | | | | | |
Collapse
|
55
|
Malilay J, Flanders WD, Brogan D. Método modificado de muestreo por conglomerados para la evaluación rápida de necesidades después de un desastre. Rev Panam Salud Publica 1997. [DOI: 10.1590/s1020-49891997000700002] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
|
56
|
Esteban E, Rubin CH, McGeehin MA, Flanders WD, Baker MJ, Sinks TH. Evaluation of Infant Diarrhea Associated with Elevated Levels of Sulfate in Drinking Water: A Case-Control Investigation in South Dakota. Int J Occup Environ Health 1997; 3:171-176. [PMID: 9891115 DOI: 10.1179/oeh.1997.3.3.171] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/31/2022]
Abstract
The objective of this study was to assess the association between infant diarrhea and ingestion of water containing elevated sulfate levels. The authors identified 274 mothers of infants born in 19 South Dakota counties with high water sulfate concentrations. Demographic information and seven-day- recall dietary and health data were obtained by telephone interviews. Sulfate in drinking water was measured from samples submitted by the participants. Logistic regression was used to estimate the risk for diarrhea (>/=3 loose stools in 24 hours). Questionnaires were completed for 274 households: 69% drank municipal water and 54% reported using the water in the infants' diets. Thirty-nine infants developed diarrhea. Of the 170 households that submitted water samples, 141 (83%) were using the water in the infants' diets. The median sulfate level of the water samples was 264 mg/L. Twenty-five of the infants developed diarrhea. Average infant daily sulfate intake was not significantly associated with an increased diarrhea rate. There was no significant association between sulfate ingestion and the incidence of diarrhea for the range of sulfate levels studied. There was no evidence of a dose-response or threshold effect. However, because of the small number of the most highly exposed infants, the possibility of such an association should be further evaluated.
Collapse
Affiliation(s)
- E Esteban
- CDC-NCEH, 4770 Buford Highway (F46), Atlanta, GA 30341-3724, USA
| | | | | | | | | | | |
Collapse
|
57
|
Abstract
Risk assessment is the process of estimating the likelihood that an adverse effect may result from exposure to a specific health hazard. The process traditionally involves hazard identification, dose-response assessment, exposure assessment, and risk characterization to answer "How many excess cases of disease A will occur in a population of size B due to exposure to agent C at dose level D?" For natural hazards, however, we modify the risk assessment paradigm to answer "How many excess cases of outcome Y will occur in a population of size B due to natural hazard event E of severity D?" Using a modified version involving hazard identification, risk factor characterization, exposure characterization, and risk characterization, we demonstrate that epidemiologic modeling and measures of risk can quantify the risks from natural hazard events. We further extend the paradigm to address mitigation, the equivalent of risk management, to answer "What is the risk for outcome Y in the presence of prevention intervention X relative to the risk for Y in the absence of X?" We use the preventable fraction to estimate the efficacy of mitigation, or reduction in adverse health outcomes as a result of a prevention strategy under ideal circumstances, and further estimate the effectiveness of mitigation, or reduction in adverse health outcomes under typical community-based settings. By relating socioeconomic costs of mitigation to measures of risk, we we illustrate that prevention effectiveness is useful for developing cost-effective risk management options.
Collapse
Affiliation(s)
- J Malilay
- Disaster Assessment and Epidemiology Section, Centers for Disease Control and Prevention, Atlanta, Georgia 30341-3724, USA
| | | | | | | |
Collapse
|
58
|
Barrett DH, Resnick HS, Foy DW, Dansky BS, Flanders WD, Stroup NE. Combat exposure and adult psychosocial adjustment among U.S. Army veterans serving in Vietnam, 1965-1971. J Abnorm Psychol 1996; 105:575-81. [PMID: 8952190 DOI: 10.1037/0021-843x.105.4.575] [Citation(s) in RCA: 26] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/03/2023]
Abstract
This study investigated the relationship between combat exposure and adult antisocial behavior in a sample of 2,490 male Army veterans of the Vietnam War who completed questionnaires about their psychological functioning. After adjustment for history of childhood behavior problems, posttraumatic stress disorder diagnosis, and demographic and military characteristics, it was found that veterans who experienced high and very high levels of combat were twice as likely to report adult antisocial behavior as veterans with no or low levels of combat and were also more likely to meet criteria for antisocial personality disorder. The results indicate that exposure to traumatic events during late adolescence or early adulthood is associated with multiple adult adjustment problems in vocational, interpersonal, and societal functioning. Treatment focusing on the effects of the trauma is likely to be necessary but not sufficient for improving affected veterans' behavior.
Collapse
Affiliation(s)
- D H Barrett
- Division of Environmental Hazards and Health Effects, National Center for Environmental Health, Centers for Disease Control and Prevention. Atlanta, Georgia 30341-3724, USA.
| | | | | | | | | | | |
Collapse
|
59
|
Abstract
Case-control studies using parents of case subjects as the control subjects provide an innovative way to study associations of genetic markers with disease risk. This approach, sometimes called the haplotype-relative risk method, has received recent attention because the use of parents as control subjects may reduce or eliminate the confounding associated with differences in race, ethnicity, or genetic background. We provide a new method for analysis of such case-parental control studies. The method of analysis is noniterative and yields simple estimates of risk ratios associated with genetic markers. It easily accommodates the situation in which data are available from only one parent. Although we illustrate the approach for a locus with two alleles, the analyses extend immediately to loci with multiple alleles.
Collapse
Affiliation(s)
- W D Flanders
- Rollins School of Public Health, Emory University, Department of Epidemiology, Atlanta, GA 30322, USA
| | | |
Collapse
|
60
|
Letizia C, Kapik B, Flanders WD. Suicidal risk during controlled clinical investigations of fluvoxamine. J Clin Psychiatry 1996; 57:415-21. [PMID: 9746450] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/08/2023]
Abstract
BACKGROUND Suicide is a serious risk factor in major depressive disorder. Paradoxical emergence of suicidal ideation or behavior during antidepressant treatment has been reported in isolated cases. An evaluation was undertaken to assess the risk of suicidality during treatment with fluvoxamine, a serotonin selective reuptake inhibitor. METHOD Meta-analyses were conducted on pooled data from double-blind, randomized, placebo-controlled, parallel-group clinical trials. The primary outcome measure was the suicide item of the Hamilton Rating Scale for Depression. Tests for emergence of substantial suicidal ideation and improvement or worsening in suicidal ideation were performed using the Mantel-Haenszel adjusted incidence difference. The Breslow-Day test was used to test for lack of homogeneity across trials. Secondary analysis, which consisted of Pearson's chi-square test, was used to confirm the Mantel-Haenszel result. RESULTS In comparison to placebo, fluvoxamine was associated with significantly greater improvement in suicidal ideation (p = .01) and significantly less worsening of suicidal ideation (p < .01). No differences were found in the emergence of substantial suicidal ideation. CONCLUSION These findings demonstrate that fluvoxamine is not associated with an increased risk of emergence of substantial suicidal thoughts among depressed patients. On the contrary, the results are suggestive of a protective effect of fluvoxamine upon the risk of suicidal ideation.
Collapse
Affiliation(s)
- C Letizia
- Solvay Pharmaceuticals, Marietta, GA 30062, USA
| | | | | |
Collapse
|
61
|
Khoury MJ, Flanders WD. Nontraditional epidemiologic approaches in the analysis of gene-environment interaction: case-control studies with no controls! Am J Epidemiol 1996; 144:207-13. [PMID: 8686689 DOI: 10.1093/oxfordjournals.aje.a008915] [Citation(s) in RCA: 295] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/01/2023] Open
Abstract
Although case-control studies are suitable for assessing gene-environment interactions, choosing appropriate control subjects is a valid concern in these studies. The authors review three nontraditional study designs that do not include a control group: 1) the case-only study, 2) the case-parental control study, and 3) the affected relative-pair method. In case-only studies, one can examine the association between an exposure and a genotype among case subjects only. Odds ratios are interpreted as a synergy index on a multiplicative scale, with independence assumed between the exposure and the genotype. In case-parental control studies, one can compare the genotypic distribution of case subjects with the expected distribution based on parental genotypes when there is no association between genotype and disease; the effect of a genotype can be stratified according to case subjects' exposure status. In affected relative-pair studies, the distribution of alleles identical by descent between pairs of affected relatives is compared with the expected distribution based on the absence of genetic linkage between the locus and the disease; the analysis can be stratified according to exposure status. Some or all of these methods have certain limitations, including linkage disequilibrium, confounding, assumptions of Mendelian transmission, an inability to measure exposure effects directly, and the use of a multiplicative scale to test for interaction. Nevertheless, they provide important tools to assess gene-environment interaction in disease etiology.
Collapse
Affiliation(s)
- M J Khoury
- Division of Birth Defects and Developmental Disabilities, National Center for Environmental Health, Centers for Disease Control and Prevention, Atlanta, GA 30341-3724, USA
| | | |
Collapse
|
62
|
Abstract
BACKGROUND During a record-setting heat wave in Chicago in July 1995, there were at least 700 excess deaths, most of which were classified as heat-related. We sought to determine who was at greatest risk for heat-related death. METHODS We conducted a case-control study in Chicago to identify risk factors associated with heat-related death and death from cardiovascular causes from July 14 through July 17, 1995. Beginning on July 21, we interviewed 339 relatives, neighbors, or friends of those who died and 339 controls matched to the case subjects according to neighborhood and age. RESULTS The risk of heat-related death was increased for people with known medical problems who were confined to bed (odds ratio as compared with those who were not confined to bed, 5.5) or who were unable to care for themselves (odds ratio, 4.1). Also at increased risk were those who did not leave home each day (odds ratio, 6.7), who lived alone (odds ratio, 2.3), or who lived on the top floor of a building (odds ratio, 4.7). Having social contacts such as group activities or friends in the area was protective. In a multivariate analysis, the strongest risk factors for heat-related death were being confined to bed (odds ratio, 8.2) and living alone (odds ratio, 2.3); the risk of death was reduced for people with working air conditioners (odds ratio, 0.3) and those with access to transportation (odds ratio, 0.3). Deaths classified as due to cardiovascular causes had risk factors similar to those for heat-related death. CONCLUSIONS In this study of the 1995 Chicago heat wave, those at greatest risk of dying from the heat were people with medical illnesses who were socially isolated and did not have access to air conditioning. In future heat emergencies, interventions directed to such persons should reduce deaths related to the heat.
Collapse
Affiliation(s)
- J C Semenza
- Epidemic Intelligence Service, Centers for Disease Control and Prevention, Atlanta, GA 30341-3724, USA
| | | | | | | | | | | | | |
Collapse
|
63
|
Abstract
Because of previously reported associations among the total leukocyte count, cigarette smoking, and risk of cardiovascular disease, we examined the relation of cigarette smoking to various leukocyte subpopulations among 3467 men aged 31 to 45 years. The median total leukocyte count was 36% higher (7840 vs. 5760 cells/mL) among current cigarette smokers than among men who had never smoked, and both stratification and regression analyses were used to examine independent associations with leukocyte subpopulations. At equivalent counts of other subpopulations, CD4+ lymphocytes and neutrophils were the cell types most strongly associated with cigarette smoking; each standard deviation change in counts of these subpopulations increased the odds of current (vs. never) smoking by approximately threefold. Furthermore, whereas 15% of the 238 men with relatively low (< 25 percentile) counts of both neutrophils and CD4+ lymphocytes were cigarette smokers, 96% of the 249 men with relatively high counts of both subpopulations were current smokers. Counts of T lymphocytes also tended to be higher among the 32 men with self-reported ischemic heart disease than among other men. These results, along with previous reports of immunologically active T lymphocytes in atherosclerotic plaques, suggest that this subpopulation may be of particular interest in studies examining the relation of leukocytes to cardiovascular disease.
Collapse
Affiliation(s)
- D S Freedman
- National Center for Chronic Disease Prevention and Health Promotion, Centers for Disease Control and Prevention, Atlanta, GA 30341-3724, USA
| | | | | | | | | |
Collapse
|
64
|
Abstract
To identify risk factors for uterine fibroids, a case-control design used to analyze data from control subjects enrolled in the Cancer and Steroid Hormone Study. Case patients were 201 women who reported a history of uterine fibroids, and control subjects were 1503 women without fibroids, individually matched by age to case patients. Reporting of fibroids was more frequent among premenopausal women, women who had frequent Papanicolaou (Pap) smears, women who used oral contraceptives and had infrequent Pap smears, and women with higher education. Reporting of fibroids was less frequent among women with a lower body mass index who were current or long-time smokers.
Collapse
Affiliation(s)
- A R Samadi
- School of Public Health, Emory University, Atlanta, GA 30329, USA
| | | | | | | | | |
Collapse
|
65
|
Yoon PW, Freeman SB, Sherman SL, Taft LF, Gu Y, Pettay D, Flanders WD, Khoury MJ, Hassold TJ. Advanced maternal age and the risk of Down syndrome characterized by the meiotic stage of chromosomal error: a population-based study. Am J Hum Genet 1996; 58:628-33. [PMID: 8644722 PMCID: PMC1914585] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/01/2023] Open
Abstract
The identification of DNA polymorphisms makes it possible to classify trisomy 21 according to the parental origin and stage (meiosis I [MI], meiosis II [MII], or postzygotic mitotic) of the chromosomal error. Studying the effect of parental age on these subgroups could shed light on parental exposures and their timing. From 1989 through 1993, 170 infants with trisomy 21 and 267 randomly selected control infants were ascertained in a population-based, case-control study in metropolitan Atlanta. Blood samples for genetic studies were obtained from case infants and their parents. Using logistic regression, we independently examined the association between maternal and paternal age and subgroups of trisomy 21 defined by parental origin and meiotic stage. The distribution of trisomy 21 by origin was 86% maternal (75% MI and 25% MII), 9% paternal (50% MI and 50% MII), and 5% mitotic. Compared with women <25 years of age, women > or = 40 years old had an odds ratio of 5.2 (95% confidence interval, 1.0-27.4) for maternal MI (MMI) errors and 51.4 (95% confidence interval, 2.3-999.0) for maternal MII (MMII) errors. Birth-prevalence rates for women > or = 40 years old were 4.2/1000 births for MMI errors and 1.9/1000 for MMII errors. These results support an association between advanced maternal age and both MMI and MMII errors. The association with MI does not pinpoint the timing of the error; however, the association with MII implies that there is at least one maternal-age related mechanism acting around the time of conception.
Collapse
Affiliation(s)
- P W Yoon
- Birth Defects and Genetic Diseases Branch, National Center for Environmental Health, Centers for Disease Control and Prevention, Atlanta, GA, USA
| | | | | | | | | | | | | | | | | |
Collapse
|
66
|
Abstract
The authors assessed the reliability of alcohol intake as recalled from 10 years in the past in a cohort of 2,907 US adults. Participants reported their drinking habits in the First National Health and Nutrition Examination Survey interview during 1971-1975. During a follow-up interview in 1982-1984, they were asked to recall their drinking habits 10 years earlier and to report their current habits. In general, the correlation for recalled alcohol intake versus reported intake at baseline was good (r = 0.7). For all subgroups stratified by race, sex, education, smoking status, and disease status, the age-adjusted correlations for recalled alcohol intake versus baseline intake were equal to or higher than those for current alcohol intake versus baseline intake. The reliability of recall of alcohol intake in the past differs among subgroups with different age and education levels. Recalled alcohol intake was also highly correlated with current alcohol intake; in particular, current heavier drinkers tended to underestimate their previous amount of drinking, an effect that was independent of other factors. These data suggest that although recalled alcohol intake is a better predictor of past intake than are reports of current intake, current drinking habits may be an important influencing factor in the estimation of alcohol intake as recalled from the distant past.
Collapse
Affiliation(s)
- S Liu
- Chronic Disease Prevention Branch, Centers for Disease Control and Prevention, Atlanta, GA 30333, USA
| | | | | | | | | | | |
Collapse
|
67
|
Abstract
Screening programs, such as annual mammography, are undertaken to reduce mortality and/or morbidity from chronic diseases such as cancer. Matched case-control studies have been used to assess the effectiveness of screening programs because of their relative simplicity and low cost. In such studies, the exposure history for controls consists of the number of screening examinations received prior to the date of diagnosis of the matched case. The authors know of no methodological evaluations that demonstrate the validity of such case-control studies. To examine the possible existence of bias due to design rules, the authors developed a simple deterministic model, which is used to calculate expected screening and disease patterns in a cohort. Cases and matched controls are selected from the cohort, and their screening histories are used to calculate an odds ratio, as is commonly done in practice. Results utilizing this simple model suggest that systematic inclusion of the examination from which diagnosis is made, which is the approach typically used in practice, leads to a positive bias (odds ratio > 1) in the absence of any real effect. Systematic exclusion of this examination appears to lead to a negative bias (odds ratio < 1). Although this simple approach has several limitations, the results suggest that a commonly used method of conducting case-control studies may yield biased odds ratios. Possible methods to reduce this bias may exist, such as defining exposure intervals differently.
Collapse
Affiliation(s)
- R S Hosek
- Emory University School of Public Health, Division of Epidemiology, Atlanta, GA 30329, USA
| | | | | |
Collapse
|
68
|
Abstract
We propose a method to screen for the matrilineal inheritance in mitochondrial disorders by comparing the risk of disease in a person whose mother is affected or whose maternal grandmother or aunt or uncle is affected to the risk of disease in a person whose father is affected or whose paternal grandmother or aunt or uncle is affected using a modification of the reconstructed cohort design. Sampling of pedigrees is accomplished via probands and must not be influenced by family history. The cohort of the proband's offspring, and offspring of the proband's siblings, can be analyzed using survival analysis. Cox proportional hazards model, Bonney's [(1986) Biometrics 42:611-625] model, and Liang's [(1991) Genet Epidemiol 8:329-338] model. Mitochondrial transmission can be distinguished from X-linked transmission by examining sex-specific patterns of disease expression in matrilineally transmitted diseases. To illustrate our epidemiologic method, we apply our screening method to pedigrees of two disorders which have been proposed to have a mitochondrial DNA component to their inheritance.
Collapse
Affiliation(s)
- F Mili
- Department of Epidemiology, Rollins School of Public Health, Emory University, Atlanta, GA 30322, USA
| | | | | | | | | |
Collapse
|
69
|
Malilay J, Flanders WD, Brogan D. A modified cluster-sampling method for post-disaster rapid assessment of needs. Bull World Health Organ 1996; 74:399-405. [PMID: 8823962 PMCID: PMC2486880] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/02/2023] Open
Abstract
The cluster-sampling method can be used to conduct rapid assessment of health and other needs in communities affected by natural disasters. It is modelled on WHO's Expanded Programme on Immunization method of estimating immunization coverage, but has been modified to provide (1) estimates of the population remaining in an area, and (2) estimates of the number of people in the post-disaster area with specific needs. This approach differs from that used previously in other disasters where rapid needs assessments only estimated the proportion of the population with specific needs. We propose a modified n x k survey design to estimate the remaining population, severity of damage, the proportion and number of people with specific needs, the number of damaged or destroyed and remaining housing units, and the changes in these estimates over a period of time as part of the survey.
Collapse
Affiliation(s)
- J Malilay
- Disaster Assessment and Epidemiology Section, Centers for Disease Control and Prevention, Atlanta, GA 30341-3724, USA
| | | | | |
Collapse
|
70
|
Abstract
In many case-control studies of common diseases, investigators use family history information to assess familial aggregation of the disease and the influence of genetic factors. Positive family history among first-degree relatives is often used as a risk factor, and its odds ratio is calculated. Although the limitations of this approach have been discussed, it is not clear how much impact such limitations could have on measuring familial aggregation. To assess this impact, we compare odds ratios derived from using a positive family history in case-control studies with measures of relative risk derived from comparing lifetime risks of disease among first-degree relatives of case subjects with those among first-degree relatives of control subjects. Positive family history is a function of the number of relatives, the background risk of disease, the age distribution of relatives, and the correlation in risk among relatives. It can be shown that even without case-control differences in the number or ages of relatives, positive family history tends to overestimate relative risk measures applied to individual relatives. This overestimation is accentuated with increasing frequency of the disease, with increasing number of relatives, and for diseases with earlier age at onset. It is further affected by even small case-control differences in family size and age distribution of relatives. As such, positive family history is not a stable indicator of familial aggregation across different case-control studies of the same disease.
Collapse
Affiliation(s)
- M J Khoury
- Birth Defects and Genetic Diseases Branch, Centers for Disease Control and Prevention, Atlanta, GA 30333, USA
| | | |
Collapse
|
71
|
Abstract
OBJECTIVES This study was undertaken to examine changes in smoking-specific death rates from the 1960s to the 1980s. METHODS In two prospective studies, one from 1959 to 1965 and the other from 1982 to 1988, death rates from lung cancer, coronary heart disease, and other major smoking-related diseases were measured among more than 200,000 current smokers and 480,000 lifelong non-smokers in each study. RESULTS From the first to the second study, lung cancer death rates (per 100,000) among current cigarette smokers increased from 26 to 155 in women and from 187 to 341 in men; the increase persisted after current daily cigarette consumption and years of smoking were controlled for. Rates among nonsmokers were stable. In contrast, coronary heart disease and stroke death rates decreased by more than 50% in both smokers and nonsmokers. The all-cause rate difference between smokers and nonsmokers doubled for women but was stable for men. CONCLUSIONS Premature mortality (the difference in all-cause death rates between smokers and nonsmokers) doubled in women and continued unabated in men from the 1960s to the 1980s. Lung cancer surpassed coronary heart disease as the largest single contributor to smoking-attributable death among White middle-class smokers.
Collapse
Affiliation(s)
- M J Thun
- Department of Epidemiology and Surveillance, American Cancer Society, Atlanta, GA 30329-4251, USA
| | | | | | | | | |
Collapse
|
72
|
Abstract
Polybrominated biphenyl (PBB), a flame-retardant material, was introduced into the food chain in Michigan in 1973 due to a manufacturing and distribution mistake. Following public concern about the long-term health effects of PBB in humans, a cohort of PBB-exposed Michigan residents was assembled in 1975. We initiated this study to determine the half-life of PBB in human sera and to understand how continued body burden relates to the possible adverse health consequences of PBB exposure. To determine the half-life, eligible persons were selected from the cohort if they had at least two PBB measurements 1 year apart and had an initial level > or = 20 pbb. There were 163 persons who met the criteria with a median PBB level of 45.5 ppb. The estimated half-life is 10.8 years (95% CI, 9.2-14.7 years). The body burden of PBB in exposed persons will decrease only gradually over time. For persons with an initial level of 45.5 ppb of PBB, it will take more than 60 years for their PBB levels to fall below the current level of detection of 1 ppb.
Collapse
Affiliation(s)
- D H Rosen
- Centers for Disease Control and Prevention, Atlanta, GA 30341-3724, USA
| | | | | | | | | |
Collapse
|
73
|
Abstract
Misclassification of exposure is a serious problem in epidemiology. Methods for addressing misclassification are available, but most are based on limiting assumptions such as availability of a "gold standard" measure of true exposure, or availability of two tests of exposure whose performance is nondifferential. In this paper, we discuss a method that allows the investigator to correct for differential misclassification in case-control studies. Our method only requires two potentially imperfect tests for measuring exposure. Importantly, the sensitivity and specificity of each test when applied to cases may differ from the sensitivity and specificity when applied to controls. The approach does require two subgroups of cases, such that each test's sensitivity and specificity is the same across these subgroups and requires analogous subgroups for controls. We exemplify our approach in several ways, using hypothetical data, using data from a case-control study of birth defects and service in Vietnam, and by a small Monte Carlo study. Finally, we discuss limitations of the method.
Collapse
Affiliation(s)
- W D Flanders
- Division of Epidemiology, Emory University School of Public Health, Atlanta, GA 30329, USA
| | | | | |
Collapse
|
74
|
Abstract
BACKGROUND One of the epidemiologist's most basic tasks is estimation of disease occurrence. To perform this task, the epidemiologist frequently models variability in disease occurrence using one of three distributions--the binomial, the Poisson or the exponential distribution. Although epidemiologists often use them and their properties appear in standard texts, we know of no text or review that compares and contrasts epidemiological application of these distributions. METHODS In this commentary, we discuss these three basic distributions. We note key assumptions as well as limitations, and compare results from analyses based on each distribution. RESULTS AND CONCLUSIONS We illustrate that the three distributions, although superficially different, often lead to similar results. We argue that epidemiologists should often obtain similar results regardless of which distribution they use. We also point out that application of all three distributions can be inappropriate if assumptions of independence or homogeneity of risks fail to hold. Finally, we briefly review how these basic distributions can be used to justify use of other distributions, such as the Gaussian distribution, for studying disease-exposure associations.
Collapse
Affiliation(s)
- W D Flanders
- Emory University School of Public Health, Division of Epidemiology, Atlanta, GA 30329, USA
| | | |
Collapse
|
75
|
Abstract
The authors sought to determine whether current cigarette smoking was associated with impotence among middle-aged men. This is a secondary analysis of a cross-sectional survey of 4,462 US Army Vietnam-era veterans aged 31-49 years who took part in the Vietnam Experience Study in 1985-1986. The main outcome measurement was the odds ratio for reported impotence, which was calculated by comparing current smokers with nonsmokers while controlling for multiple confounders. The study sample consisted of 1,162 never smokers, 1,292 former smokers, and 2,008 current smokers. The prevalence of impotence was 2.2% among never smokers, 2.0% among former smokers, and 3.7% among current smokers (p = 0.005). The unadjusted odds ratio (OR) of the association between smoking and reported impotence was 1.8 (95% confidence interval (CI) 1.2-2.6). The association held even after adjustments were made for confounders, including vascular disease, psychiatric disease, hormonal factors, substance abuse, marital status, race, and age (OR = 1.5, 95% CI 1.0-2.2). Neither years smoked nor cigarettes smoked daily were significant predictors of impotence in current smokers. The authors concluded that, among the men in this study, a higher percentage of cigarette smokers reported impotence than did nonsmokers. This observation could not be totally explained by comorbidity factors related to smoking.
Collapse
Affiliation(s)
- D M Mannino
- Air Pollution and Respiratory Health Branch, Centers for Disease Control and Prevention (CDC), Atlanta, GA
| | | | | |
Collapse
|
76
|
Flanders WD, Shipp CC, FitzGerald DM, Lin LS. Analysis of variations in mortality rates with small numbers. Health Serv Res 1994; 29:461-71. [PMID: 7928372 PMCID: PMC1070017] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/27/2023] Open
Abstract
OBJECTIVE We present a Monte Carlo technique to evaluate if observed mortality rates differ from model-predicted rates for situations when the number of deaths is small. DATA SOURCES We used Medicare hospital claims and model-predicted mortality rates from the Health Care Financing Administration (HCFA) for the 169 acute care hospitals in Georgia. The HCFA data provided model-predicted mortality rates at 30 days postadmission for 17 conditions and procedures of interest. The model-predicted rates calculated by HCFA were adjusted for patient factors, including demographic characteristics, principal diagnosis, and comorbidities. STUDY DESIGN We test the hypothesis that model-predicted 30-day mortality rates at the 169 hospitals differ significantly from the observed 30-day mortality rates. Our approach uses a test statistic that resembles a chi-square statistic, and Monte Carlo simulations to estimate the distribution of the test statistic under the null hypothesis of no differences between the observed and predicted rates. We illustrate the method using two conceptually similar simulation models. We use results of the simulations to estimate p-values and compare these results with p-values associated with the nominal chi-square distribution. DATA EXTRACTION METHODS We extracted 30-day observed and predicted mortality rates for Medicare beneficiaries for federal fiscal year 1990 for 17 conditions and procedures of interest. PRINCIPAL FINDINGS If the number of deaths in some hospitals is small, p-values calculated using the nominal chi-square distribution can be misleading, thus supporting the usefulness of our simulation method. CONCLUSIONS The Monte Carlo simulation is an appropriate approach to the analysis of hospital mortality or small area analysis for situations in which the number of deaths is small.
Collapse
|
77
|
Affiliation(s)
- H Austin
- Division of Epidemiology, Emory University School of Public Health, Atlanta, GA 30329
| | | | | | | |
Collapse
|
78
|
Mannino DM, Etzel RA, Flanders WD. Do the medical history and physical examination predict low lung function? Arch Intern Med 1993; 153:1892-7. [PMID: 8250649] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 01/29/2023]
Abstract
BACKGROUND We sought to determine whether an abnormal respiratory history or chest physical examination could be used to identify men with low lung function. METHODS We analyzed pulmonary function, physical examination, and questionnaire data from 4461 middle-aged male Vietnam-era army veterans. MAIN RESULTS The study sample consisted of 1161 never smokers, 1292 former smokers, and 2008 current smokers. Clinical indicators of respiratory disease (respiratory symptoms, respiratory signs, or a history of respiratory disease), were present in 26.1% of the never smokers, 31.7% of the former smokers, and 47.2% of the current smokers. We defined low forced expiratory volume in 1 second as a value less than 81.2% of the predicted value. Seven percent of the never smokers, 8% of the former smokers, and 17.3% of the current smokers demonstrated low forced expiratory volume in 1 second. Among those with a clinical indicator for spirometry only 11% of the never smokers, 13% of the former smokers, and 21% of the current smokers actually had a low forced expiratory volume in 1 second. Among those without a clinical indicator 6% of the never smokers, 6% of the former smokers, and 14% of the current smokers actually had a low forced expiratory volume in 1 second. CONCLUSIONS The use of clinical indicators as a basis for obtaining pulmonary function tests in middle-aged men misses many with low lung function, especially current smokers.
Collapse
Affiliation(s)
- D M Mannino
- National Center for Environmental Health, Centers for Disease Control and Prevention, Atlanta, Ga
| | | | | |
Collapse
|
79
|
Philen RM, Hill RH, Flanders WD, Caudill SP, Needham L, Sewell L, Sampson EJ, Falk H, Kilbourne EM. Tryptophan contaminants associated with eosinophilia-myalgia syndrome. The Eosinophilia-Myalgia Studies of Oregon, New York and New Mexico. Am J Epidemiol 1993; 138:154-9. [PMID: 8356958 DOI: 10.1093/oxfordjournals.aje.a116841] [Citation(s) in RCA: 36] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/30/2023] Open
Abstract
Eosinophilia-myalgia syndrome (EMS) has been linked to ingestion of tryptophan contaminated with 1,1'-ethylidene-bis[L-tryptophan] (EBT), but other contaminants have received little study. The authors identified 101 lots of L-tryptophan that had been consumed either by persons with EMS or by asymptomatic tryptophan users and quantified the amounts of EBT and five other contaminants in each lot. After stratification of case and noncase lots by time of manufacture to adjust for the strong sequential pattern over time among case and noncase lots, higher EBT levels were still associated with a lot's case status, but the association lacked statistical significance (p = 0.120, odds ratio = 1.56, 95% confidence interval 0.758-3.23). While these findings do not rule out the possibility that EBT is the etiologic agent in EMS, they raise the possibility that other chemical contaminants in manufactured tryptophan modify the effects of EBT or that the causal agent of EMS is an entirely distinct compound.
Collapse
Affiliation(s)
- R M Philen
- Division of Environmental Hazards and Health Effects, National Center for Environmental Health, Centers for Disease Control and Prevention, Atlanta, GA 30341-3724
| | | | | | | | | | | | | | | | | |
Collapse
|
80
|
Abstract
Information bias is among the most serious and common problems in epidemiology. Approaches have been developed to reduce information bias by correcting for known amounts of misclassification. Unfortunately, in most studies, the extent of exposure misclassification cannot be easily estimated. We discuss the application to case-control studies of an approach originally proposed by Hui and Walter in 1980 to estimate the sensitivity and specificity of two independent classification schemes (Hui SL, Walter SD. Biometrics 1980;36:167-171). In this paper, we propose using the EM algorithm to provide a simple numeric technique for implementing their method that seems to converge for most real-world data. Our approach allows inclusion of a measure of non-independence of the two classification schemes, and we assess the influence of non-independence on the odds ratio. Finally, we provide a simple variance estimate for the odds ratio based on the delta method and maximum likelihood theory. We exemplify our results and method with data from a case-control study of sudden infant death syndrome in which data on some variables were obtained from both maternal interviews and medical records.
Collapse
Affiliation(s)
- C D Drews
- Division of Epidemiology, Emory University School of Public Health, Atlanta, GA 30329
| | | | | |
Collapse
|
81
|
Hill RH, Caudill SP, Philen RM, Bailey SL, Flanders WD, Driskell WJ, Kamb ML, Needham LL, Sampson EJ. Contaminants in L-tryptophan associated with eosinophilia myalgia syndrome. Arch Environ Contam Toxicol 1993; 25:134-142. [PMID: 8346973 DOI: 10.1007/bf00230724] [Citation(s) in RCA: 44] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/22/2023]
Abstract
In late 1989, an epidemic of eosinophilia-myalgia syndrome (EMS) that resulted in several thousand cases of the syndrome and 36 deaths was recognized in the United States. Physicians in New Mexico linked the epidemic to the ingestion of L-tryptophan (LT). Results of studies indicated that one or more trace contaminants in LT were likely causes of the EMS epidemic. Investigators traced the LT that was taken by most patients with EMS to a single manufacturer, Showa Denko K.K. of Japan. We now report results of high performance liquid chromatographic analysis of LT samples from this manufacturer. Three sets of blind-coded samples were analyzed: the priority case lot set, which included 54 case-associated LT lots and 50 noncase-associated LT lots that were taken by case and control subjects who used only one brand of LT; the single lot case set, which included 73 case-associated LT lots and 25 noncase associated LT lots taken by case and control subjects who used only a single lot of LT; and the South Carolina tablet set, which included LT tablets taken by case subjects (n = 26) and by control subjects (n = 52). We statistically compared the concentration of each contaminant in case-associated, noncase-associated, and control samples of each sample set. The analyses showed that there were more than 60 minor contaminants in the LT from Showa Denko K.K., and that six of these contaminants were associated with EMS. The structures of three contaminants are known, but the identities of the other three contaminants are currently unknown.(ABSTRACT TRUNCATED AT 250 WORDS)
Collapse
Affiliation(s)
- R H Hill
- Division of Environmental Health Laboratory Sciences, Centers for Disease Control and Prevention, Atlanta, Georgia 30333
| | | | | | | | | | | | | | | | | |
Collapse
|
82
|
Abstract
We use a simple, empirical model to describe the healthy worker effect mortality pattern. Under this simple model, internal comparisons of risk with increasing cumulative exposure will tend to be biased away from the null because of the healthy worker effect. We illustrate the potential magnitude of the bias in a simple situation and show that controlling for time since hire, by means of standard epidemiologic methods, eliminates the bias. Time since hire also is a concern of occupational epidemiologists because of the issue of induction time; sufficient time may not have elapsed among recently hired workers for an exposure to manifest its effect on disease occurrence. Provision for an adequate induction period can be addressed, like the concern raised in this paper, by restricting the analysis to workers first employed many years before the start of the follow-up period.
Collapse
Affiliation(s)
- W D Flanders
- Division of Epidemiology, Emory University School of Public Health, Atlanta, GA 30329
| | | | | |
Collapse
|
83
|
Abstract
BACKGROUND The lifetime risk of developing breast cancer in U.S. women, often quoted as one in nine, is a commonly cited cancer statistic. However, many estimates have used cancer rates derived from total rather than the cancer-free population and have not properly accounted for multiple cancers in the same individual. PURPOSE Our purpose was to provide a revised method for calculating estimates of the lifetime risk of developing breast cancer and to aid in interpretation of the estimates. METHODS A multiple decrement life table was derived by applying age-specific incidence and mortality rates from cross-sectional data to a hypothetical cohort of women. Incidence, mortality, and population data from 1975-1988 were used, representing the geographic areas of the National Cancer Institute's Surveillance, Epidemiology, and End Results (SEER) Program. The incidence rates reflected only the first breast primary cancer; mortality rates reflected causes other than breast cancer. The population denominator used in calculating incidence rates was adjusted to reflect only those women without previously diagnosed breast cancers in the hypothetical cohort. RESULTS Our calculations showed an overall lifetime risk for developing invasive breast cancer of approximately one in eight with use of 1987-1988 SEER data, although up to age 85, it was still the commonly quoted one in nine. CONCLUSION Our estimate was calculated assuming constant age-specific rates derived from 1987-1988 SEER data. Because incidence and mortality rates change over time, conditional risk estimates over the short term (10 or 20 years) may be more reliable. A large portion of the rise in the lifetime risk of breast cancer estimated using 1975-1977 data (one in 10.6) to an estimate using 1987-1988 data (one in eight) may be attributed to 1) early detection of prevalent cases due to increased use of mammographic screening and 2) lower mortality due to causes other than breast cancer. A common misperception is that the lifetime risk estimate assumes that all women live to a particular age (e.g., 85 or 95). In fact, the calculation assumes that women can die from causes other than breast cancer at any possible age. Cutting off the lifetime risk calculation at age 85 assumes that no women develop breast cancer after that age. While the lifetime risk of developing breast cancer rose over the period 1976-1977 to 1987-1988, the lifetime risk of dying of breast cancer increased from one in 30 to one in 28, reflecting generally flat mortality trends.
Collapse
Affiliation(s)
- E J Feuer
- Division of Cancer Prevention and Control, National Cancer Institute, Bethesda, Md 20892
| | | | | | | | | | | |
Collapse
|
84
|
Mili F, Khoury MJ, Flanders WD, Greenberg RS. Risk of childhood cancer for infants with birth defects. I. A record-linkage study, Atlanta, Georgia, 1968-1988. Am J Epidemiol 1993; 137:629-38. [PMID: 8470664 DOI: 10.1093/oxfordjournals.aje.a116720] [Citation(s) in RCA: 65] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/31/2023] Open
Abstract
To evaluate the risk of childhood cancer among infants with serious birth defects, the authors linked records of the population-based registry of the Georgia Center for Cancer Statistics for 1975 to 1988 with records of the population-based Metropolitan Atlanta Congenital Defects Program for 1968 to 1987. During the study period, birth defects were diagnosed in 19,373 infants younger than 1 year of age, and cancer was diagnosed in 400 children younger than 15 years of age. The observed number of children with a defect who developed cancer was compared with the number expected on the basis of the cancer registry rates. Of the 19,373 children with birth defects, 31 developed cancer (standardized incidence ratio (SIR) = 2.2, 95% confidence interval (CI) 1.5-3.2). Two associations were found: of 532 children with Down's syndrome (trisomy 21), three developed acute leukemia (SIR = 50.8, 95% CI 10.5-148.5) while of 746 children with pyloric stenosis, four developed cancer (SIR = 7.5, 95% CI 2.0-19.3). These data show that children with selected birth defects are at increased risk for specific childhood cancers. Such record-linkage can reveal new associations, which can in turn help researchers understand underlying mechanisms common to teratogenesis and carcinogenesis.
Collapse
Affiliation(s)
- F Mili
- National Center for Environmental Health, Centers for Disease Control, Atlanta, GA
| | | | | | | |
Collapse
|
85
|
Mili F, Lynch CF, Khoury MJ, Flanders WD, Edmonds LD. Risk of childhood cancer for infants with birth defects. II. A record-linkage study, Iowa, 1983-1989. Am J Epidemiol 1993; 137:639-44. [PMID: 8470665 DOI: 10.1093/oxfordjournals.aje.a116721] [Citation(s) in RCA: 38] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/31/2023] Open
Abstract
To attempt to confirm associations found in a companion study in Atlanta, Georgia between Down's syndrome and acute leukemia and between pyloric stenosis and childhood cancer, the authors used the State Health Registry of Iowa to link the records of infants and children with cancer for 1983 to 1989 with the records of infants with birth defects for 1983 to 1988. During the study period, birth defects were diagnosed in 10,891 infants younger than 1 year of age, and cancer was diagnosed in 396 children younger than 8 years of age. The authors compared the observed number of children with a defect who developed cancer with the number expected on the basis of the cancer registry rates. Of the 10,891 children with birth defects, 16 developed cancer (standardized incidence ratio (SIR) = 2.0, 95% confidence interval (CI) 1.2-3.3). Of 251 children with Down's syndrome (trisomy 21), two developed leukemia (SIR = 32.1, 95% CI 3.9-116.0). None of the infants with cancer had pyloric stenosis (SIR = 0.0, 95% CI 0.0-6.7). The results of this study supported the association found in the Atlanta study between Down's syndrome and leukemia, but did not support the association found there between pyloric stenosis and childhood cancer. This study, however, had a shorter follow-up period and a smaller number of subjects than the Atlanta study.
Collapse
Affiliation(s)
- F Mili
- National Center for Environmental Health, Centers for Disease Control, Atlanta, GA
| | | | | | | | | |
Collapse
|
86
|
Thun MJ, Namboodiri MM, Calle EE, Flanders WD, Heath CW. Aspirin use and risk of fatal cancer. Cancer Res 1993; 53:1322-7. [PMID: 8443812] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/30/2023]
Abstract
Aspirin and other nonsteroidal antiinflammatory drugs inhibit prostaglandin synthesis and tumor growth in many experimental systems, but it is unclear which of these tumor models are relevant to humans. We have reported reduced risk of fatal colon cancer among persons who used aspirin in a large prospective study. This analysis examines other fatal cancers in relation to aspirin among 635,031 adults in that study who provided information in 1982 on the frequency and duration of their aspirin use and did not report cancer. Death rates were measured through 1988. Death rates decreased with more frequent aspirin use for cancers of the esophagus, stomach, colon, and rectum but not generally for other cancers. For each digestive tract cancer, death rates were approximately 40% lower among persons who used aspirin 16 times/month or more for at least 1 year compared to those who used no aspirin. The trend of decreasing risk with more frequent aspirin use was strongest among persons who had used aspirin for 10 years or more; it remained statistically significant, except for esophageal cancer, in multivariate analyses that adjusted for other known risk factors. Biases such as early detection or aspirin avoidance among cases do not appear to explain the results. Our data suggest that regular, prolonged use of aspirin may reduce the risk of fatal cancer of the esophagus, stomach, colon, and rectum. Future epidemiological and basic research should examine all digestive tract cancers in considering the chemopreventive or therapeutic potential of nonsteroidal antiinflammatory drugs.
Collapse
Affiliation(s)
- M J Thun
- American Cancer Society, Atlanta, Georgia 30329
| | | | | | | | | |
Collapse
|
87
|
Abstract
Recall bias or report bias is said to occur when associations are distorted or created because case informants report events differently from controls. Some investigators have suggested that this bias can be prevented by choosing controls who have conditions similar to those found in the case group. We use the term "restricted-control group" for such a control series. Although using a restricted-control series may eliminate differential misclassification, this approach will usually not eliminate nondifferential misclassification and may create selection bias. In this article, we present a way to algebraically examine the effects of misclassification and selection bias on observed associations. We use this method to compare the impact of recall bias in a study using a population control group with the effects of selection bias and nondifferential misclassification if a restricted-control group is used. Our approach is exemplified using data from a case-control study of sudden infant death syndrome. Our findings show that even when recall bias exists, the observed association can be closer to the true association when a population control series is used than when a restricted-control group is used.
Collapse
Affiliation(s)
- C Drews
- Division of Epidemiology, Emory University School of Medicine, Atlanta, GA
| | | | | |
Collapse
|
88
|
Abstract
OBJECTIVES Proven screening technologies exist for both breast and cervical cancer, but they are underused by many women. We sought to evaluate the effect of demographic characteristics on the underuse of mammography and Pap smear screening. METHODS We analyzed responses from 12,252 women who participated in the 1987 National Health Interview Survey Cancer Control Supplement. Demographic profiles were produced to target severely underserved groups of women. RESULTS Low income was a strong predictor of mammography underuse, as was Hispanic ethnicity and other race, low educational attainment, age greater than 65, and residence in a rural area. A strong predictor of never having had a Pap smear was never having been married; however, the importance of this characteristic is difficult to interpret in the absence of data on sexual activity. Hispanic women and women of other races of all ages and all income levels underused Pap smear screening, as did older women, particularly older Black women. CONCLUSIONS The tendency of women to underuse screening technologies varies greatly across levels of basic demographic characteristics. The importance of these characteristics differs for mammography screening versus Pap smear screening.
Collapse
Affiliation(s)
- E E Calle
- American Cancer Society, Atlanta, GA 30329
| | | | | | | |
Collapse
|
89
|
Mili F, Flanders WD, Boring JR, Annest JL, DeStefano F. The associations of alcohol drinking and drinking cessation to measures of the immune system in middle-aged men. Alcohol Clin Exp Res 1992; 16:688-94. [PMID: 1356316 DOI: 10.1111/j.1530-0277.1992.tb00662.x] [Citation(s) in RCA: 40] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Abstract
To estimate the association between the immunologic responses of the cell-mediated and humoral systems and alcohol drinking, we used data from the Vietnam Experience Study conducted by the Centers for Disease Control. That study, conducted from 1985 to 1986, was based on a random sample of 4462 male, Vietnam-era, U.S. veterans. By using linear regression, we evaluated how (1) the number of alcoholic drinks the subjects consumed per month and (2) the drinking cessation of certain subjects were associated with their relative and absolute T, B, CD4, and CD8 lymphocyte counts and immunoglobulin A (IgA), IgM, and IgG levels. We used geometric means and percentage differences in geometric means of immune status to measure the associations and adjusted these values to account for the effect of covariates. The results indicated that measures of immune status differed among the drinking categories and that, generally, the differences changed after adjustment for covariates. These differences consisted, as alcohol consumption increased, of higher IgA and IgM levels, relative T and CD4 lymphocytes, and the ratio of CD4 to CD8 cells, and of lower IgG levels, relative B and CD8 lymphocytes, absolute lymphocyte, and lymphocyte subset counts after adjusting for other covariates. Among former drinkers, we found no clear-cut pattern in measures of immunity for a few years after cessation and then found that values of former drinkers tended to return toward values of nondrinkers as they continued to abstain.
Collapse
Affiliation(s)
- F Mili
- Agent Orange Projects, Centers for Disease Control, Atlanta, GA
| | | | | | | | | |
Collapse
|
90
|
Thun MJ, Calle EE, Namboodiri MM, Flanders WD, Coates RJ, Byers T, Boffetta P, Garfinkel L, Heath CW. Risk factors for fatal colon cancer in a large prospective study. J Natl Cancer Inst 1992; 84:1491-500. [PMID: 1433333 DOI: 10.1093/jnci/84.19.1491] [Citation(s) in RCA: 253] [Impact Index Per Article: 7.9] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/27/2022] Open
Abstract
BACKGROUND Diet, physical activity, obesity, aspirin use, and family history may all modify the risk of colon cancer, but few epidemiologic studies are large enough to examine these factors simultaneously. PURPOSE We prospectively assessed the relationship of diet and other factors to risk of fatal colon cancer. METHODS Using data from Cancer Prevention Study II--an ongoing prospective mortality study--we studied 764,343 adults who, in 1982, completed a questionnaire on diet and other risk factors and did not report cancer or other major illness. We assessed mortality through August 1988 and identified 1150 deaths from colon cancer (611 men and 539 women). Multivariate analyses were used to compare these case patients with 5746 matched control subjects drawn from the cohort. RESULTS Risk of fatal colon cancer decreased with more frequent consumption of vegetables and high-fiber grains (P for trend = .031 in men and .0012 in women). The relative risk (RR) for the highest versus lowest quintile of vegetable intake was 0.76 in men (95% confidence interval [CI] = 0.57-1.02) and 0.62 in women (95% CI = 0.45-0.86). Dietary consumption of vegetables and grains and regular use of aspirin were the only factors having an independent and statistically significant association with fatal colon cancer. Participants who consumed the least vegetables and grains and no aspirin had a higher risk compared with those who consumed the most vegetables and used aspirin 16 or more times per month. For men in the former category, the RR was 2.4 (95% CI = 1.1-5.3); for women, it was 2.9 (95% CI = 1.3-6.7). Weaker associations were seen for physical inactivity, obesity, total dietary fat, and family history. No associations were seen with consumption of red meat or total or saturated fat in either sex, but this finding must be interpreted cautiously. CONCLUSIONS These findings support recommendations that increased consumption of vegetables and grains may reduce the risk of fatal colon cancer. Regular use of low doses of aspirin may prove to be an important supplemental measure.
Collapse
Affiliation(s)
- M J Thun
- Epidemiology Division, Emory University School of Public Health, Atlanta, Ga
| | | | | | | | | | | | | | | | | |
Collapse
|
91
|
Freedman DS, Byers T, Barboriak JJ, Flanders WD, Duncan A, Yip R, Meilahn EN. The relation of prothrombin times to coronary heart disease risk factors among men aged 31-45 years. Am J Epidemiol 1992; 136:513-24. [PMID: 1442715 DOI: 10.1093/oxfordjournals.aje.a116529] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/27/2022] Open
Abstract
Although levels of coagulation factor VII and fibrinogen are predictive of cardiovascular disease, relatively little data describe hemostatic characteristics in healthy populations. The cross-sectional associations between the prothrombin time, a measure of the activity of the extrinsic and common pathways of coagulation, and traits associated with the risk of cardiovascular disease were therefore examined among 3,604 white and 514 black, male, US Army veterans aged 31-45 years. The prothrombin time measurements, performed in 1985 and 1986, were precise, with an intraclass correlation of 0.98 (202 pairs). Overall, the mean prothrombin time was 12.4 seconds (standard deviation, 0.4 seconds), and 11 percent of the men had a value of less than 12 seconds. Many of the observed associations with the prothrombin time paralleled those that have been reported with clotting factor VII and fibrinogen. The mean prothrombin time was 0.15 seconds shorter among whites than among blacks and was 0.2 seconds shorter among current cigarette smokers than among men who had never smoked. Inverse associations were also seen with relative weight and with levels of total cholesterol and triglycerides (r = -0.09 to -0.16). All associations were statistically significant at the 0.01 level, and the examined characteristics could jointly account for about 12 percent of the variability in prothrombin times. Additional data on characteristics related to coagulation may help elucidate the natural history of cardiovascular disease and aid in the design of clinical trials.
Collapse
Affiliation(s)
- D S Freedman
- Division of Nutrition, Centers for Disease Control, Atlanta, GA 30333
| | | | | | | | | | | | | |
Collapse
|
92
|
Abstract
In linear regression analyses, we must often transform the dependent variable to meet the statistical assumptions of normality, variance stability, or linearity. Transformations, however, can complicate the interpretation of results because they change the scale on which the dependent variable is measured. In this setting, the inclusion of product terms or the transformation of some independent (or predictor) variables may further complicate interpretation. In this article, we present some interpretations of linear models that include transformations or product terms. We illustrate these interpretations using regression analyses designed to study determinants of serum testosterone levels. These examples show how one can present results using simple measures, such as medians, and interpret regression parameters.
Collapse
Affiliation(s)
- W D Flanders
- Emory University School of Public Health, Division of Epidemiology, Atlanta, GA 30329
| | | | | |
Collapse
|
93
|
Abstract
OBJECTIVE To examine the variation in the risk for mortality among patients treated at renal dialysis facilities within a defined geographic area. SETTING All free-standing and hospital-based dialysis facilities in a single southeastern state reported to the registry. DESIGN Cohort of dialysis patients followed for 1 year by an end-stage renal disease registry. PATIENTS Patients (n = 3612) aged 20 years and older receiving treatment at the dialysis facilities reporting to the registry during 1987. MEASUREMENTS Demographic, comorbid, and severity of illness indicators were abstracted from patient records. Facility-specific risk estimates were derived from a Cox proportional hazards model. RESULTS Facility-specific mortality rates ranged between 2.0 and 10.5 deaths per 10,000 patient days. Mortality rates were higher among older persons; whites; those with a history of diabetic nephropathy, angina, or congestive heart failure; and patients with either nutritional or functional status impairment. Facility-specific prevalence of each mortality risk factor varied widely. The unadjusted risk for death in a facility at the 75th percentile of risk was 1.3 times that of a facility at the median, whereas at the 25th percentile, it was 0.68 times as likely--a twofold range of risk. Controlling for differences in the prevalence of patient characteristics did not change the interquartile range in risks, and a facility's adjusted risk estimate showed a strong correlation with its unadjusted estimate (R2, 0.566; P less than 0.0001). CONCLUSIONS Patient attributes associated with increased risk for mortality vary widely among dialysis facilities. Adjustment for these differences did not, however, substantially change either the degree of variation in mortality risks or the relative ranking of a facility's mortality.
Collapse
|
94
|
Khoury MJ, James LM, Flanders WD, Erickson JD. Interpretation of recurring weak associations obtained from epidemiologic studies of suspected human teratogens. Teratology 1992; 46:69-77. [PMID: 1641813 DOI: 10.1002/tera.1420460110] [Citation(s) in RCA: 68] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/28/2022]
Abstract
Epidemiological studies of suspected human teratogens not infrequently lead to recurring weak or moderate associations (relative risks or odds ratios ranging from greater than 1 to 3 for adverse effects and from 1/3 to less than 1 for protective effects) between specific defects and prenatal exposures. Examples of such associations include cigarette smoking and oral clefts (odds ratios between 1 and 2) and periconceptional multivitamin/folic acid supplementation and neural tube defects (odds ratios from 1/3 to 1). In this paper, we illustrate that low relative risk recurring in well-designed studies may reflect underlying biologic mechanisms and should not be readily dismissed. Low relative risks could be the result of a combination of the following factors: 1) unmeasured confounding, 2) exposure misclassification (often related to the inability to pinpoint relevant dose and timing), 3) outcome misclassification (related to the etiologic heterogeneity of birth defects), 4) biologic interactions (related to teratogenic effects in population subgroups defined by genetic susceptibility or the presence of other exposures), and 5) differential prenatal survival (related to the combined impact of the exposure and the defect on prenatal survival). These issues can be addressed in epidemiologic studies by using biological markers of exposure and susceptibility, dysmorphologic evaluation of affected infants, subgroup analysis for etiologic heterogeneity, a search for biologic interactions, and the use of prospective cohort studies. Finally, low relative risks in the face of common exposures can reflect an important public health contribution of the exposure to the occurrence of the defect in the population.
Collapse
Affiliation(s)
- M J Khoury
- Birth Defects and Genetic Diseases Branch, National Center for Environmental Health and Injury Control, Centers for Disease Control, Atlanta, Georgia 30333
| | | | | | | |
Collapse
|
95
|
Abstract
Interpretation of observational studies is difficult, particularly in cross-sectional studies, because the direction of cause and effect may be difficult to assess: Did the "outcome" affect the measured exposure level, or did the exposure affect the outcome? In this paper, the authors describe a pattern, the "checkmark pattern," which can arise in cross-sectional studies. This pattern is characterized by higher levels of the outcome in an unexposed comparison group than in some subgroups of the exposed. The pattern, if seen in certain types of observational studies, suggests that the "outcome" variable may have affected the measured exposure level. Recognition of the pattern may help the epidemiologist to decipher the causal sequence. Two examples illustrate the issues involved.
Collapse
Affiliation(s)
- W D Flanders
- Emory University School of Public Health, Atlanta, GA
| | | | | | | |
Collapse
|
96
|
|
97
|
O'Brien TR, Flanders WD, Decoufle P. Use of the Mantel-Haenszel chi 2 overestimates precision in studies with sparse data. J Occup Med 1991; 33:1081-3. [PMID: 1753307] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 12/28/2022]
Abstract
Probability values or "test-based" confidence limits computed on the basis of the Mantel-Haenszel chi 2 statistic may be invalid if minimum cell-size requirements are not met. In 30 studies of occupational proportionate mortality published from 1985 through 1987, the Mantel-Haenszel chi 2 was used in potential violation of cell-size requirements in 21 studies. Sixteen (76%) studies included at least one value that, when compared with testing with the Poisson distribution, was erroneously reported as statistically significant at the 0.05 level. We conclude that by using the Mantel-Haenszel chi 2 with sparse data, some epidemiologists overestimate precision.
Collapse
Affiliation(s)
- T R O'Brien
- Centers for Disease Control, Public Health Service, Atlanta, Ga 30333
| | | | | |
Collapse
|
98
|
Abstract
Nested case-control studies, or case-control studies within a cohort, combine the advantages of cohort studies with the efficiency of case-control studies. Case-control studies can often be viewed as having two stages; the first stage consists of vital status, disease, and basic covariate ascertainment, and the second stage consists of detailed covariate and exposure ascertainment. Breslow and Cain (1988) and Breslow and Zhao (1988) recently showed that conventional analyses of such two-stage studies may ignore some of the available information. In this paper, we show how one can adapt the pseudo-likelihood analyses developed by Kalbfleisch and Lawless (1988) to the analysis of data from two-stage case-control studies.
Collapse
Affiliation(s)
- W D Flanders
- Department of Epidemiology and Biostatistics, Emory University School of Medicine, Atlanta, GA 30329
| | | |
Collapse
|
99
|
Mili F, Flanders WD, Boring JR, Annest JL, Destefano F. The associations of race, cigarette smoking, and smoking cessation to measures of the immune system in middle-aged men. Clin Immunol Immunopathol 1991; 59:187-200. [PMID: 2009639 DOI: 10.1016/0090-1229(91)90017-5] [Citation(s) in RCA: 76] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/29/2022]
Abstract
To estimate the association between the immunologic responses of the cell-mediated and humoral systems and race or tobacco smoking, we used data from the Vietnam Experience Study conducted by the Centers for Disease Control. That study, done from 1985 to 1986, was based on a random sample of 4462 male, Vietnam-era, U.S. veterans. Racial groups were white, black, Hispanic, Asian, and American Indian. We used linear regression to evaluate how (i) the race of the subjects, (ii) the number of pack-years of cigarettes the subjects smoked, and (iii) the smoking cessation of certain subjects were associated with their relative and absolute T, B, CD4, and CD8 lymphocyte counts and immunoglobulin A (IgA), IgM, and IgG levels. The results indicated that immune status was associated with race and smoking history and that, generally, the associations remained after adjustment for covariates. For example, the average IgA level and absolute CD8 lymphocyte count for blacks were, respectively, 19 and 16% higher than those for whites. On the other hand, smokers had lower immunoglobulin levels and relative CD8 cell counts and higher counts for other lymphocytes of the cell-mediated system than nonsmokers. For example, the average absolute B count of heavy smokers was 37% higher than that of nonsmokers. The pattern after cigarette smoking cessation was consistent with a reversible effect of smoking and a return toward immune levels of nonsmokers.
Collapse
Affiliation(s)
- F Mili
- Centers for Disease Control, Public Health Service, U.S. Department of Health and Human Services, Atlanta, Georgia 30333
| | | | | | | | | |
Collapse
|
100
|
Steinberg KK, Thacker SB, Smith SJ, Stroup DF, Zack MM, Flanders WD, Berkelman RL. A meta-analysis of the effect of estrogen replacement therapy on the risk of breast cancer. JAMA 1991; 265:1985-90. [PMID: 1826136] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/28/2022]
Abstract
To quantify the effect of estrogen replacement therapy on breast cancer risk, we combined dose-response slopes of the relative risk of breast cancer against the duration of estrogen use across 16 studies. Using this summary dose-response slope, we calculated the proportional increase in risk of breast cancer for each year of estrogen use. For women who experienced any type of menopause, risk did not appear to increase until after at least 5 years of estrogen use. After 15 years of estrogen use, we found a 30% increase in the risk of breast cancer (relative risk, 1.3; 95% confidence interval [CI], 1.2 to 1.6). The increase in risk was largely due to results of studies that included premenopausal women or women using estradiol (with or without progestin), studies for which the estimated relative risk was 2.2 (CI, 1.4 to 3.4) after 15 years. Among women with a family history of breast cancer, those who had ever used estrogen replacement had a significantly higher risk (3.4; CI, 2.0 to 6.0) than those who had not (1.5; CI, 1.2 to 1.7).
Collapse
Affiliation(s)
- K K Steinberg
- Center for Environmental Health and Injury Control, Centers for Disease Control, Atlanta, GA 30333
| | | | | | | | | | | | | |
Collapse
|