1
|
A Novel Approach for Conducting a Catchment Area Analysis of Breast Cancer by Age and Stage for a Community Cancer Center. Cancer Epidemiol Biomarkers Prev 2024; 33:646-653. [PMID: 38451180 PMCID: PMC11062816 DOI: 10.1158/1055-9965.epi-23-1125] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2023] [Revised: 12/07/2023] [Accepted: 03/05/2024] [Indexed: 03/08/2024] Open
Abstract
BACKGROUND The U.S. Preventive Services Task Force recently issued an updated draft recommendation statement to initiate breast cancer screening at age 40, reflecting well-documented disparities in breast cancer-related mortality that disproportionately impact younger Black women. This study applied a novel approach to identify hotspots of breast cancer diagnosed before age 50 and/or at an advanced stage to improve breast cancer detection within these communities. METHODS Cancer registry data for 3,497 women with invasive breast cancer diagnosed or treated between 2012 and 2020 at the Helen F. Graham Cancer Center and Research Institute (HFGCCRI) and who resided in the HFGCCRI catchment area, defined as New Castle County, Delaware, were geocoded and analyzed with spatial intensity. Standardized incidence ratios stratified by age and race were calculated for each hotspot. RESULTS Four hotspots were identified, two for breast cancer diagnosed before age 50, one for advanced breast cancer, and one for advanced breast cancer diagnosed before age 50. Younger Black women were overrepresented in these hotspots relative to the full-catchment area. CONCLUSIONS The novel use of spatial methods to analyze a community cancer center catchment area identified geographic areas with higher rates of breast cancer with poor prognostic factors and evidence that these areas made an outsized contribution to racial disparities in breast cancer. IMPACT Identifying and prioritizing hotspot breast cancer communities for community outreach and engagement activities designed to improve breast cancer detection have the potential to reduce the overall burden of breast cancer and narrow racial disparities in breast cancer.
Collapse
|
2
|
Disparities in Cancer Stage Outcomes by Catchment Areas for a Comprehensive Cancer Center. JAMA Netw Open 2024; 7:e249474. [PMID: 38696166 PMCID: PMC11066700 DOI: 10.1001/jamanetworkopen.2024.9474] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/25/2023] [Accepted: 03/04/2024] [Indexed: 05/05/2024] Open
Abstract
Importance The National Cancer Institute comprehensive cancer centers (CCCs) lack spatial and temporal evaluation of their self-designated catchment areas. Objective To identify disparities in cancer stage at diagnosis within and outside a CCC's catchment area across a 10-year period using spatial and statistical analyses. Design, Setting, and Participants This cross-sectional, population-based study conducted between 2010 and 2019 utilized cancer registry data for the Johns Hopkins Sidney Kimmel CCC (SKCCC). Eligible participants included patients with cancer in the contiguous US who received treatment for cancer, a diagnosis of cancer, or both at SKCCC. Patients were geocoded to zip code tabulation areas (ZCTAs). Individual-level variables included sociodemographic characteristics, smoking and alcohol use, treatment type, cancer site, and insurance type. Data analysis was performed between March and July 2023. Exposures Distance between SKCCC and ZCTAs were computed to generate a catchment area of the closest 75% of patients and outer zones in 5% increments for comparison. Main Outcomes and Measures The primary outcome was cancer stage at diagnosis, defined as early-stage, late-stage, or unknown stage. Multinomial logistic regression was used to determine associations of catchment area with stage at diagnosis. Results This study had a total of 94 007 participants (46 009 male [48.94%] and 47 998 female [51.06%]; 30 195 aged 22-45 years [32.12%]; 4209 Asian [4.48%]; 2408 Hispanic [2.56%]; 16 004 non-Hispanic Black [17.02%]; 69 052 non-Hispanic White [73.45%]; and 2334 with other or unknown race or ethnicity [2.48%]), including 47 245 patients (50.26%) who received a diagnosis of early-stage cancer, 19 491 (20.73%) who received a diagnosis of late-stage cancer , and 27 271 (29.01%) with unknown stage. Living outside the main catchment area was associated with higher odds of late-stage cancers for those who received only a diagnosis (odds ratio [OR], 1.50; 95% CI, 1.10-2.05) or only treatment (OR, 1.44; 95% CI, 1.28-1.61) at SKCCC. Non-Hispanic Black patients (OR, 1.16; 95% CI, 1.10-1.23) and those with Medicaid (OR, 1.65; 95% CI, 1.46-1.86) and no insurance at time of treatment (OR, 2.12; 95% CI, 1.79-2.51) also had higher odds of receiving a late-stage cancer diagnosis. Conclusions and Relevance In this cross-sectional study of CCC data from 2010 to 2019, patients residing outside the main catchment area, non-Hispanic Black patients, and patients with Medicaid or no insurance had higher odds of late-stage diagnoses. These findings suggest that disadvantaged populations and those living outside of the main catchment area of a CCC may face barriers to screening and treatment. Care-sharing agreements among CCCs could address these issues.
Collapse
|
3
|
Mapping Cumulative Risk in Delaware: Approach and Implications for Health Equity. JOURNAL OF PUBLIC HEALTH MANAGEMENT AND PRACTICE 2024; 30:E112-E123. [PMID: 38320288 PMCID: PMC11009089 DOI: 10.1097/phh.0000000000001859] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/08/2024]
Abstract
BACKGROUND Addressing health equity requires attention to upstream determinants of health, including environmental and social factors that act in tandem to increase communities' exposure to and vulnerability to toxicants. Cumulative risk assessment, which evaluates combined risks from environmental and social factors, is a useful approach for estimating potential drivers of health disparities. We developed a cumulative risk score of multiple indices of environmental and social conditions and assessed block group-level differences in New Castle County, Delaware. METHODS This cross-sectional study used choropleth maps to visualize the distribution of environmental, social, and cumulative risks and Moran's I statistics to assess spatial clustering of cumulative risk across the county and among individual block groups. RESULTS Findings indicate that environmental risk rarely occurs without social risk and that environmental and social risks co-occur in distinct areas, resulting in large-scale clustering of cumulative risk. Areas of higher cumulative risk had more Black residents and people of lower socioeconomic status. CONCLUSIONS Replicable measures of cumulative risk can show how environmental and social risks are inequitably distributed by race and socioeconomic status, as seen here in New Castle County. Such measures can support upstream approaches to reduce health disparities resulting from histories of environmental racism.
Collapse
|
4
|
Low-Dose Radiation Associated Mortality Risks of Site-Specific Solid Tumors in U.S. Shipyard Workers. J Occup Environ Med 2024:00043764-990000000-00531. [PMID: 38527177 DOI: 10.1097/jom.0000000000003099] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/27/2024]
Abstract
BACKGROUND U.S. nuclear capable shipyard workers have increased potential for occupational radiation exposure. OBJECTIVE to examine solid tumor mortality risks at low doses. METHOD 437,937 workers working from 1957 to 2004 at eight U.S. shipyards were studied. RESULTS Radiation workers with a median life-time dose at 0.82 mSv had a significantly lower solid tumor mortality risk [Relative risk (RR): 0.96, 95% confidence interval (CI): 0.94-0.98] than non-radiation workers. Among 153,930 radiation workers., the RRs of solid tumors increased with increasing dose categories without statistical significance. The dose category >0- < 25 mSv had significantly lower RR (0.95, 95% CI: 0.91-0.99) vs. 0 dose and the Excess Relative Risk was 0.05/100 mSv (95% CI: 0.01-0.08). CONCLUSION Solid tumor risk might increase with radiation dose, but not linearly at low doses. Actual mortality risk may be dependent on dose received.
Collapse
|
5
|
The Burden of Back and Neck Strains and Sprains in Professional Baseball Players. Clin Spine Surg 2024:01933606-990000000-00259. [PMID: 38366348 DOI: 10.1097/bsd.0000000000001579] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/06/2022] [Accepted: 12/06/2023] [Indexed: 02/18/2024]
Abstract
STUDY DESIGN A retrospective case series study. OBJECTIVE To analyze the epidemiology of diagnoses of back and neck strains and sprains among Major League (MLB) and Minor League (MiLB) Baseball players. BACKGROUND Baseball players perform unique sets of repetitive movements that may predispose to neck and back strains and sprains. Data are lacking concerning the epidemiology of these diagnoses in this population. MATERIALS AND METHODS De-identified data on neck/back strains and sprains were collected from all MLB and MiLB teams from 2011 to 2016 using the MLB-commissioned Health and Injury Tracking System database. Diagnosis rates of conditions related to cervical, thoracic, and lumbar musculature and their impact on days missed due to injury, player participation, and season or career-ending status were assessed. Injury rates were reported as injuries per 1000 athlete exposures (AEs). RESULTS There were 3447 cases of neck/back strains and sprains in professional baseball players from 2011 to 2016. Seven hundred twenty-one of these occurred in MLB versus 2726 in MiLB. Of injuries 136 were season-ending (26 in MLB, 110 in MiLB); 22 were career-ending (2 in MLB, 20 in MiLB). The total days missed were 39,118 (8838 from MLB and 30,280 from MiLB). Excluding season or career-ending injuries, the mean days missed were 11.8 (12.7 and 11.6 in MLB and MiLB, respectively). The median days missed were 4 (3 and 5 in MLB and MiLB, respectively). Combining MLB and MiLB, the pitcher injury rate was 1.893 per 1000 AEs versus 0.743 per 1000 Aes for other position players (P < 0.0001). CONCLUSION There was a high incidence of neck/back strains and sprains in MLB and MiLB players, with nearly 40,000 aggregate days missed in our 6-year study period. The median days missed were lower than the mean days missed, indicating rightward outliers. Pitchers had over double the rates of injuries compared with other position players. LEVEL OF EVIDENCE Level III.
Collapse
|
6
|
Evaluating the performance of Bayesian geostatistical prediction with physical barriers in the Chesapeake Bay. ENVIRONMENTAL MONITORING AND ASSESSMENT 2024; 196:255. [PMID: 38345642 DOI: 10.1007/s10661-024-12401-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/26/2023] [Accepted: 01/29/2024] [Indexed: 02/15/2024]
Abstract
The Chesapeake Bay is one of the most widely studied bodies of water in the United States and around the world. Routine monitoring of water quality indicators (e.g., salinity) relies on fixed sampling stations throughout the Bay. Utilizing this rich monitoring data, various methods produce surface predictions of water quality indicators to further characterize the health of the Bay as well as to support wildlife and human health research studies. Bayesian approaches for geostatistical modelling are becoming increasingly popular and can be preferred over frequentist approaches because full and exact inference can be computed, along with more accurate characterization of uncertainty. Traditional geostatistical prediction methods assume a Euclidean distance between two points when characterizing spatial dependence as a function of distance. However, Euclidean approaches may not be appropriate in estuarine environments when water-land boundaries are crossed during the modelling process. In this study, we compare stationary and barrier INLA geostatistical models with a classic kriging geostatistical model to predict salinity in the Chesapeake Bay during 4 months in 2019. Cross-validation is conducted for each approach to evaluate model performance based on prediction accuracy and precision. The results provide evidence that the two Bayesian-based models outperformed ordinary kriging, especially when examining prediction accuracy (most notably in the tributaries). We also suggest that the non-Euclidean model accounts for the appropriate water-based distances between sampling locations and is likely better at characterizing the uncertainty. However, more complex bodies of water may better showcase the capabilities and efficacy of the physical barrier INLA model.
Collapse
|
7
|
Geographic variation in HCV treatment penetration among people who inject drugs in Baltimore, MD. J Viral Hepat 2023; 30:810-818. [PMID: 37382024 PMCID: PMC10527489 DOI: 10.1111/jvh.13864] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/10/2023] [Revised: 06/06/2023] [Accepted: 06/09/2023] [Indexed: 06/30/2023]
Abstract
We evaluated geographic heterogeneity in hepatitis C virus (HCV) treatment penetration among people who inject drug (PWID) across Baltimore, MD since the advent of direct-acting antivirals (DAAs) using space-time clusters of HCV viraemia. Using data from a community-based cohort of PWID, the AIDS Linked to the IntraVenous Experience (ALIVE) study, we identified space-time clusters with higher-than-expected rates of HCV viraemia between 2015 and 2019 using scan statistics. We used Poisson regression to identify covariates associated with HCV viraemia and used the regression-fitted values to detect adjusted space-time clusters of HCV viraemia in Baltimore city. Overall, in the cohort, HCV viraemia fell from 77% in 2015 to 64%, 49%, 39% and 36% from 2016 to 2019. In Baltimore city, the percentage of census tracts where prevalence of HCV viraemia was ≥85% dropped from 57% to 34%, 25%, 22% and 10% from 2015 to 2019. We identified two clusters of higher-than-expected HCV viraemia in the unadjusted analysis that lasted from 2015 to 2017 in East and West Baltimore and one adjusted cluster of HCV viraemia in West Baltimore from 2015 to 2016. Neither differences in age, sex, race, HIV status, nor neighbourhood deprivation were able to explain the significant space-time clusters. However, residing in a cluster with higher-than-expected viraemia was associated with age, sex, educational attainment and higher levels of neighbourhood deprivation. Nearly 4 years after DAAs became available, HCV treatment has penetrated all PWID communities across Baltimore city. While nearly all census tracts experienced improvements, change was more gradual in areas with higher levels of poverty.
Collapse
|
8
|
Geographic Disparities in Potential Accessibility to Gynecologic Oncologists in the United States From 2001 to 2020. Obstet Gynecol 2023; 142:688-697. [PMID: 37535956 DOI: 10.1097/aog.0000000000005284] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2023] [Accepted: 05/25/2023] [Indexed: 08/05/2023]
Abstract
OBJECTIVE To use a spatial modeling approach to capture potential disparities of gynecologic oncologist accessibility in the United States at the county level between 2001 and 2020. METHODS Physician registries identified the 2001-2020 gynecologic oncology workforce and were aggregated to each county. The at-risk cohort (women aged 18 years or older) was stratified by race and ethnicity and rurality demographics. We computed the distance from at-risk women to physicians. Relative access scores were computed by a spatial model for each contiguous county. Access scores were compared across urban or rural status and racial and ethnic groups. RESULTS Between 2001 and 2020, the gynecologic oncologist workforce increased. By 2020, there were 1,178 active physicians and 98.3% practiced in urban areas (37.3% of all counties). Geographic disparities were identified, with 1.09 physicians per 100,000 women in urban areas compared with 0.1 physicians per 100,000 women in rural areas. In total, 2,862 counties (57.4 million at-risk women) lacked an active physician. Additionally, there was no increase in rural physicians, with only 1.7% practicing in rural areas in 2016-2020 relative to 2.2% in 2001-2005 ( P =.35). Women in racial and ethnic minority populations, such as American Indian or Alaska Native and Hispanic women, exhibited the lowest level of access to physicians across all time periods. For example, 23.7% of American Indian or Alaska Native women did not have access to a physician within 100 miles between 2016 and 2020, which did not improve over time. Non-Hispanic Black women experienced an increase in relative accessibility, with a 26.2% increase by 2016-2020. However, Asian or Pacific Islander women exhibited significantly better access than non-Hispanic White, non-Hispanic Black, Hispanic, and American Indian or Alaska Native women across all time periods. CONCLUSION Although the U.S. gynecologic oncologist workforce increased steadily over 20 years, this has not translated into evidence of improved access for many women from rural and underrepresented areas. However, health care utilization and cancer outcomes may not be influenced only by distance and availability. Policies and pipeline programs are needed to address these inequities in gynecologic cancer care.
Collapse
|
9
|
Neighborhood factors and triple negative breast cancer: The role of cumulative exposure to area-level risk factors. Cancer Med 2023. [PMID: 36916687 DOI: 10.1002/cam4.5808] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/20/2022] [Revised: 01/08/2023] [Accepted: 03/01/2023] [Indexed: 03/16/2023] Open
Abstract
BACKGROUND Despite similar incidence rates among Black and White women, breast cancer mortality rates are 40% higher among Black women. More than half of the racial difference in breast cancer mortality can be attributed to triple negative breast cancer (TNBC), an aggressive subtype of invasive breast cancer that disproportionately affects Black women. Recent research has implicated neighborhood conditions in the etiology of TNBC. This study investigated the relationship between cumulative neighborhood-level exposures and TNBC risk. METHODS This single-institution retrospective study was conducted on a cohort of 3316 breast cancer cases from New Castle County, Delaware (from 2012 to 2020), an area of the country with elevated TNBC rates. Cases were stratified into TNBC and "Non-TNBC" diagnosis and geocoded by residential address. Neighborhood exposures included census tract-level measures of unhealthy alcohol use, metabolic dysfunction, breastfeeding, and environmental hazards. An overall cumulative risk score was calculated based on tract-level exposures. RESULTS Univariate analyses showed each tract-level exposure was associated with greater TNBC odds. In multivariate analyses that controlled for patient-level race and age, tract-level exposures were not associated with TNBC odds. However, in a second multivariate model that included patient-level variables and considered tract-level risk factors as a cumulative exposure risk score, each one unit increase in cumulative exposure was significantly associated with a 10% increase in TNBC odds. Higher cumulative exposure risk scores were found in census tracts with relatively high proportions of Black residents. CONCLUSIONS Cumulative exposure to neighborhood-level risk factors that disproportionately affect Black communities was associated with greater TNBC risk.
Collapse
|
10
|
Agreement in extreme precipitation exposure assessment is modified by race and social vulnerability. FRONTIERS IN EPIDEMIOLOGY 2023; 3:1128501. [PMID: 38455887 PMCID: PMC10911001 DOI: 10.3389/fepid.2023.1128501] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/20/2022] [Accepted: 02/06/2023] [Indexed: 03/09/2024]
Abstract
Epidemiologic investigations of extreme precipitation events (EPEs) often rely on observations from the nearest weather station to represent individuals' exposures, and due to structural factors that determine the siting of weather stations, levels of measurement error and misclassification bias may differ by race, class, and other measures of social vulnerability. Gridded climate datasets provide higher spatial resolution that may improve measurement error and misclassification bias. However, similarities in the ability to identify EPEs among these types of datasets have not been explored. In this study, we characterize the overall and temporal patterns of agreement among three commonly used meteorological data sources in their identification of EPEs in all census tracts and counties in the conterminous United States over the 1991-2020 U.S. Climate Normals period and evaluate the association between sociodemographic characteristics with agreement in EPE identification. Daily precipitation measurements from weather stations in the Global Historical Climatology Network (GHCN) and gridded precipitation estimates from the Parameter-elevation Relationships on Independent Slopes Model (PRISM) and the North American Land Data Assimilation System (NLDAS) were compared in their ability to identify EPEs defined as the top 1% of precipitation events or daily precipitation >1 inch. Agreement among these datasets is fair to moderate from 1991 to 2020. There are spatial and temporal differences in the levels of agreement between ground stations and gridded climate datasets in their detection of EPEs in the United States from 1991 to 2020. Spatial variation in agreement is most strongly related to a location's proximity to the nearest ground station, with areas furthest from a ground station demonstrating the lowest levels of agreement. These areas have lower socioeconomic status, a higher proportion of Native American population, and higher social vulnerability index scores. The addition of ground stations in these areas may increase agreement, and future studies intending to use these or similar data sources should be aware of the limitations, biases, and potential for differential misclassification of exposure to EPEs. Most importantly, vulnerable populations should be engaged to determine their priorities for enhanced surveillance of climate-based threats so that community-identified needs are met by any future improvements in data quality.
Collapse
|
11
|
Geospatial analysis of firearm injuries in an urban setting: Individual rather than community characteristics affect firearm injury risk. Am J Surg 2023; 225:1062-1068. [PMID: 36702734 DOI: 10.1016/j.amjsurg.2023.01.014] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2022] [Revised: 01/09/2023] [Accepted: 01/15/2023] [Indexed: 01/18/2023]
Abstract
BACKGROUND The relationship between individual/socioeconomic characteristics and firearm injury risk in an urban center was evaluated. METHODS A hospital registry was used to identify individuals in Baltimore City who experienced interpersonal firearm injury in 2019 (FA). Injuries that did not satisfy this criterion were used as a comparison group (NF). Socioeconomic characteristics were linked to home address at the block group level. Regression analysis was used to determine predictors of firearm injury. Clusters of high and low firearm relative to non-firearm injuries were identified. RESULTS A total of 1293 individuals were included (FA = 277, NF = 1016). The FA group lived in communities with lower income (p = 0.005), higher poverty (p = 0.007), and more Black residents (p < 0.001). Individual level factors were stronger predictors of firearm injury than community factors on multivariate regression with Black race associated with 5x higher odds of firearm injury (p < 0.001). Firearm injury clustered in areas of low socioeconomic status. CONCLUSIONS Individual versus community factors have a greater influence on firearm injury risk. Prevention efforts should target young, Black men in urban centers.
Collapse
|
12
|
Low-Dose Radiation Risks of Lymphohematopoietic Cancer Mortality in U.S. Shipyard Workers. Radiat Res 2022:489323. [PMID: 36520982 DOI: 10.1667/rade-22-00092.1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/06/2022] [Accepted: 11/11/2022] [Indexed: 02/17/2024]
Abstract
The linear, non-threshold (LNT) hypothesis of cancer induction derived from studies of populations exposed to moderate-to-high acute radiation doses may not be indicative of cancer risks associated with lifetime radiation exposures less than 100 mSv. The objective of this study was to examine risks and dose-response patterns of lymphohematopoietic cancer (LHC) and its types associated with low radiation exposure while adjusting for possible confounding factors. A retrospective cohort of 437,937 U.S. nuclear shipyard workers (153,930 radiation and 284,007 non-radiation workers) was followed from 1957 to 2011, with 3,699 LHC deaths observed. The risk of LHC in radiation workers was initially compared to the risk in non-radiation workers. Time dependent accumulated radiation dose, lagged 2 years, was used in categorical and continuous dose analysis among radiation workers to examine the LHC risks and possible dose-response relationships based on Poisson regression models. These analyses controlled for sex, race, time dependent age, calendar time, socioeconomic status, solvent-related last job, and age at first hire. The median lifetime radiation dose for the radiation worker population was 0.82 mSv and the 95th percentile dose was 83.63 mSv. The study shows: 1. LHC mortality for radiation workers was significantly lower than non-radiation workers relative risk: 0.927; 95% confidence intervals (95% CI): 0.865, 0.992; P = 0.030]. Among LHC types, the risks for lymphoid leukemia and lymphomas in radiation workers were lower than the risk in non-radiation workers with statistical significance, while the risk for the rest of LHC types did not show any statistically significant difference. 2. In categorical dose analysis among radiation workers, sample size weighted linear trend of relative risk (RRs) for LHC and its types in five dose categories (>0-<25, 25-<50, 50-<100, 100-<200, and > = 200 mSv) vs. 0 mSv were not statistically significant, although there was an elevation of RR for chronic myeloid leukemia only in the 50-<100 mSv category (RR: 2.746; 95% CI: 1.002, 7.521; P = 0.049) vs. 0 mSv. 3. The Poisson regression analyses among radiation workers using the time dependent radiation dose as a continuous variable showed an excess relative risk (ERR) for LHC at 100 mSv of 0.094 (95% CI: -0.037, 0.225; P = 0.158) and leukemia less chronic lymphoid leukemia, of 0.178 (95% CI: -0.085, 0.440; P = 0.440) vs. 0 mSv. The ERRs and their linear trend for all other types were not statistically significant.
Collapse
|
13
|
Temporal and Spatial Differences between Symptomatic and Asymptomatic Malaria Infections in the Chittagong Hill Districts, Bangladesh. Am J Trop Med Hyg 2022; 107:1210-1217. [PMID: 36122682 PMCID: PMC9768271 DOI: 10.4269/ajtmh.21-0121] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2021] [Accepted: 02/13/2022] [Indexed: 12/30/2022] Open
Abstract
Mapping asymptomatic malaria infections, which contribute to the transmission reservoir, is important for elimination programs. This analysis compared the spatiotemporal patterns of symptomatic and asymptomatic Plasmodium falciparum malaria infections in a cohort study of ∼25,000 people living in a rural hypoendemic area of about 179 km2 in a small area of the Chittagong Hill Districts of Bangladesh. Asymptomatic infections were identified by active surveillance; symptomatic clinical cases presented for care. Infections were identified by a positive rapid diagnostic test and/or microscopy. Fifty-three subjects with asymptomatic P. falciparum infection were compared with 572 subjects with symptomatic P. falciparum between mid-October 2009 and mid-October 2012 with regard to seasonality, household location, and extent of spatial clustering. We found increased spatial clustering of symptomatic compared with asymptomatic infections, and the areas of high intensity were only sometimes overlapping. Symptomatic cases had a distinct seasonality, unlike asymptomatic infections, which were detected year-round. In a comparison of 42 symptomatic Plasmodium vivax and 777 symptomatic P. falciparum cases from mid-October 2009 through mid-March 2015, we found substantial spatial overlap in areas with high infection rates, but the areas with the greatest concentration of infection differed. Detection of both symptomatic P. falciparum and symptomatic P. vivax infections was greater during the May-to-October high season, although a greater proportion of P. falciparum cases occurred during the high season compared with P. vivax. These findings reinforce that passive malaria surveillance and treatment of symptomatic cases will not eliminate the asymptomatic reservoirs that occur distinctly in time and space.
Collapse
|
14
|
The association between vacant housing demolition and safety and health in Baltimore, MD. Prev Med 2022; 164:107292. [PMID: 36228876 DOI: 10.1016/j.ypmed.2022.107292] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/04/2022] [Revised: 08/17/2022] [Accepted: 10/02/2022] [Indexed: 11/19/2022]
Abstract
We measured the association between vacant housing demolitions and changes in crime and emergency department (ED) visits in Baltimore, MD. We included 646 block groups in Baltimore, 224 of which experienced at least one demolition from 2012 to 2019. The exposure was the number of demolitions completed in a block group during the previous quarter. Crime (all, property, and violent) and ED visits (all, adults, children, and for specific causes) were examined as the change in the rate per 1000 people from the previous quarter to the current quarter and analyzed using multivariable mixed effects regression models. Demolitions were associated with a small decrease in total ED visits (difference = -0.068 per 1000 people from the previous quarter to the current quarter, 95% CI -0.119, -0.018) but no significant change in crime. For each demolition, the rate of total child ED visits was 0.452 lower compared to the previous quarter (95% CI -0.777, -0.127). Demolitions were associated with small decreases in adult injury-related ED visits in the short term.
Collapse
|
15
|
Pectoralis muscle injuries in Major and Minor League Baseball. J Shoulder Elbow Surg 2022; 31:e363-e368. [PMID: 35183743 DOI: 10.1016/j.jse.2022.01.134] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/04/2021] [Revised: 01/12/2022] [Accepted: 01/16/2022] [Indexed: 02/01/2023]
Abstract
BACKGROUND AND HYPOTHESIS Although shoulder and elbow injuries in professional baseball players have been thoroughly studied, little is known about the frequency and impact of pectoralis muscle injuries in this population. The purpose of this study was to use the official league injury surveillance system to describe pectoralis muscle injuries in professional baseball players in Major League Baseball (MLB) and Minor League Baseball (MiLB). Specifically, (1) player demographic characteristics, (2) return to play (RTP), (3) injury mechanism, (4) throwing- and batting-side dominance, and (5) injury rate per athlete exposure (AE) were characterized to guide future injury prevention strategies. METHODS The MLB Health and Injury Tracking System database was used to compile all pectoralis muscle injuries in MLB and MiLB athletes in the 2011-2017 seasons. Injury-related data including diagnosis (tear or rupture vs. strain), player demographic characteristics, injury timing, need for surgical intervention, RTP, and mechanism of injury were recorded. Subanalyses of throwing- and batting-side dominance, as well as MLB vs. MiLB injury frequency, were performed. RESULTS A total of 138 pectoralis muscle injuries (32 MLB and 106 MiLB injuries) were reported in the study period (5 tears or ruptures and 133 strains), with 5 of these being recurrent injuries. Operative intervention was performed in 4 athletes (2.9%). Of the 138 injuries, 116 (84.1%) resulted in missed days of play, with a mean time to RTP of 19.5 days. Starting pitchers sustained the greatest proportion of pectoralis injuries (48.1%), with pitching being the most common activity at the time of injury (45.9%). A majority of injuries (86.5%) were sustained during non-contact play. Overall, 87.5% of injuries occurred on the player's dominant throwing side and 81.3% occurred on the player's dominant batting side. There was no significant difference in the rate of pectoralis injuries in the MLB regular season (0.584 per 10,000 AEs) vs. the MiLB regular season (0.425 per 10,000 AEs) (P = .1018). CONCLUSION Pectoralis muscle injuries are most frequently non-contact injuries, most commonly sustained by pitchers. An understanding of these injuries can guide athletic trainers and management in expectation management and decision making, in addition to directing future efforts at injury prevention.
Collapse
|
16
|
Geospatial Food Environment Exposure and Obesity among Low Income Baltimore City Children: Associations Differ by Data Source and Processing Method. JOURNAL OF HUNGER & ENVIRONMENTAL NUTRITION 2022. [DOI: 10.1080/19320248.2022.2090882] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
|
17
|
The Most Common Type, Severity, and Expected Frequency of Injuries Vary by Defensive Position in Professional Baseball Players. Am J Sports Med 2022; 50:2534-2541. [PMID: 35763569 DOI: 10.1177/03635465221104490] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 01/31/2023]
Abstract
BACKGROUND Location, frequency, and severity of in-game injuries by defensive position played have never been determined in professional baseball players. HYPOTHESIS Catchers would have a higher frequency of hip and knee injuries; infielders and outfielders would have a higher frequency of general lower extremity injuries; and pitchers would have a higher frequency and severity of shoulder and elbow/forearm injuries. STUDY DESIGN Descriptive epidemiology study. METHODS The Major League Baseball Health and Injury Tracking System database was queried for all injuries in Major League Baseball and Minor League Baseball during the 2011-2019 seasons. Injuries were stratified by the following variables: athlete's level of play at the time of injury, anatomic region injured, whether the injury occurred during a game, and position played at the time of injury (infielder, outfielder, catcher, or pitcher). Number of days missed from competition immediately after an injury was used as a surrogate for injury severity: mild (0 days missed), moderate (1-5 days), and severe (>5 days). Observed versus expected injury ratios were calculated for each anatomic region based on position played, and ratios were adjusted by the number of players per position type during a standard inning of play. RESULTS A total of 112,405 work-related injuries were reported, with the majority of injuries (86,520; 77%) occurring in Minor League Baseball athletes. Injuries to the leg, hand, shoulder, torso, and foot were the most common for athletes in both leagues, while hip/groin injuries were the least common. Catchers sustained the most in-game defensive head/neck injuries, while infielders and outfielders had the highest number of knee injuries. Starting and relief pitchers had the greatest total proportion of in-game defensive injuries across every other body region. Infielders and outfielders sustained injuries less frequently than expected across all body regions, while pitchers experienced more injuries than expected for all body parts. Catchers experienced more injuries than expected to the head/neck, hand, hip/groin, knee, and foot, and were more likely than other position players to sustain a knee injury that was categorized as severe based on time missed. CONCLUSION The location, severity, and frequency of injuries vary by defensive position among professional baseball players.
Collapse
|
18
|
Aggregated spatial intensity as a method for estimating point-level exposures within area-level units: The case of tobacco retailer exposure in census tracts. Spat Spatiotemporal Epidemiol 2022; 41:100482. [PMID: 35691649 PMCID: PMC9193981 DOI: 10.1016/j.sste.2022.100482] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/01/2021] [Revised: 11/08/2021] [Accepted: 01/20/2022] [Indexed: 11/26/2022]
Abstract
BACKGROUND Aggregating point-level events to area-level units can produce misleading interpretations when displayed via choropleth maps. We developed the aggregated intensity method to share point-level location information across unit boundaries prior to aggregation. This method was applied to tobacco retailers among census tracts in New Castle County, DE. METHODS Aggregated intensity uses kernel density estimation to generate spatially continuous expected counts of events per unit area, then aggregates these results to area-level units. We calculated a relative difference measure to compare aggregated intensity to observed counts. RESULTS Aggregated intensity produces estimates of event exposure unconstrained by boundaries. The relative difference between aggregated intensity and counts is greater for units with many events proximal to their borders. The appropriateness of aggregated intensity depends on events' spatial influence and proximity to unit boundaries, as well as computational inputs. CONCLUSIONS Aggregated intensity may facilitate more spatially realistic estimates of exposure to point-level events.
Collapse
|
19
|
Nested Spatial and Temporal Modeling of Environmental Conditions Associated With Genetic Markers of Vibrio parahaemolyticus in Washington State Pacific Oysters. Front Microbiol 2022; 13:849336. [PMID: 35432254 PMCID: PMC9007611 DOI: 10.3389/fmicb.2022.849336] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/06/2022] [Accepted: 03/01/2022] [Indexed: 11/25/2022] Open
Abstract
The Pacific Northwest (PNW) is one of the largest commercial harvesting areas for Pacific oysters (Crassostrea gigas) in the United States. Vibrio parahaemolyticus, a bacterium naturally present in estuarine waters accumulates in shellfish and is a major cause of seafood-borne illness. Growers, consumers, and public-health officials have raised concerns about rising vibriosis cases in the region. Vibrio parahaemolyticus genetic markers (tlh, tdh, and trh) were estimated using an most-probable-number (MPN)-PCR technique in Washington State Pacific oysters regularly sampled between May and October from 2005 to 2019 (N = 2,836); environmental conditions were also measured at each sampling event. Multilevel mixed-effects regression models were used to assess relationships between environmental measures and genetic markers as well as genetic marker ratios (trh:tlh, tdh:tlh, and tdh:trh), accounting for variation across space and time. Spatial and temporal dependence were also accounted for in the model structure. Model fit improved when including environmental measures from previous weeks (1-week lag for air temperature, 3-week lag for salinity). Positive associations were found between tlh and surface water temp, specifically between 15 and 26°C, and between trh and surface water temperature up to 26°C. tlh and trh were negatively associated with 3-week lagged salinity in the most saline waters (> 27 ppt). There was also a positive relationship between tissue temperature and tdh, but only above 20°C. The tdh:tlh ratio displayed analogous inverted non-linear relationships as tlh. The non-linear associations found between the genetic targets and environmental measures demonstrate the complex habitat suitability of V. parahaemolyticus. Additional associations with both spatial and temporal variables also suggest there are influential unmeasured environmental conditions that could further explain bacterium variability. Overall, these findings confirm previous ecological risk factors for vibriosis in Washington State, while also identifying new associations between lagged temporal effects and pathogenic markers of V. parahaemolyticus.
Collapse
|
20
|
Abstract
STUDY DESIGN A retrospective case series study. OBJECTIVE To analyze the epidemiology of diagnoses of degenerative cervical and lumbar spinal conditions among Major League Baseball (MLB) and Minor League Baseball (MiLB) players. SUMMARY OF BACKGROUND DATA Repetitive high-energy forces in professional baseball players may predispose them to degenerative cervical and lumbar spinal conditions. There is a lack of data concerning the epidemiology of these injuries in professional baseball. METHODS Deidentified data on spine injuries were collected from all MLB and MiLB teams from 2011 to 2016 from the MLB-commissioned Health and Injury Tracking System database. Rates of diagnoses of common degenerative spinal conditions as well as their impact on days missed due to injury, necessitation of surgery, and player participation and career-ending status were assessed. Injury rates were reported as injuries per 1000 athlete-exposures in concordance with prior studies. RESULTS Over 2011 to 2016, 4246 days of play were missed due to 172 spine-related injuries. 73.3% were related to the lumbar spine and 26.7% to the cervical spine. There were similar rates of surgery required for these injuries (18.3% of lumbar injuries vs. 13.0% of cervical injuries, P = 0.2164). Mean age of players with cervical injuries was higher compared with the lumbar group (27.5 vs. 25.4, P = 0.0119). Average number of days missed due to lumbar injuries was significantly higher than those due to cervical injuries (34.1 vs. 21.6 d, P = 0.0468). Spine injury rates for pitchers were significantly higher than those of other position players (0.086 per 1000 athlete-exposures vs. 0.037, P < 0.0001). CONCLUSION Neurologic diagnoses relating to the cervical and lumbar spine lead to substantial disability among MLB and MiLB players as well as days missed from play. Pitchers have over double the rates of injury compared with other position players. Lumbar conditions were associated with significantly higher numbers of days missed from play.Level of Evidence: 4.
Collapse
|
21
|
Abstract
BACKGROUND Injury rates in baseball players of all ages are increasing. Identifying modifiable risk factors is paramount to implementing injury prevention programs. PURPOSE/HYPOTHESIS The purpose was to evaluate the influence of weather (temperature, humidity, atmospheric pressure, and heat index) and game factors (start time, duration, single vs doubleheader) on injury rates in professional baseball players. We hypothesized that colder temperatures would be associated with significantly more injuries per game. STUDY DESIGN Case-control study; Level of evidence, 3. METHODS This was a retrospective database study. Two data sets were combined: 1 containing all injuries in Major and Minor League Baseball between 2011 and 2017 and 1 containing all games played in Major and Minor League Baseball during the same period to determine the number of injuries per game. Temperature, humidity, atmospheric pressure, and heat index were determined for each game using the data from the US Environmental Protection Agency. Additional game variables included the level of play, the turf type (natural vs artificial grass), the stadium type (open vs dome vs retractable), the game start time, the game duration, and whether the game was a doubleheader. Then, a multivariate analysis was conducted to determine which factors were associated with the number of injuries per game. RESULTS In total, our analysis included 33,587 injuries and 76,747 games. A total of 25,776 (33.6%) games contained an injury, and 41% of injuries occurred as multiples per game, with up to 9 injuries per game. The multivariate analysis identified significant associations between game duration and injuries per game (P < .001; effect size, 0.013) and the level of play and injuries per game (P < .001; effect size, 0.011). There were significant associations between the venue type (P < .001), the game start time (P < .001), humidity (P < .001), the turf type (P = .016), and barometric pressure (P = .031); however, the effect size for each was <0.001, suggesting that these factors are clinically unimportant. Our overall model produced an R2 of 0.04, indicating that these variables only predicted 4% of the variance in injury risk. CONCLUSION In professional baseball, the weather is not associated with injury risk; however, game duration may contribute to injury risk.
Collapse
|
22
|
Racial disparities in triple negative breast cancer: toward a causal architecture approach. Breast Cancer Res 2022; 24:37. [PMID: 35650633 PMCID: PMC9158353 DOI: 10.1186/s13058-022-01533-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2021] [Accepted: 05/23/2022] [Indexed: 01/21/2023] Open
Abstract
BACKGROUND Triple negative breast cancer (TNBC) is an aggressive subtype of invasive breast cancer that disproportionately affects Black women and contributes to racial disparities in breast cancer mortality. Prior research has suggested that neighborhood effects may contribute to this disparity beyond individual risk factors. METHODS The sample included a cohort of 3316 breast cancer cases diagnosed between 2012 and 2020 in New Castle County, Delaware, a geographic region of the US with elevated rates of TNBC. Multilevel methods and geospatial mapping evaluated whether the race, income, and race/income versions of the neighborhood Index of Concentration at the Extremes (ICE) metric could efficiently identify census tracts (CT) with higher odds of TNBC relative to other forms of invasive breast cancer. Odds ratios (OR) and 95% confidence intervals (CI) were reported; p-values < 0.05 were significant. Additional analyses examined area-level differences in exposure to metabolic risk factors, including unhealthy alcohol use and obesity. RESULTS The ICE-Race, -Income-, and Race/Income metrics were each associated with greater census tract odds of TNBC on a bivariate basis. However, only ICE-Race was significantly associated with higher odds of TNBC after adjustment for patient-level age and race (most disadvantaged CT: OR = 2.09; 95% CI 1.40-3.13), providing support for neighborhood effects. Higher counts of alcohol and fast-food retailers, and correspondingly higher rates of unhealthy alcohol use and obesity, were observed in CTs that were classified into the most disadvantaged ICE-Race quintile and had the highest odds of TNBC. CONCLUSION The use of ICE can facilitate the monitoring of cancer inequities and advance the study of racial disparities in breast cancer.
Collapse
|
23
|
Reducing Exposure to Tobacco Retailers with Residential Zoning Policy: Insights from a Geospatial Analysis of Wilmington, Delaware. CITIES & HEALTH 2022; 6:752-764. [PMID: 36570619 PMCID: PMC9783014 DOI: 10.1080/23748834.2021.1935141] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/03/2023]
Abstract
Cigarette use remains the leading preventable cause of premature mortality in the US, with declines in smoking rates slowing in recent years. One promising target for improved tobacco control is the expanded regulation of tobacco retailers. Evaluations of such policy attempts have largely produced mixed results to date. The objective of this study was to the assess the potential of using a novel, residentially-focused zoning approach to produce a more targeted and equitable reduction in tobacco retailers in high-risk urban settings. We focused on Wilmington, Delaware, a city characterized by high poverty rates, a majority Black population, a disparate number of tobacco retailers, and an elevated smoking prevalence. Through the use of geospatial analyses, we observed disproportionately higher counts of convenience store tobacco retailers in medium- and high-density residential zones in Wilmington relative to the surrounding county. By linking electronic health record (EHR) data from a local health care system and US Census Bureau data, we further found that approximately 80% of Wilmington smokers and 60% of Wilmington youth lived in these residential zones. These findings highlight the potential to more equitably reduce tobacco retailer exposure through a residentially-focused zoning approach. Tobacco control policy and research implications are considered.
Collapse
|
24
|
The Lyme and Tickborne Disease Dashboard: A map-based resource to promote public health awareness and research collaboration. PLoS One 2021; 16:e0260122. [PMID: 34851988 PMCID: PMC8635336 DOI: 10.1371/journal.pone.0260122] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2021] [Accepted: 11/02/2021] [Indexed: 11/19/2022] Open
Abstract
With the incidence of Lyme and other tickborne diseases on the rise in the US and globally, there is a critical need for data-driven tools that communicate the magnitude of this problem and help guide public health responses. We present the Johns Hopkins Lyme and Tickborne Disease Dashboard (https://www.hopkinslymetracker.org/), a new tool that harnesses the power of geography to raise awareness and fuel research and scientific collaboration. The dashboard is unique in applying a geographic lens to tickborne diseases, aiming not only to become a global tracker of tickborne diseases but also to contextualize their complicated geography with a comprehensive set of maps and spatial data sets representing a One Health approach. We share our experience designing and implementing the dashboard, describe the main features, and discuss current limitations and future directions.
Collapse
|
25
|
A Population Health Assessment in a Community Cancer Center Catchment Area: Triple negative breast cancer, alcohol use, and obesity in New Castle County, Delaware. Cancer Epidemiol Biomarkers Prev 2021; 31:108-116. [PMID: 34737210 DOI: 10.1158/1055-9965.epi-21-1031] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2021] [Revised: 10/12/2021] [Accepted: 11/02/2021] [Indexed: 11/16/2022] Open
Abstract
BACKGROUND The National Cancer Institute (NCI) requires designated cancer centers to conduct catchment area assessments to guide cancer control and prevention efforts designed to reduce the local cancer burden. We extended and adapted this approach to a community cancer center catchment area with elevated rates of triple negative breast cancer (TNBC). METHODS Cancer registry data for 462 TNBC and 2,987 Not-TNBC cases diagnosed between 2012 and 2020 at the Helen F. Graham Cancer Center & Research Institute (HFGCCRI), located in New Castle County, Delaware, were geocoded to detect areas of elevated risk ('hot spots') and decreased risk ('cold spots'). Next, electronic health record (EHR) data on obesity and alcohol use disorder (AUD) and catchment-area measures of fast-food and alcohol retailers were used to assess for spatial relationships between TNBC hot spots and potentially modifiable risk factors. RESULTS Two hot and two cold spots were identified for TNBC within the catchment area. The hot spots accounted for 11% of the catchment area but nearly a third of all TNBC cases. Higher rates of unhealthy alcohol use and obesity were observed within the hot spots. CONCLUSIONS The use of spatial methods to analyze cancer registry and other secondary data sources can inform cancer control and prevention efforts within community cancer center catchment areas, where limited resources can preclude the collection of new primary data. IMPACT Targeting community outreach and engagement activities to TNBC hot spots offers the potential to reduce the population-level burden of cancer efficiently and equitably.
Collapse
|
26
|
Black, white, or green? The effects of racial composition and socioeconomic status on neighborhood-level tobacco outlet density. ETHNICITY & HEALTH 2021; 26:1012-1027. [PMID: 31124377 PMCID: PMC6875694 DOI: 10.1080/13557858.2019.1620178] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/16/2018] [Accepted: 05/14/2019] [Indexed: 05/23/2023]
Abstract
Objective: To compare predominantly-Black and predominantly-White Maryland areas with similar socioeconomic status to examine the role of both race and socioeconomic status on tobacco outlet availability and tobacco outlet access.Design: Maryland tobacco outlet addresses were geocoded with 2011-2015 American Community Survey sociodemographic data. Two-sample t-tests were conducted comparing the mean values of sociodemographic variables and tobacco outlet density per Census Tract, and spatial lag based regression models were conducted to analyze the direct association between covariables and tobacco outlet density while accounting for spatial dependence between and within jurisdictions.Results: Predominantly-White jurisdictions had lower tobacco outlet availability and access than predominantly-Black jurisdictions, despite similar socioeconomic status. Spatial lag model results showed that median household income and vacant houses had consistent associations with tobacco outlet density across most of the jurisdictions analyzed, and place-based spatial lag models showed direct associations between predominantly-Black jurisdictions and tobacco outlet availability and access.Conclusion: Predominantly-White areas have lower levels of tobacco outlet density than predominantly-Black areas, despite both areas having similar socioeconomic statuses.
Collapse
|
27
|
Operationalizing the Population Health Framework: Clinical Characteristics, Social Context, and the Built Environment. Popul Health Manag 2021; 24:454-462. [PMID: 34406088 DOI: 10.1089/pop.2020.0170] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
As a framework, population health emphasizes health outcomes for entire populations, the broad range of determinants of these outcomes, and the comparative effectiveness of medical and public health interventions. In practice, however, many contemporary population health programs instead focus on small subsets of patients who account for a disproportionate share of health care utilization, often with disappointing results. The authors proposed a new approach to operationalize population health in clinical settings, with the example of tobacco use. Electronic health record (EHR) data from a mid-Atlantic health system were used to: (1) define and describe a hospital-based population of current smokers, (2) analyze the demographic characteristics of the population to consider how the social context may impact treatment, and (3) join EHR data with public licensing data on tobacco retail locations to assess the relationship between the built environment and smoking status. Out of a total of 20,310 unique adult admissions to the health system, 3749 (18.5%) were current smokers. Compared to never smokers, current smokers were significantly younger, more likely to be male, more likely to be Black/African American, less likely to be Hispanic/Latino/a, and more likely to be on Medicaid or be self-pay. Current vs. former smokers had significantly higher exposure to tobacco retail locations, even after adjusting for demographic and other covariates. By defining populations around leading modifiable medical determinants of health, and accounting for the larger context of sociodemographic factors and the built environment, health systems can invest in comprehensive programs designed to produce the greatest population health returns.
Collapse
|
28
|
The co-occurrence of smoking and alcohol use disorder in a hospital-based population: Applying a multimorbidity framework using geographic information system methods. Addict Behav 2021; 118:106883. [PMID: 33714034 DOI: 10.1016/j.addbeh.2021.106883] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/22/2020] [Revised: 02/08/2021] [Accepted: 02/15/2021] [Indexed: 11/27/2022]
Abstract
Tobacco and alcohol use are leading causes of premature mortality in the US and concurrent use is associated with even greater health risks. A cross-sectional study of 20,310 patients admitted to a Mid-Atlantic acute health care system between July 1, 2018 and June 30, 2019 were categorized according to smoking and alcohol use disorder (AUD) status. Of the total admissions, 1464 (7.2%) were current smokers with an AUD. These patients were younger (52.4 vs. 63.9), more likely to be male (64.1% vs. 38.0%) and covered by Medicaid (46.9% vs. 11.6%), and resided in proximity to higher counts of tobacco (10.3 vs. 4.72) and alcohol (2.24 vs. 1.14) retailers than never smokers without an AUD. Clinically, these patients had higher rates of other substance use disorders (60.4% vs. 6.1%), depression (64.6% vs. 34.8%), HIV/AIDS (3.3% vs. 0.6%), and liver disease (40.7% vs. 13.2%) than never smokers without an AUD. Patients who concurrently smoke and have an AUD face unique and serious health risks. A multimorbidity framework can guide clinical and community-based interventions for individuals with concurrent psychiatric and chronic medical conditions, complex social needs, and adverse environmental exposures.
Collapse
|
29
|
Time Out of Play Due to Illness in Major and Minor League Baseball. Clin J Sport Med 2021; 31:e137-e143. [PMID: 31219928 DOI: 10.1097/jsm.0000000000000756] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/10/2018] [Accepted: 04/09/2019] [Indexed: 02/02/2023]
Abstract
OBJECTIVE To generate a summative report on the most commonly diagnosed illnesses in Major League Baseball (MLB) and Minor League Baseball (MiLB) athletes with specific attention to their impact based on time out of play. DESIGN Retrospective analysis. SETTING Injury and illness data from the MLB Health and Injury Tracking System. PARTICIPANTS All MLB and MiLB athletes active between 2011 and 2016. ASSESSMENT OF RISK FACTORS Illnesses were defined as atraumatic medical diagnoses that occurred during the MLB or MiLB season and resulted in at least 1 day out of play. MAIN OUTCOME MEASURES Incidence of illness diagnoses and resulting time out play. RESULTS Eight thousand eight hundred thirty-four illnesses were reported, representing 14.7% of all diagnoses resulting in time out of play. Total days missed (DM) due to illness were 39 614, with a mean of 4.6 (SD 9.9 days) and median 2 DM per diagnosis. The annual incidence of illness per season was 20.3 per 100 athletes. The most common diagnosis was nonspecific viral illness (15.3%), followed by gastroenteritis (13.6%), other gastrointestinal illness (8.3%), influenza (7.0%), and upper respiratory infection (6.2%). Appendicitis (15.2%) and Epstein-Barr virus/cytomegalovirus (9.1%) were the most common season-ending diagnoses. CONCLUSIONS Illnesses represent a significant cause of time out of play in MLB and MiLB. Prevention efforts should focus on limiting the spread of communicable viral, respiratory, and gastrointestinal disease among players, as the majority of diagnoses fell into these categories. This work may be used to guide future research into illness treatment and prevention in professional baseball.
Collapse
|
30
|
Examining Batting Performance After a Sports-Related Concussion Among Major League Baseball Position Players. Am J Sports Med 2021; 49:790-797. [PMID: 33513029 DOI: 10.1177/0363546520987232] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 01/31/2023]
Abstract
BACKGROUND An ongoing challenge in sports-related concussion (SRC) is determining full recovery. This study examines performance metrics in baseball after an SRC and provides a template for assessment of return-to-performance parameters. PURPOSE To determine whether batting performance returns to baseline after an SRC. STUDY DESIGN Descriptive epidemiological study. METHODS Participants were all Major League Baseball (MLB) position players with confirmed SRCs that occurred during the 2011-2015 seasons. A retrospective review and assessment of performance metrics before and after injury were conducted as defined relative to the number of plate appearances (PAs) to yield reliable performance statistics. Seven batting metrics were considered as outcomes in longitudinal regressions: batting average, on-base percentage, slugging percentage, on-base plus slugging, bases on balls, strikeouts, and home runs. Metrics were calculated for each player 60, 30, and 14 days before their SRCs, as well as for the 14, 30, and 60 days after returning to play. Other variables controlled for included defensive position, player age at the time of SRC, number of days missed, mechanism of injury, whether the player completed a rehabilitation stint, and year in which the mild traumatic brain injury (MTBI) occurred (2011-2015). RESULTS A total of 77 MTBI case events occurred in MLB position players over 5 seasons. These injuries resulted in a mean 11.4 days lost to injury. For all performance metrics using 60 or 30 days before MTBI as baseline, no statistically significant differences were found in batting performance. In total, 63 events met PA criteria before injury. Varying the PA cutoff thresholds to be more inclusive or more restrictive yielded similar regression results. For the 48 events that met PA criteria before and after injury, most performance metrics showed no significant performance change after MTBI and, in some events, a slight though mostly nonsignificant performance improvement after MTBI. CONCLUSIONS MLB position players who are medically cleared to return to play after an SRC perform at the same offensive performance levels as their preinjury statistics when an adequate number of PAs is used to compare performance before and after injury.
Collapse
|
31
|
Characterizing the spatial relationship between smoking status and tobacco retail exposure: Implications for policy development and evaluation. Health Place 2021; 68:102530. [PMID: 33609995 PMCID: PMC7986985 DOI: 10.1016/j.healthplace.2021.102530] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/12/2020] [Revised: 02/03/2021] [Accepted: 02/04/2021] [Indexed: 11/20/2022]
Abstract
Tobacco retail density and smoking prevalence remain elevated in marginalized communities, underscoring the need for strategies to address these place-based disparities. The spatial variation of smokers and tobacco retailers is often measured by aggregating them to area-level units (e.g., census tracts), but spatial statistical methods that use point-level data, such as spatial intensity and K-functions, can better describe their geographic patterns. We applied these methods to a case study in New Castle County, DE to characterize the cross-sectional spatial relationship between tobacco retailers and smokers, finding that current smokers experience greater tobacco retail exposure and clustering relative to former smokers. We discuss how analysis at different geographic scales can provide complementary insights for tobacco control policy.
Collapse
|
32
|
A syndromic surveillance tool to detect anomalous clusters of COVID-19 symptoms in the United States. Sci Rep 2021; 11:4660. [PMID: 33633250 DOI: 10.1101/2020.08.18.20177295] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2020] [Accepted: 02/12/2021] [Indexed: 05/25/2023] Open
Abstract
Coronavirus SARS-COV-2 infections continue to spread across the world, yet effective large-scale disease detection and prediction remain limited. COVID Control: A Johns Hopkins University Study, is a novel syndromic surveillance approach, which collects body temperature and COVID-like illness (CLI) symptoms across the US using a smartphone app and applies spatio-temporal clustering techniques and cross-correlation analysis to create maps of abnormal symptomatology incidence that are made publicly available. The results of the cross-correlation analysis identify optimal temporal lags between symptoms and a range of COVID-19 outcomes, with new taste/smell loss showing the highest correlations. We also identified temporal clusters of change in taste/smell entries and confirmed COVID-19 incidence in Baltimore City and County. Further, we utilized an extended simulated dataset to showcase our analytics in Maryland. The resulting clusters can serve as indicators of emerging COVID-19 outbreaks, and support syndromic surveillance as an early warning system for disease prevention and control.
Collapse
|
33
|
A syndromic surveillance tool to detect anomalous clusters of COVID-19 symptoms in the United States. Sci Rep 2021; 11:4660. [PMID: 33633250 PMCID: PMC7907397 DOI: 10.1038/s41598-021-84145-5] [Citation(s) in RCA: 19] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2020] [Accepted: 02/12/2021] [Indexed: 11/13/2022] Open
Abstract
Coronavirus SARS-COV-2 infections continue to spread across the world, yet effective large-scale disease detection and prediction remain limited. COVID Control: A Johns Hopkins University Study, is a novel syndromic surveillance approach, which collects body temperature and COVID-like illness (CLI) symptoms across the US using a smartphone app and applies spatio-temporal clustering techniques and cross-correlation analysis to create maps of abnormal symptomatology incidence that are made publicly available. The results of the cross-correlation analysis identify optimal temporal lags between symptoms and a range of COVID-19 outcomes, with new taste/smell loss showing the highest correlations. We also identified temporal clusters of change in taste/smell entries and confirmed COVID-19 incidence in Baltimore City and County. Further, we utilized an extended simulated dataset to showcase our analytics in Maryland. The resulting clusters can serve as indicators of emerging COVID-19 outbreaks, and support syndromic surveillance as an early warning system for disease prevention and control.
Collapse
|
34
|
A case-control analysis of traceback investigations for Vibrio parahaemolyticus infections (vibriosis) and pre-harvest environmental conditions in Washington State, 2013-2018. THE SCIENCE OF THE TOTAL ENVIRONMENT 2021; 752:141650. [PMID: 32898797 PMCID: PMC7674187 DOI: 10.1016/j.scitotenv.2020.141650] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/08/2020] [Revised: 07/25/2020] [Accepted: 08/10/2020] [Indexed: 05/28/2023]
Abstract
BACKGROUND Vibrio parahaemolyticus is a major cause of seafood-borne illness. It is naturally prevalent in brackish waters and accumulates in shellfish. Vibriosis cases are rising globally, likely due to rising temperatures. OBJECTIVES To identify associations between vibriosis in Washington State and pre-harvest environmental and V. parahaemolyticus genetic measurements sampled from shellfish. METHODS Successful vibriosis traceback investigations were spatiotemporally matched to routine intertidal oyster (Crassostrea gigas) sampling events, which included measurements of temperature, salinity, and V. parahaemolyticus genetic targets (thermolabile hemolysin: tlh; thermostable direct hemolysin: tdh; thermostable direct-related hemolysin: trh). Unmatched sampling events were treated as controls. Associations were evaluated using logistic regression models. RESULTS Systematic differences were observed across Washington harvesting zones. These included positive associations between the odds of vibriosis and all three genetic targets in South Puget Sound, with a large odds ratio (OR) = 13.0 (95% CI: 1.5, 115.0) for a 1-log10 increase in tdh when total bacterium abundance was low (tlh < 1 log10 MPN/g). A positive association also occurred for a 1 °C increase in tissue temperature OR = 1.20 (95% CI: 1.10, 1.30) while a negative association occurred for a similar increase in water temperature OR = 0.70 (95% CI: 0.59, 0.81). In contrast, the coastal bays displayed positive associations for water temperature OR = 2.16 (95% CI, 1.15, 4.05), and for a 1-log10 increase in the tdh:trh ratio OR = 5.85 (95% CI, 1.06, 32.26). DISCUSSION The zonal variation in associations indicates unique pathogenic strain prominence, suggesting tdh+/trh+ strains in South Puget Sound, such as the O4:K12 serotype, and tdh+/trh- strains in the coastal bays. The temperature discrepancy between water and oyster tissue suggests that South Puget Sound pathogenic strains flourish with exposure to relatively warm air during low tide. These findings identify new ecological risk factors for vibriosis in Washington State that can be used in future prevention efforts.
Collapse
|
35
|
Preventing Concussions From Foul Tips and Backswings in Professional Baseball: Catchers' Perceptions of and Experiences With Conventional and Hockey-Style Masks. Clin J Sport Med 2021; 31:e1-e7. [PMID: 30358617 DOI: 10.1097/jsm.0000000000000679] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/02/2023]
Abstract
OBJECTIVE To understand catchers' preferences for mask type and perceptions regarding safety, comfort, and fit, and determine whether mask type is correlated with self-reported concussion and related symptoms after impacts from foul tips or backswings. DESIGN Cross-sectional study. SETTING Survey of active baseball catchers. PARTICIPANTS Professional baseball catchers. INTERVENTION From May 1, 2015, to June 30, 2015, an online survey was administered in English and Spanish to all Major and Minor League catchers (n = 836). MAIN OUTCOME MEASURES Survey items addressed the type of mask routinely and previously used (conventional or hockey style); brand and material (steel or titanium); perceptions regarding safety, comfort, and fit; and experiences with concussions. RESULTS The sample consisted of 596 catchers of which 26% reported being diagnosed with a concussion. Some concussions occurred from non-baseball activities, such as car accidents or off the field incidents. For those that occurred playing baseball, 35% resulted from a foul tip. Once catchers entered professional baseball, the use of a conventional mask rose significantly: 71% of catchers reported wearing conventional-style masks, and 30% hockey-style masks at the time the survey was conducted (P < 0.05). Both conventional and hockey-style mask wearers significantly selected hockey-style masks as providing better overall safety and protection than conventional masks (P < 0.05). CONCLUSIONS This research supports foul tips as an important cause of concussion in catchers and provides important information about preferences among catchers for masks that are not perceived as the safest and strongest. Future research should supplement these data by conducting laboratory testing to determine which masks are stronger and by collecting qualitative data to explore why some players are more likely to wear a mask type that they perceive as offering less safety or protection.
Collapse
|
36
|
A systematic review of post-harvest interventions for Vibrio parahaemolyticus in raw oysters. THE SCIENCE OF THE TOTAL ENVIRONMENT 2020; 745:140795. [PMID: 32731065 DOI: 10.1016/j.scitotenv.2020.140795] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/31/2020] [Revised: 06/29/2020] [Accepted: 07/05/2020] [Indexed: 06/11/2023]
Abstract
BACKGROUND Non-cholera Vibrio bacteria are a major cause of foodborne illness in the United States. Raw oysters are commonly implicated in gastroenteritis caused by pathogenic Vibrio parahaemolyticus. In response to outbreaks in 1997-1998, the US Food and Drug Administration developed a nation-wide quantitative microbial risk assessment (QMRA) of V. parahaemolyticus in raw oysters in 2005. The QMRA identified information gaps that new research may address. Incidence of sporadic V. parahaemolyticus illness has recently increased and, as oyster consumption increases and sea temperatures rise, V. parahaemolyticus outbreaks may become more frequent, posing health concerns. Updated and region-specific QMRAs will improve the accuracy and precision of risk of infection estimates. OBJECTIVES We identify research to support an updated QMRA of V. parahaemolyticus from oysters harvested in Chesapeake Bay and Puget Sound, focusing on observational and experimental research on post-harvest practices (PHPs) published from 2004 to 2019. METHODS A predefined search strategy was applied to PubMed, Embase, Scopus, Science.gov, NAL Agricola, and Google Scholar. Study eligibility criteria were defined using a population, intervention, comparator, and outcome statement. Reviewers independently coded abstracts for inclusion/exclusion using predefined criteria. Data were extracted and study quality and relevance evaluated based on published guidance for food safety risk assessments. Findings were synthesized using a weight of evidence approach. RESULTS Of 12,174 articles retrieved, 93 were included for full-text review. Twenty-seven studies were found to be high quality and high relevance, including studies on cold storage, high hydrostatic pressure, depuration, and disinfectant, and other PHPs. High hydrostatic pressure consistently emerged as the most effective PHP in reducing abundance of V. parahaemolyticus. DISCUSSION Limitations of the knowledge base and review approach involve the type and quantity of data reported. Future research should focus on PHPs for which few or no high quality and high relevance studies exist, such as irradiation and relaying.
Collapse
|
37
|
Neighborhood and Network Characteristics and the HIV Care Continuum among Gay, Bisexual, and Other Men Who Have Sex with Men. J Urban Health 2020; 97:592-608. [PMID: 29845586 PMCID: PMC7560681 DOI: 10.1007/s11524-018-0266-2] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/16/2022]
Abstract
In order for treatment as prevention to work as a national strategy to contain the HIV/AIDS epidemic in the United States (US), the HIV care continuum must become more robust, retaining more individuals at each step. The majority of people living with HIV/AIDS (PLWHA) in the US are gay, bisexual, and other men who have sex with men (MSM). Within this population, there are distinct race- and ethnicity-based disparities in rates of HIV infection, engagement, and retention in HIV care, and viral suppression. Compared with White MSM, HIV-infected Black MSM are less likely to be on anti-retroviral therapy (ART), adhere to ART, and achieve viral suppression. Among MSM living in urban areas, falling off the continuum may be influenced by factors beyond the individual level, with new research identifying key roles for network- and neighborhood-level characteristics. To inform multi-level and multi-component interventions, particularly to support Black MSM living in urban areas, a clearer understanding of the pathways of influence among factors at various levels of the social ecology is required. Here, we review and apply the empirical literature and relevant theoretical perspectives to develop a series of potential pathways of influence that may be further evaluated. Results of research based on these pathways may provide insights into the design of interventions, urban planning efforts, and assessments of program implementation, resulting in increased retention in care, ART adherence, and viral suppression among urban-dwelling, HIV-infected MSM.
Collapse
|
38
|
Abstract
BACKGROUND Gastrocnemius injuries are a common lower extremity injury in elite baseball players. There are no current epidemiological studies focused on gastrocnemius injuries in professional baseball players that provide information on the timing, distribution, and characteristics of such injuries. HYPOTHESIS Gastrocnemius injury in professional baseball players is a common injury that is influenced by factors such as age, player position, and time of season. STUDY DESIGN Descriptive epidemiological study. METHODS Based on Major League Baseball's (MLB's) Health and Injury Tracking System (HITS) database, gastrocnemius injuries that caused time out of play for MLB and Minor League Baseball (MiLB) players during the 2011-2016 seasons were identified. Player characteristics, including age, level of play, and position at time of injury, were collected. Injury-specific factors analyzed included date of injury, time of season, days missed, and activity leading to injury. RESULTS A total of 402 gastrocnemius injuries (n = 145, MLB; n = 257, MiLB) occurred during the 2011-2016 seasons. MLB players were significantly older at the time of injury (30.1 years, MLB; 23.9 years, MiLB; P < .001). Base running (36.1%) was the most common activity causing the injury, followed by fielding (23.6%), with 50.3% of base-running injuries sustained on the way to first base. In MLB players, gastrocnemius injuries were most common in infielders (48.3%), followed by pitchers (27.6%) and then outfielders (17.9%), while for MiLB players the injuries were more evenly distributed (33.5%, 28.8%, and 30.7%, respectively). The frequency of injuries in MLB players dropped off after the start of the regular season, whereas MiLB players had a consistent injury rate throughout the year. CONCLUSION Gastrocnemius injuries are a common cause of lower extremity injury in professional baseball players, resulting in significant time out of play. Base running, particularly to first base, was the most common activity during injury. Outfielders had the fewest injuries; however, they required the longest time to recover. This study provides the first investigation to date with the HITS database to examine the characteristics and distribution of gastrocnemius injuries in professional baseball players, offering insight into risk factors, injury prevention, and recovery expectations.
Collapse
|
39
|
Low-Income Black and Hispanic Children's Neighborhood Food Environments and Weight Trajectories in Early Childhood. Acad Pediatr 2020; 20:784-792. [PMID: 31783182 PMCID: PMC7324231 DOI: 10.1016/j.acap.2019.11.013] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/01/2019] [Revised: 11/22/2019] [Accepted: 11/23/2019] [Indexed: 10/25/2022]
Abstract
BACKGROUND High obesity rates among young black and Hispanic children place them at a higher risk for adult obesity and its comorbidities. Neighborhoods with predominately racial and ethnic minority residents have fewer healthful food options, which may contribute to obesity disparities. Few studies have assessed the relationship between neighborhood food environments and obesity in this population. METHODS Electronic health records from 2 pediatric primary care clinics serving predominately low-income, black, and Hispanic children were used to create a cohort of 3724 2- to 5-year olds, encompassing 7256 visits from 2007 to 2012 (mean 1.9 visits per patient, range: 1-5 visits per child). Longitudinal regression was used to model the association of mean body mass index z-score (BMI-z) over time and 3 measures of the neighborhood food environment: healthful food availability, availability of stores accepting the Special Supplemental Nutrition Program for Women, Infants and Children (WIC) benefits, and fast food availability. RESULTS Compared to peers in neighborhoods with no or few stores accepting WIC, children in neighborhoods with many WIC stores had higher BMI-z at age 2 years (average difference of 0.272; 95% confidence interval: 0.041-0.503; P = .021). No relationship was found for healthful food or fast food availability. Although children in neighborhoods with low fast food availability did not have statistically significantly different BMI-z at age 2 as compared to children in areas with high fast food availability, they did have a statistically significantly higher change in average BMI-z over time (0.006 per month, 0.000-0.012, P = .024). CONCLUSIONS Access to WIC stores was associated with lower obesity rates and more healthful average BMI-z over time and represents a potentially important neighborhood food environment characteristic influencing racial/ethnic disparities in childhood obesity among young black and Hispanic children. More studies are needed to assess what aspects of WIC stores may underlie the observed association.
Collapse
|
40
|
Improving the efficiency of reactive case detection for malaria elimination in southern Zambia: a cross-sectional study. Malar J 2020; 19:175. [PMID: 32381005 PMCID: PMC7206707 DOI: 10.1186/s12936-020-03245-1] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2020] [Accepted: 04/23/2020] [Indexed: 01/20/2023] Open
Abstract
Background Reactive case detection (RCD) seeks to enhance malaria surveillance and control by identifying and treating parasitaemic individuals residing near index cases. In Zambia, this strategy starts with passive detection of symptomatic incident malaria cases at local health facilities or by community health workers, with subsequent home visits to screen-and-treat residents in the index case and neighbouring (secondary) households within a 140-m radius using rapid diagnostic tests (RDTs). However, a small circular radius may not be the most efficient strategy to identify parasitaemic individuals in low-endemic areas with hotspots of malaria transmission. To evaluate if RCD efficiency could be improved by increasing the probability of identifying parasitaemic residents, environmental risk factors and a larger screening radius (250 m) were assessed in a region of low malaria endemicity. Methods Between January 12, 2015 and July 26, 2017, 4170 individuals residing in 158 index and 531 secondary households were enrolled and completed a baseline questionnaire in the catchment area of Macha Hospital in Choma District, Southern Province, Zambia. Plasmodium falciparum prevalence was measured using PfHRP2 RDTs and quantitative PCR (qPCR). A Quickbird™ high-resolution satellite image of the catchment area was used to create environmental risk factors in ArcGIS, and generalized estimating equations were used to evaluate associations between risk factors and secondary households with parasitaemic individuals. Results The parasite prevalence in secondary (non-index case) households was 0.7% by RDT and 1.8% by qPCR. Overall, 8.5% (n = 45) of secondary households had at least one resident with parasitaemia by qPCR or RDT. The risk of a secondary household having a parasitaemic resident was significantly increased in proximity to higher order streams and marginally with increasing distance from index households. The adjusted OR for proximity to third- and fifth-order streams were 2.97 (95% CI 1.04–8.42) and 2.30 (95% CI 1.04–5.09), respectively, and that for distance to index households for each 50 m was 1.24 (95% CI 0.98–1.58). Conclusion Applying proximity to streams as a screening tool, 16% (n = 3) more malaria-positive secondary households were identified compared to using a 140-m circular screening radius. This analysis highlights the potential use of environmental risk factors as a screening strategy to increase RCD efficiency.
Collapse
|
41
|
A Spatiotemporal Analysis of Organ-Specific Lupus Flares in Relation to Atmospheric Variables and Fine Particulate Matter Pollution. Arthritis Rheumatol 2020; 72:1134-1142. [PMID: 32017464 DOI: 10.1002/art.41217] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2019] [Accepted: 01/24/2020] [Indexed: 02/04/2023]
Abstract
OBJECTIVE To identify potential clusters of systemic lupus erythematosus (SLE) organ-specific flares and their relationship to fine particulate matter pollution (PM2.5), temperature, ozone concentration, resultant wind, relative humidity, and barometric pressure in the Hopkins Lupus Cohort, using spatiotemporal cluster analysis. METHODS A total of 1,628 patients who fulfilled the Systemic Lupus International Collaborating Clinics classification criteria for SLE and who had a home address recorded were included in the analysis. Disease activity was assessed using the Lupus Activity Index. Assessment of rash, joint involvement, serositis, and neurologic, pulmonary, renal, and hematologic activity was quantified on a 0-3 visual analog scale (VAS). An organ-specific flare was defined as an increase in VAS of ≥1 point compared to the previous visit. Spatiotemporal clusters were detected using SaTScan software. Regression models were used for cluster adjustment and included individual, county-level, and environmental variables. RESULTS Significant clusters unadjusted for environmental variables were identified for joint flares (P < 0.05; n = 3), rash flares (P < 0.05; n = 4), hematologic flares (P < 0.05; n = 3), neurologic flares (P < 0.05; n = 2), renal flares (P < 0.001; n = 4), serositis (P < 0.001; n = 2), and pulmonary flares (P < 0.001; n = 2). The majority of the clusters identified changed in significance, temporal extent, or spatial extent after adjustment for environmental variables. CONCLUSION We describe the first spatiotemporal clusters of lupus organ-specific flares. Seasonal, as well as multi-year, cluster patterns were identified, differing in extent and location for the various organ-specific flare types. Further studies focusing on each individual organ-specific flare are needed to better understand the driving forces behind these observed changes.
Collapse
|
42
|
Abstract
Background: Numerous studies have investigated injuries and treatments in the baseball athlete. The majority of these studies have focused on the throwing shoulder and elbow. However, more recent literature is reporting injuries to other regions in this cohort, including the knee, head, hip, and hamstring. Purpose/Hypothesis: The purpose of the current study was to determine the number and type of injuries in Major League Baseball (MLB) and Minor League Baseball (MiLB) players that do not occur during the actual game but are related to baseball participation. Our hypothesis was that there would be a substantial number of injuries that occurred in professional baseball players during non-game situations. Study Design: Descriptive epidemiological study. Methods: Deidentified, anonymous data were collected from the 2011 through 2016 seasons from the MLB Health and Injury Tracking System (HITS) medical record database. All injuries that were identified as a primary diagnosis and resulted in at least 1 day out of play from both MLB and MiLB were examined. Injuries were categorized as occurring during the game (“game” injuries) or not during the game. A “non-game” injury was defined as occurring at any time other than during the scheduled game from the first to last pitch. Results: There were 51,548 total injuries in MLB and MiLB players from 2011 to 2016, almost 40% of which were attributed to non–game-related injuries (n = 19,201; 37.2%). The remainder occurred during a game (n = 32,347; 62.8%). A significantly greater percentage of non-game injuries were season ending (10.8%) compared with the percentage of game-related season-ending injuries (8.4%) (P < .0001). Pitchers had significantly more non–game-related injuries than game-related injuries (P < .0001). Conclusion: A large number of injuries occur in professional baseball outside of actual games. MiLB players, specifically pitchers, are particularly at risk for these types of injuries. It is feasible that the overall injury rate in professional baseball players could be reduced by analyzing these injuries in more detail to develop prevention strategies.
Collapse
|
43
|
Associations of Environmental Conditions and Vibrio parahaemolyticus Genetic Markers in Washington State Pacific Oysters. Front Microbiol 2019; 10:2797. [PMID: 31866972 PMCID: PMC6904363 DOI: 10.3389/fmicb.2019.02797] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2019] [Accepted: 11/18/2019] [Indexed: 01/05/2023] Open
Abstract
Vibrio parahaemolyticus is a naturally occurring bacterium in estuarine waters and is a major cause of seafood-borne illness. The bacterium has been consistently identified in Pacific Northwest waters and elevated illness rates of vibriosis in Washington State have raised concerns among growers, risk managers, and consumers of Pacific oysters (Crassostrea gigas). In order to better understand pre-harvest variation of V. parahaemolyticus in the region, abundance of total and potentially pathogenic strains of the bacterium in a large number of Washington State Pacific oyster samples were compared with environmental conditions at the time of sampling. The Washington Department of Health regularly sampled oysters between June and September at over 21 locations from 2014 to 2018, resulting in over 946 samples. V. parahaemolyticus strains carrying three genetic markers, tlh, trh, and tdh, were enumerated in oyster tissue using a most probable number-PCR analysis. Tobit regressions and seemingly unrelated estimations were used to formally assess relationships between environmental measures and genetic markers. All genetic markers were found to be positively associated with temperature, independent of the abundance of other genetic markers. Surface water temperature displayed a non-linear relationship, with no association observed between any genetic marker in the warmest waters. There were also stark differences between surface and shore water temperature models. Salinity was not found to be substantially associated with any of the genetic variables. The relative abundance of tdh+ strains given total V. parahaemolyticus abundance (pathogenic ratio tdh:tlh) was negatively associated with water temperature in colder waters and decreased exponentially as total V. parahaemolyticus abundance increased. Strains carrying the trh gene had a pronounced positive association with strains carrying the tdh gene but was also negatively associated with the tdh:tlh pathogenic ratio. These results suggest that there are ecological relationships of competition, growth, and survival for V. parahaemolyticus strains in the oyster tissue matrix. This work also improves the overall understanding of environmental associations with V. parahaemolyticus in Washington State Pacific oysters, laying the groundwork for future risk mitigation efforts in the region.
Collapse
|
44
|
Associations among neighborhood greenspace, neighborhood violence, and children's asthma control in an urban city. Ann Allergy Asthma Immunol 2019; 123:608-610. [PMID: 31610235 PMCID: PMC6915955 DOI: 10.1016/j.anai.2019.10.003] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2019] [Revised: 10/01/2019] [Accepted: 10/03/2019] [Indexed: 11/19/2022]
|
45
|
Adjusted, non-Euclidean cluster detection of Vibrio parahaemolyticus in the Chesapeake Bay, USA. GEOSPATIAL HEALTH 2019; 14. [PMID: 31724370 DOI: 10.4081/gh.2019.783] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/23/2019] [Accepted: 09/12/2019] [Indexed: 06/10/2023]
Abstract
Vibrio parahaemolyticus (V. parahaemolyticus) is a naturallyoccurring bacterium found in estuaries, such as the Chesapeake Bay (USA), that can cause vibriosis, a food - and waterborne illness, in humans. Tracking the spatial and temporal distribution of V. parahaemolyticus in the Chesapeake Bay, which varies in part due to water temperature, salinity, and other environmental variables, can help identify areas and time periods of high risk. These observations can support interventions used to reduce the burden of vibriosis. Spatial and spatiotemporal clusters of high V. parahaemolyticus abundance were identified among surface water samples in the Chesapeake Bay between 2007 and 2010. While Euclidean distances between geographic points in spatial analyses are often used for cluster detection, non-Euclidean distances should be considered for cluster detection due to the complex nature of the Chesapeake Bay shoreline. Comparison of both methods consistently showed the non-Euclidean cluster detection providing unique and more reasonable clusters than the Euclidean approach. Residuals from univariate and multivariate models were used to identify how clusters changed after controlling for environmental variables. Most clusters tended to decrease in space, time, or significance after adjustment, suggesting these covariates contributed to the original formation of the clusters and as such are useful observation tools for vibriosis risk managers. Clusters that remained after adjustment suggest areas for further study and intervention. These findings reinforce the importance of using non-Euclidean distances when tracking the spatiotemporal variation of V. parahaemolyticus as well as the benefits of cluster detection methods for V. parahaemolyticus risk management in estuaries.
Collapse
|
46
|
Returning to our roots: The use of geospatial data for nurse-led community research. Res Nurs Health 2019; 42:467-475. [PMID: 31599459 DOI: 10.1002/nur.21984] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2019] [Accepted: 09/23/2019] [Indexed: 12/22/2022]
Abstract
In the early 20th century, public health nurse, Lillian Wald, addressed the social determinants of health (SDOH) through her work in New York City and her advocacy to improve policy in workplace conditions, education, recreation, and housing. In the early 21st century, addressing the SDOH is a renewed priority and provides nurse researchers with an opportunity to return to our roots. The purpose of this methods paper is to examine how the incorporation of geospatial data and spatial methodologies in community research can enhance the analyses of the complex relationships between social determinants and health. Geospatial technologies, software for mapping and working with geospatial data, statistical methods, and unique considerations are discussed. An exemplar for using geospatial data is presented regarding associations between neighborhood greenspace, neighborhood violence, and children's asthma control. This innovative use of geospatial data illustrates a new frontier in investigating nontraditional connections between the environment and SDOH outcomes.
Collapse
|
47
|
Associations of Distance to Trauma Care, Community Income, and Neighborhood Median Age With Rates of Injury Mortality. JAMA Surg 2019; 153:535-543. [PMID: 29417146 DOI: 10.1001/jamasurg.2017.6133] [Citation(s) in RCA: 37] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/05/2023]
Abstract
Importance Rural, low-income, and historically underrepresented minority communities face substantial barriers to trauma care and experience high injury incidence and mortality rates. Characteristics of injury incident locations may contribute to poor injury outcomes. Objective To examine the association of injury scene characteristics with injury mortality. Design, Setting, and Participants In this cross-sectional study, data from trauma center and emergency medical services provided by emergency medical services companies and designated trauma centers in the state of Maryland from January 1, 2015, to December 31, 2015, were geocoded by injury incident locations and linked with injury scene characteristics. Participants included adults who experienced traumatic injury in Maryland and were transported to a designated trauma center or died while in emergency medical services care at the incident scene or in transit. Exposures The primary exposures of interest were geographic characteristics of injury incident locations, including distance to the nearest trauma center, designation level and ownership status of the nearest trauma center, and land use, as well as community-level characteristics such as median age and per capita income. Main Outcomes and Measures Odds of death were estimated with multilevel logistic regression, controlling for individual demographic measures and measures of injury and health. Results Of the 16 082 patients included in this study, 8716 (52.4%) were white, and 5838 (36.3%) were African American. Most patients were male (10 582; 65.8%) and younger than 65 years (12 383; 77.0%). Odds of death increased by 8.0% for every 5-mile increase in distance to the nearest trauma center (OR, 1.08; 95% CI, 1.01-1.15; P = .03). Compared with privately owned level 1 or 2 centers, odds of death increased by 49.9% when the nearest trauma center was level 3 (OR, 1.50; 95% CI, 1.06-2.11; P = .02), and by 80.7% when the nearest trauma center was publicly owned (OR, 1.81; 95% CI, 1.39-2.34; P < .001). At the zip code tabulation area level, odds of death increased by 16.0% for every 5-year increase in median age (OR, 1.16; 95% CI, 1.03-1.30; P = .02), and decreased by 26.6% when the per capita income was greater than $25 000 (OR, 0.73; 95% CI, 0.54-0.99; P = .05). Conclusions and Relevance Injury scene characteristics are associated with injury mortality. Odds of death are highest for patients injured in communities with higher median age or lower per capita income and at locations farthest from level 1 or 2 trauma centers.
Collapse
|
48
|
Vibrio parahaemolyticus in the Chesapeake Bay: Operational In Situ Prediction and Forecast Models Can Benefit from Inclusion of Lagged Water Quality Measurements. Appl Environ Microbiol 2019; 85:e01007-19. [PMID: 31253685 PMCID: PMC6696964 DOI: 10.1128/aem.01007-19] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2019] [Accepted: 06/24/2019] [Indexed: 12/18/2022] Open
Abstract
Vibrio parahaemolyticus is a leading cause of seafood-borne gastroenteritis. Given its natural presence in brackish waters, there is a need to develop operational forecast models that can sufficiently predict the bacterium's spatial and temporal variation. This work attempted to develop V. parahaemolyticus prediction models using frequently measured time-indexed and -lagged water quality measures. Models were built using a large data set (n = 1,043) of surface water samples from 2007 to 2010 previously analyzed for V. parahaemolyticus in the Chesapeake Bay. Water quality variables were classified as time indexed, 1-month lag, and 2-month lag. Tobit regression models were used to account for V. parahaemolyticus measures below the limit of quantification and to simultaneously estimate the presence and abundance of the bacterium. Models were evaluated using cross-validation and metrics that quantify prediction bias and uncertainty. Presence classification models containing only one type of water quality parameter (e.g., temperature) performed poorly, while models with additional water quality parameters (i.e., salinity, clarity, and dissolved oxygen) performed well. Lagged variable models performed similarly to time-indexed models, and lagged variables occasionally contained a predictive power that was independent of or superior to that of time-indexed variables. Abundance estimation models were less effective, primarily due to a restricted number of samples with abundances above the limit of quantification. These findings indicate that an operational in situ prediction model is attainable but will require a variety of water quality measurements and that lagged measurements will be particularly useful for forecasting. Future work will expand variable selection for prediction models and extend the spatial-temporal extent of predictions by using geostatistical interpolation techniques.IMPORTANCEVibrio parahaemolyticus is one of the leading causes of seafood-borne illness in the United States and across the globe. Exposure often occurs from the consumption of raw shellfish. Despite public health concerns, there have been only sporadic efforts to develop environmental prediction and forecast models for the bacterium preharvest. This analysis used commonly sampled water quality measurements of temperature, salinity, dissolved oxygen, and clarity to develop models for V. parahaemolyticus in surface water. Predictors also included measurements taken months before water was tested for the bacterium. Results revealed that the use of multiple water quality measurements is necessary for satisfactory prediction performance, challenging current efforts to manage the risk of infection based upon water temperature alone. The results also highlight the potential advantage of including historical water quality measurements. This analysis shows promise and lays the groundwork for future operational prediction and forecast models.
Collapse
|
49
|
Development and Evaluation of Geostatistical Methods for Non-Euclidean-Based Spatial Covariance Matrices. MATHEMATICAL GEOSCIENCES 2019; 51:767-791. [PMID: 31827631 PMCID: PMC6905632 DOI: 10.1007/s11004-019-09791-y] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/16/2018] [Accepted: 02/23/2019] [Indexed: 06/01/2023]
Abstract
Customary and routine practice of geostatistical modeling assumes that inter-point distances are a Euclidean metric (i.e., as the crow flies) when characterizing spatial variation. There are many real-world settings, however, in which the use of a non-Euclidean distance is more appropriate, for example in complex bodies of water. However, if such a distance is used with current semivariogram functions, the resulting spatial covariance matrices are no longer guaranteed to be positive-definite. Previous attempts to address this issue for geostatistical prediction (i.e., kriging) models transform the non-Euclidean space into a Euclidean metric, such as through multi-dimensional scaling (MDS). However, these attempts estimate spatial covariances only after distances are scaled. An alternative method is proposed to re-estimate a spatial covariance structure originally based on a non-Euclidean distance metric to ensure validity. This method is compared to the standard use of Euclidean distance, as well as a previously utilized MDS method. All methods are evaluated using cross-validation assessments on both simulated and real-world experiments. Results show a high level of bias in prediction variance for the previously developed MDS method that has not been highlighted previously. Conversely, the proposed method offers a preferred tradeoff between prediction accuracy and prediction variance and at times outperforms the existing methods for both sets of metrics. Overall results indicate that this proposed method can provide improved geostatistical predictions while ensuring valid results when the use of non-Euclidean distances is warranted.
Collapse
|
50
|
Methods for Evaluating the Association Between Alcohol Outlet Density and Violent Crime. Alcohol Clin Exp Res 2019; 43:1714-1726. [PMID: 31157919 DOI: 10.1111/acer.14119] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/26/2018] [Accepted: 05/21/2019] [Indexed: 11/27/2022]
Abstract
BACKGROUND The objective of this analysis was to compare measurement methods-counts, proximity, mean distance, and spatial access-of calculating alcohol outlet density and violent crime using data from Baltimore, Maryland. METHODS Violent crime data (n = 11,815) were obtained from the Baltimore City Police Department and included homicides, aggravated assaults, rapes, and robberies in 2016. We calculated alcohol outlet density and violent crime at the census block (CB) level (n = 13,016). We then weighted these CB-level measures to the census tract level (n = 197) and conducted a series of regressions. Negative binomial regression was used for count outcomes and linear regression for proximity and spatial access outcomes. Choropleth maps, partial R2 , Akaike's Information Criterion, and root mean squared error guided determination of which models yielded lower error and better fit. RESULTS The inference depended on the measurement methods used. Eight models that used a count of alcohol outlets and/or violent crimes failed to detect an association between outlets and crime, and 3 other count-based models detected an association in the opposite direction. Proximity, mean distance, and spatial access methods consistently detected an association between outlets and crime and produced comparable model fits. CONCLUSIONS Proximity, mean distance, and spatial access methods yielded the best model fits and had the lowest levels of error in this urban setting. Spatial access methods may offer conceptual strengths over proximity and mean distance. Conflicting findings in the field may be in part due to error in the way that researchers measure alcohol outlet density.
Collapse
|