1
|
Grant S, Mayo-Wilson E, Kianersi S, Naaman K, Henschel B. Open Science Standards at Journals that Inform Evidence-Based Policy. PREVENTION SCIENCE : THE OFFICIAL JOURNAL OF THE SOCIETY FOR PREVENTION RESEARCH 2023; 24:1275-1291. [PMID: 37178346 DOI: 10.1007/s11121-023-01543-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/19/2023] [Indexed: 05/15/2023]
Abstract
Evidence-based policy uses intervention research to inform consequential decisions about resource allocation. Research findings are often published in peer-reviewed journals. Because detrimental research practices associated with closed science are common, journal articles report more false-positives and exaggerated effect sizes than would be desirable. Journal implementation of standards that promote open science-such as the transparency and openness promotion (TOP) guidelines-could reduce detrimental research practices and improve the trustworthiness of research evidence on intervention effectiveness. We evaluated TOP implementation at 339 peer-reviewed journals that have been used to identify evidence-based interventions for policymaking and programmatic decisions. Each of ten open science standards in TOP was not implemented in most journals' policies (instructions to authors), procedures (manuscript submission systems), or practices (published articles). Journals implementing at least one standard typically encouraged, but did not require, an open science practice. We discuss why and how journals could improve implementation of open science standards to safeguard evidence-based policy.
Collapse
Affiliation(s)
- Sean Grant
- HEDCO Institute for Evidence-Based Educational Practice, College of Education, University of Oregon, OR, 97403-1215, Eugene, USA.
- Richard M. Fairbanks School of Public Health, Indiana University, Indianapolis, IN, USA.
| | - Evan Mayo-Wilson
- Gillings School of Global Public Health, University of North Carolina, Chapel Hill, NC, USA
- School of Public Health-Bloomington, Indiana University, Bloomington, IN, USA
| | - Sina Kianersi
- School of Public Health-Bloomington, Indiana University, Bloomington, IN, USA
- Channing Division of Network Medicine, Department of Medicine, Brigham and Women's Hospital and Harvard Medical School, Boston, MA, USA
| | - Kevin Naaman
- School of Public Health-Bloomington, Indiana University, Bloomington, IN, USA
- Indiana University, School of Education, Bloomington, IN, USA
| | - Beate Henschel
- School of Public Health-Bloomington, Indiana University, Bloomington, IN, USA
| |
Collapse
|
2
|
Lee-Easton MJ, Magura S, Maranda MJ, Landsverk J, Rolls-Royce J, Green B, DeCamp W, Abu-Obaid R. A Scoping Review of the Influence of Evidence-Based Program Resources (EBPR) Websites for Behavioral Health. ADMINISTRATION AND POLICY IN MENTAL HEALTH AND MENTAL HEALTH SERVICES RESEARCH 2023; 50:379-391. [PMID: 36564667 PMCID: PMC10191876 DOI: 10.1007/s10488-022-01245-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/12/2022] [Indexed: 12/25/2022]
Abstract
Evidence-based program resources (EBPR) websites evaluate behavioral health programs, practices or policies (i.e., interventions) according to a predetermined set of research criteria and standards, usually resulting in a summary rating of the strength of an intervention's evidence base. This study is a mixed-methods analysis of the peer-reviewed academic literature relating to the influence of EBPRs on clinical practice and policy in the behavioral health field. Using an existing framework for a scoping review, we searched for research articles in PubMed, Web of Science, SCOPUS, and ProQuest that were published between January 2002 and March 2022, referenced an EBPR or multiple EBPRs, and presented data showing the influence of one or more EBPRs on behavioral health. A total of 210 articles met the inclusion criteria and were classified into five distinct categories of influence, the most important of which was showing the direct impact of one or more EBPRs on behavioral health (8.1% of articles), defined as documenting observable changes in interventions or organizations that are at least partly due to information obtained from EBPR(s). These included impacts at the state legislative and policy-making level, at the community intervention level, provider agency level, and individual practitioner level. The majority of influences identified in the study were indirect demonstrations of how EBPRs are used in various ways. However, more studies are needed to learn about the direct impact of information from EBPRs on the behavioral health field, including impact on clinician practice and treatment outcomes for consumers.
Collapse
Affiliation(s)
- Miranda J Lee-Easton
- The Evaluation Center, Western Michigan University, 1903 W. Michigan Avenue, Kalamazoo, MI, 49008, USA
| | - Stephen Magura
- The Evaluation Center, Western Michigan University, 1903 W. Michigan Avenue, Kalamazoo, MI, 49008, USA.
| | - Michael J Maranda
- The Evaluation Center, Western Michigan University, 1903 W. Michigan Avenue, Kalamazoo, MI, 49008, USA
| | - John Landsverk
- Oregon Social Learning Center, 10 Shelton McMurphey Blvd, Eugene, OR, 97401, USA
| | - Jennifer Rolls-Royce
- Chadwick Center, Rady Children's Hospital, 3020 Children's Way-Mailcode 5131, San Diego, CA, 92123, USA
| | - Brandn Green
- Development Services Group Inc, 7315 Wisconsin Ave #800E, Bethesda, MD, 20814, USA
| | - Whitney DeCamp
- Department of Sociology, Western Michigan University, 1903 W. Michigan Avenue, Kalamazoo, MI, 49008, USA
| | - Ruqayyah Abu-Obaid
- The Evaluation Center, Western Michigan University, 1903 W. Michigan Avenue, Kalamazoo, MI, 49008, USA
| |
Collapse
|
3
|
Lee-Easton MJ, Magura S. Discrepancies in Ratings of Behavioral Healthcare Interventions Among Evidence-Based Program Resources Websites. INQUIRY : A JOURNAL OF MEDICAL CARE ORGANIZATION, PROVISION AND FINANCING 2023; 60:469580231186836. [PMID: 37462104 DOI: 10.1177/00469580231186836] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/20/2023]
Abstract
Decision makers in the behavioral health disciplines could benefit from tools to assist them in identifying and implementing evidence-based interventions. One tool is an evidence-based program resources website (EBPR). Prior studies documented that when multiple EBPRs rate an intervention, they may disagree. Prior research concerning the reason for such conflicts is sparse. The present study examines how EBPRs rate interventions and the sources of disagreement between EBPRs when rating the same intervention. This study hypothesizes that EBPRs may disagree about intervention ratings because they either use different rating paradigms or they use different studies as evidence of intervention effectiveness (or both). This study identified 15 EBPRs for inclusion. One author (M.J.L.E.) coded the EBPRs for which "tiers of evidence" each EBPR used to classify behavioral health interventions and which criteria they used when rating interventions. The author then computed one Jaccard index of similarity for the criteria shared between each pair of EBPRs that co-rated interventions, and one for the studies used by EBPR rating pairs when rating the same program. The authors used a combination of chi-square, correlation, and binary logistic regression analyses to analyze the data. There was a statistically significant negative correlation between the number of Cochrane Risk of Bias criteria shared between 2 EBPRs and the likelihood of those 2 EBPRs agreeing on an intervention rating (r = -.12, P ≤ .01). There was no relationship between the number of studies evaluated by 2 EBPRs and the likelihood of those EBPRs agreeing on an intervention rating. The major reason for disagreements between EBPRs when rating the same intervention in this study was due to differences in the rating criteria used by the EBPRs. The studies used by the EBPRs to rate programs does not appear to have an impact.
Collapse
|