1
|
Wiechert S, Leistra P, Ben-Shakhar G, Pertzov Y, Verschuere B. Open science practices in the false memory literature. Memory 2024:1-13. [PMID: 39101456 DOI: 10.1080/09658211.2024.2387108] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2024] [Accepted: 07/24/2024] [Indexed: 08/06/2024]
Abstract
In response to the replication crisis in psychology, the scientific community has advocated open science practices to promote transparency and reproducibility. Although existing reviews indicate inconsistent and generally low adoption of open science in psychology, a current-day, detailed analysis is lacking. Recognising the significant impact of false memory research in legal contexts, we conducted a preregistered systematic review to assess the integration of open science practices within this field, analysing 388 publications from 2015 to 2023 (including 15 replications and 3 meta-analyses). Our findings indicated a significant yet varied adoption of open science practices. Most studies (86.86%) adhered to at least one measure, with publication accessibility being the most consistently adopted practice at 73.97%. While data sharing demonstrated the most substantial growth, reaching about 75% by 2023, preregistration and analysis script sharing lagged, with 20-25% adoption in 2023. This review highlights a promising trend towards enhanced research quality, transparency, and reproducibility in false memory research. However, the inconsistent implementation of open science practices may still challenge the verification, replication, and interpretation of research findings. Our study underscores the need for a comprehensive adoption of open science to improve research reliability and validity substantially, fostering trust and credibility in psychology.
Collapse
Affiliation(s)
- Sera Wiechert
- Department of Clinical Psychology, University of Amsterdam, Amsterdam, The Netherlands
- Department of Psychology, The Hebrew University of Jerusalem, Jerusalem, Israel
| | - Phaedra Leistra
- Department of Experimental Psychology, Ghent University, Ghent, Belgium
| | - Gershon Ben-Shakhar
- Department of Psychology, The Hebrew University of Jerusalem, Jerusalem, Israel
| | - Yoni Pertzov
- Department of Psychology, The Hebrew University of Jerusalem, Jerusalem, Israel
| | - Bruno Verschuere
- Department of Clinical Psychology, University of Amsterdam, Amsterdam, The Netherlands
| |
Collapse
|
2
|
Johnson AL, Bouvette M, Rangu N, Morley T, Schultz A, Torgerson T, Vassar M. Data-Sharing Across Otolaryngology: Comparing Journal Policies and Their Adherence to the FAIR Principles. Ann Otol Rhinol Laryngol 2024; 133:105-110. [PMID: 37431814 DOI: 10.1177/00034894231185642] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/12/2023]
Abstract
OBJECTIVE Data-sharing plays an essential role in advancing scientific understanding. Here, we aim to identify the commonalities and differences in data-sharing policies endorsed by otolaryngology journals and to assess their adherence to the FAIR (findable, accessible, interoperable, reusable) principles. METHODS Data-sharing policies were searched for among 111 otolaryngology journals, as listed by Scimago Journal & Country Rank. Policy extraction of the top biomedical journals as ranked by Google Scholar metrics were used as a comparison. The FAIR principles for scientific data management and stewardship were used for the extraction framework. This occurred in a blind, masked, and independent fashion. RESULTS Of the 111 ranked otolaryngology journals, 100 met inclusion criteria. Of those 100 journals, 79 provided data-sharing policies. There was a clear lack of standardization across policies, along with specific gaps in accessibility and reusability which need to be addressed. Seventy-two policies (of 79; 91%) designated that metadata should have globally unique and persistent identifiers. Seventy-one (of 79; 90%) policies specified that metadata should clearly include the identifier of the data they describe. Fifty-six policies (of 79; 71%) outlined that metadata should be richly described with a plurality of accurate and relevant attributes. CONCLUSION Otolaryngology journals have varying data-sharing policies, and adherence to the FAIR principles appears to be moderate. This calls for increased data transparency, allowing for results to be reproduced, confirmed, and debated.
Collapse
Affiliation(s)
- Austin L Johnson
- Department of Otolaryngology, The University of Texas Medical Branch, Galveston, TX, USA
| | - Max Bouvette
- University of Oklahoma College of Medicine, Oklahoma, OK, USA
| | - Nitin Rangu
- University of Oklahoma College of Medicine, Oklahoma, OK, USA
| | - Timothy Morley
- Alabama College of Osteopathic Medicine, Dothan, AL, USA
| | - Adam Schultz
- Oklahoma State University Center for Health Sciences, Tulsa, OK, USA
| | - Trevor Torgerson
- Department of Head and Neck Surgery & Communication Sciences, Duke University Medical Center, Durham, NC, USA
| | - Matt Vassar
- Oklahoma State University Center for Health Sciences, Tulsa, OK, USA
| |
Collapse
|
3
|
Prosser AMB, Hamshaw RJT, Meyer J, Bagnall R, Blackwood L, Huysamen M, Jordan A, Vasileiou K, Walter Z. When open data closes the door: A critical examination of the past, present and the potential future for open data guidelines in journals. BRITISH JOURNAL OF SOCIAL PSYCHOLOGY 2023; 62:1635-1653. [PMID: 36076340 PMCID: PMC10946880 DOI: 10.1111/bjso.12576] [Citation(s) in RCA: 7] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2021] [Accepted: 08/17/2022] [Indexed: 01/17/2023]
Abstract
Opening data promises to improve research rigour and democratize knowledge production. But it also presents practical, theoretical, and ethical considerations for qualitative researchers in particular. Discussion about open data in qualitative social psychology predates the replication crisis. However, the nuances of this ongoing discussion have not been translated into current journal guidelines on open data. In this article, we summarize ongoing debates about open data from qualitative perspectives, and through a content analysis of 261 journals we establish the state of current journal policies for open data in the domain of social psychology. We critically discuss how current common expectations for open data may not be adequate for establishing qualitative rigour, can introduce ethical challenges, and may place those who wish to use qualitative approaches at a disadvantage in peer review and publication processes. We advise that future open data guidelines should aim to reflect the nuance of arguments surrounding data sharing in qualitative research, and move away from a universal "one-size-fits-all" approach to data sharing. This article outlines the past, present, and the potential future of open data guidelines in social-psychological journals. We conclude by offering recommendations for how journals might more inclusively consider the use of open data in qualitative methods, whilst recognizing and allowing space for the diverse perspectives, needs, and contexts of all forms of social-psychological research.
Collapse
Affiliation(s)
| | | | | | | | | | - Monique Huysamen
- Department of Social Care and Social WorkManchester Metropolitan UniversityManchesterUK
| | - Abbie Jordan
- Department of PsychologyUniversity of BathBathUK
| | | | - Zoe Walter
- School of PsychologyThe University of QueenslandSt LuciaQueenslandAustralia
| |
Collapse
|
4
|
Karhulahti VM. Reasons for qualitative psychologists to share human data. BRITISH JOURNAL OF SOCIAL PSYCHOLOGY 2023; 62:1621-1634. [PMID: 36068662 DOI: 10.1111/bjso.12573] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2021] [Accepted: 08/17/2022] [Indexed: 11/29/2022]
Abstract
Qualitative data sharing practices in psychology have not developed as rapidly as those in parallel quantitative domains. This is often explained by numerous epistemological, ethical and pragmatic issues concerning qualitative data types. In this article, I provide an alternative to the frequently expressed, often reasonable, concerns regarding the sharing of qualitative human data by highlighting three advantages of qualitative data sharing. I argue that sharing qualitative human data is not by default 'less ethical', 'riskier' and 'impractical' compared with quantitative data sharing, but in some cases more ethical, less risky and easier to manage for sharing because (1) informed consent can be discussed, negotiated and validated; (2) the shared data can be curated by special means; and (3) the privacy risks are mainly local instead of global. I hope this alternative perspective further encourages qualitative psychologists to share their data when it is epistemologically, ethically and pragmatically possible.
Collapse
Affiliation(s)
- Veli-Matti Karhulahti
- Department of Music, Art and Culture Studies, Faculty of Humanities and Social Sciences, University of Jyväskylä, Jyväskylä, Finland
| |
Collapse
|
5
|
El Amin M, Borders JC, Long HL, Keller MA, Kearney E. Open Science Practices in Communication Sciences and Disorders: A Survey. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2023; 66:1928-1947. [PMID: 36417765 PMCID: PMC10554559 DOI: 10.1044/2022_jslhr-22-00062] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/27/2022] [Revised: 07/15/2022] [Accepted: 08/07/2022] [Indexed: 06/16/2023]
Abstract
PURPOSE Open science is a collection of practices that seek to improve the accessibility, transparency, and replicability of science. Although these practices have garnered interest in related fields, it remains unclear whether open science practices have been adopted in the field of communication sciences and disorders (CSD). This study aimed to survey the knowledge, implementation, and perceived benefits and barriers of open science practices in CSD. METHOD An online survey was disseminated to researchers in the United States actively engaged in CSD research. Four-core open science practices were examined: preregistration, self-archiving, gold open access, and open data. Data were analyzed using descriptive statistics and regression models. RESULTS Two hundred twenty-two participants met the inclusion criteria. Most participants were doctoral students (38%) or assistant professors (24%) at R1 institutions (58%). Participants reported low knowledge of preregistration and gold open access. There was, however, a high level of desire to learn more for all practices. Implementation of open science practices was also low, most notably for preregistration, gold open access, and open data (< 25%). Predictors of knowledge and participation, as well as perceived barriers to implementation, are discussed. CONCLUSION Although participation in open science appears low in the field of CSD, participants expressed a strong desire to learn more in order to engage in these practices in the future. Supplemental Material and Open Science Form: https://doi.org/10.23641/asha.21569040.
Collapse
Affiliation(s)
- Mariam El Amin
- Communication Sciences and Disorders, University of Georgia, Athens
| | - James C. Borders
- Department of Biobehavioral Sciences, Teacher College, Columbia University, New York, NY
| | | | | | - Elaine Kearney
- Department of Speech, Language and Hearing Sciences, Boston University, MA
| |
Collapse
|
6
|
Zhang L, Ma L. Is open science a double-edged sword?: data sharing and the changing citation pattern of Chinese economics articles. Scientometrics 2023; 128:2803-2818. [PMID: 37101973 PMCID: PMC10028759 DOI: 10.1007/s11192-023-04684-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2022] [Accepted: 03/05/2023] [Indexed: 03/24/2023]
Abstract
Data sharing is an important part of open science (OS), and more and more institutions and journals have been enforcing open data (OD) policies. OD is advocated to help increase academic influences and promote scientific discovery and development, but such a proposition has not been elaborated on well. This study explores the nuanced effects of the OD policies on the citation pattern of articles by using the case of Chinese economics journals. China Industrial Economics (CIE) is the first and only Chinese social science journal so far to adopt a compulsory OD policy, requiring all published articles to share original data and processing codes. We use the article-level data and difference-in-differences (DID) approach to compare the citation performance of articles published in CIE and 36 comparable journals. Firstly, we find that the OD policy quickly increased the number of citations, and each article on average received 0.25, 1.19, 0.86, and 0.44 more citations in the first four years after publication respectively. Furthermore, we also found that the citation benefit of the OD policy rapidly decreased over time, and even became negative in the fifth year after publication. In conclusion, this changing citation pattern suggests that an OD policy can be double edged sword, which can quickly increase citation performance but simultaneously accelerate the aging of articles.
Collapse
Affiliation(s)
- Liwei Zhang
- grid.27255.370000 0004 1761 1174School of Innovation and Entrepreneurship, Shandong University, 72 Binhai Road, Jimo District, Qingdao, Shandong Province 266237 China
| | - Liang Ma
- grid.24539.390000 0004 0368 8103School of Public Administration and Policy, Renmin University of China, 59 Zhongguancun Avenue, Haidian District, Beijing, 100872 China
| |
Collapse
|
7
|
Johnson AL, Anderson JM, Bouvette M, Pinero I, Rauh S, Johnson B, Kee M, Heigle B, Tricco AC, Page MJ, McCall Wright P, Vassar M. Clinical trial data-sharing policies among journals, funding agencies, foundations, and other professional organizations: a scoping review. J Clin Epidemiol 2023; 154:42-55. [PMID: 36375641 DOI: 10.1016/j.jclinepi.2022.11.009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2022] [Revised: 10/25/2022] [Accepted: 11/08/2022] [Indexed: 11/13/2022]
Abstract
BACKGROUND AND OBJECTIVES To identify the similarities and differences in data-sharing policies for clinical trial data that are endorsed by biomedical journals, funding agencies, and other professional organizations. Additionally, to determine the beliefs, and opinions regarding data-sharing policies for clinical trials discussed in articles published in biomedical journals. METHODS Two searches were conducted, a bibliographic search for published articles that present beliefs, opinions, similarities, and differences regarding policies governing the sharing of clinical trial data. The second search analyzed the gray literature (non-peer-reviewed publications) to identify important data-sharing policies in selected biomedical journals, foundations, funding agencies, and other professional organizations. RESULTS A total of 471 articles were included after database search and screening, with 45 from the bibliographic search and 426 from the gray literature search. A total of 424 data-sharing policies were included. Fourteen of the 45 published articles from the bibliographic search (31.1%) discussed only advantages specific to data-sharing policies, 27 (27/45; 60%) discussed both advantages and disadvantages, and 4 (4/45; 8.9%) discussed only disadvantages specific. A total of 216 journals (of 270; 80%) specified a data-sharing policy provided by the journal itself. One hundred industry data-sharing policies were included, and 32 (32%) referenced a data-sharing policy on their website. One hundred and thirty-six (42%) organizations (of 327) specified a data-sharing policy. CONCLUSION We found many similarities listed as advantages to data-sharing and fewer disadvantages were discussed within the literature. Additionally, we found a wide variety of commonalities and differences-such as the lack of standardization between policies, and inadequately addressed details regarding the accessibility of research data-that exist in data-sharing policies endorsed by biomedical journals, funding agencies, and other professional organizations. Our study may not include information on all data sharing policies and our data is limited to the entities' descriptions of each policy.
Collapse
Affiliation(s)
- Austin L Johnson
- Oklahoma State University Center for Health Sciences, Tulsa, OK, USA; The University of Texas Medical Branch, Galveston, TX, USA.
| | | | | | - Israel Pinero
- The University of Texas Medical Branch, Galveston, TX, USA
| | - Shelby Rauh
- Oklahoma State University Center for Health Sciences, Tulsa, OK, USA
| | - Bradley Johnson
- Oklahoma State University Center for Health Sciences, Tulsa, OK, USA
| | - Micah Kee
- Oklahoma State University Center for Health Sciences, Tulsa, OK, USA
| | - Benjamin Heigle
- Oklahoma State University Center for Health Sciences, Tulsa, OK, USA
| | - Andrea C Tricco
- Li Ka Shing Knowledge Institute, St. Michael's Hospital, Toronto, Ontario, Canada; Epidemiology Division, Dalla Lana School of Public Health and the Institute for Health, Policy, Management, and Evaluation, University of Toronto, Toronto, Ontario, Canada; Queen's Collaboration for Health Care Quality, Joanna Briggs Institute Centre of Excellence, Queen's University, Kingston, Ontario, Canada
| | - Matthew J Page
- School of Public Health and Preventive Medicine, Monash University, Melbourne, Australia
| | | | - Matt Vassar
- Oklahoma State University Center for Health Sciences, Tulsa, OK, USA
| |
Collapse
|
8
|
Priestley DR, Staph J, Koneru SD, Rajtmajer SM, Cwiek A, Vervoordt S, Hillary FG. Establishing ground truth in the traumatic brain injury literature: if replication is the answer, then what are the questions? Brain Commun 2022; 5:fcac322. [PMID: 36601624 PMCID: PMC9806718 DOI: 10.1093/braincomms/fcac322] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/06/2022] [Revised: 09/09/2022] [Accepted: 12/06/2022] [Indexed: 12/13/2022] Open
Abstract
The replication crisis poses important challenges to modern science. Central to this challenge is re-establishing ground truths or the most fundamental theories that serve as the bedrock to a scientific community. However, the goal to identify hypotheses with the greatest support is non-trivial given the unprecedented rate of scientific publishing. In this era of high-volume science, the goal of this study is to sample from one research community within clinical neuroscience (traumatic brain injury) and track major trends that have shaped this literature over the past 50 years. To do so, we first conduct a decade-wise (1980-2019) network analysis to examine the scientific communities that shape this literature. To establish the robustness of our findings, we utilized searches from separate search engines (Web of Science; Semantic Scholar). As a second goal, we sought to determine the most highly cited hypotheses influencing the literature in each decade. In a third goal, we then searched for any papers referring to 'replication' or efforts to reproduce findings within our >50 000 paper dataset. From this search, 550 papers were analysed to determine the frequency and nature of formal replication studies over time. Finally, to maximize transparency, we provide a detailed procedure for the creation and analysis of our dataset, including a discussion of each of our major decision points, to facilitate similar efforts in other areas of neuroscience. We found that the unparalleled rate of scientific publishing within the brain injury literature combined with the scarcity of clear hypotheses in individual publications is a challenge to both evaluating accepted findings and determining paths forward to accelerate science. Additionally, while the conversation about reproducibility has increased over the past decade, the rate of published replication studies continues to be a negligible proportion of the research. Meta-science and computational methods offer the critical opportunity to assess the state of the science and illuminate pathways forward, but ultimately there is structural change needed in the brain injury literature and perhaps others.
Collapse
Affiliation(s)
| | | | - Sai D Koneru
- College of Information Sciences and Technology, Penn State University, University Park, PA 16802, USA
| | - Sarah M Rajtmajer
- College of Information Sciences and Technology, Penn State University, University Park, PA 16802, USA
| | - Andrew Cwiek
- Department of Psychology, Penn State University, University Park, PA 16802, USA
| | - Samantha Vervoordt
- Department of Psychology, Penn State University, University Park, PA 16802, USA
| | - Frank G Hillary
- Correspondence to: Frank G. Hillary Professor of Psychology 313 Bruce V. Moore Building, University Park, PA, USA E-mail:
| |
Collapse
|
9
|
Kaye LK, Rousaki A, Joyner LC, Barrett LA, Orchard LJ. The Online Behaviour Taxonomy: A conceptual framework to understand behaviour in computer-mediated communication. COMPUTERS IN HUMAN BEHAVIOR 2022. [DOI: 10.1016/j.chb.2022.107443] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022]
|
10
|
Heacock ML, Lopez AR, Amolegbe SM, Carlin DJ, Henry HF, Trottier BA, Velasco ML, Suk WA. Enhancing Data Integration, Interoperability, and Reuse to Address Complex and Emerging Environmental Health Problems. ENVIRONMENTAL SCIENCE & TECHNOLOGY 2022; 56:7544-7552. [PMID: 35549252 PMCID: PMC9227711 DOI: 10.1021/acs.est.1c08383] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/09/2021] [Indexed: 05/21/2023]
Abstract
Environmental health sciences (EHS) span many diverse disciplines. Within the EHS community, the National Institute of Environmental Health Sciences Superfund Research Program (SRP) funds multidisciplinary research aimed to address pressing and complex issues on how people are exposed to hazardous substances and their related health consequences with the goal of identifying strategies to reduce exposures and protect human health. While disentangling the interrelationships that contribute to environmental exposures and their effects on human health over the course of life remains difficult, advances in data science and data sharing offer a path forward to explore data across disciplines to reveal new insights. Multidisciplinary SRP-funded teams are well-positioned to examine how to best integrate EHS data across diverse research domains to address multifaceted environmental health problems. As such, SRP supported collaborative research projects designed to foster and enhance the interoperability and reuse of diverse and complex data streams. This perspective synthesizes those experiences as a landscape view of the challenges identified while working to increase the FAIR-ness (Findable, Accessible, Interoperable, and Reusable) of EHS data and opportunities to address them.
Collapse
Affiliation(s)
- Michelle L. Heacock
- Superfund
Research Program, National Institute of Environmental Health Sciences
(NIEHS), National Institutes
of Health (NIH), Department of Health and Human Services (DHHS), Research Triangle Park, North Carolina 27709, United States
- . Tel: 984-287-3267
| | | | - Sara M. Amolegbe
- Superfund
Research Program, National Institute of Environmental Health Sciences
(NIEHS), National Institutes
of Health (NIH), Department of Health and Human Services (DHHS), Research Triangle Park, North Carolina 27709, United States
| | - Danielle J. Carlin
- Superfund
Research Program, National Institute of Environmental Health Sciences
(NIEHS), National Institutes
of Health (NIH), Department of Health and Human Services (DHHS), Research Triangle Park, North Carolina 27709, United States
| | - Heather F. Henry
- Superfund
Research Program, National Institute of Environmental Health Sciences
(NIEHS), National Institutes
of Health (NIH), Department of Health and Human Services (DHHS), Research Triangle Park, North Carolina 27709, United States
| | - Brittany A. Trottier
- Superfund
Research Program, National Institute of Environmental Health Sciences
(NIEHS), National Institutes
of Health (NIH), Department of Health and Human Services (DHHS), Research Triangle Park, North Carolina 27709, United States
| | | | - William A. Suk
- Superfund
Research Program, National Institute of Environmental Health Sciences
(NIEHS), National Institutes
of Health (NIH), Department of Health and Human Services (DHHS), Research Triangle Park, North Carolina 27709, United States
| |
Collapse
|
11
|
Pragmatic Reductionism: On the Relation between Contingency and Metacontingency. BEHAVIOR AND SOCIAL ISSUES 2022. [DOI: 10.1007/s42822-022-00097-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
|
12
|
Anderson JM, Johnson A, Rauh S, Johnson B, Bouvette M, Pinero I, Beaman J, Vassar M. Perceptions and Opinions Towards Data-Sharing: A Survey of Addiction Journal Editorial Board Members. THE JOURNAL OF SCIENTIFIC PRACTICE AND INTEGRITY 2022; 2022:10.35122/001c.35597. [PMID: 38804666 PMCID: PMC11129878 DOI: 10.35122/001c.35597] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/29/2024] Open
Abstract
Background We surveyed addiction journal editorial board members to better understand their opinions towards data-sharing. Methods Survey items consisted of Likert-type (e.g., one to five scale), multiple-choice, and free-response questions. Journal websites were searched for names and email addresses. Emails were distributed using SurveyMonkey. Descriptive statistics were used to characterize the responses. Results We received 178 responses (of 1039; 17.1%). Of these, 174 individuals agreed to participate in our study (97.8%). Most respondents did not know whether their journal had a data-sharing policy. Board members "somewhat agree" that addiction journals should recommend but not require data-sharing for submitted manuscripts [M=4.09 (SD=0.06); 95% CI: 3.97-4.22]. Items with the highest perceived benefit ratings were "secondary data use (e.g., meta-analysis)" [M=3.44 (SD=0.06); 95% CI: 3.31-3.56] and "increased transparency" [M=3.29 (SD=0.07); 95% CI: 3.14-3.43]. Items perceived to be the greatest barrier to data-sharing included "lack of metadata standards" [M=3.21 (SD=0.08); 95% CI: 3.06-3.36], "no incentive" [M=3.43 (SD=0.07); 95% CI: 3.30-3.57], "inadequate resources" [M=3.53 (SD=0.05); 95% CI: 3.42-3.63], and "protection of privacy"[M=3.22 (SD=0.07); 95% CI: 3.07-3.36]. Conclusion Our results suggest addiction journal editorial board members believe data-sharing has a level of importance within the research community. However, most board members are unaware of their journals' data-sharing policies, and most data-sharing should be recommended but not required. Future efforts aimed at better understanding common reservations and benefits towards data-sharing, as well as avenues to optimize data-sharing while minimizing potential risks, are warranted.
Collapse
Affiliation(s)
| | | | - Shelby Rauh
- Center for Health Sciences, Oklahoma State University
| | | | | | | | - Jason Beaman
- Center for Health Sciences, Oklahoma State University
| | - Matt Vassar
- Center for Health Sciences, Oklahoma State University
| |
Collapse
|
13
|
Trassi AP, Leonard SJ, Rodrigues LD, Rodas JA, Santos FH. Mediating factors of statistics anxiety in university students: a systematic review and meta-analysis. Ann N Y Acad Sci 2022; 1512:76-97. [PMID: 35211989 DOI: 10.1111/nyas.14746] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/17/2021] [Accepted: 12/14/2021] [Indexed: 11/30/2022]
Abstract
Statistics plays a key role in many areas of modern society, including technology, social and behavior studies, economics, and the sciences. Statistics anxiety (SA) has a detrimental impact on academic experiences in university populations, although the mediating factors remain underexplored. We conducted the first systematic review and meta-analysis focused on SA in university students in the context of statistical performance, individual differences in statistical learning, self-perceptions regarding the statistics course and instructor, and sociodemographic factors. Searches were carried out in the PsycINFO, PubMed, Scielo, and Web of Science databases according to our preregistration. Forty studies were selected for systematic review. Seventeen were included in a series of six meta-analyses concerning academic achievement, attitudes, self-perception, procrastination, and gender. The findings reveal learning strategies, procrastination, self-efficacy, and self-awareness as predictors of SA. However, the impact of sociodemographic data in these moderators is still uncharted. We conclude with a critical appraisal of the selected studies and present future directions for research in SA.
Collapse
Affiliation(s)
| | - Sophie J Leonard
- UCD School of Psychology, University College Dublin, Dublin, Ireland
| | | | - Jose A Rodas
- UCD School of Psychology, University College Dublin, Dublin, Ireland.,University of Guayaquil, Guayaquil, Ecuador
| | - Flávia H Santos
- UNESP, São Paulo State University, Bauru, Brazil.,UCD School of Psychology, University College Dublin, Dublin, Ireland
| |
Collapse
|
14
|
Burke NL, Frank GKW, Hilbert A, Hildebrandt T, Klump KL, Thomas JJ, Wade TD, Walsh BT, Wang SB, Weissman RS. Open science practices for eating disorders research. Int J Eat Disord 2021; 54:1719-1729. [PMID: 34555191 PMCID: PMC9107337 DOI: 10.1002/eat.23607] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/27/2021] [Revised: 09/02/2021] [Accepted: 09/03/2021] [Indexed: 11/07/2022]
Abstract
This editorial seeks to encourage the increased application of three open science practices in eating disorders research: Preregistration, Registered Reports, and the sharing of materials, data, and code. For each of these practices, we introduce updated International Journal of Eating Disorders author and reviewer guidance. Updates include the introduction of open science badges; specific instructions about how to improve transparency; and the introduction of Registered Reports of systematic or meta-analytical reviews. The editorial also seeks to encourage the study of open science practices. Open science practices pose considerable time and other resource burdens. Therefore, research is needed to help determine the value of these added burdens and to identify efficient strategies for implementing open science practices.
Collapse
Affiliation(s)
- Natasha L. Burke
- Department of Psychology, Fordham University, Bronx, New York, USA
| | - Guido K. W. Frank
- Department of Psychiatry, University of California San Diego, San Diego, California, USA
| | - Anja Hilbert
- Department of Psychosomatic Medicine and Psychotherapy, Integrated Research and Treatment Center Adiposity Diseases, Behavioral Medicine Research Unit, Leipzig, Germany
| | - Thomas Hildebrandt
- Center of Excellence in Eating and Weight Disorders, Icahn School of Medicine at Mount Sinai, New York, New York, USA
| | - Kelly L. Klump
- Department of Psychology, Michigan State University, East Lansing, Michigan, USA
| | - Jennifer J. Thomas
- Eating Disorders Clinical and Research Program, Massachusetts General Hospital and Department of Psychiatry, Harvard Medical School, Boston, Massachusetts, USA
| | - Tracey D. Wade
- Blackbird Initiative, Órama Institute for Mental Health and Well-Being, Flinders University, Adelaide, South Australia, Australia
| | - B. Timothy Walsh
- New York State Psychiatric Institute and Department of Psychiatry, Columbia University Irving Medical Center, New York, New York, USA
| | - Shirley B. Wang
- Department of Psychology, Harvard University, Cambridge, Massachusetts, USA
| | | |
Collapse
|
15
|
Rosenfeld DL, Balcetis E, Bastian B, Berkman ET, Bosson JK, Brannon TN, Burrow AL, Cameron CD, Chen S, Cook JE, Crandall C, Davidai S, Dhont K, Eastwick PW, Gaither SE, Gangestad SW, Gilovich T, Gray K, Haines EL, Haselton MG, Haslam N, Hodson G, Hogg MA, Hornsey MJ, Huo YJ, Joel S, Kachanoff FJ, Kraft-Todd G, Leary MR, Ledgerwood A, Lee RT, Loughnan S, MacInnis CC, Mann T, Murray DR, Parkinson C, Pérez EO, Pyszczynski T, Ratner K, Rothgerber H, Rounds JD, Schaller M, Silver RC, Spellman BA, Strohminger N, Swim JK, Thoemmes F, Urganci B, Vandello JA, Volz S, Zayas V, Tomiyama AJ. Psychological Science in the Wake of COVID-19: Social, Methodological, and Metascientific Considerations. PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2021; 17:311-333. [PMID: 34597198 DOI: 10.1177/1745691621999374] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/10/2023]
Abstract
The COVID-19 pandemic has extensively changed the state of psychological science from what research questions psychologists can ask to which methodologies psychologists can use to investigate them. In this article, we offer a perspective on how to optimize new research in the pandemic's wake. Because this pandemic is inherently a social phenomenon-an event that hinges on human-to-human contact-we focus on socially relevant subfields of psychology. We highlight specific psychological phenomena that have likely shifted as a result of the pandemic and discuss theoretical, methodological, and practical considerations of conducting research on these phenomena. After this discussion, we evaluate metascientific issues that have been amplified by the pandemic. We aim to demonstrate how theoretically grounded views on the COVID-19 pandemic can help make psychological science stronger-not weaker-in its wake.
Collapse
Affiliation(s)
| | | | - Brock Bastian
- Melbourne School of Psychological Sciences, University of Melbourne
| | - Elliot T Berkman
- Department of Psychology, University of Oregon.,Center for Translational Neuroscience, University of Oregon
| | | | | | | | - C Daryl Cameron
- Department of Psychology, The Pennsylvania State University.,Rock Ethics Institute, The Pennsylvania State University
| | - Serena Chen
- Department of Psychology, University of California, Berkeley
| | | | | | | | | | | | | | | | | | - Kurt Gray
- Department of Psychology and Neuroscience, University of North Carolina, Chapel Hill
| | | | - Martie G Haselton
- Department of Psychology, University of California, Los Angeles.,Department of Communication, University of California, Los Angeles.,Institute for Society and Genetics, University of California, Los Angeles
| | - Nick Haslam
- Melbourne School of Psychological Sciences, University of Melbourne
| | | | | | | | - Yuen J Huo
- Department of Psychology, University of California, Los Angeles
| | | | - Frank J Kachanoff
- Department of Psychology and Neuroscience, University of North Carolina, Chapel Hill
| | | | - Mark R Leary
- Department of Psychology and Neuroscience, Duke University
| | | | - Randy T Lee
- Department of Psychology, Cornell University
| | - Steve Loughnan
- School of Philosophy, Psychology, and Language Sciences, The University of Edinburgh
| | | | - Traci Mann
- Department of Psychology, University of Minnesota
| | | | | | - Efrén O Pérez
- Department of Psychology, University of California, Los Angeles.,Department of Political Science, University of California, Los Angeles
| | - Tom Pyszczynski
- Department of Psychology, University of Colorado at Colorado Springs
| | | | | | | | - Mark Schaller
- Department of Psychology, University of British Columbia
| | - Roxane Cohen Silver
- Department of Psychological Science, University of California, Irvine.,Department of Medicine, University of California, Irvine.,Program in Public Health, University of California, Irvine
| | | | - Nina Strohminger
- Department of Legal Studies and Business Ethics, Wharton School of Business, University of Pennsylvania.,Department of Psychology, University of Pennsylvania
| | - Janet K Swim
- Department of Psychology, The Pennsylvania State University
| | - Felix Thoemmes
- Department of Human Development, Cornell University.,Department of Psychology, Cornell University
| | | | | | - Sarah Volz
- Department of Psychology, University of Minnesota
| | | | | |
Collapse
|
16
|
|
17
|
Kim Y. A study of the determinants of psychologists' data sharing and open data badge adoption. LEARNED PUBLISHING 2021. [DOI: 10.1002/leap.1388] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Affiliation(s)
- Youngseek Kim
- Department of Library and Information Science Sungkyunkwan University 25‐2 Sungkyunkwan‐ro, Jongno‐gu Seoul 03063 Republic of Korea
| |
Collapse
|
18
|
Kim Y. An empirical study of research ethics and their role in psychologists’ data sharing intentions using consequentialism theory of ethics. JOURNAL OF LIBRARIANSHIP AND INFORMATION SCIENCE 2021. [DOI: 10.1177/09610006211008967] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
The purpose of this study is to examine how different ethical dimensions of egoism, utilitarianism, and deontology all help in the formation of psychologists’ research ethics for data sharing, and how the research ethics eventually affect psychologists making decisions regarding whether to engage in data sharing. This research utilized consequentialism theory of ethics as its theoretical framework to develop its research model of psychologists’ data sharing as mediated by research ethics. It conducted an online survey with psychologists in US academic institutions and collected a total of 362 valid responses. Then, it employed the structural equation modeling technique to evaluate the research model and related hypotheses of psychologists’ data sharing intentions as mediated by the profession’s research ethics. This research found that perceived career benefit, perceived community benefit, and norm of data sharing all significantly contribute to the formation of psychologists’ research ethics for data sharing, and then these research ethics, along with perceived community benefit and norm of data sharing, significantly influence psychologists’ data sharing intentions. This study suggests that the consequentialism theory of ethics nicely explains psychologists’ formation of their research ethics for data sharing and their decision to engage in data sharing. The study also suggests that research communities can better promote researchers’ data sharing behaviors by stimulating their research ethics through different ethical dimensions, including egoism (career benefit), utilitarianism (community benefit), and deontology (norm of data sharing).
Collapse
|
19
|
Lishner DA. Sorting the File Drawer: A Typology for Describing Unpublished Studies. PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2021; 17:252-269. [PMID: 33645325 DOI: 10.1177/1745691620979831] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
A typology of unpublished studies is presented to describe various types of unpublished studies and the reasons for their nonpublication. Reasons for nonpublication are classified by whether they stem from an awareness of the study results (result-dependent reasons) or not (result-independent reasons) and whether the reasons affect the publication decisions of individual researchers or reviewers/editors. I argue that result-independent reasons for nonpublication are less likely to introduce motivated reasoning into the publication decision process than are result-dependent reasons. I also argue that some reasons for nonpublication would produce beneficial as opposed to problematic publication bias. The typology of unpublished studies provides a descriptive scheme that can facilitate understanding of the population of study results across the field of psychology, within subdisciplines of psychology, or within specific psychology research domains. The typology also offers insight into different publication biases and research-dissemination practices and can guide individual researchers in organizing their own file drawers of unpublished studies.
Collapse
|
20
|
Ashworth M, Palikara O, Burchell E, Purser H, Nikolla D, Van Herwegen J. Online and Face-to-Face Performance on Two Cognitive Tasks in Children With Williams Syndrome. Front Psychol 2021; 11:594465. [PMID: 33613354 PMCID: PMC7889503 DOI: 10.3389/fpsyg.2020.594465] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2020] [Accepted: 12/24/2020] [Indexed: 12/02/2022] Open
Abstract
There has been an increase in cognitive assessment via the Internet, especially since the coronavirus disease 2019 surged the need for remote psychological assessment. This is the first study to investigate the appropriability of conducting cognitive assessments online with children with a neurodevelopmental condition and intellectual disability, namely, Williams syndrome. This study compared Raven’s Colored Progressive Matrices (RCPM) and British Picture Vocabulary Scale (BPVS) scores from two different groups of children with WS age 10–11 years who were assessed online (n = 14) or face-to-face (RCPM n = 12; BPVS n = 24). Bayesian t-tests showed that children’s RCPM scores were similar across testing conditions, but suggested BPVS scores were higher for participants assessed online. The differences between task protocols are discussed in line with these findings, as well as the implications for neurodevelopmental research.
Collapse
Affiliation(s)
- Maria Ashworth
- Department of Psychology and Human Development, UCL Institute of Education, University College London, London, United Kingdom
| | - Olympia Palikara
- Department of Education Studies, University of Warwick, Coventry, United Kingdom
| | - Elizabeth Burchell
- Department of Psychological Sciences, Birkbeck, University of London, London, United Kingdom
| | - Harry Purser
- Department of Psychology, Nottingham Trent University, Nottingham, United Kingdom
| | - Dritan Nikolla
- Department of Psychology, Kingston University, Kingston upon Thames, United Kingdom
| | - Jo Van Herwegen
- Department of Psychology and Human Development, UCL Institute of Education, University College London, London, United Kingdom
| |
Collapse
|
21
|
Blask K, Gerhards L, Jalynskij M. PsyCuraDat: Designing a User-Oriented Curation Standard for Behavioral Psychological Research Data. Front Psychol 2021; 11:579397. [PMID: 33584413 PMCID: PMC7874086 DOI: 10.3389/fpsyg.2020.579397] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2020] [Accepted: 12/14/2020] [Indexed: 11/26/2022] Open
Abstract
Starting from the observation that data sharing in general and sharing of reusable behavioral data in particular is still scarce in psychology, we set out to develop a curation standard for behavioral psychological research data rendering data reuse more effective and efficient. Specifically, we propose a standard that is oriented toward the requirements of the psychological research process, thus considering the needs of researchers in their role as data providers and data users. To this end, we suggest that researchers should describe their data on three documentation levels reflecting researchers’ central decisions during the research process. In particular, these levels describe researchers’ decisions on the concrete research design that is most suitable to address the corresponding research question, its operationalization as well as a precise description of the subsequent data collection and analysis process. Accordingly, the first documentation level represents, for instance, researchers’ decision on the concrete hypotheses, inclusion/exclusion criteria and the number of measurement points as well as a conceptual presentation of all substantial variables included in the design. On the second level these substantial variables are presented within an extended codebook allowing for the linkage between the conceptual research design and the actually operationalized variables as presented within the data. Finally, the third level includes all materials, data preparation and analyses scripts as well as a detailed procedure graphic that allows the data user to link the information from all three documentation levels at a single glance. After a comprehensive presentation of the standard, we will offer some arguments for its integration into the psychological research process.
Collapse
Affiliation(s)
- Katarina Blask
- Archiving and Publication Services, Leibniz Institute for Psychology, Trier, Germany
| | - Lea Gerhards
- Archiving and Publication Services, Leibniz Institute for Psychology, Trier, Germany
| | - Maria Jalynskij
- Department of Psychology, University of Koblenz-Landau, Landau, Germany
| |
Collapse
|
22
|
Does open data boost journal impact: evidence from Chinese economics. Scientometrics 2021; 126:3393-3419. [PMID: 33612885 PMCID: PMC7882418 DOI: 10.1007/s11192-021-03897-z] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/24/2020] [Accepted: 02/02/2021] [Indexed: 11/23/2022]
Abstract
To encourage research transparency and replication, more and more journals have been requiring authors to share original datasets and analytic procedures supporting their publications. Does open data boost journal impact? In this article, we report one of the first empirical studies to assess the effects of open data on journal impact. China Industrial Economics (CIE) mandated authors to open their research data in the end of 2016, which is the first to embrace open data among Chinese journals and provides a natural experiment for policy evaluation. We use the data of 37 Chinese economics journals from 2001 to 2019 and apply synthetic control method to causally estimate the effects of open data, and our results show that open data has significantly increased the citations of journal articles. On average, the current- and second-year citations of articles published with CIE have increased by 1 ~ 4 times, and articles published before the open data policy also benefited from the spillover effect. Our findings suggest that journals can leverage compulsory open data to develop reputation and amplify academic impacts.
Collapse
|
23
|
Carpenter TP, Law KC. Optimizing the scientific study of suicide with open and transparent research practices. Suicide Life Threat Behav 2021; 51:36-46. [PMID: 33624871 DOI: 10.1111/sltb.12665] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
Suicide research is vitally important, yet-like psychology research more broadly-faces methodological challenges. In recent years, researchers have raised concerns about standard practices in psychological research, concerns that apply to suicide research and raise questions about its robustness and validity. In the present paper, we review these concerns and the corresponding solutions put forth by the "open science" community. These include using open science platforms, pre-registering studies, ensuring reproducible analyses, using high-powered studies, ensuring open access to research materials and products, and conducting replication studies. We build upon existing guides, address specific obstacles faced by suicide researchers, and offer a clear set of recommended practices for suicide researchers. In particular, we consider challenges that suicide researchers may face in seeking to adopt "open science" practices (e.g., prioritizing large samples) and suggest possible strategies that the field may use in order to ensure robust and transparent research, despite these challenges.
Collapse
Affiliation(s)
| | - Keyne C Law
- Seattle Pacific University, Seattle, Washington, USA
| |
Collapse
|
24
|
Riley WT, Bibb K, Hargrave S, Fearon P. Publication rates from biomedical and behavioral and social science R01s funded by the National Institutes of Health. PLoS One 2020; 15:e0242271. [PMID: 33186405 PMCID: PMC7665634 DOI: 10.1371/journal.pone.0242271] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2020] [Accepted: 10/29/2020] [Indexed: 11/29/2022] Open
Abstract
Prior research has shown a serious lack of research transparency resulting from the failure to publish study results in a timely manner. The National Institutes of Health (NIH) has increased its use of publication rate and time to publication as metrics for grant productivity. In this study, we analyze the publications associated with all R01 and U01 grants funded from 2008 through 2014, providing sufficient time for these grants to publish their findings, and identify predictors of time to publication based on a number of variables, including if a grant was coded as a behavioral and social sciences research (BSSR) grant or not. Overall, 2.4% of the 27,016 R01 and U01 grants did not have a publication associated with the grant within 60 months of the project start date, and this rate of zero publications was higher for BSSR grants (4.6%) than for non-BSSR grants (1.9%). Mean time in months to first publication was 15.2 months, longer for BSSR grants (22.4 months) than non-BSSR grants (13.6 months). Survival curves showed a more rapid reduction of risk to publish from non-BSSR vs BSSR grants. Cox regression models showed that human research (vs. animal, neither, or both) and clinical trials research (vs. not) are the strongest predictors of time to publication and failure to publish, but even after accounting for these and other predictors, BSSR grants continued to show longer times to first publication and greater risk of no publications than non-BSSR grants. These findings indicate that even with liberal criteria for publication (any publication associated with a grant), a small percentage of R01 and U01 grantees fail to publish in a timely manner, and that a number of factors, including human research, clinical trial research, child research, not being an early stage investigator, and conducting behavioral and social sciences research increase the risk of time to first publication.
Collapse
Affiliation(s)
- William T. Riley
- Office of Behavioral and Social Sciences Research, National Institutes of Health, Bethesda, MD, United States of America
- * E-mail:
| | - Katrina Bibb
- Lexical Intelligence, Rockville, MD, United States of America
| | - Sara Hargrave
- Office of Behavioral and Social Sciences Research, National Institutes of Health, Bethesda, MD, United States of America
| | - Paula Fearon
- Lexical Intelligence, Rockville, MD, United States of America
| |
Collapse
|
25
|
Opening Pandora's Box: Peeking inside Psychology's data sharing practices, and seven recommendations for change. Behav Res Methods 2020; 53:1455-1468. [PMID: 33179123 PMCID: PMC8367918 DOI: 10.3758/s13428-020-01486-1] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/13/2020] [Indexed: 11/17/2022]
Abstract
Open data-sharing is a valuable practice that ought to enhance the impact, reach, and transparency of a research project. While widely advocated by many researchers and mandated by some journals and funding agencies, little is known about detailed practices across psychological science. In a pre-registered study, we show that overall, few research papers directly link to available data in many, though not all, journals. Most importantly, even where open data can be identified, the majority of these lacked completeness and reusability—conclusions that closely mirror those reported outside of Psychology. Exploring the reasons behind these findings, we offer seven specific recommendations for engineering and incentivizing improved practices, so that the potential of open data can be better realized across psychology and social science more generally.
Collapse
|
26
|
Lukács G, Specker E. Dispersion matters: Diagnostics and control data computer simulation in Concealed Information Test studies. PLoS One 2020; 15:e0240259. [PMID: 33007043 PMCID: PMC7531802 DOI: 10.1371/journal.pone.0240259] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2020] [Accepted: 09/23/2020] [Indexed: 01/12/2023] Open
Abstract
Binary classification has numerous applications. For one, lie detection methods typically aim to classify each tested person either as "liar" or as "truthteller" based on the given test results. To infer practical implications, as well as to compare different methods, it is essential to assess the diagnostic efficiency, such as demonstrating the number of correctly classified persons. However, this is not always straightforward. In Concealed Information Tests (CITs), the key predictor value (probe-irrelevant difference) for "truthtellers" is always similar (zero on average), and "liars" are always distinguished by a larger value (i.e., a larger number resulting from the CIT test, as compared to the zero baseline). Thereby, in general, the larger predictor values a given CIT method obtains for "liars" on average, the better this method is assumed to be. This has indeed been assumed in countless studies, and therefore, when comparing the classification efficiencies of two different designs, the mean difference of "liar" predictor values in the two designs were simply compared to each other (hence not collecting "truthteller" data to spare resources). We show, based on the meta-data of 12 different experimental designs collected in response time-based CIT studies, that differences in dispersion (i.e., variance in the data, e.g. the extent of random deviations from the zero average in case of "truthtellers") can substantially influence classification efficiency-to the point that, in extreme cases, one design may even be superior in classification despite having a larger mean "liar" predictor value. However, we also introduce a computer simulation procedure to estimate classification efficiency in the absence of "truthteller" data, and validate this procedure via a meta-analysis comparing outcomes based on empirical data versus simulated data.
Collapse
Affiliation(s)
- Gáspár Lukács
- Department of Cognition, Emotion, and Methods in Psychology, University of Vienna, Vienna, Austria
- Department of Security and Crime Science, University College London, London, United Kingdom
| | - Eva Specker
- Department of Cognition, Emotion, and Methods in Psychology, University of Vienna, Vienna, Austria
| |
Collapse
|
27
|
Abstract
A consensus on the importance of open data and reproducible code is emerging. How should data and code be shared to maximize the key desiderata of reproducibility, permanence, and accessibility? Research assets should be stored persistently in formats that are not software restrictive, and documented so that others can reproduce and extend the required computations. The sharing method should be easy to adopt by already busy researchers. We suggest the R package standard as a solution for creating, curating, and communicating research assets. The R package standard, with extensions discussed herein, provides a format for assets and metadata that satisfies the above desiderata, facilitates reproducibility, open access, and sharing of materials through online platforms like GitHub and Open Science Framework. We discuss a stack of R resources that help users create reproducible collections of research assets, from experiments to manuscripts, in the RStudio interface. We created an R package, vertical, to help researchers incorporate these tools into their workflows, and discuss its functionality at length in an online supplement. Together, these tools may increase the reproducibility and openness of psychological science.
Collapse
|
28
|
Quantity Over Quality? Reproducible Psychological Science from a Mixed Methods Perspective. COLLABRA: PSYCHOLOGY 2020. [DOI: 10.1525/collabra.284] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
A robust dialogue about the (un)reliability of psychological science findings has emerged in recent years. In response, metascience researchers have developed innovative tools to increase rigor, transparency, and reproducibility, stimulating rapid improvement and adoption of open science practices. However, existing reproducibility guidelines are geared toward purely quantitative study designs. This leaves some ambiguity as to how such guidelines should be implemented in mixed methods (MM) studies, which combine quantitative and qualitative research. Drawing on extant literature, our own experiences, and feedback from 79 self-identified MM researchers, the current paper addresses two main questions: (a) how and to what extent do existing reproducibility guidelines apply to MM study designs; and (b) can existing reproducibility guidelines be improved by incorporating best practices from qualitative research and epistemology? In answer, we offer 10 key recommendations for use within and outside of MM research. Finally, we argue that good science and good ethical practice are mutually reinforcing and lead to meaningful, credible science.
Collapse
|
29
|
Actionable recommendations for narrowing the science-practice gap in open science. ORGANIZATIONAL BEHAVIOR AND HUMAN DECISION PROCESSES 2020. [DOI: 10.1016/j.obhdp.2020.02.007] [Citation(s) in RCA: 30] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/15/2023]
|
30
|
Kassam-Adams N, Olff M. Embracing data preservation, sharing, and re-use in traumatic stress research. Eur J Psychotraumatol 2020; 11:1739885. [PMID: 32341765 PMCID: PMC7170380 DOI: 10.1080/20008198.2020.1739885] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/21/2020] [Revised: 03/01/2020] [Accepted: 03/03/2020] [Indexed: 02/07/2023] Open
Abstract
This editorial argues that it is time for the traumatic stress field to join the growing international movement towards Findable, Accessible, Interoperable, and Re-usable (FAIR) research data, and that we are well-positioned to do so. The field has a huge, largely untapped resource in the enormous number of rich potentially re-usable datasets that are not currently shared or preserved. We have several promising shared data resources created via international collaborative efforts by traumatic stress researchers, but we do not yet have common standards for data description, sharing, or preservation. And, despite the promise of novel findings from data sharing and re-use, there are a number of barriers to researchers' adoption of FAIR data practices. We present a vision for the future of FAIR traumatic stress data, and a call to action for the traumatic stress research community and individual researchers and research teams to help achieve this vision.
Collapse
Affiliation(s)
- Nancy Kassam-Adams
- Department of Pediatrics, Children’s Hospital of Philadelphia and University of Pennsylvania Perelman School of Medicine, Philadelphia, PA, USA
| | - Miranda Olff
- Psychiatry, University of Amsterdam (Universiteit Van Amsterdam), Amsterdam, Netherlands
| |
Collapse
|
31
|
Rahnev D, Desender K, Lee ALF, Adler WT, Aguilar-Lleyda D, Akdoğan B, Arbuzova P, Atlas LY, Balcı F, Bang JW, Bègue I, Birney DP, Brady TF, Calder-Travis J, Chetverikov A, Clark TK, Davranche K, Denison RN, Dildine TC, Double KS, Duyan YA, Faivre N, Fallow K, Filevich E, Gajdos T, Gallagher RM, de Gardelle V, Gherman S, Haddara N, Hainguerlot M, Hsu TY, Hu X, Iturrate I, Jaquiery M, Kantner J, Koculak M, Konishi M, Koß C, Kvam PD, Kwok SC, Lebreton M, Lempert KM, Ming Lo C, Luo L, Maniscalco B, Martin A, Massoni S, Matthews J, Mazancieux A, Merfeld DM, O'Hora D, Palser ER, Paulewicz B, Pereira M, Peters C, Philiastides MG, Pfuhl G, Prieto F, Rausch M, Recht S, Reyes G, Rouault M, Sackur J, Sadeghi S, Samaha J, Seow TXF, Shekhar M, Sherman MT, Siedlecka M, Skóra Z, Song C, Soto D, Sun S, van Boxtel JJA, Wang S, Weidemann CT, Weindel G, Wierzchoń M, Xu X, Ye Q, Yeon J, Zou F, Zylberberg A. The Confidence Database. Nat Hum Behav 2020; 4:317-325. [PMID: 32015487 PMCID: PMC7565481 DOI: 10.1038/s41562-019-0813-1] [Citation(s) in RCA: 49] [Impact Index Per Article: 12.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2019] [Accepted: 12/11/2019] [Indexed: 11/09/2022]
Abstract
Understanding how people rate their confidence is critical for the characterization of a wide range of perceptual, memory, motor and cognitive processes. To enable the continued exploration of these processes, we created a large database of confidence studies spanning a broad set of paradigms, participant populations and fields of study. The data from each study are structured in a common, easy-to-use format that can be easily imported and analysed using multiple software packages. Each dataset is accompanied by an explanation regarding the nature of the collected data. At the time of publication, the Confidence Database (which is available at https://osf.io/s46pr/) contained 145 datasets with data from more than 8,700 participants and almost 4 million trials. The database will remain open for new submissions indefinitely and is expected to continue to grow. Here we show the usefulness of this large collection of datasets in four different analyses that provide precise estimations of several foundational confidence-related effects.
Collapse
Affiliation(s)
- Dobromir Rahnev
- School of Psychology, Georgia Institute of Technology, Atlanta, GA, USA.
| | - Kobe Desender
- Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
- Department of Experimental Psychology, Ghent University, Ghent, Belgium
| | - Alan L F Lee
- Department of Applied Psychology and Wofoo Joseph Lee Consulting and Counselling Psychology Research Centre, Lingnan University, Tuen Mun, Hong Kong
| | - William T Adler
- Center for Neural Science, New York University, New York, NY, USA
| | - David Aguilar-Lleyda
- Centre d'Économie de la Sorbonne, CNRS & Université Paris 1 Panthéon-Sorbonne, Paris, France
| | - Başak Akdoğan
- Department of Psychology, Columbia University, New York, NY, USA
| | - Polina Arbuzova
- Bernstein Center for Computational Neuroscience, Berlin, Germany
- Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Berlin, Germany
- Institute of Psychology, Humboldt-Universität zu Berlin, Berlin, Germany
| | - Lauren Y Atlas
- National Center for Complementary and Integrative Health, National Institutes of Health, Bethesda, MD, USA
- National Institute of Mental Health, National Institutes of Health, Bethesda, MD, USA
- National Institute on Drug Abuse, National Institutes of Health, Baltimore, MD, USA
| | - Fuat Balcı
- Department of Psychology, Koç University, Istanbul, Turkey
| | - Ji Won Bang
- Department of Ophthalmology, New York University (NYU) School of Medicine, NYU Langone Health, New York, NY, USA
| | - Indrit Bègue
- Department of Psychiatry and Mental Health, University Hospitals of Geneva and University of Geneva, Geneva, Switzerland
| | - Damian P Birney
- School of Psychology, University of Sydney, Sydney, New South Wales, Australia
| | - Timothy F Brady
- Department of Psychology, University of California, San Diego, La Jolla, CA, USA
| | | | - Andrey Chetverikov
- Donders Institute for Brain, Cognition and Behavior, Radboud University, Nijmegen, the Netherlands
| | - Torin K Clark
- Smead Aerospace Engineering Sciences, University of Colorado, Boulder, CO, USA
| | | | - Rachel N Denison
- Department of Psychology and Center for Neural Science, New York University, New York, NY, USA
| | - Troy C Dildine
- National Center for Complementary and Integrative Health, National Institutes of Health, Bethesda, MD, USA
- Department of Clinical Neuroscience, Karolinska Institutet, Solna, Sweden
| | - Kit S Double
- Department of Education, University of Oxford, Oxford, UK
| | - Yalçın A Duyan
- Department of Psychology, Koç University, Istanbul, Turkey
| | - Nathan Faivre
- Laboratoire de Psychologie et Neurocognition, Université Grenoble Alpes, Grenoble, France
| | - Kaitlyn Fallow
- Department of Psychology, University of Victoria, Victoria, British Columbia, Canada
| | - Elisa Filevich
- Bernstein Center for Computational Neuroscience, Berlin, Germany
- Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Berlin, Germany
- Institute of Psychology, Humboldt-Universität zu Berlin, Berlin, Germany
| | | | - Regan M Gallagher
- School of Psychology, University of Queensland, Brisbane, Queensland, Australia
- Department of Experimental & Applied Psychology, Vrije Universiteit Amsterdam, Amsterdam, the Netherlands
- School of Psychological Sciences, Monash University, Melbourne, Victoria, Australia
| | | | - Sabina Gherman
- Institute of Neuroscience and Psychology, University of Glasgow, Glasgow, UK
- Feinstein Institute for Medical Research, Manhasset, NY, USA
| | - Nadia Haddara
- School of Psychology, Georgia Institute of Technology, Atlanta, GA, USA
| | - Marine Hainguerlot
- Erasmus School of Economics, Erasmus University Rotterdam, Rotterdam, the Netherlands
| | - Tzu-Yu Hsu
- Graduate Institute of Mind, Brain, and Consciousness, Taipei Medical University, Taipei, Taiwan
| | - Xiao Hu
- Collaborative Innovation Center of Assessment toward Basic Education Quality, Beijing Normal University, Beijing, China
| | - Iñaki Iturrate
- National Institute of Neurological Disorders and Stroke, National Institutes of Health, Bethesda, MD, USA
| | - Matt Jaquiery
- Department of Experimental Psychology, University of Oxford, Oxford, UK
| | - Justin Kantner
- Department of Psycholgoy, California State University, Northridge, CA, USA
| | - Marcin Koculak
- Consciousness Lab, Institute of Psychology, Jagiellonian University, Krakow, Poland
| | - Mahiko Konishi
- Laboratoire de Sciences Cognitives et de Psycholinguistique, Department d'Etudes Cognitives, ENS, PSL University, EHESS, CNRS, Paris, France
| | - Christina Koß
- Bernstein Center for Computational Neuroscience, Berlin, Germany
- Institute of Psychology, Humboldt-Universität zu Berlin, Berlin, Germany
| | - Peter D Kvam
- Department of Psychology, University of Florida, Gainesville, FL, USA
| | - Sze Chai Kwok
- Shanghai Key Laboratory of Brain Functional Genomics, Key Laboratory of Brain Functional Genomics Ministry of Education, School of Psychology and Cognitive Science, East China Normal University, Shanghai, China
- Shanghai Key Laboratory of Magnetic Resonance, East China Normal University, Shanghai, China
- NYU-ECNU Institute of Brain and Cognitive Science, NYU Shanghai, Shanghai, China
| | - Maël Lebreton
- Swiss Center for Affective Science and LaBNIC, Department of Basic Neuroscience, University of Geneva, Geneva, Switzerland
| | - Karolina M Lempert
- Department of Psychology, University of Pennsylvania, Philadelphia, PA, USA
| | - Chien Ming Lo
- Graduate Institute of Mind, Brain, and Consciousness, Taipei Medical University, Taipei, Taiwan
- Brain and Consciousness Research Centre, TMU Shuang-Ho Hospital, New Taipei City, Taiwan
| | - Liang Luo
- Collaborative Innovation Center of Assessment toward Basic Education Quality, Beijing Normal University, Beijing, China
| | - Brian Maniscalco
- Department of Bioengineering, University of California, Riverside, Riverside, CA, USA
| | - Antonio Martin
- Graduate Institute of Mind, Brain, and Consciousness, Taipei Medical University, Taipei, Taiwan
| | - Sébastien Massoni
- Université de Lorraine, Université de Strasbourg, CNRS, BETA, Nancy, France
| | - Julian Matthews
- School of Psychological Sciences, Monash University, Melbourne, Victoria, Australia
- Philosophy Department, Monash University, Monash, Victoria, Australia
| | - Audrey Mazancieux
- Laboratoire de Psychologie et Neurocognition, Université Grenoble Alpes, Grenoble, France
| | - Daniel M Merfeld
- Otolaryngology-Head and Neck Surgery, The Ohio State University, Columbus, OH, USA
| | - Denis O'Hora
- School of Psychology, National University of Ireland Galway, Galway, Ireland
| | - Eleanor R Palser
- Department of Neurology, University of California, San Francisco, San Francisco, CA, USA
- Psychology and Language Sciences, University College Londo, London, UK
- Institute of Neurology, University College London, London, UK
| | - Borysław Paulewicz
- SWPS University of Social Sciences and Humanities, Katowice Faculty of Psychology, Katowice, Poland
| | - Michael Pereira
- Laboratory of Cognitive Neuroscience, Ecole Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Caroline Peters
- Bernstein Center for Computational Neuroscience, Berlin, Germany
- Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Berlin, Germany
- Institute of Psychology, Humboldt-Universität zu Berlin, Berlin, Germany
| | | | - Gerit Pfuhl
- Department of Psychology, UiT the Arctic University of Norway, Tromso, Norway
| | - Fernanda Prieto
- Faculty of Psychology, Universidad del Desarrollo, Santiago, Chile
| | - Manuel Rausch
- Catholic University of Eichstätt-Ingolstadt, Eichstätt, Germany
| | - Samuel Recht
- Laboratoire des Systèmes Perceptifs, Département d'Études Cognitives, École normale supérieure-PSL University, CNRS, Paris, France
| | - Gabriel Reyes
- Faculty of Psychology, Universidad del Desarrollo, Santiago, Chile
| | - Marion Rouault
- Département d'Études Cognitives, École Normale Supérieure-PSL University, CNRS, EHESS, INSERM, Paris, France
| | - Jérôme Sackur
- Département d'Études Cognitives, École Normale Supérieure-PSL University, CNRS, EHESS, INSERM, Paris, France
- École Polytechnique, Palaiseau, France
| | - Saeedeh Sadeghi
- Department of Human Development, Cornell University, Ithaca, NY, USA
| | - Jason Samaha
- Department of Psychology, University of California, Santa Cruz, Santa Cruz, CA, USA
| | - Tricia X F Seow
- School of Psychology, Trinity College Dublin, Dublin, Ireland
| | - Medha Shekhar
- School of Psychology, Georgia Institute of Technology, Atlanta, GA, USA
| | - Maxine T Sherman
- Sackler Centre for Consciousness Science, Brighton, UK
- Brighton and Sussex Medical School, University of Sussex, Brighton, UK
| | - Marta Siedlecka
- Consciousness Lab, Institute of Psychology, Jagiellonian University, Krakow, Poland
| | - Zuzanna Skóra
- Consciousness Lab, Institute of Psychology, Jagiellonian University, Krakow, Poland
| | - Chen Song
- Cardiff University Brain Research Imaging Centre, School of Psychology, Cardiff University, Cardiff, UK
| | - David Soto
- Basque Center on Cognition, Brain and Language, San Sebastian, Spain
- Ikerbasque, Basque Foundation for Science, Bilbao, Spain
| | - Sai Sun
- Divisions of Biology and Biological Engineering and Computation and Neural Systems, California Institute of Technology, Pasadena, CA, USA
| | - Jeroen J A van Boxtel
- School of Psychological Sciences, Monash University, Melbourne, Victoria, Australia
- Discipline of Psychology, University of Canberra, Canberra, Australian Capital Territory, Australia
| | - Shuo Wang
- Department of Chemical and Biomedical Engineering and Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV, USA
| | | | | | - Michał Wierzchoń
- Consciousness Lab, Institute of Psychology, Jagiellonian University, Krakow, Poland
| | - Xinming Xu
- Shanghai Key Laboratory of Brain Functional Genomics, Key Laboratory of Brain Functional Genomics Ministry of Education, School of Psychology and Cognitive Science, East China Normal University, Shanghai, China
| | - Qun Ye
- Shanghai Key Laboratory of Brain Functional Genomics, Key Laboratory of Brain Functional Genomics Ministry of Education, School of Psychology and Cognitive Science, East China Normal University, Shanghai, China
| | - Jiwon Yeon
- School of Psychology, Georgia Institute of Technology, Atlanta, GA, USA
| | - Futing Zou
- Shanghai Key Laboratory of Brain Functional Genomics, Key Laboratory of Brain Functional Genomics Ministry of Education, School of Psychology and Cognitive Science, East China Normal University, Shanghai, China
| | - Ariel Zylberberg
- Department of Brain and Cognitive Sciences, University of Rochester, Rochester, NY, USA
| |
Collapse
|
32
|
Rowhani-Farid A, Aldcroft A, Barnett AG. Did awarding badges increase data sharing in BMJ Open? A randomized controlled trial. ROYAL SOCIETY OPEN SCIENCE 2020; 7:191818. [PMID: 32269804 PMCID: PMC7137948 DOI: 10.1098/rsos.191818] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/16/2019] [Accepted: 02/27/2020] [Indexed: 05/06/2023]
Abstract
Sharing data and code are important components of reproducible research. Data sharing in research is widely discussed in the literature; however, there are no well-established evidence-based incentives that reward data sharing, nor randomized studies that demonstrate the effectiveness of data sharing policies at increasing data sharing. A simple incentive, such as an Open Data Badge, might provide the change needed to increase data sharing in health and medical research. This study was a parallel group randomized controlled trial (protocol registration: doi:10.17605/OSF.IO/PXWZQ) with two groups, control and intervention, with 80 research articles published in BMJ Open per group, with a total of 160 research articles. The intervention group received an email offer for an Open Data Badge if they shared their data along with their final publication and the control group received an email with no offer of a badge if they shared their data with their final publication. The primary outcome was the data sharing rate. Badges did not noticeably motivate researchers who published in BMJ Open to share their data; the odds of awarding badges were nearly equal in the intervention and control groups (odds ratio = 0.9, 95% CI [0.1, 9.0]). Data sharing rates were low in both groups, with just two datasets shared in each of the intervention and control groups. The global movement towards open science has made significant gains with the development of numerous data sharing policies and tools. What remains to be established is an effective incentive that motivates researchers to take up such tools to share their data.
Collapse
Affiliation(s)
- Anisa Rowhani-Farid
- Department of Pharmaceutical Health Services Research, University of Maryland, Baltimore, MD, USA
- School of Public Health and Social Work, QueenslandUniversity of Technology, Brisbane, Australia
| | | | - Adrian G. Barnett
- School of Public Health and Social Work, QueenslandUniversity of Technology, Brisbane, Australia
| |
Collapse
|
33
|
Vasilev MR, Yates M, Slattery TJ. Do Readers Integrate Phonological Codes Across Saccades? A Bayesian Meta-Analysis and a Survey of the Unpublished Literature. J Cogn 2019; 2:43. [PMID: 31750415 PMCID: PMC6838770 DOI: 10.5334/joc.87] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2019] [Accepted: 10/08/2019] [Indexed: 11/28/2022] Open
Abstract
It is commonly accepted that phonological codes can be activated parafoveally during reading and later used to aid foveal word recognition- a finding known as the phonological preview benefit. However, a closer look at the literature shows that this effect may be less consistent than what is sometimes believed. To determine the extent to which phonology is processed parafoveally, a Bayesian meta-analysis of 27 experiments and a survey of the unpublished literature were conducted. While the results were generally consistent with the phonological preview benefit (>90% probability of a true effect in gaze durations), the size of the effect was small. Readers of alphabetical orthographies obtained a modest benefit of only 4 ms in gaze durations. Interestingly, Chinese readers showed a larger effect (6-14 ms in size). There was no difference in the magnitude of the phonological preview benefit between homophone and pseudo-homophone previews, thus suggesting that the modest processing advantage is indeed related to the activation of phonological codes from the parafoveal word. Simulations revealed that the results are relatively robust to missing studies, although the effects may be 19-22% smaller if all missing studies found a null effect. The results suggest that while phonology can be processed parafoveally, this happens only to a limited extent. Because phonological priming effects in single-word recognition are small (10-13 ms; Rastle & Brysbaert, 2006) and there is a loss of visual acuity in the parafovea, it is argued that large phonological preview benefit effects may be unlikely.
Collapse
Affiliation(s)
| | - Mark Yates
- University of South Alabama, Department of Psychology, US
| | | |
Collapse
|
34
|
|
35
|
Mihura JL, Bombel G, Dumitrascu N, Roy M, Meadows EA. Why We Need a Formal Systematic Approach to Validating Psychological Tests: The Case of the Rorschach Comprehensive System. J Pers Assess 2018; 101:374-392. [PMID: 29723065 DOI: 10.1080/00223891.2018.1458315] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
Abstract
This article documents and discusses the importance of using a formal systematic approach to validating psychological tests. To illustrate, results are presented from a systematic review of the validity findings cited in the Rorschach Comprehensive System (CS; Exner, 2003) test manual, originally conducted during the manuscript review process for Mihura, Meyer, Dumitrascu, and Bombel's (2013) CS meta-analyses. Our review documents (a) the degree to which the CS test manual reports validity findings for each test variable, (b) whether these findings are publicly accessible or unpublished studies coordinated by the test developer, and (c) the presence and nature of data discrepancies between the CS test manual and the cited source. Implications are discussed for the CS in particular, the Rorschach more generally, and psychological tests more broadly. Notably, a history of intensive scrutiny of the Rorschach has resulted in more stringent standards applied to it, even though its scales have more published and supportive construct validity meta-analyses than any other psychological test. Calls are made for (a) a mechanism to correct data errors in the scientific literature, (b) guidelines for test developers' key unpublished studies, and (c) systematic reviews and meta-analyses to become standard practice for all psychological tests.
Collapse
Affiliation(s)
| | | | | | - Manali Roy
- a Department of Psychology , University of Toledo
| | | |
Collapse
|