1
|
Wendelborn C, Anger M, Schickhardt C. Promoting Data Sharing: The Moral Obligations of Public Funding Agencies. SCIENCE AND ENGINEERING ETHICS 2024; 30:35. [PMID: 39105890 PMCID: PMC11303567 DOI: 10.1007/s11948-024-00491-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/21/2022] [Accepted: 06/08/2024] [Indexed: 08/07/2024]
Abstract
Sharing research data has great potential to benefit science and society. However, data sharing is still not common practice. Since public research funding agencies have a particular impact on research and researchers, the question arises: Are public funding agencies morally obligated to promote data sharing? We argue from a research ethics perspective that public funding agencies have several pro tanto obligations requiring them to promote data sharing. However, there are also pro tanto obligations that speak against promoting data sharing in general as well as with regard to particular instruments of such promotion. We examine and weigh these obligations and conclude that all things considered funders ought to promote the sharing of data. Even the instrument of mandatory data sharing policies can be justified under certain conditions.
Collapse
Affiliation(s)
- Christian Wendelborn
- Section for Translational Medical Ethics, German Cancer Research Center (DKFZ), National Center for Tumor Diseases (NCT) Heidelberg, Heidelberg, Germany.
- University of Konstanz, Konstanz, Germany.
| | - Michael Anger
- Section for Translational Medical Ethics, German Cancer Research Center (DKFZ), National Center for Tumor Diseases (NCT) Heidelberg, Heidelberg, Germany
| | - Christoph Schickhardt
- Section for Translational Medical Ethics, German Cancer Research Center (DKFZ), National Center for Tumor Diseases (NCT) Heidelberg, Heidelberg, Germany
| |
Collapse
|
2
|
Kaufmann E. Teachers' judgment accuracy: A replication check by psychometric meta-analysis. PLoS One 2024; 19:e0307594. [PMID: 39052673 PMCID: PMC11271880 DOI: 10.1371/journal.pone.0307594] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/02/2024] [Accepted: 07/08/2024] [Indexed: 07/27/2024] Open
Abstract
Teachers' judgment accuracy is a core competency in their daily business. Due to its importance, several meta-analyses have estimated how accurately teachers judge students' academic achievements by measuring teachers' judgment accuracy (i.e., the correlation between teachers' judgments of students' academic abilities and students' scores on achievement tests). In our study, we considered previous meta-analyses and updated these databases and the analytic combination of data using a psychometric meta-analysis to explain variations in results across studies. Our results demonstrate the importance of considering aggregation and publication bias as well as correcting for the most important artifacts (e.g., sampling and measurement error), but also that most studies fail to report the data needed for conducting a meta-analysis according to current best practices. We find that previous reviews have underestimated teachers' judgment accuracy and overestimated the variance in estimates of teachers' judgment accuracy across studies because at least 10% of this variance may be associated with common artifacts. We conclude that ignoring artifacts, as in classical meta-analysis, may lead one to erroneously conclude that moderator variables, instead of artifacts, explain any variation. We describe how online data repositories could improve the scientific process and the potential for using psychometric meta-analysis to synthesize results and assess replicability.
Collapse
Affiliation(s)
- Esther Kaufmann
- Research Methods, Assessment and iScience, Department of Psychology, University of Konstanz, Konstanz, Germany
| |
Collapse
|
3
|
Luijken K, Lohmann A, Alter U, Claramunt Gonzalez J, Clouth FJ, Fossum JL, Hesen L, Huizing AHJ, Ketelaar J, Montoya AK, Nab L, Nijman RCC, Penning de Vries BBL, Tibbe TD, Wang YA, Groenwold RHH. Replicability of simulation studies for the investigation of statistical methods: the RepliSims project. ROYAL SOCIETY OPEN SCIENCE 2024; 11:231003. [PMID: 38234442 PMCID: PMC10791519 DOI: 10.1098/rsos.231003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/13/2023] [Accepted: 12/14/2023] [Indexed: 01/19/2024]
Abstract
Results of simulation studies evaluating the performance of statistical methods can have a major impact on the way empirical research is implemented. However, so far there is limited evidence of the replicability of simulation studies. Eight highly cited statistical simulation studies were selected, and their replicability was assessed by teams of replicators with formal training in quantitative methodology. The teams used information in the original publications to write simulation code with the aim of replicating the results. The primary outcome was to determine the feasibility of replicability based on reported information in the original publications and supplementary materials. Replicasility varied greatly: some original studies provided detailed information leading to almost perfect replication of results, whereas other studies did not provide enough information to implement any of the reported simulations. Factors facilitating replication included availability of code, detailed reporting or visualization of data-generating procedures and methods, and replicator expertise. Replicability of statistical simulation studies was mainly impeded by lack of information and sustainability of information sources. We encourage researchers publishing simulation studies to transparently report all relevant implementation details either in the research paper itself or in easily accessible supplementary material and to make their simulation code publicly available using permanent links.
Collapse
Affiliation(s)
- K. Luijken
- Department of Clinical Epidemiology, Leiden University Medical Centre, Leiden, The Netherlands
- Department of Epidemiology, Julius Center for Health Sciences and Primary Care, University Medical Center Utrecht, University Utrecht, Utrecht, The Netherlands
| | - A. Lohmann
- Department of Clinical Epidemiology, Leiden University Medical Centre, Leiden, The Netherlands
| | - U. Alter
- Department of Psychology, York University, Toronto, Ontario, Canada
| | - J. Claramunt Gonzalez
- Methodology and Statistics Unit, Institute of Psychology, Leiden University, Leiden, The Netherlands
| | - F. J. Clouth
- Department of Methodology and Statistics, Tilburg University, Tilburg, The Netherlands
- Netherlands Comprehensive Cancer Organisation (IKNL), Utrecht, The Netherlands
| | - J. L. Fossum
- Department of Psychology, University of California, Los Angeles, CA, USA
- Department of Psychology, Seattle Pacific University, Seattle, WA, USA
| | - L. Hesen
- Department of Clinical Epidemiology, Leiden University Medical Centre, Leiden, The Netherlands
| | - A. H. J. Huizing
- TNO (Netherlands Organization for Applied Scientific Research), Expertise Group Child Health, Leiden, The Netherlands
| | - J. Ketelaar
- Department of Clinical Epidemiology, Leiden University Medical Centre, Leiden, The Netherlands
| | - A. K. Montoya
- Department of Psychology, University of California, Los Angeles, CA, USA
| | - L. Nab
- Department of Clinical Epidemiology, Leiden University Medical Centre, Leiden, The Netherlands
| | - R. C. C. Nijman
- Department of Clinical Epidemiology, Leiden University Medical Centre, Leiden, The Netherlands
| | - B. B. L. Penning de Vries
- Department of Clinical Epidemiology, Leiden University Medical Centre, Leiden, The Netherlands
- Department of Biomedical Data Sciences, Leiden University Medical Centre, Leiden, The Netherlands
| | - T. D. Tibbe
- Department of Psychology, University of California, Los Angeles, CA, USA
| | - Y. A. Wang
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada
| | - R. H. H. Groenwold
- Department of Clinical Epidemiology, Leiden University Medical Centre, Leiden, The Netherlands
- Department of Biomedical Data Sciences, Leiden University Medical Centre, Leiden, The Netherlands
| |
Collapse
|
4
|
Nosek BA, Hardwicke TE, Moshontz H, Allard A, Corker KS, Dreber A, Fidler F, Hilgard J, Struhl MK, Nuijten MB, Rohrer JM, Romero F, Scheel AM, Scherer LD, Schönbrodt FD, Vazire S. Replicability, Robustness, and Reproducibility in Psychological Science. Annu Rev Psychol 2021; 73:719-748. [PMID: 34665669 DOI: 10.1146/annurev-psych-020821-114157] [Citation(s) in RCA: 130] [Impact Index Per Article: 43.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Replication-an important, uncommon, and misunderstood practice-is gaining appreciation in psychology. Achieving replicability is important for making research progress. If findings are not replicable, then prediction and theory development are stifled. If findings are replicable, then interrogation of their meaning and validity can advance knowledge. Assessing replicability can be productive for generating and testing hypotheses by actively confronting current understandings to identify weaknesses and spur innovation. For psychology, the 2010s might be characterized as a decade of active confrontation. Systematic and multi-site replication projects assessed current understandings and observed surprising failures to replicate many published findings. Replication efforts highlighted sociocultural challenges such as disincentives to conduct replications and a tendency to frame replication as a personal attack rather than a healthy scientific practice, and they raised awareness that replication contributes to self-correction. Nevertheless, innovation in doing and understanding replication and its cousins, reproducibility and robustness, has positioned psychology to improve research practices and accelerate progress. Expected final online publication date for the Annual Review of Psychology, Volume 73 is January 2022. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.
Collapse
Affiliation(s)
- Brian A Nosek
- Department of Psychology, University of Virginia, Charlottesville, Virginia 22904, USA; .,Center for Open Science, Charlottesville, Virginia 22903, USA
| | - Tom E Hardwicke
- Department of Psychology, University of Amsterdam, 1012 ZA Amsterdam, The Netherlands
| | - Hannah Moshontz
- Addiction Research Center, University of Wisconsin-Madison, Madison, Wisconsin 53706, USA
| | - Aurélien Allard
- Department of Psychology, University of California, Davis, California 95616, USA
| | - Katherine S Corker
- Psychology Department, Grand Valley State University, Allendale, Michigan 49401, USA
| | - Anna Dreber
- Department of Economics, Stockholm School of Economics, 113 83 Stockholm, Sweden
| | - Fiona Fidler
- School of Biosciences, University of Melbourne, Parkville VIC 3010, Australia
| | - Joe Hilgard
- Department of Psychology, Illinois State University, Normal, Illinois 61790, USA
| | | | - Michèle B Nuijten
- Meta-Research Center, Tilburg University, 5037 AB Tilburg, The Netherlands
| | - Julia M Rohrer
- Department of Psychology, Leipzig University, 04109 Leipzig, Germany
| | - Felipe Romero
- Department of Theoretical Philosophy, University of Groningen, 9712 CP, The Netherlands
| | - Anne M Scheel
- Department of Industrial Engineering and Innovation Sciences, Eindhoven University of Technology, 5612 AZ Eindhoven, The Netherlands
| | - Laura D Scherer
- University of Colorado Anschutz Medical Campus, Aurora, Colorado 80045, USA
| | - Felix D Schönbrodt
- Department of Psychology, Ludwig Maximilian University of Munich, 80539 Munich, Germany
| | - Simine Vazire
- School of Psychological Sciences, University of Melbourne, Parkville VIC 3052, Australia
| |
Collapse
|
5
|
Samota EK, Davey RP. Knowledge and Attitudes Among Life Scientists Toward Reproducibility Within Journal Articles: A Research Survey. Front Res Metr Anal 2021; 6:678554. [PMID: 34268467 PMCID: PMC8276979 DOI: 10.3389/frma.2021.678554] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2021] [Accepted: 05/18/2021] [Indexed: 12/22/2022] Open
Abstract
We constructed a survey to understand how authors and scientists view the issues around reproducibility, focusing on interactive elements such as interactive figures embedded within online publications, as a solution for enabling the reproducibility of experiments. We report the views of 251 researchers, comprising authors who have published in eLIFE Sciences, and those who work at the Norwich Biosciences Institutes (NBI). The survey also outlines to what extent researchers are occupied with reproducing experiments themselves. Currently, there is an increasing range of tools that attempt to address the production of reproducible research by making code, data, and analyses available to the community for reuse. We wanted to collect information about attitudes around the consumer end of the spectrum, where life scientists interact with research outputs to interpret scientific results. Static plots and figures within articles are a central part of this interpretation, and therefore we asked respondents to consider various features for an interactive figure within a research article that would allow them to better understand and reproduce a published analysis. The majority (91%) of respondents reported that when authors describe their research methodology (methods and analyses) in detail, published research can become more reproducible. The respondents believe that having interactive figures in published papers is a beneficial element to themselves, the papers they read as well as to their readers. Whilst interactive figures are one potential solution for consuming the results of research more effectively to enable reproducibility, we also review the equally pressing technical and cultural demands on researchers that need to be addressed to achieve greater success in reproducibility in the life sciences.
Collapse
Affiliation(s)
- Evanthia Kaimaklioti Samota
- Earlham Institute, Norwich, United Kingdom
- School of Biological Sciences, University of East Anglia, Norwich, United Kingdom
| | | |
Collapse
|
6
|
Page MJ, Moher D, Fidler FM, Higgins JPT, Brennan SE, Haddaway NR, Hamilton DG, Kanukula R, Karunananthan S, Maxwell LJ, McDonald S, Nakagawa S, Nunan D, Tugwell P, Welch VA, McKenzie JE. The REPRISE project: protocol for an evaluation of REProducibility and Replicability In Syntheses of Evidence. Syst Rev 2021; 10:112. [PMID: 33863381 PMCID: PMC8052676 DOI: 10.1186/s13643-021-01670-0] [Citation(s) in RCA: 16] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/18/2021] [Accepted: 04/07/2021] [Indexed: 12/02/2022] Open
Abstract
BACKGROUND Investigations of transparency, reproducibility and replicability in science have been directed largely at individual studies. It is just as critical to explore these issues in syntheses of studies, such as systematic reviews, given their influence on decision-making and future research. We aim to explore various aspects relating to the transparency, reproducibility and replicability of several components of systematic reviews with meta-analysis of the effects of health, social, behavioural and educational interventions. METHODS The REPRISE (REProducibility and Replicability In Syntheses of Evidence) project consists of four studies. We will evaluate the completeness of reporting and sharing of review data, analytic code and other materials in a random sample of 300 systematic reviews of interventions published in 2020 (Study 1). We will survey authors of systematic reviews to explore their views on sharing review data, analytic code and other materials and their understanding of and opinions about replication of systematic reviews (Study 2). We will then evaluate the extent of variation in results when we (a) independently reproduce meta-analyses using the same computational steps and analytic code (if available) as used in the original review (Study 3), and (b) crowdsource teams of systematic reviewers to independently replicate a subset of methods (searches for studies, selection of studies for inclusion, collection of outcome data, and synthesis of results) in a sample of the original reviews; 30 reviews will be replicated by 1 team each and 2 reviews will be replicated by 15 teams (Study 4). DISCUSSION The REPRISE project takes a systematic approach to determine how reliable systematic reviews of interventions are. We anticipate that results of the REPRISE project will inform strategies to improve the conduct and reporting of future systematic reviews.
Collapse
Affiliation(s)
- Matthew J Page
- School of Public Health and Preventive Medicine, Monash University, 553 St. Kilda Road, Melbourne, Victoria, 3004, Australia.
| | - David Moher
- Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Canada
- School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, Canada
| | - Fiona M Fidler
- School of BioSciences, University of Melbourne, Melbourne, Australia
- School of Historical and Philosophical Studies, University of Melbourne, Melbourne, Australia
| | - Julian P T Higgins
- Population Health Sciences, Bristol Medical School, University of Bristol, Bristol, UK
| | - Sue E Brennan
- School of Public Health and Preventive Medicine, Monash University, 553 St. Kilda Road, Melbourne, Victoria, 3004, Australia
| | - Neal R Haddaway
- Mercator Research Institute on Global Commons and Climate Change, Berlin, Germany
- African Centre for Evidence, University of Johannesburg, Johannesburg, South Africa
- Stockholm Environment Institute, Linnégatan 87D, Stockholm, Sweden
- The SEI Centre of the Collaboration for Environmental Evidence, Stockholm, Sweden
| | - Daniel G Hamilton
- School of BioSciences, University of Melbourne, Melbourne, Australia
| | - Raju Kanukula
- School of Public Health and Preventive Medicine, Monash University, 553 St. Kilda Road, Melbourne, Victoria, 3004, Australia
| | - Sathya Karunananthan
- Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Canada
| | - Lara J Maxwell
- Faculty of Medicine, University of Ottawa, Ottawa, Canada
| | - Steve McDonald
- School of Public Health and Preventive Medicine, Monash University, 553 St. Kilda Road, Melbourne, Victoria, 3004, Australia
| | - Shinichi Nakagawa
- Evolution & Ecology Research Centre and School of Biological, Earth and Environmental Sciences, University of New South Wales, Sydney, Australia
| | - David Nunan
- Centre for Evidence-Based Medicine, Oxford University, Oxford, UK
| | - Peter Tugwell
- School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, Canada
- Department of Medicine, Faculty of Medicine, University of Ottawa, Ottawa, Canada
- Bruyère Research Institute, Ottawa, Canada
| | - Vivian A Welch
- School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, Canada
- Bruyère Research Institute, Ottawa, Canada
| | - Joanne E McKenzie
- School of Public Health and Preventive Medicine, Monash University, 553 St. Kilda Road, Melbourne, Victoria, 3004, Australia
| |
Collapse
|
7
|
de Winter JCF, Petermeijer SM, Kooijman L, Dodou D. Replicating five pupillometry studies of Eckhard Hess. Int J Psychophysiol 2021; 165:145-205. [PMID: 33766646 DOI: 10.1016/j.ijpsycho.2021.03.003] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2020] [Revised: 03/01/2021] [Accepted: 03/09/2021] [Indexed: 02/02/2023]
Abstract
Several papers by Eckhard Hess from the 1960s and 1970s report that the pupils dilate or constrict according to the interest value, arousing content, or mental demands of visual stimuli. However, Hess mostly used small sample sizes and undocumented luminance control. In a first experiment (N = 182) and a second preregistered experiment (N = 147), we replicated five studies of Hess using modern equipment. Our experiments (1) did not support the hypothesis of gender differences in pupil diameter change with respect to baseline (PC) when viewing stimuli of different interest value, (2) showed that solving more difficult multiplications yields a larger PC in the seconds before providing an answer and a larger maximum PC, but a smaller PC at a fixed time after the onset of the multiplication, (3) did not support the hypothesis that participants' PC mimics the pupil diameter in a pair of schematic eyes but not in single-eyed or three-eyed stimuli, (4) did not support the hypothesis of gender differences in PC when watching a video of a male trying to escape a mob, and (5) supported the hypothesis that arousing words yield a higher PC than non-arousing words. Although we did not observe consistent gender differences in PC, additional analyses showed gender differences in eye movements towards erogenous zones. Furthermore, PC strongly correlated with the luminance of the locations where participants looked. Overall, our replications confirm Hess's findings that pupils dilate in response to mental demands and stimuli of an arousing nature. Hess's hypotheses regarding pupil mimicry and gender differences in pupil dilation did not replicate.
Collapse
Affiliation(s)
- J C F de Winter
- Faculty of Mechanical, Maritime and Materials Engineering, Delft University of Technology, the Netherlands.
| | - S M Petermeijer
- Faculty of Mechanical, Maritime and Materials Engineering, Delft University of Technology, the Netherlands
| | - L Kooijman
- Faculty of Mechanical, Maritime and Materials Engineering, Delft University of Technology, the Netherlands
| | - D Dodou
- Faculty of Mechanical, Maritime and Materials Engineering, Delft University of Technology, the Netherlands
| |
Collapse
|
8
|
Efficient Scientific Self-Correction in Times of Crisis. THE NEW COMMON 2021. [PMCID: PMC7978759 DOI: 10.1007/978-3-030-65355-2_23] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Science has been invaluable in combating the COVID-19 pandemic and its consequences. However, science is not flawless: especially research that is performed and written up under high time pressure may be susceptible to errors. Luckily, one of the core principles of science is its ability to self-correct. Traditionally, scientific self-correction is achieved through replication, but this takes time and resources; both of which are scarce. In this chapter, I argue for an additional, more efficient self-correction mechanism: analytical reproducibility checks.
Collapse
|
9
|
Hardwicke TE, Bohn M, MacDonald K, Hembacher E, Nuijten MB, Peloquin BN, deMayo BE, Long B, Yoon EJ, Frank MC. Analytic reproducibility in articles receiving open data badges at the journal Psychological Science: an observational study. ROYAL SOCIETY OPEN SCIENCE 2021; 8:201494. [PMID: 33614084 PMCID: PMC7890505 DOI: 10.1098/rsos.201494] [Citation(s) in RCA: 25] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/21/2020] [Accepted: 12/04/2020] [Indexed: 06/12/2023]
Abstract
For any scientific report, repeating the original analyses upon the original data should yield the original outcomes. We evaluated analytic reproducibility in 25 Psychological Science articles awarded open data badges between 2014 and 2015. Initially, 16 (64%, 95% confidence interval [43,81]) articles contained at least one 'major numerical discrepancy' (>10% difference) prompting us to request input from original authors. Ultimately, target values were reproducible without author involvement for 9 (36% [20,59]) articles; reproducible with author involvement for 6 (24% [8,47]) articles; not fully reproducible with no substantive author response for 3 (12% [0,35]) articles; and not fully reproducible despite author involvement for 7 (28% [12,51]) articles. Overall, 37 major numerical discrepancies remained out of 789 checked values (5% [3,6]), but original conclusions did not appear affected. Non-reproducibility was primarily caused by unclear reporting of analytic procedures. These results highlight that open data alone is not sufficient to ensure analytic reproducibility.
Collapse
Affiliation(s)
- Tom E. Hardwicke
- Department of Psychology, University of Amsterdam, Amsterdam, The Netherlands
- Meta-Research Innovation Center Berlin (METRIC-B), QUEST Center for Transforming Biomedical Research, Charité – Universitätsmedizin, Berlin, Germany
| | - Manuel Bohn
- Department of Comparative Cultural Psychology, Max Planck Institute for Evolutionary Anthropology, Leipzig, Germany
| | - Kyle MacDonald
- Department of Communication, University of California, Los Angeles, CA, USA
| | - Emily Hembacher
- Department of Psychology, Stanford University, Stanford, CA, USA
| | - Michèle B. Nuijten
- Department of Methodology and Statistics, Tilburg School of Social and Behavioral Sciences, Tilburg University, Tilburg, The Netherlands
| | | | | | - Bria Long
- Department of Psychology, Stanford University, Stanford, CA, USA
| | - Erica J. Yoon
- Department of Psychology, Stanford University, Stanford, CA, USA
| | - Michael C. Frank
- Department of Psychology, Stanford University, Stanford, CA, USA
| |
Collapse
|
10
|
How accurately do teachers’ judge students? Re-analysis of Hoge and Coladarci (1989) meta-analysis. CONTEMPORARY EDUCATIONAL PSYCHOLOGY 2020. [DOI: 10.1016/j.cedpsych.2020.101902] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
|
11
|
Nuijten MB, Polanin JR. "statcheck": Automatically detect statistical reporting inconsistencies to increase reproducibility of meta-analyses. Res Synth Methods 2020; 11:574-579. [PMID: 32275351 PMCID: PMC7540394 DOI: 10.1002/jrsm.1408] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2019] [Revised: 01/24/2020] [Accepted: 03/26/2020] [Indexed: 11/11/2022]
Abstract
We present the R package and web app statcheck to automatically detect statistical reporting inconsistencies in primary studies and meta-analyses. Previous research has shown a high prevalence of reported p-values that are inconsistent - meaning a re-calculated p-value, based on the reported test statistic and degrees of freedom, does not match the author-reported p-value. Such inconsistencies affect the reproducibility and evidential value of published findings. The tool statcheck can help researchers to identify statistical inconsistencies so that they may correct them. In this paper, we provide an overview of the prevalence and consequences of statistical reporting inconsistencies. We also discuss the tool statcheck in more detail and give an example of how it can be used in a meta-analysis. We end with some recommendations concerning the use of statcheck in meta-analyses and make a case for better reporting standards of statistical results.
Collapse
Affiliation(s)
- Michèle B. Nuijten
- The Department of Methodology and Statistics, Tilburg UniversityTilburgThe Netherlands
| | - Joshua R. Polanin
- Research & Evaluation, American Institutes for ResearchWashingtonDCUSA
| |
Collapse
|
12
|
Pietschnig J, Siegel M, Eder JSN, Gittler G. Effect Declines Are Systematic, Strong, and Ubiquitous: A Meta-Meta-Analysis of the Decline Effect in Intelligence Research. Front Psychol 2019; 10:2874. [PMID: 31920891 PMCID: PMC6930891 DOI: 10.3389/fpsyg.2019.02874] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2019] [Accepted: 12/04/2019] [Indexed: 12/14/2022] Open
Abstract
Empirical sciences in general and psychological science in particular are plagued by replicability problems and biased published effect sizes. Although dissemination bias-related phenomena such as publication bias, time-lag bias, or visibility bias are well-known and have been intensively studied, another variant of effect distorting mechanisms, so-called decline effects, have not. Conceptually, decline effects are rooted in low initial (exploratory) study power due to strategic researcher behavior and can be expected to yield overproportional effect declines. Although decline effects have been documented in individual meta-analytic investigations, systematic evidence for decline effects in the psychological literature remains to date unavailable. Therefore, we present in this meta-meta-analysis a systematic investigation of the decline effect in intelligence research. In all, data from 22 meta-analyses comprising 36 meta-analytical and 1,391 primary effect sizes (N = 697,000+) that have been published in the journal Intelligence were included in our analyses. Two different analytic approaches showed consistent evidence for a higher prevalence of cross-temporal effect declines compared to effect increases, yielding a ratio of about 2:1. Moreover, effect declines were considerably stronger when referenced to the initial primary study within a meta-analysis, yielding about twice the magnitude of effect increases. Effect misestimations were more substantial when initial studies had smaller sample sizes and reported larger effects, thus indicating suboptimal initial study power as the main driver of effect misestimations in initial studies. Post hoc study power comparisons of initial versus subsequent studies were consistent with this interpretation, showing substantially lower initial study power of declining, than of increasing effects. Our findings add another facet to the ever accumulating evidence about non-trivial effect misestimations in the scientific literature. We therefore stress the necessity for more rigorous protocols when it comes to designing and conducting primary research as well as reporting findings in exploratory and replication studies. Increasing transparency in scientific processes such as data sharing, (exploratory) study preregistration, but also self- (or independent) replication preceding the publication of exploratory findings may be suitable approaches to strengthen the credibility of empirical research in general and psychological science in particular.
Collapse
Affiliation(s)
- Jakob Pietschnig
- Department of Applied Psychology: Health, Development, Enhancement and Intervention, Faculty of Psychology, University of Vienna, Vienna, Austria
| | | | | | | |
Collapse
|