1
|
Bartoš F, Maier M, Quintana DS, Wagenmakers EJ. Adjusting for Publication Bias in JASP and R: Selection Models, PET-PEESE, and Robust Bayesian Meta-Analysis. ADVANCES IN METHODS AND PRACTICES IN PSYCHOLOGICAL SCIENCE 2022. [DOI: 10.1177/25152459221109259] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Meta-analyses are essential for cumulative science, but their validity can be compromised by publication bias. To mitigate the impact of publication bias, one may apply publication-bias-adjustment techniques such as precision-effect test and precision-effect estimate with standard errors (PET-PEESE) and selection models. These methods, implemented in JASP and R, allow researchers without programming experience to conduct state-of-the-art publication-bias-adjusted meta-analysis. In this tutorial, we demonstrate how to conduct a publication-bias-adjusted meta-analysis in JASP and R and interpret the results. First, we explain two frequentist bias-correction methods: PET-PEESE and selection models. Second, we introduce robust Bayesian meta-analysis, a Bayesian approach that simultaneously considers both PET-PEESE and selection models. We illustrate the methodology on an example data set, provide an instructional video ( https://bit.ly/pubbias ) and an R-markdown script ( https://osf.io/uhaew/ ), and discuss the interpretation of the results. Finally, we include concrete guidance on reporting the meta-analytic results in an academic article.
Collapse
Affiliation(s)
- František Bartoš
- Department of Psychological Methods, University of Amsterdam, Amsterdam, The Netherlands
- Institute of Computer Science, Czech Academy of Sciences, Prague, Czech Republic
| | - Maximilian Maier
- Department of Psychological Methods, University of Amsterdam, Amsterdam, The Netherlands
- Department of Experimental Psychology, University College London, London, England
| | - Daniel S. Quintana
- Department of Psychology, University of Oslo, Oslo, Norway
- NevSom, Department of Rare Disorders, Oslo University Hospital, Oslo, Norway
- Norwegian Centre for Mental Disorders Research (NORMENT), University of Oslo, Oslo, Norway
- KG Jebsen Centre for Neurodevelopmental Disorders, University of Oslo, Oslo, Norway
| | - Eric-Jan Wagenmakers
- Department of Psychological Methods, University of Amsterdam, Amsterdam, The Netherlands
| |
Collapse
|
2
|
Dang J, Barker P, Baumert A, Bentvelzen M, Berkman E, Buchholz N, Buczny J, Chen Z, De Cristofaro V, de Vries L, Dewitte S, Giacomantonio M, Gong R, Homan M, Imhoff R, Ismail I, Jia L, Kubiak T, Lange F, Li DY, Livingston J, Ludwig R, Panno A, Pearman J, Rassi N, Schiöth HB, Schmitt M, Sevincer AT, Shi J, Stamos A, Tan YC, Wenzel M, Zerhouni O, Zhang LW, Zhang YJ, Zinkernagel A. A Multilab Replication of the Ego Depletion Effect. SOCIAL PSYCHOLOGICAL AND PERSONALITY SCIENCE 2021; 12:14-24. [PMID: 34113424 DOI: 10.1177/1948550619887702] [Citation(s) in RCA: 45] [Impact Index Per Article: 15.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
There is an active debate regarding whether the ego depletion effect is real. A recent preregistered experiment with the Stroop task as the depleting task and the antisaccade task as the outcome task found a medium-level effect size. In the current research, we conducted a preregistered multilab replication of that experiment. Data from 12 labs across the globe (N = 1,775) revealed a small and significant ego depletion effect, d = 0.10. After excluding participants who might have responded randomly during the outcome task, the effect size increased to d = 0.16. By adding an informative, unbiased data point to the literature, our findings contribute to clarifying the existence, size, and generality of ego depletion.
Collapse
Affiliation(s)
- Junhua Dang
- Department of Neuroscience, Uppsala University, Sweden
| | - Paul Barker
- Social Cognition Center Cologne, University of Cologne, Germany
| | - Anna Baumert
- Max Planck Institute for Research on Collective Goods, Bonn, Germany.,TUM School of Education, München, Germany
| | | | - Elliot Berkman
- Department of Psychology, University of Oregon, Eugene, OR, USA
| | - Nita Buchholz
- Department of Psychology, University of Koblenz-Landau, Mainz, Germany
| | - Jacek Buczny
- Department of Experimental and Applied Psychology, VU University Amsterdam, the Netherlands
| | - Zhansheng Chen
- Department of Psychology, The University of Hong Kong, Hong Kong
| | - Valeria De Cristofaro
- Department of Social and Developmental Psychology, University of Rome "Sapienza," Italy
| | - Lianne de Vries
- Department of Experimental and Applied Psychology, VU University Amsterdam, the Netherlands
| | | | - Mauro Giacomantonio
- Department of Social and Developmental Psychology, University of Rome "Sapienza," Italy
| | - Ran Gong
- Department of Psychology, Beijing Sport University, China
| | - Maaike Homan
- Amsterdam Institute for Social Science Research, University of Amsterdam, the Netherlands
| | - Roland Imhoff
- Social and Legal Psychology, Johannes Gutenberg University Mainz, Germany
| | - Ismaharif Ismail
- Department of Psychology, National University of Singapore, Singapore
| | - Lile Jia
- Department of Psychology, National University of Singapore, Singapore
| | - Thomas Kubiak
- Heath Psychology, Johannes Gutenberg University Mainz, Germany
| | | | - Dan-Yang Li
- Department of Psychology, Beijing Sport University, China
| | | | - Rita Ludwig
- Department of Psychology, University of Oregon, Eugene, OR, USA
| | - Angelo Panno
- Department of Social and Developmental Psychology, University of Rome "Sapienza," Italy
| | - Joshua Pearman
- Department of Psychology, University of Oregon, Eugene, OR, USA
| | - Niklas Rassi
- Institute of Psychology, University of Hamburg, Germany
| | | | - Manfred Schmitt
- Department of Psychology, University of Koblenz-Landau, Mainz, Germany
| | | | - Jiaxin Shi
- Department of Psychology, The University of Hong Kong, Hong Kong
| | | | - Yia-Chin Tan
- Department of Psychology, National University of Singapore, Singapore
| | - Mario Wenzel
- Heath Psychology, Johannes Gutenberg University Mainz, Germany
| | - Oulmann Zerhouni
- Laboratoire Parisien de Psychologie Sociale, University Paris Nanterre, France
| | - Li-Wei Zhang
- Department of Psychology, Beijing Sport University, China
| | - Yi-Jia Zhang
- Department of Psychology, Beijing Sport University, China
| | - Axel Zinkernagel
- Department of Psychology, University of Koblenz-Landau, Mainz, Germany
| |
Collapse
|
3
|
Hinne M, Gronau QF, van den Bergh D, Wagenmakers EJ. A Conceptual Introduction to Bayesian Model Averaging. ADVANCES IN METHODS AND PRACTICES IN PSYCHOLOGICAL SCIENCE 2020. [DOI: 10.1177/2515245919898657] [Citation(s) in RCA: 56] [Impact Index Per Article: 14.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022]
Abstract
Many statistical scenarios initially involve several candidate models that describe the data-generating process. Analysis often proceeds by first selecting the best model according to some criterion and then learning about the parameters of this selected model. Crucially, however, in this approach the parameter estimates are conditioned on the selected model, and any uncertainty about the model-selection process is ignored. An alternative is to learn the parameters for all candidate models and then combine the estimates according to the posterior probabilities of the associated models. This approach is known as Bayesian model averaging (BMA). BMA has several important advantages over all-or-none selection methods, but has been used only sparingly in the social sciences. In this conceptual introduction, we explain the principles of BMA, describe its advantages over all-or-none model selection, and showcase its utility in three examples: analysis of covariance, meta-analysis, and network analysis.
Collapse
Affiliation(s)
- Max Hinne
- Donders Institute for Brain, Cognition and Behaviour, Radboud University Nijmegen
| | | | | | | |
Collapse
|
4
|
Abstract
Abstract. Ego depletion has been successfully replicated in hundreds of studies. Yet the most recent large-scale Registered Replication Reports (RRR), comprising thousands of participants, have yielded disappointingly small effects, sometimes even failing to reach statistical significance. Although these results may seem surprising, in the present article I suggest that they are perfectly consistent with a long-term decline in the size of the depletion effects that can be traced back to at least 10 years ago, well before any of the RRR on ego depletion were conceived. The decline seems to be at least partly due to a parallel trend toward publishing better and less biased research.
Collapse
Affiliation(s)
- Miguel A. Vadillo
- Departamento de Psicología Básica, Facultad de Psicología, Universidad Autónoma de Madrid, Spain
| |
Collapse
|
5
|
Carter EC, Schönbrodt FD, Gervais WM, Hilgard J. Correcting for Bias in Psychology: A Comparison of Meta-Analytic Methods. ADVANCES IN METHODS AND PRACTICES IN PSYCHOLOGICAL SCIENCE 2019. [DOI: 10.1177/2515245919847196] [Citation(s) in RCA: 169] [Impact Index Per Article: 33.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/11/2022]
Abstract
Publication bias and questionable research practices in primary research can lead to badly overestimated effects in meta-analysis. Methodologists have proposed a variety of statistical approaches to correct for such overestimation. However, it is not clear which methods work best for data typically seen in psychology. Here, we present a comprehensive simulation study in which we examined how some of the most promising meta-analytic methods perform on data that might realistically be produced by research in psychology. We simulated several levels of questionable research practices, publication bias, and heterogeneity, and used study sample sizes empirically derived from the literature. Our results clearly indicated that no single meta-analytic method consistently outperformed all the others. Therefore, we recommend that meta-analysts in psychology focus on sensitivity analyses—that is, report on a variety of methods, consider the conditions under which these methods fail (as indicated by simulation studies such as ours), and then report how conclusions might change depending on which conditions are most plausible. Moreover, given the dependence of meta-analytic methods on untestable assumptions, we strongly recommend that researchers in psychology continue their efforts to improve the primary literature and conduct large-scale, preregistered replications. We provide detailed results and simulation code at https://osf.io/rf3ys and interactive figures at http://www.shinyapps.org/apps/metaExplorer/ .
Collapse
Affiliation(s)
- Evan C. Carter
- Human Research and Engineering Directorate, U.S. Army Research Laboratory, Aberdeen, Maryland
| | | | | | | |
Collapse
|