1
|
Piantadosi ST, Gallistel CR. Formalising the role of behaviour in neuroscience. Eur J Neurosci 2024. [PMID: 38858853 DOI: 10.1111/ejn.16372] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2023] [Revised: 01/19/2024] [Accepted: 03/21/2024] [Indexed: 06/12/2024]
Abstract
We develop a mathematical approach to formally proving that certain neural computations and representations exist based on patterns observed in an organism's behaviour. To illustrate, we provide a simple set of conditions under which an ant's ability to determine how far it is from its nest would logically imply neural structures isomorphic to the natural numbersℕ $$ \mathrm{\mathbb{N}} $$ . We generalise these results to arbitrary behaviours and representations and show what mathematical characterisation of neural computation and representation is simplest while being maximally predictive of behaviour. We develop this framework in detail using a path integration example, where an organism's ability to search for its nest in the correct location implies representational structures isomorphic to two-dimensional coordinates under addition. We also study a system for processinga n b n $$ {a}^n{b}^n $$ strings common in comparative work. Our approach provides an objective way to determine what theory of a physical system is best, addressing a fundamental challenge in neuroscientific inference. These results motivate considering which neurobiological structures have the requisite formal structure and are otherwise physically plausible given relevant physical considerations such as generalisability, information density, thermodynamic stability and energetic cost.
Collapse
Affiliation(s)
- Steven T Piantadosi
- Department of Psychology, Department of Neuroscience, UC Berkeley, Berkeley, California, USA
| | | |
Collapse
|
2
|
Kvam PD, Irving LH, Sokratous K, Smith CT. Improving the reliability and validity of the IAT with a dynamic model driven by similarity. Behav Res Methods 2024; 56:2158-2193. [PMID: 37450219 DOI: 10.3758/s13428-023-02141-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/02/2023] [Indexed: 07/18/2023]
Abstract
The Implicit Association Test (IAT), like many behavioral measures, seeks to quantify meaningful individual differences in cognitive processes that are difficult to assess with approaches like self-reports. However, much like other behavioral measures, many IATs appear to show low test-retest reliability and typical scoring methods fail to quantify all of the decision-making processes that generate the overt task performance. Here, we develop a new modeling approach for IATs based on the geometric similarity representation (GSR) model. This model leverages both response times and accuracy on IATs to make inferences about representational similarity between the stimuli and categories. The model disentangles processes related to response caution, stimulus encoding, similarities between concepts and categories, and response processes unrelated to the choice itself. This approach to analyzing IAT data illustrates that the unreliability in IATs is almost entirely attributable to the methods used to analyze data from the task: GSR model parameters show test-retest reliability around .80-.90, on par with reliable self-report measures. Furthermore, we demonstrate how model parameters result in greater validity compared to the IAT D-score, Quad model, and simple diffusion model contrasts, predicting outcomes related to intergroup contact and motivation. Finally, we present a simple point-and-click software tool for fitting the model, which uses a pre-trained neural network to estimate best-fit parameters of the GSR model. This approach allows easy and instantaneous fitting of IAT data with minimal demands on coding or technical expertise on the part of the user, making the new model accessible and effective.
Collapse
Affiliation(s)
- Peter D Kvam
- Department of Psychology, University of Florida, Florida, USA.
| | - Louis H Irving
- Department of Psychology, University of Florida, Florida, USA
| | | | | |
Collapse
|
3
|
Nunez MD, Fernandez K, Srinivasan R, Vandekerckhove J. A tutorial on fitting joint models of M/EEG and behavior to understand cognition. Behav Res Methods 2024:10.3758/s13428-023-02331-x. [PMID: 38409458 DOI: 10.3758/s13428-023-02331-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/21/2023] [Indexed: 02/28/2024]
Abstract
We present motivation and practical steps necessary to find parameter estimates of joint models of behavior and neural electrophysiological data. This tutorial is written for researchers wishing to build joint models of human behavior and scalp and intracranial electroencephalographic (EEG) or magnetoencephalographic (MEG) data, and more specifically those researchers who seek to understand human cognition. Although these techniques could easily be applied to animal models, the focus of this tutorial is on human participants. Joint modeling of M/EEG and behavior requires some knowledge of existing computational and cognitive theories, M/EEG artifact correction, M/EEG analysis techniques, cognitive modeling, and programming for statistical modeling implementation. This paper seeks to give an introduction to these techniques as they apply to estimating parameters from neurocognitive models of M/EEG and human behavior, and to evaluate model results and compare models. Due to our research and knowledge on the subject matter, our examples in this paper will focus on testing specific hypotheses in human decision-making theory. However, most of the motivation and discussion of this paper applies across many modeling procedures and applications. We provide Python (and linked R) code examples in the tutorial and appendix. Readers are encouraged to try the exercises at the end of the document.
Collapse
Affiliation(s)
- Michael D Nunez
- Psychological Methods, University of Amsterdam, Amsterdam, The Netherlands.
| | - Kianté Fernandez
- Department of Psychology, University of California, Los Angeles, CA, USA
| | - Ramesh Srinivasan
- Department of Cognitive Sciences, University of California, Irvine, CA, USA
- Department of Biomedical Engineering, University of California, Irvine, CA, USA
- Institute of Mathematical Behavioral Sciences, University of California, Irvine, CA, USA
| | - Joachim Vandekerckhove
- Department of Cognitive Sciences, University of California, Irvine, CA, USA
- Institute of Mathematical Behavioral Sciences, University of California, Irvine, CA, USA
- Department of Statistics, University of California, Irvine, CA, USA
| |
Collapse
|
4
|
Piantadosi ST. The algorithmic origins of counting. Child Dev 2023; 94:1472-1490. [PMID: 37984061 DOI: 10.1111/cdev.14031] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2022] [Revised: 09/16/2023] [Accepted: 09/19/2023] [Indexed: 11/22/2023]
Abstract
The study of how children learn numbers has yielded one of the most productive research programs in cognitive development, spanning empirical and computational methods, as well as nativist and empiricist philosophies. This paper provides a tutorial on how to think computationally about learning models in a domain like number, where learners take finite data and go far beyond what they directly observe or perceive. To illustrate, this paper then outlines a model which acquires a counting procedure using observations of sets and words, extending the proposal of Piantadosi et al. (2012). This new version of the model responds to several critiques of the original work and outlines an approach which is likely appropriate for acquiring further aspects of mathematics.
Collapse
|
5
|
van Ravenzwaaij D, Bakker M, Heesen R, Romero F, van Dongen N, Crüwell S, Field SM, Held L, Munafò MR, Pittelkow MM, Tiokhin L, Traag VA, van den Akker OR, van ‘t Veer AE, Wagenmakers EJ. Perspectives on scientific error. ROYAL SOCIETY OPEN SCIENCE 2023; 10:230448. [PMID: 37476516 PMCID: PMC10354464 DOI: 10.1098/rsos.230448] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/07/2023] [Accepted: 06/27/2023] [Indexed: 07/22/2023]
Abstract
Theoretical arguments and empirical investigations indicate that a high proportion of published findings do not replicate and are likely false. The current position paper provides a broad perspective on scientific error, which may lead to replication failures. This broad perspective focuses on reform history and on opportunities for future reform. We organize our perspective along four main themes: institutional reform, methodological reform, statistical reform and publishing reform. For each theme, we illustrate potential errors by narrating the story of a fictional researcher during the research cycle. We discuss future opportunities for reform. The resulting agenda provides a resource to usher in an era that is marked by a research culture that is less error-prone and a scientific publication landscape with fewer spurious findings.
Collapse
Affiliation(s)
- D. van Ravenzwaaij
- Department of Psychology, University of Groningen, Grote Kruisstraat 2/1, Heymans Building, room 239, 9712 TS Groningen, The Netherlands
| | - M. Bakker
- Tilburg University, 5037 AB Tilburg, The Netherlands
| | - R. Heesen
- University of Western Australia, Perth, Western Australia 6009, Australia
- London School of Economics and Political Science, London WC2A 2AE, UK
| | - F. Romero
- Department of Psychology, University of Groningen, Grote Kruisstraat 2/1, Heymans Building, room 239, 9712 TS Groningen, The Netherlands
| | - N. van Dongen
- University of Amsterdam, 1012 WP Amsterdam, The Netherlands
| | - S. Crüwell
- Department of History and Philosophy of Science, University of Cambridge, Cambridge CB2 1TN, UK
| | - S. M. Field
- Centre for Science and Technology Studies (CWTS), Leiden University, 2311 EZ Leiden, The Netherlands
| | - L. Held
- University of Zurich, 8006 Zürich, Switzerland
| | - M. R. Munafò
- School of Psychological Science, University of Bristol, Bristol BS8 1QU, UK
| | - M. M. Pittelkow
- Department of Psychology, University of Groningen, Grote Kruisstraat 2/1, Heymans Building, room 239, 9712 TS Groningen, The Netherlands
- QUEST Center for Transforming Biomedical Research, Berlin Institute of Health, Charité—Universitätsmedizin, 10178 Berlin, Germany
| | - L. Tiokhin
- IG&H Consulting, 3528 AC Utrecht, The Netherlands
| | - V. A. Traag
- Centre for Science and Technology Studies (CWTS), Leiden University, 2311 EZ Leiden, The Netherlands
| | - O. R. van den Akker
- Tilburg University, 5037 AB Tilburg, The Netherlands
- QUEST Center for Transforming Biomedical Research, Berlin Institute of Health, Charité—Universitätsmedizin, 10178 Berlin, Germany
| | - A. E. van ‘t Veer
- Methodology and Statistics Unit, Institute of Psychology, Leiden University, 2333 AK Leiden, The Netherlands
| | | |
Collapse
|
6
|
El Amin M, Borders JC, Long HL, Keller MA, Kearney E. Open Science Practices in Communication Sciences and Disorders: A Survey. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2023; 66:1928-1947. [PMID: 36417765 PMCID: PMC10554559 DOI: 10.1044/2022_jslhr-22-00062] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/27/2022] [Revised: 07/15/2022] [Accepted: 08/07/2022] [Indexed: 06/16/2023]
Abstract
PURPOSE Open science is a collection of practices that seek to improve the accessibility, transparency, and replicability of science. Although these practices have garnered interest in related fields, it remains unclear whether open science practices have been adopted in the field of communication sciences and disorders (CSD). This study aimed to survey the knowledge, implementation, and perceived benefits and barriers of open science practices in CSD. METHOD An online survey was disseminated to researchers in the United States actively engaged in CSD research. Four-core open science practices were examined: preregistration, self-archiving, gold open access, and open data. Data were analyzed using descriptive statistics and regression models. RESULTS Two hundred twenty-two participants met the inclusion criteria. Most participants were doctoral students (38%) or assistant professors (24%) at R1 institutions (58%). Participants reported low knowledge of preregistration and gold open access. There was, however, a high level of desire to learn more for all practices. Implementation of open science practices was also low, most notably for preregistration, gold open access, and open data (< 25%). Predictors of knowledge and participation, as well as perceived barriers to implementation, are discussed. CONCLUSION Although participation in open science appears low in the field of CSD, participants expressed a strong desire to learn more in order to engage in these practices in the future. Supplemental Material and Open Science Form: https://doi.org/10.23641/asha.21569040.
Collapse
Affiliation(s)
- Mariam El Amin
- Communication Sciences and Disorders, University of Georgia, Athens
| | - James C. Borders
- Department of Biobehavioral Sciences, Teacher College, Columbia University, New York, NY
| | | | | | - Elaine Kearney
- Department of Speech, Language and Hearing Sciences, Boston University, MA
| |
Collapse
|
7
|
Pownall M, Azevedo F, König LM, Slack HR, Evans TR, Flack Z, Grinschgl S, Elsherif MM, Gilligan-Lee KA, de Oliveira CMF, Gjoneska B, Kalandadze T, Button K, Ashcroft-Jones S, Terry J, Albayrak-Aydemir N, Děchtěrenko F, Alzahawi S, Baker BJ, Pittelkow MM, Riedl L, Schmidt K, Pennington CR, Shaw JJ, Lüke T, Makel MC, Hartmann H, Zaneva M, Walker D, Verheyen S, Cox D, Mattschey J, Gallagher-Mitchell T, Branney P, Weisberg Y, Izydorczak K, Al-Hoorie AH, Creaven AM, Stewart SLK, Krautter K, Matvienko-Sikar K, Westwood SJ, Arriaga P, Liu M, Baum MA, Wingen T, Ross RM, O'Mahony A, Bochynska A, Jamieson M, Tromp MV, Yeung SK, Vasilev MR, Gourdon-Kanhukamwe A, Micheli L, Konkol M, Moreau D, Bartlett JE, Clark K, Brekelmans G, Gkinopoulos T, Tyler SL, Röer JP, Ilchovska ZG, Madan CR, Robertson O, Iley BJ, Guay S, Sladekova M, Sadhwani S. Teaching open and reproducible scholarship: a critical review of the evidence base for current pedagogical methods and their outcomes. ROYAL SOCIETY OPEN SCIENCE 2023; 10:221255. [PMID: 37206965 PMCID: PMC10189598 DOI: 10.1098/rsos.221255] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/27/2022] [Accepted: 04/26/2023] [Indexed: 05/21/2023]
Abstract
In recent years, the scientific community has called for improvements in the credibility, robustness and reproducibility of research, characterized by increased interest and promotion of open and transparent research practices. While progress has been positive, there is a lack of consideration about how this approach can be embedded into undergraduate and postgraduate research training. Specifically, a critical overview of the literature which investigates how integrating open and reproducible science may influence student outcomes is needed. In this paper, we provide the first critical review of literature surrounding the integration of open and reproducible scholarship into teaching and learning and its associated outcomes in students. Our review highlighted how embedding open and reproducible scholarship appears to be associated with (i) students' scientific literacies (i.e. students' understanding of open research, consumption of science and the development of transferable skills); (ii) student engagement (i.e. motivation and engagement with learning, collaboration and engagement in open research) and (iii) students' attitudes towards science (i.e. trust in science and confidence in research findings). However, our review also identified a need for more robust and rigorous methods within pedagogical research, including more interventional and experimental evaluations of teaching practice. We discuss implications for teaching and learning scholarship.
Collapse
Affiliation(s)
| | - Flávio Azevedo
- Department of Psychology, University of Cambridge, CB2 3EB, UK
| | - Laura M. König
- Faculty of Life Sciences: Food, Nutrition and Health, University of Bayreuth, 95447 Bayreuth, Germany
| | - Hannah R. Slack
- School of Psychology, University of Nottingham, Nottingham NG7 2RD, UK
| | - Thomas Rhys Evans
- School of Human Sciences, University of Greenwich, London SE10 9LS, UK
- Centre for Workforce Development, Institute for Lifecourse Development, University of Greenwich, London SE10 9LS, UK
| | - Zoe Flack
- School of Humanities and Social Science, University of Brighton, BN2 0JY, UK
| | | | | | | | | | - Biljana Gjoneska
- Macedonian Academy of Sciences and Arts, North Macedonia, XCWR+GJM, 1000
| | - Tamara Kalandadze
- Faculty of Teacher Education and Languages, Department of Education, ICT and Learning, Ostfold University College, 1757 Halden, Norway
| | | | - Sarah Ashcroft-Jones
- Department of Experimental Psychology, University of Oxford, Oxford OX1 4BH18, UK
| | - Jenny Terry
- School of Psychology, University of Sussex, Brighton BN1 9RH, UK
| | - Nihan Albayrak-Aydemir
- School of Psychology and Counselling, the Open University, Milton Keynes MK7 6AA, UK
- Department of Psychological and Behavioural Science, London School of Economics and Political Science, UK
| | - Filip Děchtěrenko
- Department of Mathematics, College of Polytechnics Jihlava, 1556/16, 586 01, Czech Republic
| | | | - Bradley J. Baker
- Department of Sport and Recreation Management, Temple University, PA 19122, USA
| | - Merle-Marie Pittelkow
- Department of Psychology, University of Groningen, 9712 CP, Groningen, the Netherlands
| | - Lydia Riedl
- Department of Psychiatry and Psychotherapy, Philipps-University Marburg, D-35039 Marburg, Germany
| | | | | | - John J. Shaw
- Division of Psychology, De Montfort University, Leicester LE1 9BH, UK
| | - Timo Lüke
- Institute for Educational Research and Teacher Education, University of Graz, Graz, 8010 Graz, Austria
| | | | - Helena Hartmann
- Department for Cognition, Emotion, and Methods in Psychology, University of Vienna, Vienna 1010, Austria
| | - Mirela Zaneva
- Department of Experimental Psychology, University of Oxford, Oxford OX1 4BH18, UK
| | - Daniel Walker
- Department of Psychology, Faculty of Management, Law and Social Sciences, University of Bradford, Bradford BD7 1DP, UK
| | - Steven Verheyen
- Department of Psychology, Education and Child Studies, Erasmus University Rotterdam, Rotterdam 3000, The Netherlands
| | - Daniel Cox
- Division of Neuroscience and Experimental Psychology, University of Manchester, Manchester M13 9PL, UK
| | - Jennifer Mattschey
- School of Psychology and Counselling, the Open University, Milton Keynes MK7 6AA, UK
| | | | - Peter Branney
- Department of Psychology, Faculty of Management, Law and Social Sciences, University of Bradford, Bradford BD7 1DP, UK
| | - Yanna Weisberg
- Department of Psychology, Linfield University, Linfield, 503-883-2200, USA
| | - Kamil Izydorczak
- Faculty of Psychology in Wrocław, SWPS University of Social Sciences and Humanities, Wrocław 03-81536, Al Jubail 35819, Poland
| | - Ali H. Al-Hoorie
- Jubail English Language and Preparatory Year Institute, Royal Commission for Jubail and Yanbu, Saudi Arabia
| | | | | | - Kai Krautter
- Department of Psychology, Saarland University, 66123 Saarbrücken, Germany
| | | | - Samuel J. Westwood
- Department of Psychology, School of Social Science, University of Westminster, London W1B 2HW, UK
| | - Patrícia Arriaga
- Iscte-Universty Institute of Lisbon, CIS-IUL, 1649-026 Lisboa, Portugal
| | - Meng Liu
- Faculty of Education, University of Cambridge, Cambridge CB2 1TN, UK
| | - Myriam A. Baum
- Department of Psychology, Saarland University, 66123 Saarbrücken, Germany
| | - Tobias Wingen
- Institute of General Practice and Family Medicine, University Hospital Bonn, University of Bonn, 53127 Bonn, Germany
| | - Robert M. Ross
- Department of Philosophy, Macquarie University, NSW 2109, Australia
| | - Aoife O'Mahony
- School of Psychology, Cardiff University, Cardiff CF10 3AT, UK
| | | | - Michelle Jamieson
- School of Social and Political Sciences, University of Glasgow, Glasgow G12 8QQ, UK
| | - Myrthe Vel Tromp
- Department of Psychology, Leiden University, 2311 EZ Leiden, The Netherlands
| | - Siu Kit Yeung
- Department of Psychology, the Chinese University of Hong Kong, Hong Kong, SAR 100871, People's Republic of China
| | - Martin R. Vasilev
- Department of Psychology, Bournemouth University, Poole BH12 5BB, UK
| | | | - Leticia Micheli
- Department of Psychology III, University of Würzburg, 97070 Würzburg, Germany
| | - Markus Konkol
- Faculty for Geo-Information Science and Earth Observation, University of Twente, 7522 NB, The Netherlands
| | - David Moreau
- School of Psychology, University of Auckland, Auckland 1142, New Zealand
| | - James E. Bartlett
- School of Psychology and Neuroscience, University of Glasgow, Glasgow G12 8QQ, UK
| | - Kait Clark
- Department of Social Sciences, University of the West of England, Bristol BS16 1QY, UK
| | - Gwen Brekelmans
- Department of Biological and Experimental Psychology, Queen Mary University of London, E1 4NS, UK
| | | | - Samantha L. Tyler
- Department of Neuroscience, Psychology and Behaviour, University of Leicester, UK
| | | | | | | | - Olly Robertson
- Departments of Psychiatry and Experimental Psychology, University of Oxford, UK
- School of Psychology, Keele University, Newcastle ST5 5BG, UK
| | - Bethan J. Iley
- School of Psychology, Queen's University Belfast, Belfast BT7 1NN, UK
| | - Samuel Guay
- Department of Psychology, University of Montreal, Canada
| | - Martina Sladekova
- School of Humanities and Social Science, University of Brighton, BN2 0JY, UK
- School of Psychology, University of Sussex, Brighton BN1 9RH, UK
| | - Shanu Sadhwani
- School of Humanities and Social Science, University of Brighton, BN2 0JY, UK
| | - FORRT
- Framework for Open and Reproducible Research Training
| |
Collapse
|
8
|
Schroeder PA, Artemenko C, Kosie JE, Cockx H, Stute K, Pereira J, Klein F, Mehler DMA. Using preregistration as a tool for transparent fNIRS study design. NEUROPHOTONICS 2023; 10:023515. [PMID: 36908680 PMCID: PMC9993433 DOI: 10.1117/1.nph.10.2.023515] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/09/2022] [Accepted: 01/11/2023] [Indexed: 05/04/2023]
Abstract
Significance The expansion of functional near-infrared spectroscopy (fNIRS) methodology and analysis tools gives rise to various design and analytical decisions that researchers have to make. Several recent efforts have developed guidelines for preprocessing, analyzing, and reporting practices. For the planning stage of fNIRS studies, similar guidance is desirable. Study preregistration helps researchers to transparently document study protocols before conducting the study, including materials, methods, and analyses, and thus, others to verify, understand, and reproduce a study. Preregistration can thus serve as a useful tool for transparent, careful, and comprehensive fNIRS study design. Aim We aim to create a guide on the design and analysis steps involved in fNIRS studies and to provide a preregistration template specified for fNIRS studies. Approach The presented preregistration guide has a strong focus on fNIRS specific requirements, and the associated template provides examples based on continuous-wave (CW) fNIRS studies conducted in humans. These can, however, be extended to other types of fNIRS studies. Results On a step-by-step basis, we walk the fNIRS user through key methodological and analysis-related aspects central to a comprehensive fNIRS study design. These include items specific to the design of CW, task-based fNIRS studies, but also sections that are of general importance, including an in-depth elaboration on sample size planning. Conclusions Our guide introduces these open science tools to the fNIRS community, providing researchers with an overview of key design aspects and specification recommendations for comprehensive study planning. As such it can be used as a template to preregister fNIRS studies or merely as a tool for transparent fNIRS study design.
Collapse
Affiliation(s)
- Philipp A. Schroeder
- University of Tuebingen, Department of Psychology, Faculty of Science, Tuebingen, Germany
| | - Christina Artemenko
- University of Tuebingen, Department of Psychology, Faculty of Science, Tuebingen, Germany
| | - Jessica E. Kosie
- Princeton University, Social and Natural Sciences, Department of Psychology, Princeton, New Jersey, United States
| | - Helena Cockx
- Radboud University, Donders Institute for Brain, Cognition and Behaviour, Biophysics Department, Faculty of Science, Nijmegen, The Netherlands
| | - Katharina Stute
- Chemnitz University of Technology, Institute of Human Movement Science and Health, Faculty of Behavioural and Social Sciences, Chemnitz, Germany
| | - João Pereira
- University of Coimbra, Coimbra Institute for Biomedical Imaging and Translational Research, Coimbra, Portugal
| | - Franziska Klein
- University of Oldenburg, Department of Psychology, Neurocognition and functional Neurorehabilitation Group, Oldenburg (Oldb), Germany
- RWTH Aachen University, Medical School, Department of Psychiatry, Psychotherapy and Psychosomatics, Aachen, Germany
| | - David M. A. Mehler
- RWTH Aachen University, Medical School, Department of Psychiatry, Psychotherapy and Psychosomatics, Aachen, Germany
- University of Münster, Institute for Translational Psychiatry, Medical School, Münster, Germany
| |
Collapse
|
9
|
Rauvola RS, Rudolph CW. Worker aging, control, and well-being: A specification curve analysis. Acta Psychol (Amst) 2023; 233:103833. [PMID: 36623471 DOI: 10.1016/j.actpsy.2023.103833] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2022] [Revised: 12/14/2022] [Accepted: 01/05/2023] [Indexed: 01/09/2023] Open
Abstract
Among the many work (and life) characteristics of relevance to adult development and aging, various forms of control are some of the most extensively and diversely studied. Indeed, "control," whether objectively held (i.e., "actual" control), perceived, or enacted through self-regulation, is a concept central to our understanding of person-environment interactions, development, and well-being within and across life domains. However, variability in conceptualization and analysis in the literature on control presents challenges to integration. To partially address these gaps, the present study sought to explore the effects of conceptual and analytical specification decisions (e.g., construct types, time, covariates) on observed control-well-being relationships in a large, age-diverse, longitudinal sample (Midlife in the United States I, II, and III datasets), providing a specification curve analysis (SCA) tutorial and guidance in the process. Results suggest that construct types and operationalizations, particularly predictor variables, have bearing on observed results, with certain types of control serving as better predictors of various forms of well-being than others. These findings and identified gaps are summarized to provide direction for theoretical clarification and reconciliation in the control and lifespan development literatures, construct selection and operationalization in future aging and work research, and inclusive, well-specified interventions to improve employee well-being.
Collapse
Affiliation(s)
| | - Cort W Rudolph
- Department of Psychology, Saint Louis University, Saint Louis, MO, USA
| |
Collapse
|
10
|
Buzbas EO, Devezer B, Baumgaertner B. The logical structure of experiments lays the foundation for a theory of reproducibility. ROYAL SOCIETY OPEN SCIENCE 2023; 10:221042. [PMID: 36938532 PMCID: PMC10014247 DOI: 10.1098/rsos.221042] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/11/2022] [Accepted: 02/02/2023] [Indexed: 06/18/2023]
Abstract
The scientific reform movement has proposed openness as a potential remedy to the putative reproducibility or replication crisis. However, the conceptual relationship among openness, replication experiments and results reproducibility has been obscure. We analyse the logical structure of experiments, define the mathematical notion of idealized experiment and use this notion to advance a theory of reproducibility. Idealized experiments clearly delineate the concepts of replication and results reproducibility, and capture key differences with precision, allowing us to study the relationship among them. We show how results reproducibility varies as a function of the elements of an idealized experiment, the true data-generating mechanism, and the closeness of the replication experiment to an original experiment. We clarify how openness of experiments is related to designing informative replication experiments and to obtaining reproducible results. With formal backing and evidence, we argue that the current 'crisis' reflects inadequate attention to a theoretical understanding of results reproducibility.
Collapse
Affiliation(s)
- Erkan O. Buzbas
- Department of Mathematics and Statistical Science, University of Idaho, Moscow, ID 83844, USA
| | - Berna Devezer
- Department of Mathematics and Statistical Science, University of Idaho, Moscow, ID 83844, USA
- Department of Business, University of Idaho, Moscow, ID 83844, USA
| | - Bert Baumgaertner
- Department of Politics and Philosophy, University of Idaho, Moscow, ID 83844, USA
| |
Collapse
|
11
|
Tang B, Levine M, Adamek JH, Wodka EL, Caffo BS, Ewen JB. Evaluating causal psychological models: A study of language theories of autism using a large sample. Front Psychol 2023; 14:1060525. [PMID: 36910768 PMCID: PMC9998497 DOI: 10.3389/fpsyg.2023.1060525] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/03/2022] [Accepted: 02/03/2023] [Indexed: 03/14/2023] Open
Abstract
We used a large convenience sample (n = 22,223) from the Simons Powering Autism Research (SPARK) dataset to evaluate causal, explanatory theories of core autism symptoms. In particular, the data-items collected supported the testing of theories that posited altered language abilities as cause of social withdrawal, as well as alternative theories that competed with these language theories. Our results using this large dataset converge with the evolution of the field in the decades since these theories were first proposed, namely supporting primary social withdrawal (in some cases of autism) as a cause of altered language development, rather than vice versa. To accomplish the above empiric goals, we used a highly theory-constrained approach, one which differs from current data-driven modeling trends but is coherent with a very recent resurgence in theory-driven psychology. In addition to careful explication and formalization of theoretical accounts, we propose three principles for future work of this type: specification, quantification, and integration. Specification refers to constraining models with pre-existing data, from both outside and within autism research, with more elaborate models and more veridical measures, and with longitudinal data collection. Quantification refers to using continuous measures of both psychological causes and effects, as well as weighted graphs. This approach avoids "universality and uniqueness" tests that hold that a single cognitive difference could be responsible for a heterogeneous and complex behavioral phenotype. Integration of multiple explanatory paths within a single model helps the field examine for multiple contributors to a single behavioral feature or to multiple behavioral features. It also allows integration of explanatory theories across multiple current-day diagnoses and as well as typical development.
Collapse
Affiliation(s)
- Bohao Tang
- Bloomberg School of Public Health, Johns Hopkins University, Baltimore, MD, United States
| | | | - Jack H Adamek
- Kennedy Krieger Institute, Baltimore, MD, United States
| | - Ericka L Wodka
- Kennedy Krieger Institute, Baltimore, MD, United States.,School of Medicine, Johns Hopkins University, Baltimore, MD, United States
| | - Brian S Caffo
- Bloomberg School of Public Health, Johns Hopkins University, Baltimore, MD, United States
| | - Joshua B Ewen
- Kennedy Krieger Institute, Baltimore, MD, United States.,School of Medicine, Johns Hopkins University, Baltimore, MD, United States.,Neurology and Developmental Medicine, Kennedy Krieger Institute, Baltimore, MD, United States
| |
Collapse
|
12
|
Hardwicke TE, Wagenmakers EJ. Reducing bias, increasing transparency and calibrating confidence with preregistration. Nat Hum Behav 2023; 7:15-26. [PMID: 36707644 DOI: 10.1038/s41562-022-01497-2] [Citation(s) in RCA: 8] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2021] [Accepted: 11/09/2022] [Indexed: 01/29/2023]
Abstract
Flexibility in the design, analysis and interpretation of scientific studies creates a multiplicity of possible research outcomes. Scientists are granted considerable latitude to selectively use and report the hypotheses, variables and analyses that create the most positive, coherent and attractive story while suppressing those that are negative or inconvenient. This creates a risk of bias that can lead to scientists fooling themselves and fooling others. Preregistration involves declaring a research plan (for example, hypotheses, design and statistical analyses) in a public registry before the research outcomes are known. Preregistration (1) reduces the risk of bias by encouraging outcome-independent decision-making and (2) increases transparency, enabling others to assess the risk of bias and calibrate their confidence in research outcomes. In this Perspective, we briefly review the historical evolution of preregistration in medicine, psychology and other domains, clarify its pragmatic functions, discuss relevant meta-research, and provide recommendations for scientists and journal editors.
Collapse
Affiliation(s)
- Tom E Hardwicke
- Department of Psychology, University of Amsterdam, Amsterdam, the Netherlands.
| | | |
Collapse
|
13
|
Barros JM, Widmer LA, Baillie M, Wandel S. Rethinking clinical study data: why we should respect analysis results as data. Sci Data 2022; 9:686. [PMID: 36357430 PMCID: PMC9649650 DOI: 10.1038/s41597-022-01789-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2022] [Accepted: 10/18/2022] [Indexed: 11/12/2022] Open
Abstract
The development and approval of new treatments generates large volumes of results, such as summaries of efficacy and safety. However, it is commonly overlooked that analyzing clinical study data also produces data in the form of results. For example, descriptive statistics and model predictions are data. Although integrating and putting findings into context is a cornerstone of scientific work, analysis results are often neglected as a data source. Results end up stored as "data products" such as PDF documents that are not machine readable or amenable to future analyses. We propose a solution to "calculate once, use many times" by combining analysis results standards with a common data model. This analysis results data model re-frames the target of analyses from static representations of the results (e.g., tables and figures) to a data model with applications in various contexts, including knowledge discovery. Further, we provide a working proof of concept detailing how to approach standardization and construct a schema to store and query analysis results.
Collapse
Affiliation(s)
- Joana M Barros
- Analytics, Novartis Pharma AG, Basel, Switzerland.
- Department of Biometry, Idorsia Pharmaceuticals, Allschwil, Switzerland.
| | | | - Mark Baillie
- Analytics, Novartis Pharma AG, Basel, Switzerland.
| | - Simon Wandel
- Analytics, Novartis Pharma AG, Basel, Switzerland
| |
Collapse
|
14
|
Niso G, Botvinik-Nezer R, Appelhoff S, De La Vega A, Esteban O, Etzel JA, Finc K, Ganz M, Gau R, Halchenko YO, Herholz P, Karakuzu A, Keator DB, Markiewicz CJ, Maumet C, Pernet CR, Pestilli F, Queder N, Schmitt T, Sójka W, Wagner AS, Whitaker KJ, Rieger JW. Open and reproducible neuroimaging: From study inception to publication. Neuroimage 2022; 263:119623. [PMID: 36100172 PMCID: PMC10008521 DOI: 10.1016/j.neuroimage.2022.119623] [Citation(s) in RCA: 29] [Impact Index Per Article: 14.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2022] [Revised: 08/17/2022] [Accepted: 09/09/2022] [Indexed: 10/31/2022] Open
Abstract
Empirical observations of how labs conduct research indicate that the adoption rate of open practices for transparent, reproducible, and collaborative science remains in its infancy. This is at odds with the overwhelming evidence for the necessity of these practices and their benefits for individual researchers, scientific progress, and society in general. To date, information required for implementing open science practices throughout the different steps of a research project is scattered among many different sources. Even experienced researchers in the topic find it hard to navigate the ecosystem of tools and to make sustainable choices. Here, we provide an integrated overview of community-developed resources that can support collaborative, open, reproducible, replicable, robust and generalizable neuroimaging throughout the entire research cycle from inception to publication and across different neuroimaging modalities. We review tools and practices supporting study inception and planning, data acquisition, research data management, data processing and analysis, and research dissemination. An online version of this resource can be found at https://oreoni.github.io. We believe it will prove helpful for researchers and institutions to make a successful and sustainable move towards open and reproducible science and to eventually take an active role in its future development.
Collapse
Affiliation(s)
- Guiomar Niso
- Psychological & Brain Sciences, Indiana University, Bloomington, IN, USA; Universidad Politecnica de Madrid, Madrid and CIBER-BBN, Spain; Instituto Cajal, CSIC, Madrid, Spain.
| | - Rotem Botvinik-Nezer
- Department of Psychological and Brain Sciences, Dartmouth College, Hanover, NH, USA.
| | - Stefan Appelhoff
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Berlin, Germany
| | | | - Oscar Esteban
- Dept. of Radiology, Lausanne University Hospital and University of Lausanne, Lausanne, Switzerland; Department of Psychology, Stanford University, Stanford, CA, USA
| | - Joset A Etzel
- Department of Psychological and Brain Sciences, Washington University in St. Louis, St. Louis, MO, USA
| | - Karolina Finc
- Centre for Modern Interdisciplinary Technologies, Nicolaus Copernicus University, Toruń, Poland
| | - Melanie Ganz
- Neurobiology Research Unit, Rigshospitalet, Copenhagen, Denmark; Department of Computer Science, University of Copenhagen, Copenhagen, Denmark
| | - Rémi Gau
- Institute of Psychology, Université catholique de Louvain, Louvain la Neuve, Belgium
| | - Yaroslav O Halchenko
- Department of Psychological and Brain Sciences, Dartmouth College, Hanover, NH, USA
| | - Peer Herholz
- Montreal Neurological Institute-Hospital, McGill University, Montréal, Quebec, Canada
| | - Agah Karakuzu
- Biomedical Engineering Institute, Polytechnique Montréal, Montréal, Quebec, Canada; Montréal Heart Institute, Montréal, Quebec, Canada
| | - David B Keator
- Department of Psychiatry and Human Behavior, University of California, Irvine, CA, USA
| | | | - Camille Maumet
- Inria, Univ Rennes, CNRS, Inserm - IRISA UMR 6074, Empenn ERL U 1228, Rennes, France
| | - Cyril R Pernet
- Neurobiology Research Unit, Rigshospitalet, Copenhagen, Denmark
| | - Franco Pestilli
- Psychological & Brain Sciences, Indiana University, Bloomington, IN, USA; Department of Psychology, The University of Texas at Austin, Austin, TX, USA
| | - Nazek Queder
- Montreal Neurological Institute-Hospital, McGill University, Montréal, Quebec, Canada; Department of Neurobiology and Behavior, University of California, Irvine, CA, USA
| | - Tina Schmitt
- Neuroimaging Unit, Carl-von-Ossietzky Universität, Oldenburg, Germany
| | - Weronika Sójka
- Faculty of Philosophy and Social Sciences, Nicolaus Copernicus University, Toruń, Poland
| | - Adina S Wagner
- Institute for Neuroscience and Medicine, Research Centre Juelich, Germany
| | | | - Jochem W Rieger
- Neuroimaging Unit, Carl-von-Ossietzky Universität, Oldenburg, Germany; Department of Psychology, Carl-von-Ossietzky Universität, Oldenburg, Germany.
| |
Collapse
|
15
|
Rubin M, Donkin C. Exploratory hypothesis tests can be more compelling than confirmatory hypothesis tests. PHILOSOPHICAL PSYCHOLOGY 2022. [DOI: 10.1080/09515089.2022.2113771] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/15/2022]
Affiliation(s)
- Mark Rubin
- Department of Psychology, Durham University, Durham, UK
| | - Chris Donkin
- Faculty of Psychology and Educational Sciences, Ludwig Maximilian University of Munich, Munich, Germany
| |
Collapse
|
16
|
Niso G, Krol LR, Combrisson E, Dubarry AS, Elliott MA, François C, Héjja-Brichard Y, Herbst SK, Jerbi K, Kovic V, Lehongre K, Luck SJ, Mercier M, Mosher JC, Pavlov YG, Puce A, Schettino A, Schön D, Sinnott-Armstrong W, Somon B, Šoškić A, Styles SJ, Tibon R, Vilas MG, van Vliet M, Chaumon M. Good scientific practice in EEG and MEG research: Progress and perspectives. Neuroimage 2022; 257:119056. [PMID: 35283287 PMCID: PMC11236277 DOI: 10.1016/j.neuroimage.2022.119056] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2021] [Revised: 02/25/2022] [Accepted: 03/01/2022] [Indexed: 11/22/2022] Open
Abstract
Good scientific practice (GSP) refers to both explicit and implicit rules, recommendations, and guidelines that help scientists to produce work that is of the highest quality at any given time, and to efficiently share that work with the community for further scrutiny or utilization. For experimental research using magneto- and electroencephalography (MEEG), GSP includes specific standards and guidelines for technical competence, which are periodically updated and adapted to new findings. However, GSP also needs to be regularly revisited in a broader light. At the LiveMEEG 2020 conference, a reflection on GSP was fostered that included explicitly documented guidelines and technical advances, but also emphasized intangible GSP: a general awareness of personal, organizational, and societal realities and how they can influence MEEG research. This article provides an extensive report on most of the LiveMEEG contributions and new literature, with the additional aim to synthesize ongoing cultural changes in GSP. It first covers GSP with respect to cognitive biases and logical fallacies, pre-registration as a tool to avoid those and other early pitfalls, and a number of resources to enable collaborative and reproducible research as a general approach to minimize misconceptions. Second, it covers GSP with respect to data acquisition, analysis, reporting, and sharing, including new tools and frameworks to support collaborative work. Finally, GSP is considered in light of ethical implications of MEEG research and the resulting responsibility that scientists have to engage with societal challenges. Considering among other things the benefits of peer review and open access at all stages, the need to coordinate larger international projects, the complexity of MEEG subject matter, and today's prioritization of fairness, privacy, and the environment, we find that current GSP tends to favor collective and cooperative work, for both scientific and for societal reasons.
Collapse
Affiliation(s)
- Guiomar Niso
- Psychological & Brain Sciences, Indiana University, Bloomington, IN, USA; Universidad Politecnica de Madrid and CIBER-BBN, Madrid, Spain
| | - Laurens R Krol
- Neuroadaptive Human-Computer Interaction, Brandenburg University of Technology Cottbus-Senftenberg, Germany
| | - Etienne Combrisson
- Aix-Marseille University, Institut de Neurosciences de la Timone, France
| | | | | | | | - Yseult Héjja-Brichard
- Centre d'Ecologie Fonctionnelle et Evolutive, CNRS, EPHE, IRD, Université Montpellier, Montpellier, France
| | - Sophie K Herbst
- Cognitive Neuroimaging Unit, INSERM, CEA, CNRS, NeuroSpin center, Université Paris-Saclay, Gif/Yvette, France
| | - Karim Jerbi
- Cognitive and Computational Neuroscience Laboratory, Department of Psychology, University of Montreal, Montreal, QC, Canada; Mila - Quebec Artificial Intelligence Institute, Canada
| | - Vanja Kovic
- Faculty of Philosophy, Laboratory for neurocognition and applied cognition, University of Belgrade, Serbia
| | - Katia Lehongre
- Institut du Cerveau - Paris Brain Institute - ICM, Inserm U 1127, CNRS UMR 7225, APHP, Hôpital de la Pitié Salpêtrière, Sorbonne Université, Centre MEG-EEG, Centre de NeuroImagerie Recherche (CENIR), Paris, France
| | - Steven J Luck
- Center for Mind & Brain, University of California, Davis, CA, USA
| | - Manuel Mercier
- Aix Marseille Univ, Inserm, INS, Inst Neurosci Syst, Marseille, France
| | - John C Mosher
- McGovern Medical School, University of Texas Health Science Center at Houston, Houston, TX, USA
| | - Yuri G Pavlov
- University of Tuebingen, Germany; Ural Federal University, Yekaterinburg, Russia
| | - Aina Puce
- Psychological & Brain Sciences, Indiana University, Bloomington, IN, USA
| | - Antonio Schettino
- Erasmus University Rotterdam, Rotterdam, the Netherland; Institute for Globally Distributed Open Research and Education (IGDORE), Sweden
| | - Daniele Schön
- Aix Marseille Univ, Inserm, INS, Inst Neurosci Syst, Marseille, France
| | | | | | - Anđela Šoškić
- Faculty of Philosophy, Laboratory for neurocognition and applied cognition, University of Belgrade, Serbia; Teacher Education Faculty, University of Belgrade, Serbia
| | - Suzy J Styles
- Psychology, Nanyang Technological University, Singapore; Singapore Institute for Clinical Sciences, A*STAR, Singapore
| | - Roni Tibon
- MRC Cognition and Brain Sciences Unit, University of Cambridge, Cambridge, UK; School of Psychology, University of Nottingham, Nottingham, UK
| | - Martina G Vilas
- Ernst Strüngmann Institute for Neuroscience, Frankfurt am Main, Germany
| | | | - Maximilien Chaumon
- Institut du Cerveau - Paris Brain Institute - ICM, Inserm U 1127, CNRS UMR 7225, APHP, Hôpital de la Pitié Salpêtrière, Sorbonne Université, Centre MEG-EEG, Centre de NeuroImagerie Recherche (CENIR), Paris, France..
| |
Collapse
|
17
|
Abstract
A well-formulated research question should incorporate the components of a ‘problem’, an ‘intervention’, a ‘control’, and an ‘outcome’—at least according to the PICO mnemonic. The utility of this format, however, has been said to be limited to clinical studies that pose ‘which’ questions demanding correlational study designs. In contrast, its suitability for descriptive approaches outside of clinical investigations has been doubted. This paper disagrees with the alleged limitations of PICO. Instead, it argues that the scheme can be used universally for every scientific endeavour in any discipline with all study designs. This argument draws from four abstract components common to every research, namely, a research object, a theory/method, a (null) hypothesis, and the goal of knowledge generation. Various examples of how highly heterogenous studies from different disciplines can be grounded in the single scheme of PICO are offered. The finding implies that PICO is indeed a universal technique that can be used for teaching academic writing in any discipline, beyond clinical settings, regardless of a preferred study design.
Collapse
|
18
|
Baillie M, Moloney C, Mueller CP, Dorn J, Branson J, Ohlssen D. Good Data Science Practice: Moving Towards a Code of Practice for Drug Development. Stat Biopharm Res 2022. [DOI: 10.1080/19466315.2022.2063172] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/29/2022]
Affiliation(s)
- Mark Baillie
- Clinical Development & Analytics, Novartis Pharma AG, Basel, Switzerland
| | - Conor Moloney
- Clinical Development & Analytics, Novartis Pharma AG, Dublin, Ireland
| | | | - Jonas Dorn
- pRED Informatics, Roche, Basel, Switzerland
| | - Janice Branson
- Clinical Development & Analytics, Novartis Pharma AG, Basel, Switzerland
| | - David Ohlssen
- Clinical Development & Analytics, Novartis Pharma AG, East Hannover, New Jersey, USA
| |
Collapse
|
19
|
Abstract
The recent reform debates in psychological science, prompted by a widespread crisis of confidence, have exposed and destabilized the so-called myth of self-correction, that is, the problem that most scientists perceive their disciplines as self-correcting without engaging in actual practices that correct the scientific record. In this paper, building on the idea of self-correction as a myth, I propose another myth common to psychological science: the myth of self-organization. The myth of self-organization is the idea that scientific literature will organize itself into something the community adding to it would recognize as systematic knowledge; while the actual members of those communities do not engage in effective ways of organizing it. I argue for the existence of the myth self-organization by taking a historical look at how the scientific literature was construed by psychologists during the 20th century. In my view, the literature, and behaviors of scientists related to it, becomes a social institution exerting influence over the science it belongs to. I conclude with a critical discussion of self-organization through the debates about preregistration and theory formalization in psychology’s reform movement.
Collapse
Affiliation(s)
- Ivan Flis
- Department of Psychology, Catholic University of Croatia, Zagreb, Croatia
| |
Collapse
|
20
|
Taper ML, Lele SR, Ponciano JM, Dennis B, Jerde CL. Assessing the Global and Local Uncertainty of Scientific Evidence in the Presence of Model Misspecification. Front Ecol Evol 2021. [DOI: 10.3389/fevo.2021.679155] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022] Open
Abstract
Scientists need to compare the support for models based on observed phenomena. The main goal of the evidential paradigm is to quantify the strength of evidence in the data for a reference model relative to an alternative model. This is done via an evidence function, such as ΔSIC, an estimator of the sample size scaled difference of divergences between the generating mechanism and the competing models. To use evidence, either for decision making or as a guide to the accumulation of knowledge, an understanding of the uncertainty in the evidence is needed. This uncertainty is well characterized by the standard statistical theory of estimation. Unfortunately, the standard theory breaks down if the models are misspecified, as is commonly the case in scientific studies. We develop non-parametric bootstrap methodologies for estimating the sampling distribution of the evidence estimator under model misspecification. This sampling distribution allows us to determine how secure we are in our evidential statement. We characterize this uncertainty in the strength of evidence with two different types of confidence intervals, which we term “global” and “local.” We discuss how evidence uncertainty can be used to improve scientific inference and illustrate this with a reanalysis of the model identification problem in a prominent landscape ecology study using structural equations.
Collapse
|
21
|
Haucke M, Hoekstra R, van Ravenzwaaij D. When numbers fail: do researchers agree on operationalization of published research? ROYAL SOCIETY OPEN SCIENCE 2021; 8:191354. [PMID: 34527263 PMCID: PMC8424321 DOI: 10.1098/rsos.191354] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/05/2019] [Accepted: 08/04/2021] [Indexed: 06/13/2023]
Abstract
Current discussions on improving the reproducibility of science often revolve around statistical innovations. However, equally important for improving methodological rigour is a valid operationalization of phenomena. Operationalization is the process of translating theoretical constructs into measurable laboratory quantities. Thus, the validity of operationalization is central for the quality of empirical studies. But do differences in the validity of operationalization affect the way scientists evaluate scientific literature? To investigate this, we manipulated the strength of operationalization of three published studies and sent them to researchers via email. In the first task, researchers were presented with a summary of the Method and Result section from one of the studies and were asked to guess the hypothesis that was investigated via a multiple-choice questionnaire. In a second task, researchers were asked to rate the perceived quality of the study. Our results show that (1) researchers are better at inferring the underlying research question from empirical results if the operationalization is more valid, but (2) the different validity is only to some extent reflected in a judgement of the study's quality. These results combined give partial corroboration to the notion that researchers' evaluations of research results are not affected by operationalization validity.
Collapse
Affiliation(s)
- Matthias Haucke
- Clinical Psychology and Psychotherapy, Department of Education and Psychology, Freie Universität Berlin, Berlin, Germany
- Department of Psychometrics, University of Groningen, Groningen, The Netherlands
| | - Rink Hoekstra
- Department of Pedagogical and Educational Sciences, University of Groningen, Groningen, The Netherlands
| | - Don van Ravenzwaaij
- Department of Psychometrics, University of Groningen, Groningen, The Netherlands
| |
Collapse
|
22
|
van Rooij I, Baggio G. Theory Before the Test: How to Build High-Verisimilitude Explanatory Theories in Psychological Science. PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2021; 16:682-697. [PMID: 33404356 PMCID: PMC8273840 DOI: 10.1177/1745691620970604] [Citation(s) in RCA: 66] [Impact Index Per Article: 22.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/21/2023]
Abstract
Drawing on the philosophy of psychological explanation, we suggest that psychological science, by focusing on effects, may lose sight of its primary explananda: psychological capacities. We revisit Marr's levels-of-analysis framework, which has been remarkably productive and useful for cognitive psychological explanation. We discuss ways in which Marr's framework may be extended to other areas of psychology, such as social, developmental, and evolutionary psychology, bringing new benefits to these fields. We then show how theoretical analyses can endow a theory with minimal plausibility even before contact with empirical data: We call this the theoretical cycle. Finally, we explain how our proposal may contribute to addressing critical issues in psychological science, including how to leverage effects to understand capacities better.
Collapse
Affiliation(s)
- Iris van Rooij
- Donders Institute for Brain, Cognition and Behaviour, Radboud University
| | - Giosuè Baggio
- Department of Language and Literature, Norwegian University of Science and Technology
| |
Collapse
|
23
|
|