1
|
Porto de Oliveira JVM, de Oliveira Júnior ALF, de Freitas Martins LP, Dourado HN, Purificação IR, Kolias AG, Paiva WS, Solla DJF. Spin in traumatic brain injury literature: prevalence and associated factors. A systematic review. J Neurosurg 2024:1-8. [PMID: 38728757 DOI: 10.3171/2023.11.jns231822] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2023] [Accepted: 11/21/2023] [Indexed: 05/12/2024]
Abstract
OBJECTIVE Spin is characterized as a misinterpretation of results that, whether deliberate or unintentional, culminates in misleading conclusions and steers readers toward an excessively optimistic perspective of the data. The primary objective of this systematic review was to estimate the prevalence and nature of spin within the traumatic brain injury (TBI) literature. Additionally, the identification of associated factors is intended to provide guidance for future research practices. METHODS The Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) recommendations were followed. A search of the MEDLINE/PubMed database was conducted to identify English-language articles published between January 1960 and July 2020. Inclusion criteria encompassed randomized controlled trials (RCTs) that exclusively enrolled TBI patients, investigating various interventions, whether surgical or nonsurgical, and that were published in high-impact journals. Spin was defined as 1) a focus on statistically significant results not based on the primary outcome; 2) interpreting statistically nonsignificant results for a superiority analysis of the primary outcome; 3) claiming or emphasizing the beneficial effect of the treatment despite statistically nonsignificant results; 4) conclusion focused in the per-protocol or as-treated analysis instead of the intention-to-treat (ITT) results; 5) incorrect statistical analysis; or 6) republication of a significant secondary analysis without proper acknowledgment of the primary outcome analysis result. Primary outcomes were those explicitly reported as such in the published article. Studies without a clear primary outcome were excluded. The study characteristics were described using traditional descriptive statistics and an exploratory inferential analysis was performed to identify those associated with spin. The studies' risk of bias was evaluated by the Cochrane Risk of Bias Tool. RESULTS A total of 150 RCTs were included and 22% (n = 33) had spin, most commonly spin types 1 and 3. The overall risk of bias (p < 0.001), a neurosurgery department member as the first author (p = 0.009), absence of a statistician among authors (p = 0.042), and smaller sample sizes (p = 0.033) were associated with spin. CONCLUSIONS The prevalence of spin in the TBI literature is high, even at leading medical journals. Studies with higher risks of bias are more frequently associated with spin. Critical interpretation of results and authors' conclusions is advisable regardless of the study design and published journal.
Collapse
Affiliation(s)
| | | | | | | | | | - Angelos G Kolias
- 3NIHR Global Health Research Group on Neurotrauma, University of Cambridge, United Kingdom
- 4Division of Neurosurgery, Department of Clinical Neurosciences, Addenbrooke's Hospital, University of Cambridge, United Kingdom; and
| | - Wellingson Silva Paiva
- 3NIHR Global Health Research Group on Neurotrauma, University of Cambridge, United Kingdom
- 5Division of Neurosurgery, Department of Neurology, Hospital das Clínicas of the Faculty of Medicine, University of São Paulo, Brazil
| | - Davi Jorge Fontoura Solla
- 3NIHR Global Health Research Group on Neurotrauma, University of Cambridge, United Kingdom
- 5Division of Neurosurgery, Department of Neurology, Hospital das Clínicas of the Faculty of Medicine, University of São Paulo, Brazil
| |
Collapse
|
2
|
Slaney KL, Graham ME, Dhillon RS, Hohn RE. Rhetoric of psychological measurement theory and practice. Front Psychol 2024; 15:1374330. [PMID: 38699572 PMCID: PMC11064813 DOI: 10.3389/fpsyg.2024.1374330] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/21/2024] [Accepted: 04/02/2024] [Indexed: 05/05/2024] Open
Abstract
Metascience scholars have long been concerned with tracking the use of rhetorical language in scientific discourse, oftentimes to analyze the legitimacy and validity of scientific claim-making. Psychology, however, has only recently become the explicit target of such metascientific scholarship, much of which has been in response to the recent crises surrounding replicability of quantitative research findings and questionable research practices. The focus of this paper is on the rhetoric of psychological measurement and validity scholarship, in both the theoretical and methodological and empirical literatures. We examine various discourse practices in published psychological measurement and validity literature, including: (a) clear instances of rhetoric (i.e., persuasion or performance); (b) common or rote expressions and tropes (e.g., perfunctory claims or declarations); (c) metaphors and other "literary" styles; and (d) ambiguous, confusing, or unjustifiable claims. The methodological approach we use is informed by a combination of conceptual analysis and exploratory grounded theory, the latter of which we used to identify relevant themes within the published psychological discourse. Examples of both constructive and useful or misleading and potentially harmful discourse practices will be given. Our objectives are both to contribute to the critical methodological literature on psychological measurement and connect metascience in psychology to broader interdisciplinary examinations of science discourse.
Collapse
|
3
|
Grimes DR. Region of Attainable Redaction, an extension of Ellipse of Insignificance analysis for gauging impacts of data redaction in dichotomous outcome trials. eLife 2024; 13:e93050. [PMID: 38284745 PMCID: PMC10871715 DOI: 10.7554/elife.93050] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2023] [Accepted: 01/23/2024] [Indexed: 01/30/2024] Open
Abstract
In biomedical science, it is a reality that many published results do not withstand deeper investigation, and there is growing concern over a replicability crisis in science. Recently, Ellipse of Insignificance (EOI) analysis was introduced as a tool to allow researchers to gauge the robustness of reported results in dichotomous outcome design trials, giving precise deterministic values for the degree of miscoding between events and non-events tolerable simultaneously in both control and experimental arms (Grimes, 2022). While this is useful for situations where potential miscoding might transpire, it does not account for situations where apparently significant findings might result from accidental or deliberate data redaction in either the control or experimental arms of an experiment, or from missing data or systematic redaction. To address these scenarios, we introduce Region of Attainable Redaction (ROAR), a tool that extends EOI analysis to account for situations of potential data redaction. This produces a bounded cubic curve rather than an ellipse, and we outline how this can be used to identify potential redaction through an approach analogous to EOI. Applications are illustrated, and source code, including a web-based implementation that performs EOI and ROAR analysis in tandem for dichotomous outcome trials is provided.
Collapse
Affiliation(s)
- David Robert Grimes
- School of Medicine, Trinity College DublinDublinIreland
- School of Physical Sciences, Dublin City UniversityDublinIreland
| |
Collapse
|
4
|
Grimes DR. Is biomedical research self-correcting? Modelling insights on the persistence of spurious science. R Soc Open Sci 2024; 11:231056. [PMID: 38298396 PMCID: PMC10827424 DOI: 10.1098/rsos.231056] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Figures] [Subscribe] [Scholar Register] [Received: 07/21/2023] [Accepted: 01/08/2024] [Indexed: 02/02/2024]
Abstract
The reality that volumes of published biomedical research are not reproducible is an increasingly recognized problem. Spurious results reduce trustworthiness of reported science, increasing research waste. While science should be self-correcting from a philosophical perspective, that in insolation yields no information on efforts required to nullify suspect findings or factors shaping how quickly science may be corrected. There is also a paucity of information on how perverse incentives in the publishing ecosystem favouring novel positive findings over null results shape the ability of published science to self-correct. Knowledge of factors shaping self-correction of science remain obscure, limiting our ability to mitigate harms. This modelling study introduces a simple model to capture dynamics of the publication ecosystem, exploring factors influencing research waste, trustworthiness, corrective effort and time to correction. Results from this work indicate that research waste and corrective effort are highly dependent on field-specific false positive rates and time delays to corrective results to spurious findings are propagated. The model also suggests conditions under which biomedical science is self-correcting and those under which publication of correctives alone cannot stem propagation of untrustworthy results. Finally, this work models a variety of potential mitigation strategies, including researcher- and publisher-driven interventions.
Collapse
Affiliation(s)
- David Robert Grimes
- School of Medicine, Trinity College, Dublin, Ireland
- School of Physical Sciences, Dublin City University, Dublin, Ireland
| |
Collapse
|
5
|
Schneider J. Sorry we're open, come in we're closed: different profiles in the perceived applicability of open science practices to completed research projects. R Soc Open Sci 2024; 11:230595. [PMID: 38298393 PMCID: PMC10827419 DOI: 10.1098/rsos.230595] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/26/2023] [Accepted: 01/08/2024] [Indexed: 02/02/2024]
Abstract
Open science is an increasingly important topic for research, politics and funding agencies. However, the discourse on open science is heavily influenced by certain research fields and paradigms, leading to the risk of generalizing what counts as openness to other research fields, regardless of its applicability. In our paper, we provide evidence that researchers perceive different profiles in the potential to apply open science practices to their projects, making a one-size-fits-all approach unsuitable. In a pilot study, we first systematized the breadth of open science practices. The subsequent survey study examined the perceived applicability of 13 open science practices across completed research projects in a broad variety of research disciplines. We were able to identify four different profiles in the perceived applicability of open science practices. For researchers conducting qualitative-empirical research projects, comprehensively implementing the breadth of open science practices is tendentially not feasible. Further, research projects from some disciplines tended to fit a profile with little opportunity for public participation. Yet, disciplines and research paradigms appear not to be the key factors in predicting the perceived applicability of open science practices. Our findings underscore the case for considering project-related conditions when implementing open science practices. This has implications for the establishment of policies, guidelines and standards concerning open science.
Collapse
Affiliation(s)
- Jürgen Schneider
- Department of Teacher and Teaching Quality, DIPF | Leibniz Institute for Research and Information in Education, Rostocker Straße 6, 60323 Frankfurt am Main, Germany
| |
Collapse
|
6
|
Watson NM, Thomas JD. Studying Adherence to Reporting Standards in Kinesiology: A Post-publication Peer Review Brief Report. Int J Exerc Sci 2024; 17:25-37. [PMID: 38666001 PMCID: PMC11042891] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 04/28/2024]
Abstract
To demonstrate how post-publication peer reviews-using journal article reporting standards-could improve the design and write-up of kinesiology research, the authors performed a post-publication peer review on one systematic literature review published in 2020. Two raters (1st & 2nd authors) critically appraised the case article between April and May 2021. The latest Journal Article Reporting Standards by the American Psychological Association relevant to the review were used: i.e., Table 1 (quantitative research standards) and Table 9 (research synthesis standards). A standard fully met was deemed satisfactory. Per Krippendorff's alpha-coefficient, inter-rater agreement was moderate for Table 1 (k-alpha = .57, raw-agreement = 72.2%) and poor for Table 9 (k-alpha = .09, raw-agreement = 53.6%). A 100% consensus was reached on all discrepancies. Results suggest the case article's Abstract, Methods, and Discussion sections required clarification or more detail. Per Table 9 standards, four sections were largely incomplete: i.e., Abstract (100%-incomplete), Introduction (66%-incomplete), Methods (75%-incomplete), and Discussion (66%-incomplete). Case article strengths included tabular summary of studies analyzed in the systematic review and a cautionary comment about the review's generalizability. The article's write-up gave detail to help the reader understand the scope of the study and decisions made by the authors. However, adequate detail was not provided to assess the credibility of all claims made in the article. This could affect readers' ability to obtain critical and nuanced understanding of the article's topics. The results of this critique should encourage (continuing) education on journal article reporting standards for diverse stakeholders (e.g., authors, reviewers).
Collapse
Affiliation(s)
- Nikki M Watson
- Department of Kinesiology and Public Health, California Polytechnic State University-San Luis Obispo, San Luis Obispo, CA, USA
| | - Jafrā D Thomas
- Department of Kinesiology and Public Health, California Polytechnic State University-San Luis Obispo, San Luis Obispo, CA, USA
| |
Collapse
|
7
|
Qian W, Zhang C, Piersiak HA, Humphreys KL, Mitchell C. Biomarker adoption in developmental science: A data-driven modelling of trends from 90 biomarkers across 20 years. Infant Child Dev 2024; 33:e2366. [PMID: 38389732 PMCID: PMC10882483 DOI: 10.1002/icd.2366] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2021] [Accepted: 07/26/2022] [Indexed: 11/11/2022]
Abstract
Developmental scientists have adopted numerous biomarkers in their research to better understand the biological underpinnings of development, environmental exposures, and variation in long-term health. Yet, adoption patterns merit investigation given the substantial resources used to collect, analyse, and train to use biomarkers in research with infants and children. We document trends in use of 90 biomarkers between 2000 and 2020 from approximately 430,000 publications indexed by the Web of Science. We provide a tool for researchers to examine each of these biomarkers individually using a data-driven approach to estimate the biomarker growth trajectory based on yearly publication number, publication growth rate, number of author affiliations, National Institutes of Health dedicated funding resources, journal impact factor, and years since the first publication. Results indicate that most biomarkers fit a "learning curve" trajectory (i.e., experience rapid growth followed by a plateau), though a small subset decline in use over time.
Collapse
Affiliation(s)
| | - Chao Zhang
- Vanderbilt University, Nashville, Tennessee, USA
| | | | | | | |
Collapse
|
8
|
Dresler M. FENS-Kavli Network of Excellence: Postponed, non-competitive peer review for research funding. Eur J Neurosci 2023; 58:4441-4448. [PMID: 36085597 DOI: 10.1111/ejn.15818] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2022] [Revised: 08/17/2022] [Accepted: 08/29/2022] [Indexed: 11/29/2022]
Abstract
Receiving research grants is among the highlights of an academic career, affirming previous accomplishments and enabling new research endeavours. Much of the process of acquiring research funding, however, belongs to the less favourite duties of many researchers: It is time consuming, often stressful and, in the majority of cases, unsuccessful. This resentment towards funding acquisition is backed up by empirical research: The current system to distribute research funding, via competitive calls for extensive research applications that undergo peer review, has repeatedly been shown to fail in its task to reliably rank proposals according to their merit, while at the same time being highly inefficient. The simplest, fairest and broadly supported alternative would be to distribute funding more equally across researchers, for example, by an increase of universities' base funding, thereby saving considerable time that can be spent on research instead. Here, I propose how to combine such a 'funding flat rate' model-or other efficient distribution strategies-with quality control through postponed, non-competitive peer review using open science practices.
Collapse
Affiliation(s)
- Martin Dresler
- Donders Institute for Brain, Cognition and Behaviour, Nijmegen, The Netherlands
- Radboud University Medical Center, Nijmegen, The Netherlands
| |
Collapse
|
9
|
Eidels A. Prior beliefs and the interpretation of scientific results. R Soc Open Sci 2023; 10:231613. [PMID: 38126060 PMCID: PMC10731315 DOI: 10.1098/rsos.231613] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/25/2023] [Accepted: 11/22/2023] [Indexed: 12/23/2023]
Abstract
How do prior beliefs affect the interpretation of scientific results? I discuss a hypothetical scenario where researchers publish results that could either support a theory they believe in, or refute that theory, and ask if the two instances carry the same weight. More colloquially, I ask if we should overweigh scientific results supporting a given theory and reported by a researcher, or a team, that initially did not support that theory. I illustrate the challenge using two examples from psychology: evidence accumulation models, and extra sensory perception.
Collapse
Affiliation(s)
- Ami Eidels
- School of Psychological Science, The University of Newcastle, Callaghan, New South Wales, Australia
| |
Collapse
|
10
|
Williams AJ, Botanov Y, Giovanetti AK, Perko VL, Sutherland CL, Youngren W, Sakaluk JK. A Metascientific Review of the Evidential Value of Acceptance and Commitment Therapy for Depression. Behav Ther 2023; 54:989-1005. [PMID: 37863589 DOI: 10.1016/j.beth.2022.06.004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/08/2021] [Revised: 05/31/2022] [Accepted: 06/13/2022] [Indexed: 11/02/2022]
Abstract
In the past three-and-a-half decades, nearly 500 randomized controlled trials (RCTs) have examined Acceptance and Commitment Therapy (ACT) for a range of health problems, including depression. However, emerging concerns regarding the replicability of scientific findings across psychology and mental health treatment outcome research highlight a need to re-examine the strength of evidence for treatment efficacy. Therefore, we conducted a metascientific review of the evidential value of ACT in treating depression. Whereas reporting accuracy was generally high across all trials, we found important differences in evidential value metrics corresponding to the types of control conditions used. RCTs of ACT compared to weaker controls (e.g., no treatment, waitlist) were well-powered, with sample sizes appropriate for detecting plausible effect sizes. They typically yielded stronger Bayesian evidence for (and larger posterior estimates of) ACT efficacy, though there was some evidence of significance inflation among these effects. RCTs of ACT against stronger controls (e.g., other psychotherapies), meanwhile, were poorly powered, designed to detect implausibly large effect sizes, and yielded ambiguous-if not contradicting-Bayesian evidence and estimates of efficacy. Although our review supports a view of ACT as efficacious for treating depression compared to weaker controls, future RCTs must provide more transparent reporting with larger groups of participants to properly assess the difference between ACT and competitor treatments such as behavioral activation and other forms of cognitive behavioral therapy. Clinicians and health organizations should reassess the use of ACT for depression if costs and resources are higher than for other efficacious treatments. Clinical trials contributing effects to our synthesis can be found at https://osf.io/qky35.
Collapse
|
11
|
Boyce V, Mathur M, Frank MC. Eleven years of student replication projects provide evidence on the correlates of replicability in psychology. R Soc Open Sci 2023; 10:231240. [PMID: 38026006 PMCID: PMC10645069 DOI: 10.1098/rsos.231240] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/22/2023] [Accepted: 10/20/2023] [Indexed: 12/01/2023]
Abstract
Cumulative scientific progress requires empirical results that are robust enough to support theory construction and extension. Yet in psychology, some prominent findings have failed to replicate, and large-scale studies suggest replicability issues are widespread. The identification of predictors of replication success is limited by the difficulty of conducting large samples of independent replication experiments, however: most investigations reanalyse the same set of 170 replications . We introduce a new dataset of 176 replications from students in a graduate-level methods course. Replication results were judged to be successful in 49% of replications; of the 136 where effect sizes could be numerically compared, 46% had point estimates within the prediction interval of the original outcome (versus the expected 95%). Larger original effect sizes and within-participants designs were especially related to replication success. Our results indicate that, consistent with prior reports, the robustness of the psychology literature is low enough to limit cumulative progress by student investigators.
Collapse
Affiliation(s)
- Veronica Boyce
- Department of Psychology, Department of Medicine, Stanford University, Stanford, CA, USA
| | - Maya Mathur
- Quantitative Sciences Unit, Department of Medicine, Stanford University, Stanford, CA, USA
| | - Michael C. Frank
- Department of Psychology, Department of Medicine, Stanford University, Stanford, CA, USA
| |
Collapse
|
12
|
Pownall M, Talbot CV, Kilby L, Branney P. Opportunities, challenges and tensions: Open science through a lens of qualitative social psychology. Br J Soc Psychol 2023; 62:1581-1589. [PMID: 36718588 DOI: 10.1111/bjso.12628] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
In recent years, there has been a focus in social psychology on efforts to improve the robustness, rigour, transparency and openness of psychological research. This has led to a plethora of new tools, practices and initiatives that each aim to combat questionable research practices and improve the credibility of social psychological scholarship. However, the majority of these efforts derive from quantitative, deductive, hypothesis-testing methodologies, and there has been a notable lack of in-depth exploration about what the tools, practices and values may mean for research that uses qualitative methodologies. Here, we introduce a Special Section of BJSP: Open Science, Qualitative Methods and Social Psychology: Possibilities and Tensions. The authors critically discuss a range of issues, including authorship, data sharing and broader research practices. Taken together, these papers urge the discipline to carefully consider the ontological, epistemological and methodological underpinnings of efforts to improve psychological science, and advocate for a critical appreciation of how mainstream open science discourse may (or may not) be compatible with the goals of qualitative research.
Collapse
Affiliation(s)
| | | | - Laura Kilby
- Department of Psychology, Sociology and Politics, Sheffield Hallam University, Sheffield, UK
| | - Peter Branney
- Department of Psychology, Faculty of Management, Law & Social Sciences, School of Social Sciences, University of Bradford, Bradford, UK
| |
Collapse
|
13
|
Thompson WH, Skau S. On the scope of scientific hypotheses. R Soc Open Sci 2023; 10:230607. [PMID: 37650069 PMCID: PMC10465209 DOI: 10.1098/rsos.230607] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/06/2023] [Accepted: 08/04/2023] [Indexed: 09/01/2023]
Abstract
Hypotheses are frequently the starting point when undertaking the empirical portion of the scientific process. They state something that the scientific process will attempt to evaluate, corroborate, verify or falsify. Their purpose is to guide the types of data we collect, analyses we conduct, and inferences we would like to make. Over the last decade, metascience has advocated for hypotheses being in preregistrations or registered reports, but how to formulate these hypotheses has received less attention. Here, we argue that hypotheses can vary in specificity along at least three independent dimensions: the relationship, the variables, and the pipeline. Together, these dimensions form the scope of the hypothesis. We demonstrate how narrowing the scope of a hypothesis in any of these three ways reduces the hypothesis space and that this reduction is a type of novelty. Finally, we discuss how this formulation of hypotheses can guide researchers to formulate the appropriate scope for their hypotheses and should aim for neither too broad nor too narrow a scope. This framework can guide hypothesis-makers when formulating their hypotheses by helping clarify what is being tested, chaining results to previous known findings, and demarcating what is explicitly tested in the hypothesis.
Collapse
Affiliation(s)
- William Hedley Thompson
- Department of Applied Information Technology, University of Gothenburg, Gothenburg, Sweden
- Institute of Neuroscience and Physiology, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
| | - Simon Skau
- Department of Pedagogical, Curricular and Professional Studies, Faculty of Education, University of Gothenburg, Gothenburg, Sweden
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
| |
Collapse
|
14
|
Clark CJ, Connor P, Isch C. Failing to replicate predicts citation declines in psychology. Proc Natl Acad Sci U S A 2023; 120:e2304862120. [PMID: 37428904 PMCID: PMC10629524 DOI: 10.1073/pnas.2304862120] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2023] [Accepted: 06/03/2023] [Indexed: 07/12/2023] Open
Abstract
With a sample of 228 psychology papers that failed to replicate, we tested whether the trajectory of citation patterns changes following the publication of a failure to replicate. Across models, we found consistent evidence that failing to replicate predicted lower future citations and that the size of this reduction increased over time. In a 14-y postpublication period, we estimated that the publication of a failed replication was associated with an average citation decline of 14% for original papers. These findings suggest that the publication of failed replications may contribute to a self-correcting science by decreasing scholars' reliance on unreplicable original findings.
Collapse
Affiliation(s)
- Cory J. Clark
- The Wharton School, University of Pennsylvania, Philadelphia, PA19104
- School of Arts and Sciences, University of Pennsylvania, Philadelphia, PA19104
| | - Paul Connor
- School of Arts and Sciences, University of Pennsylvania, Philadelphia, PA19104
| | - Calvin Isch
- The Wharton School, University of Pennsylvania, Philadelphia, PA19104
| |
Collapse
|
15
|
Stafford T, Rombach I, Hind D, Mateen B, Woods HB, Dimario M, Wilsdon J. Where next for partial randomisation of research funding? The feasibility of RCTs and alternatives. Wellcome Open Res 2023; 8:309. [PMID: 37663796 PMCID: PMC10474338 DOI: 10.12688/wellcomeopenres.19565.1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/28/2023] [Indexed: 09/05/2023] Open
Abstract
We outline essential considerations for any study of partial randomisation of research funding, and consider scenarios in which randomised controlled trials (RCTs) would be feasible and appropriate. We highlight the interdependence of target outcomes, sample availability and statistical power for determining the cost and feasibility of a trial. For many choices of target outcome, RCTs may be less practical and more expensive than they at first appear (in large part due to issues pertaining to sample size and statistical power). As such, we briefly discuss alternatives to RCTs. It is worth noting that many of the considerations relevant to experiments on partial randomisation may also apply to other potential experiments on funding processes (as described in The Experimental Research Funder's Handbook. RoRI, June 2022).
Collapse
Affiliation(s)
- Tom Stafford
- The University of Sheffield, Sheffield, England, UK
| | - Ines Rombach
- The University of Sheffield, Sheffield, England, UK
| | - Dan Hind
- The University of Sheffield, Sheffield, England, UK
| | | | | | | | | |
Collapse
|
16
|
Huber C, Dreber A, Huber J, Johannesson M, Kirchler M, Weitzel U, Abellán M, Adayeva X, Ay FC, Barron K, Berry Z, Bönte W, Brütt K, Bulutay M, Campos-Mercade P, Cardella E, Claassen MA, Cornelissen G, Dawson IGJ, Delnoij J, Demiral EE, Dimant E, Doerflinger JT, Dold M, Emery C, Fiala L, Fiedler S, Freddi E, Fries T, Gasiorowska A, Glogowsky U, M Gorny P, Gretton JD, Grohmann A, Hafenbrädl S, Handgraaf M, Hanoch Y, Hart E, Hennig M, Hudja S, Hütter M, Hyndman K, Ioannidis K, Isler O, Jeworrek S, Jolles D, Juanchich M, Kc RP, Khadjavi M, Kugler T, Li S, Lucas B, Mak V, Mechtel M, Merkle C, Meyers EA, Mollerstrom J, Nesterov A, Neyse L, Nieken P, Nussberger AM, Palumbo H, Peters K, Pirrone A, Qin X, Rahal RM, Rau H, Rincke J, Ronzani P, Roth Y, Saral AS, Schmitz J, Schneider F, Schram A, Schudy S, Schweitzer ME, Schwieren C, Scopelliti I, Sirota M, Sonnemans J, Soraperra I, Spantig L, Steimanis I, Steinmetz J, Suetens S, Theodoropoulou A, Urbig D, Vorlaufer T, Waibel J, Woods D, Yakobi O, Yilmaz O, Zaleskiewicz T, Zeisberger S, Holzmeister F. Competition and moral behavior: A meta-analysis of forty-five crowd-sourced experimental designs. Proc Natl Acad Sci U S A 2023; 120:e2215572120. [PMID: 37252958 DOI: 10.1073/pnas.2215572120] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/01/2023] Open
Abstract
Does competition affect moral behavior? This fundamental question has been debated among leading scholars for centuries, and more recently, it has been tested in experimental studies yielding a body of rather inconclusive empirical evidence. A potential source of ambivalent empirical results on the same hypothesis is design heterogeneity-variation in true effect sizes across various reasonable experimental research protocols. To provide further evidence on whether competition affects moral behavior and to examine whether the generalizability of a single experimental study is jeopardized by design heterogeneity, we invited independent research teams to contribute experimental designs to a crowd-sourced project. In a large-scale online data collection, 18,123 experimental participants were randomly allocated to 45 randomly selected experimental designs out of 95 submitted designs. We find a small adverse effect of competition on moral behavior in a meta-analysis of the pooled data. The crowd-sourced design of our study allows for a clean identification and estimation of the variation in effect sizes above and beyond what could be expected due to sampling variance. We find substantial design heterogeneity-estimated to be about 1.6 times as large as the average standard error of effect size estimates of the 45 research designs-indicating that the informativeness and generalizability of results based on a single experimental design are limited. Drawing strong conclusions about the underlying hypotheses in the presence of substantive design heterogeneity requires moving toward much larger data collections on various experimental designs testing the same hypothesis.
Collapse
Affiliation(s)
- Christoph Huber
- Institute for Markets and Strategy, WU Vienna University of Economics and Business, Vienna, Austria
| | - Anna Dreber
- Department of Economics, Stockholm School of Economics, Stockholm, Sweden
- Department of Economics, University of Innsbruck, Innsbruck, Austria
| | - Jürgen Huber
- Department of Banking and Finance, University of Innsbruck, Innsbruck, Austria
| | - Magnus Johannesson
- Department of Economics, Stockholm School of Economics, Stockholm, Sweden
| | - Michael Kirchler
- Department of Banking and Finance, University of Innsbruck, Innsbruck, Austria
| | - Utz Weitzel
- Department of Finance, School of Business and Economics, Vrije Universiteit Amsterdam, Amsterdam, The Netherlands
- Department of Economics and Business Economics, Nijmegen School of Management, Radboud University, Nijmegen, The Netherlands
- Tinbergen Institute, Amsterdam, The Netherlands
| | - Miguel Abellán
- School of Public Affairs, Leuphana University Lueneburg, Lueneburg, Germany
| | | | - Fehime Ceren Ay
- Telenor Research, Telenor Group, Oslo, Norway
- FAIR - The Choice Lab, Norwegian School of Economics, Bergen, Norway
| | - Kai Barron
- WZB Berlin Social Science Center, Berlin, Germany
| | - Zachariah Berry
- Department of Organizational Behavior, Industrial and Labor Relations School, Cornell University, Ithaca, NY
| | - Werner Bönte
- Schumpeter School of Business and Economics, University of Wuppertal, Wuppertal, Germany
- Institute for Development Strategies, Indiana University Bloomington, Bloomington, IN
| | - Katharina Brütt
- Amsterdam School of Economics, University of Amsterdam, Amsterdam, The Netherlands
| | | | | | - Eric Cardella
- Rawls College of Business, Texas Tech University, Lubbock, TX
| | | | - Gert Cornelissen
- Department of Economics and Business, Universitat Pompeu Fabra, Barcelona, Spain
- UPF Barcelona School of Management, Barcelona, Spain
| | - Ian G J Dawson
- Centre for Risk Research, University of Southampton, Southampton, United Kingdom
| | - Joyce Delnoij
- Section Economics, Wageningen University, Wageningen, The Netherlands
| | - Elif E Demiral
- Department of Accounting, Finance and Economics, Austin Peay State University, Clarksville, TN
- Women and Public Policy Program, Harvard University, Cambridge, MA
| | | | | | | | - Cécile Emery
- University of Exeter Business School, Exeter, UK
| | - Lenka Fiala
- Department of Economics, University of Bergen, Bergen, Norway
| | - Susann Fiedler
- Institute for Cognition and Behavior, WU Vienna University of Economics and Business, Vienna, Austria
| | - Eleonora Freddi
- Telenor Research, Telenor Group, Oslo, Norway
- FAIR - The Choice Lab, Norwegian School of Economics, Bergen, Norway
| | - Tilman Fries
- WZB Berlin Social Science Center, Berlin, Germany
| | - Agata Gasiorowska
- Center for Research in Economic Behavior, Institute of Psychology, SWPS University of Social Sciences and Humanities, Wroclaw, Poland
| | - Ulrich Glogowsky
- Department of Economics, Johannes Kepler University Linz, Linz, Austria
| | - Paul M Gorny
- Department of Economics and Management, Karlsruhe Institute of Technology, Karlsruhe, Germany
| | | | - Antonia Grohmann
- Department of Economics and Business Economics, Aarhus University, Denmark
- Danish Finance Institute, Denmark
| | - Sebastian Hafenbrädl
- Managing People in Organizations Department, IESE Business School, Barcelona, Spain
| | - Michel Handgraaf
- Section Economics, Wageningen University, Wageningen, The Netherlands
- AMS Institute, Amsterdam, The Netherlands
| | - Yaniv Hanoch
- Centre for Risk Research, University of Southampton, Southampton, United Kingdom
| | - Einav Hart
- School of Business, George Mason University, Fairfax, VA
| | - Max Hennig
- Psychology Department, Eberhard Karls Universität Tübingen, Tübingen, Germany
| | - Stanton Hudja
- Hankamer School of Business, Baylor University, Waco, TX
| | - Mandy Hütter
- Psychology Department, Eberhard Karls Universität Tübingen, Tübingen, Germany
| | | | | | - Ozan Isler
- School of Economics, University of Queensland, St Lucia, Australia
| | - Sabrina Jeworrek
- Faculty of Economics and Management, Otto von Guericke University Magdeburg, Magdeburg, Germany
- Halle Institute for Economic Research, Halle (Saale), Germany
| | - Daniel Jolles
- Department of Psychology, University of Essex, Colchester, Uinted Kingdom
| | - Marie Juanchich
- Department of Psychology, University of Essex, Colchester, Uinted Kingdom
| | | | - Menusch Khadjavi
- Tinbergen Institute, Amsterdam, The Netherlands
- Department of Spatial Economics, School of Business and Economics, Vrije Universiteit Amsterdam, The Netherlands
- Kiel Institute for the World Economy, Kiel, Germany
| | - Tamar Kugler
- Department of Management and Organizations, University of Arizona, Tucson, AZ
| | - Shuwen Li
- Antai College of Economics and Management, Shanghai Jiao Tong University, Shanghai, China
| | - Brian Lucas
- Department of Organizational Behavior, Industrial and Labor Relations School, Cornell University, Ithaca, NY
| | - Vincent Mak
- Cambridge Judge Business School, Cambridge, Uinted Kingdom
| | - Mario Mechtel
- School of Public Affairs, Leuphana University Lueneburg, Lueneburg, Germany
| | - Christoph Merkle
- Department of Economics and Business Economics, Aarhus University, Denmark
- Danish Finance Institute, Denmark
| | | | - Johanna Mollerstrom
- Interdisciplinary Center for Economic Science, George Mason University, Fairfax, VA
- Research Institute for Industrial Economics (IFN), Stockholm, Sweden
| | | | - Levent Neyse
- WZB Berlin Social Science Center, Berlin, Germany
- DIW, Berlin, Germany
| | - Petra Nieken
- Department of Economics and Management, Karlsruhe Institute of Technology, Karlsruhe, Germany
- CESifo, Munich, Germany
| | - Anne-Marie Nussberger
- Center for Humans and Machines, Max Planck Institute for Human Development, Berlin, Germany
| | - Helena Palumbo
- Department of Economics and Business, Universitat Pompeu Fabra, Barcelona, Spain
| | - Kim Peters
- University of Exeter Business School, Exeter, UK
| | - Angelo Pirrone
- Centre for Philosophy of Natural and Social Science, London School of Economics and Political Science, London, Uinted Kingdom
| | - Xiangdong Qin
- Antai College of Economics and Management, Shanghai Jiao Tong University, Shanghai, China
| | - Rima Maria Rahal
- Max Planck Institute for Research on Collective Goods, Bonn, Germany
| | - Holger Rau
- University of Göttingen, Göttingen, Germany
| | - Johannes Rincke
- Friedrich-Alexander-Universität Erlangen-Nürnberg, Nürnberg, Germany
| | - Piero Ronzani
- International Security and Development Center, Berlin, Germany
| | | | | | - Jan Schmitz
- Department of Economics and Business Economics, Nijmegen School of Management, Radboud University, Nijmegen, The Netherlands
| | | | - Arthur Schram
- Amsterdam School of Economics, University of Amsterdam, Amsterdam, The Netherlands
| | - Simeon Schudy
- CESifo, Munich, Germany
- Department of Economics, LMU Munich, Munich, Germany
| | | | | | - Irene Scopelliti
- Bayes Business School, City University of London, London, Uinted Kingdom
| | - Miroslav Sirota
- Department of Psychology, University of Essex, Colchester, Uinted Kingdom
| | - Joep Sonnemans
- CREED, University of Amsterdam, Amsterdam, The Netherlands
| | - Ivan Soraperra
- CREED, University of Amsterdam, Amsterdam, The Netherlands
| | - Lisa Spantig
- School of Business and Economics, RWTH Aachen University, Aachen, Germany
- Department of Economics, University of Essex, Colchester, Uinted Kingdom
| | - Ivo Steimanis
- Working Group Sustainable Use of Natural Resources, University of Marburg, Germany
| | - Janina Steinmetz
- Bayes Business School, City University of London, London, Uinted Kingdom
| | - Sigrid Suetens
- Department of Economics, Tilburg University, Tilburg, The Netherlands
| | | | - Diemo Urbig
- Institute for Development Strategies, Indiana University Bloomington, Bloomington, IN
- Institute of Business and Economics, Brandenburg University of Technology Cottbus-Senftenberg, Germany
| | - Tobias Vorlaufer
- Institute of Environmental Systems Research and Faculty of Economics and Business Administration, Osnabruck University, Osnabruck, Germany
| | - Joschka Waibel
- Department of Economics, University of Essex, Colchester, Uinted Kingdom
| | - Daniel Woods
- Department of Economics, University of Innsbruck, Innsbruck, Austria
| | - Ofir Yakobi
- Department of Psychology, University of Waterloo, Waterloo, Canada
| | - Onurcan Yilmaz
- Department of Psychology, Kadir Has University, Istanbul, Turkey
| | - Tomasz Zaleskiewicz
- Center for Research in Economic Behavior, Institute of Psychology, SWPS University of Social Sciences and Humanities, Wroclaw, Poland
| | - Stefan Zeisberger
- Department of Economics and Business Economics, Nijmegen School of Management, Radboud University, Nijmegen, The Netherlands
- Department of Banking and Finance, University of Zürich, Zürich, Switzerland
| | - Felix Holzmeister
- Department of Economics, University of Innsbruck, Innsbruck, Austria
| |
Collapse
|
17
|
Wintle BC, Smith ET, Bush M, Mody F, Wilkinson DP, Hanea AM, Marcoci A, Fraser H, Hemming V, Thorn FS, McBride MF, Gould E, Head A, Hamilton DG, Kambouris S, Rumpff L, Hoekstra R, Burgman MA, Fidler F. Predicting and reasoning about replicability using structured groups. R Soc Open Sci 2023; 10:221553. [PMID: 37293358 PMCID: PMC10245209 DOI: 10.1098/rsos.221553] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/09/2022] [Accepted: 04/14/2023] [Indexed: 06/10/2023]
Abstract
This paper explores judgements about the replicability of social and behavioural sciences research and what drives those judgements. Using a mixed methods approach, it draws on qualitative and quantitative data elicited from groups using a structured approach called the IDEA protocol ('investigate', 'discuss', 'estimate' and 'aggregate'). Five groups of five people with relevant domain expertise evaluated 25 research claims that were subject to at least one replication study. Participants assessed the probability that each of the 25 research claims would replicate (i.e. that a replication study would find a statistically significant result in the same direction as the original study) and described the reasoning behind those judgements. We quantitatively analysed possible correlates of predictive accuracy, including self-rated expertise and updating of judgements after feedback and discussion. We qualitatively analysed the reasoning data to explore the cues, heuristics and patterns of reasoning used by participants. Participants achieved 84% classification accuracy in predicting replicability. Those who engaged in a greater breadth of reasoning provided more accurate replicability judgements. Some reasons were more commonly invoked by more accurate participants, such as 'effect size' and 'reputation' (e.g. of the field of research). There was also some evidence of a relationship between statistical literacy and accuracy.
Collapse
Affiliation(s)
- Bonnie C. Wintle
- MetaMelb Research Initiative, School of Ecosystem and Forest Sciences, University of Melbourne, Parkville 3010, Australia
| | - Eden T. Smith
- MetaMelb Research Initiative, School of Historical and Philosophical Studies, University of Melbourne, Parkville 3010, Australia
| | - Martin Bush
- MetaMelb Research Initiative, School of Historical and Philosophical Studies, University of Melbourne, Parkville 3010, Australia
| | - Fallon Mody
- MetaMelb Research Initiative, School of Historical and Philosophical Studies, University of Melbourne, Parkville 3010, Australia
| | - David P. Wilkinson
- MetaMelb Research Initiative, School of Ecosystem and Forest Sciences, University of Melbourne, Parkville 3010, Australia
| | - Anca M. Hanea
- MetaMelb Research Initiative, School of Ecosystem and Forest Sciences, University of Melbourne, Parkville 3010, Australia
- Centre of Excellence for Biosecurity Risk Analysis, School of BioSciences, University of Melbourne, Parkville 3010, Australia
| | - Alexandru Marcoci
- Centre for the Study of Existential Risk, University of Cambridge, Cambridge, UK
| | - Hannah Fraser
- MetaMelb Research Initiative, School of Ecosystem and Forest Sciences, University of Melbourne, Parkville 3010, Australia
| | - Victoria Hemming
- Martin Conservation Decisions Lab, Department of Forest and Conservation Sciences, University of British Columbia, Vancouver, Canada
| | - Felix Singleton Thorn
- School of Psychological Sciences, University of Melbourne, Parkville 3010, Australia
| | - Marissa F. McBride
- MetaMelb Research Initiative, School of Ecosystem and Forest Sciences, University of Melbourne, Parkville 3010, Australia
- Centre for Environmental Policy, Imperial College London, London, UK
| | - Elliot Gould
- MetaMelb Research Initiative, School of Ecosystem and Forest Sciences, University of Melbourne, Parkville 3010, Australia
| | - Andrew Head
- MetaMelb Research Initiative, School of Historical and Philosophical Studies, University of Melbourne, Parkville 3010, Australia
| | - Daniel G. Hamilton
- MetaMelb Research Initiative, School of Historical and Philosophical Studies, University of Melbourne, Parkville 3010, Australia
| | - Steven Kambouris
- MetaMelb Research Initiative, School of Ecosystem and Forest Sciences, University of Melbourne, Parkville 3010, Australia
| | - Libby Rumpff
- MetaMelb Research Initiative, School of Ecosystem and Forest Sciences, University of Melbourne, Parkville 3010, Australia
| | - Rink Hoekstra
- Department of Pedagogical and Educational Sciences, University of Groningen, Groningen, The Netherlands
| | - Mark A. Burgman
- Centre for Environmental Policy, Imperial College London, London, UK
| | - Fiona Fidler
- MetaMelb Research Initiative, School of Ecosystem and Forest Sciences, University of Melbourne, Parkville 3010, Australia
- MetaMelb Research Initiative, School of Historical and Philosophical Studies, University of Melbourne, Parkville 3010, Australia
| |
Collapse
|
18
|
Clark CJ, Graso M, Redstone I, Tetlock PE. Harm Hypervigilance in Public Reactions to Scientific Evidence. Psychol Sci 2023:9567976231168777. [PMID: 37260038 DOI: 10.1177/09567976231168777] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/02/2023] Open
Abstract
Two preregistered studies from two different platforms with representative U.S. adult samples (N = 1,865) tested the harm-hypervigilance hypothesis in risk assessments of controversial behavioral science. As expected, across six sets of scientific findings, people consistently overestimated others' harmful reactions (medium to large average effect sizes) and underestimated helpful ones, even when incentivized for accuracy. Additional analyses found that (a) harm overestimations were associated with support for censoring science, (b) people who were more offended by scientific findings reported greater difficulty understanding them, and (c) evidence was moderately consistent for an association between more conservative ideology and harm overestimations. These findings are particularly relevant because journals have begun evaluating potential downstream harms of scientific findings. We discuss implications of our work and invite scholars to develop rigorous tests of (a) the social pressures that lead science astray and (b) the actual costs and benefits of publishing or not publishing potentially controversial conclusions.
Collapse
Affiliation(s)
- Cory J Clark
- Department of Psychology, University of Pennsylvania
- Management Department, The Wharton School, University of Pennsylvania
| | - Maja Graso
- Department of Organizational Psychology, University of Groningen
| | - Ilana Redstone
- Department of Sociology, University of Illinois Urbana-Champaign
| | - Philip E Tetlock
- Department of Psychology, University of Pennsylvania
- Management Department, The Wharton School, University of Pennsylvania
| |
Collapse
|
19
|
Buzbas EO, Devezer B, Baumgaertner B. The logical structure of experiments lays the foundation for a theory of reproducibility. R Soc Open Sci 2023; 10:221042. [PMID: 36938532 PMCID: PMC10014247 DOI: 10.1098/rsos.221042] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/11/2022] [Accepted: 02/02/2023] [Indexed: 06/18/2023]
Abstract
The scientific reform movement has proposed openness as a potential remedy to the putative reproducibility or replication crisis. However, the conceptual relationship among openness, replication experiments and results reproducibility has been obscure. We analyse the logical structure of experiments, define the mathematical notion of idealized experiment and use this notion to advance a theory of reproducibility. Idealized experiments clearly delineate the concepts of replication and results reproducibility, and capture key differences with precision, allowing us to study the relationship among them. We show how results reproducibility varies as a function of the elements of an idealized experiment, the true data-generating mechanism, and the closeness of the replication experiment to an original experiment. We clarify how openness of experiments is related to designing informative replication experiments and to obtaining reproducible results. With formal backing and evidence, we argue that the current 'crisis' reflects inadequate attention to a theoretical understanding of results reproducibility.
Collapse
Affiliation(s)
- Erkan O. Buzbas
- Department of Mathematics and Statistical Science, University of Idaho, Moscow, ID 83844, USA
| | - Berna Devezer
- Department of Mathematics and Statistical Science, University of Idaho, Moscow, ID 83844, USA
- Department of Business, University of Idaho, Moscow, ID 83844, USA
| | - Bert Baumgaertner
- Department of Politics and Philosophy, University of Idaho, Moscow, ID 83844, USA
| |
Collapse
|
20
|
Kummerfeld E, Jones GL. One data set, many analysts: Implications for practicing scientists. Front Psychol 2023; 14:1094150. [PMID: 36865366 PMCID: PMC9971968 DOI: 10.3389/fpsyg.2023.1094150] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2022] [Accepted: 01/27/2023] [Indexed: 02/16/2023] Open
Abstract
Researchers routinely face choices throughout the data analysis process. It is often opaque to readers how these choices are made, how they affect the findings, and whether or not data analysis results are unduly influenced by subjective decisions. This concern is spurring numerous investigations into the variability of data analysis results. The findings demonstrate that different teams analyzing the same data may reach different conclusions. This is the "many-analysts" problem. Previous research on the many-analysts problem focused on demonstrating its existence, without identifying specific practices for solving it. We address this gap by identifying three pitfalls that have contributed to the variability observed in many-analysts publications and providing suggestions on how to avoid them.
Collapse
Affiliation(s)
- Erich Kummerfeld
- Institute for Health Informatics, University of Minnesota, Minneapolis, MN, United States,*Correspondence: Erich Kummerfeld ✉
| | - Galin L. Jones
- School of Statistics, University of Minnesota, Minneapolis, MN, United States
| |
Collapse
|
21
|
Kekecs Z, Palfi B, Szaszi B, Szecsi P, Zrubka M, Kovacs M, Bakos BE, Cousineau D, Tressoldi P, Schmidt K, Grassi M, Evans TR, Yamada Y, Miller JK, Liu H, Yonemitsu F, Dubrov D, Röer JP, Becker M, Schnepper R, Ariga A, Arriaga P, Oliveira R, Põldver N, Kreegipuu K, Hall B, Wiechert S, Verschuere B, Girán K, Aczel B. Raising the value of research studies in psychological science by increasing the credibility of research reports: the transparent Psi project. R Soc Open Sci 2023; 10:191375. [PMID: 36756055 PMCID: PMC9890107 DOI: 10.1098/rsos.191375] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/07/2019] [Accepted: 01/11/2023] [Indexed: 06/18/2023]
Abstract
The low reproducibility rate in social sciences has produced hesitation among researchers in accepting published findings at their face value. Despite the advent of initiatives to increase transparency in research reporting, the field is still lacking tools to verify the credibility of research reports. In the present paper, we describe methodologies that let researchers craft highly credible research and allow their peers to verify this credibility. We demonstrate the application of these methods in a multi-laboratory replication of Bem's Experiment 1 (Bem 2011 J. Pers. Soc. Psychol. 100, 407-425. (doi:10.1037/a0021524)) on extrasensory perception (ESP), which was co-designed by a consensus panel including both proponents and opponents of Bem's original hypothesis. In the study we applied direct data deposition in combination with born-open data and real-time research reports to extend transparency to protocol delivery and data collection. We also used piloting, checklists, laboratory logs and video-documented trial sessions to ascertain as-intended protocol delivery, and external research auditors to monitor research integrity. We found 49.89% successful guesses, while Bem reported 53.07% success rate, with the chance level being 50%. Thus, Bem's findings were not replicated in our study. In the paper, we discuss the implementation, feasibility and perceived usefulness of the credibility-enhancing methodologies used throughout the project.
Collapse
Affiliation(s)
- Zoltan Kekecs
- Institute of Psychology, ELTE, Eotvos Lorand University, Izabella u 46. 1064, Budapest, Hungary
- Department of Psychology, Lund University, Box 213, Lund 221 00, Sweden
| | - Bence Palfi
- Department of Surgery and Cancer, Imperial College London, London, UK
| | - Barnabas Szaszi
- Institute of Psychology, ELTE, Eotvos Lorand University, Izabella u 46. 1064, Budapest, Hungary
| | - Peter Szecsi
- Institute of Psychology, ELTE, Eotvos Lorand University, Izabella u 46. 1064, Budapest, Hungary
| | - Mark Zrubka
- Department of Psychology, University of Amsterdam, P.O. Box 19268, 1000 GG Amsterdam, The Netherlands
| | - Marton Kovacs
- Institute of Psychology, ELTE, Eotvos Lorand University, Izabella u 46. 1064, Budapest, Hungary
| | - Bence E. Bakos
- Institute of Psychology, ELTE, Eotvos Lorand University, Izabella u 46. 1064, Budapest, Hungary
| | - Denis Cousineau
- École de psychologie, University of Ottawa, 136, Jean-Jacques Lussier, Ontario, Canada, K1N 6N5
| | - Patrizio Tressoldi
- Studium Patavinum, Università di Padova via Venezia 8, 35131 Padova, Italy
| | - Kathleen Schmidt
- Department of Psychology, Ashland University, Ashland, OH 44805, USA
- School of Psychological & Behavioral Sciences, Southern Illinois University, Carbondale, USA
| | - Massimo Grassi
- Dipartimento di Psicologia Generale, Università di Padova via Venezia 8, 35131 Padova, Italy
| | | | - Yuki Yamada
- Faculty of Arts and Science, Kyushu University, 744 Motooka, Nishi-ku, Fukuoka 819-0395, Japan
| | - Jeremy K. Miller
- Department of Psychology, Willamette University, 900 State Street, Salem, OR 97301, USA
| | - Huanxu Liu
- Graduate School of Human-Environment Studies, Kyushu University, 744 Motooka, Nishi-ku, Fukuoka 819-0395, Japan
| | - Fumiya Yonemitsu
- Faculty of Letters, Chuo University, Hachioji, Tokyo 192-0393, Japan
| | - Dmitrii Dubrov
- National Research University Higher School of Economics, Russian Federation
| | - Jan Philipp Röer
- Department of Psychology and Psychotherapy, Witten/Herdecke University, Witten, Germany
| | - Marvin Becker
- Department of Psychology and Psychotherapy, Witten/Herdecke University, Witten, Germany
| | - Roxane Schnepper
- Department of Psychology and Psychotherapy, Witten/Herdecke University, Witten, Germany
| | - Atsunori Ariga
- Faculty of Letters, Chuo University, Hachioji, Tokyo 192-0393, Japan
| | - Patrícia Arriaga
- Iscte-University Institute of Lisbon, CIS_Iscte, Av. das Forças Armadas, 1649-026, Lisbon, Portugal
| | - Raquel Oliveira
- Iscte-University Institute of Lisbon, CIS_Iscte, Av. das Forças Armadas, 1649-026, Lisbon, Portugal
| | - Nele Põldver
- University of Tartu Institute of Psychology, Estonia
| | | | - Braeden Hall
- School of Psychological & Behavioral Sciences, Southern Illinois University, Carbondale, USA
| | - Sera Wiechert
- University of Amsterdam, Amsterdam, The Netherlands
- Hebrew University of Jerusalem, Jerusalem, Israel
| | | | - Kyra Girán
- Institute of Psychology, ELTE, Eotvos Lorand University, Izabella u 46. 1064, Budapest, Hungary
| | - Balazs Aczel
- Institute of Psychology, ELTE, Eotvos Lorand University, Izabella u 46. 1064, Budapest, Hungary
| |
Collapse
|
22
|
Dresler M, Buddeberg E, Endesfelder U, Haaker J, Hof C, Kretschmer R, Pflüger D, Schmidt F. Effective or predatory funding? Evaluating the hidden costs of grant applications. Immunol Cell Biol 2023; 101:104-111. [PMID: 36214095 DOI: 10.1111/imcb.12592] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2022] [Revised: 10/05/2022] [Accepted: 10/07/2022] [Indexed: 12/04/2022]
Abstract
Researchers are spending an increasing fraction of their time on applying for funding; however, the current funding system has considerable deficiencies in reliably evaluating the merit of research proposals, despite extensive efforts on the sides of applicants, grant reviewers and decision committees. For some funding schemes, the systemic costs of the application process as a whole can even outweigh the granted resources-a phenomenon that could be considered as predatory funding. We present five recommendations to remedy this unsatisfactory situation.
Collapse
Affiliation(s)
- Martin Dresler
- Radboud University Medical Center, Donders Institute for Brain, Cognition and Behavior, Nijmegen, The Netherlands
| | | | | | - Jan Haaker
- University Medical Center Hamburg-Eppendorf, Hamburg, Germany
| | - Christian Hof
- Terrestrial Ecology Research Group, TUM School of Life Sciences, Technical University of Munich, Freising, Germany
| | | | | | - Fabian Schmidt
- Max Planck Institute for Astrophysics, Garching, Germany
| |
Collapse
|
23
|
Kohrt F, Smaldino PE, McElreath R, Schönbrodt F. Replication of the natural selection of bad science. R Soc Open Sci 2023; 10:221306. [PMID: 36844805 PMCID: PMC9943874 DOI: 10.1098/rsos.221306] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/12/2022] [Accepted: 01/20/2023] [Indexed: 06/18/2023]
Abstract
This study reports an independent replication of the findings presented by Smaldino and McElreath (Smaldino, McElreath 2016 R. Soc. Open Sci. 3, 160384 (doi:10.1098/rsos.160384)). The replication was successful with one exception. We find that selection acting on scientist's propensity for replication frequency caused a brief period of exuberant replication not observed in the original paper due to a coding error. This difference does not, however, change the authors' original conclusions. We call for more replication studies for simulations as unique contributions to scientific quality assurance.
Collapse
Affiliation(s)
- Florian Kohrt
- Department of Psychology, Ludwig-Maximilians-Universität München, Munich, Germany
| | - Paul E. Smaldino
- Department of Cognitive and Information Sciences, University of California, Merced, CA 95343, USA
- Santa Fe Institute, Santa Fe, NM 87501, USA
| | - Richard McElreath
- Department of Human Behavior, Ecology, and Culture, Max Planck Institute for Evolutionary Anthropology, Leipzig, Germany
| | - Felix Schönbrodt
- Department of Psychology, Ludwig-Maximilians-Universität München, Munich, Germany
| |
Collapse
|
24
|
Murphy G, Maher J, Ballantyne L, Barrett E, Cowman CS, Dawson CA, Huston C, Ryan KM, Greene CM. How do participants feel about the ethics of rich false memory studies? Memory 2023; 31:474-481. [PMID: 36689341 DOI: 10.1080/09658211.2023.2170417] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/24/2023]
Abstract
ABSTRACTDeception is often a necessity in rich false memory studies, but is this deception acceptable to participants? In the current study, we followed up with 175 participants who had taken part in a replication of the Lost in the Mall childhood false memory study (Loftus & Pickrell, 1995), as either a research subject or a familial informant. We found that both participants and informants were generally very positive about their experience, did not regret taking part and found the deceptive methods acceptable. Importantly, the vast majority reported that they would still have taken part had they known the true objectives from the beginning. Participants also reported learning something interesting about memory and enjoying the nostalgia and family discussions that were prompted by the study. We would encourage other researchers to assess the ethical implications of false memory research paradigms and to incorporate the valuable feedback from participants and informants.
Collapse
Affiliation(s)
- Gillian Murphy
- School of Applied Psychology, University College Cork, Cork, Ireland
| | - Julie Maher
- School of Applied Psychology, University College Cork, Cork, Ireland
| | - Lisa Ballantyne
- School of Applied Psychology, University College Cork, Cork, Ireland
| | - Elizabeth Barrett
- School of Applied Psychology, University College Cork, Cork, Ireland
| | - Conor S Cowman
- School of Applied Psychology, University College Cork, Cork, Ireland
| | | | - Charlotte Huston
- School of Applied Psychology, University College Cork, Cork, Ireland
| | - Katie M Ryan
- School of Psychology, University College Dublin, Dublin, Ireland
| | - Ciara M Greene
- School of Psychology, University College Dublin, Dublin, Ireland
| |
Collapse
|
25
|
Gainsburg I, Pauer S, Abboub N, Aloyo ET, Mourrat JC, Cristia A. How Effective Altruism Can Help Psychologists Maximize Their Impact. Perspect Psychol Sci 2023; 18:239-253. [PMID: 35981321 DOI: 10.1177/17456916221079596] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/31/2023]
Abstract
Although many psychologists are interested in making the world a better place through their work, they are often unable to have the impact that they would like. Here, we suggest that both individuals and psychology as a field can better improve human welfare by incorporating ideas from effective altruism, a growing movement whose members aim to do the most good by using science and reason to inform their efforts. In this article, we first briefly introduce effective altruism and review important principles that can be applied to how psychologists approach their work, such as the importance, tractability, and neglectedness framework. We then review how effective altruism can inform individual psychologists' choices. Finally, we close with a discussion of ideas for how psychology, as a field, can increase its positive impact. By applying insights from effective altruism to psychological science, we aim to integrate a new theoretical framework into psychological science, stimulate new areas of research, start a discussion on how psychology can maximize its impact, and inspire the psychology community to do the most good.
Collapse
Affiliation(s)
- Izzy Gainsburg
- Ross School of Business, University of Michigan
- John F. Kennedy School of Government, Harvard University
| | - Shiva Pauer
- Department of Social Psychology, University of Amsterdam
| | | | - Eamon T Aloyo
- Institute of Security and Global Affairs, Leiden University
| | | | - Alejandrina Cristia
- Laboratoire de Sciences Cognitives et de Psycholinguistique, Département d'Etudes Cognitives, École Normale Supérieure (ENS)/Ecole des Hautes Études en Sciences Sociales (EHESS)/Centre National de la Recherche Scientifique (CNRS), Paris Sciences et Lettres (PSL)
| |
Collapse
|
26
|
Breznau N, Rinke EM, Wuttke A, Nguyen HHV, Adem M, Adriaans J, Alvarez-Benjumea A, Andersen HK, Auer D, Azevedo F, Bahnsen O, Balzer D, Bauer G, Bauer PC, Baumann M, Baute S, Benoit V, Bernauer J, Berning C, Berthold A, Bethke FS, Biegert T, Blinzler K, Blumenberg JN, Bobzien L, Bohman A, Bol T, Bostic A, Brzozowska Z, Burgdorf K, Burger K, Busch KB, Carlos-Castillo J, Chan N, Christmann P, Connelly R, Czymara CS, Damian E, Ecker A, Edelmann A, Eger MA, Ellerbrock S, Forke A, Forster A, Gaasendam C, Gavras K, Gayle V, Gessler T, Gnambs T, Godefroidt A, Grömping M, Groß M, Gruber S, Gummer T, Hadjar A, Heisig JP, Hellmeier S, Heyne S, Hirsch M, Hjerm M, Hochman O, Hövermann A, Hunger S, Hunkler C, Huth N, Ignácz ZS, Jacobs L, Jacobsen J, Jaeger B, Jungkunz S, Jungmann N, Kauff M, Kleinert M, Klinger J, Kolb JP, Kołczyńska M, Kuk J, Kunißen K, Kurti Sinatra D, Langenkamp A, Lersch PM, Löbel LM, Lutscher P, Mader M, Madia JE, Malancu N, Maldonado L, Marahrens H, Martin N, Martinez P, Mayerl J, Mayorga OJ, McManus P, McWagner K, Meeusen C, Meierrieks D, Mellon J, Merhout F, Merk S, Meyer D, Micheli L, Mijs J, Moya C, Neunhoeffer M, Nüst D, Nygård O, Ochsenfeld F, Otte G, Pechenkina AO, Prosser C, Raes L, Ralston K, Ramos MR, Roets A, Rogers J, Ropers G, Samuel R, Sand G, Schachter A, Schaeffer M, Schieferdecker D, Schlueter E, Schmidt R, Schmidt KM, Schmidt-Catran A, Schmiedeberg C, Schneider J, Schoonvelde M, Schulte-Cloos J, Schumann S, Schunck R, Schupp J, Seuring J, Silber H, Sleegers W, Sonntag N, Staudt A, Steiber N, Steiner N, Sternberg S, Stiers D, Stojmenovska D, Storz N, Striessnig E, Stroppe AK, Teltemann J, Tibajev A, Tung B, Vagni G, Van Assche J, van der Linden M, van der Noll J, Van Hootegem A, Vogtenhuber S, Voicu B, Wagemans F, Wehl N, Werner H, Wiernik BM, Winter F, Wolf C, Yamada Y, Zhang N, Ziller C, Zins S, Żółtak T. Observing many researchers using the same data and hypothesis reveals a hidden universe of uncertainty. Proc Natl Acad Sci U S A 2022; 119:e2203150119. [PMID: 36306328 PMCID: PMC9636921 DOI: 10.1073/pnas.2203150119] [Citation(s) in RCA: 13] [Impact Index Per Article: 6.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2022] [Accepted: 08/22/2022] [Indexed: 11/25/2022] Open
Abstract
This study explores how researchers' analytical choices affect the reliability of scientific findings. Most discussions of reliability problems in science focus on systematic biases. We broaden the lens to emphasize the idiosyncrasy of conscious and unconscious decisions that researchers make during data analysis. We coordinated 161 researchers in 73 research teams and observed their research decisions as they used the same data to independently test the same prominent social science hypothesis: that greater immigration reduces support for social policies among the public. In this typical case of social science research, research teams reported both widely diverging numerical findings and substantive conclusions despite identical start conditions. Researchers' expertise, prior beliefs, and expectations barely predict the wide variation in research outcomes. More than 95% of the total variance in numerical results remains unexplained even after qualitative coding of all identifiable decisions in each team's workflow. This reveals a universe of uncertainty that remains hidden when considering a single study in isolation. The idiosyncratic nature of how researchers' results and conclusions varied is a previously underappreciated explanation for why many scientific hypotheses remain contested. These results call for greater epistemic humility and clarity in reporting scientific findings.
Collapse
Affiliation(s)
- Nate Breznau
- Research Center on Inequality and Social Policy (SOCIUM), University of Bremen, Bremen, 28359, Germany
| | - Eike Mark Rinke
- School of Politics and International Studies, University of Leeds, Leeds, LS2 9JT, United Kingdom
| | - Alexander Wuttke
- Mannheim Centre for European Social Research, University of Mannheim, 68131 Mannheim, Germany
- Department of Political Science, Ludwig Maximilian University, 80539 Munich, Germany
| | - Hung H. V. Nguyen
- Research Center on Inequality and Social Policy (SOCIUM), University of Bremen, Bremen, 28359, Germany
- Bremen International Graduate School of Social Sciences, 28359 Bremen, Germany
| | - Muna Adem
- Department of Sociology, Indiana University, Bloomington, IN 47405
| | - Jule Adriaans
- Socio-Economic Panel Study (SOEP), German Institute for Economic Research (DIW), 10117 Berlin, Germany
| | - Amalia Alvarez-Benjumea
- Mechanisms of Normative Change, Max Planck Institute for Research on Collective Goods, 53113 Bonn, Germany
| | - Henrik K. Andersen
- Institute of Sociology, Chemnitz University of Technology, 09126 Chemnitz, Germany
| | - Daniel Auer
- Mannheim Centre for European Social Research, University of Mannheim, 68131 Mannheim, Germany
| | - Flavio Azevedo
- Department of Psychology, University of Cambridge, Cambridge, CB23RQ, United Kingdom
| | - Oke Bahnsen
- School of Social Sciences, University of Mannheim, 68159 Mannheim, Germany
| | - Dave Balzer
- Institute of Sociology, Johannes Gutenberg University Mainz, 55128 Mainz, Germany
| | - Gerrit Bauer
- Department of Sociology, Ludwig Maximilian University, 80801 Munich, Germany
| | - Paul C. Bauer
- Mannheim Centre for European Social Research, University of Mannheim, 68131 Mannheim, Germany
| | - Markus Baumann
- Heidelberg University, 69117 Heidelberg, Germany
- Institute for Political Science, Goethe University Frankfurt, 60323 Frankfurt, Germany
| | - Sharon Baute
- Comparative Political Economy, University of Konstanz, 78457 Konstanz, Germany
| | - Verena Benoit
- Department of Political Science, Ludwig Maximilian University, 80539 Munich, Germany
- Faculty of Social Sciences, Economics, and Business Administration, University of Bamberg, 96052 Bamberg, Germany
| | - Julian Bernauer
- Mannheim Centre for European Social Research, University of Mannheim, 68131 Mannheim, Germany
| | - Carl Berning
- Institute for Political Science, Johannes Gutenberg University Mainz, 55099 Mainz, Germany
| | - Anna Berthold
- Faculty of Social Sciences, Economics, and Business Administration, University of Bamberg, 96052 Bamberg, Germany
| | - Felix S. Bethke
- Research Department on Intrastate Conflict, Peace Research Institute Frankfurt, 60329 Frankfurt, Germany
| | - Thomas Biegert
- Department of Social Policy, London School of Economics and Political Science, London, WC2A 2AE, United Kingdom
| | - Katharina Blinzler
- Survey Data Curation, Leibniz Institute for the Social Sciences (GESIS), 50667 Cologne, Germany
| | - Johannes N. Blumenberg
- Knowledge Exchange and Outreach, Leibniz Institute for the Social Sciences (GESIS), 68159 Mannheim, Germany
| | - Licia Bobzien
- Jacques Delors Centre, Hertie School, 10117 Berlin, Germany
| | - Andrea Bohman
- Department of Sociology, Umeå University, 90187 Umeå, Sweden
| | - Thijs Bol
- Social Research Institute, Institute of Education, University College London, London, WC1H 0AL, United Kingdom
- Department of Sociology, University of Amsterdam, 1001 Amsterdam, The Netherlands
| | - Amie Bostic
- Department of Sociology, The University of Texas Rio Grande Valley, Brownsville, TX 78520
| | - Zuzanna Brzozowska
- Vienna Institute of Demography, Austrian Academy of Sciences, 1030 Vienna, Austria
- Austrian National Public Health Institute, Gesundheit Österreich (GÖG), 1030 Vienna, Austria
| | - Katharina Burgdorf
- School of Social Sciences, University of Mannheim, 68159 Mannheim, Germany
| | - Kaspar Burger
- Social Research Institute, Institute of Education, University College London, London, WC1H 0AL, United Kingdom
- Department of Sociology, University of Zurich, 8050 Zurich, Switzerland
- Jacobs Center for Productive Youth, University of Zurich, 8050 Zurich, Switzerland
| | | | - Juan Carlos-Castillo
- Department of Sociology, University of Chile, Santiago, 7800284, Chile
- Center for Social Conflict and Cohesion Studies (COES), Pontificia Universidad Católica de Chile, Santiago, 8331150, Chile
| | - Nathan Chan
- Department of Political Science and International Relations, Loyola Marymount University, Los Angeles, CA 90045
| | - Pablo Christmann
- Data and Research on Society, Leibniz Institute for the Social Sciences, 68159 Mannheim, Germany
| | - Roxanne Connelly
- School of Social and Political Science, University of Edinburgh, Edinburgh, EH8 9LD, United Kingdom
| | | | - Elena Damian
- Lifestyle and Chronic Diseases, Epidemiology and Public Health, Sciensano, 1000 Brussels, Belgium
| | - Alejandro Ecker
- Mannheim Centre for European Social Research, University of Mannheim, 68131 Mannheim, Germany
| | | | - Maureen A. Eger
- Department of Sociology, Umeå University, 90187 Umeå, Sweden
| | - Simon Ellerbrock
- Mannheim Centre for European Social Research, University of Mannheim, 68131 Mannheim, Germany
- School of Social Sciences, University of Mannheim, 68159 Mannheim, Germany
| | | | - Andrea Forster
- Empirical Educational and Higher Education Research, Freie Universität Berlin, 14195 Berlin, Germany
| | - Chris Gaasendam
- Department of Sociology, Center for Sociological Research, KU Leuven, 3000 Leuven, Belgium
| | - Konstantin Gavras
- School of Social Sciences, University of Mannheim, 68159 Mannheim, Germany
| | - Vernon Gayle
- School of Social and Political Science, University of Edinburgh, Edinburgh, EH8 9LD, United Kingdom
| | - Theresa Gessler
- Kulturwissenschaftliche Fakultät, European University Viadrina, 15230 Frankfurt (Oder), Germany
| | - Timo Gnambs
- Educational Measurement, Leibniz Institute for Educational Trajectories, 96047 Bamberg, Germany
| | - Amélie Godefroidt
- Centre for Research on Peace and Development, KU Leuven, 3000 Leuven, Belgium
| | - Max Grömping
- School of Government and International Relations, Griffith University, Nathan, QLD, 4111, Australia
| | - Martin Groß
- Department of Sociology, University of Tübingen, 72074 Tübingen, Germany
| | - Stefan Gruber
- Max Planck Institute for Social Law and Social Policy, 80799 Munich, Germany
| | - Tobias Gummer
- Data and Research on Society, Leibniz Institute for the Social Sciences, 68159 Mannheim, Germany
| | - Andreas Hadjar
- University of Luxembourg, 4365 Esch-sur-Alzette, Luxembourg
- Wirtschafts- und Sozialwissenschaftliches Institut (WSI), Hans Böckler Foundation, 40474 Düsseldorf, Germany
- University of Fribourg, 1700 Fribourg, Switzerland
- Department of Social Sciences, University of Luxembourg, 4366 Esch-sur-Alzette, Luxembourg
| | - Jan Paul Heisig
- University of Groningen, 9712 CP Groningen,The Netherlands
- Research Group "Health and Social Inequality", Berlin Social Science Center (WZB), 10785 Berlin, Germany
| | - Sebastian Hellmeier
- Transformations of Democracy Unit, Berlin Social Science Center (WZB), 10785 Berlin, Germany
| | - Stefanie Heyne
- Mannheim Centre for European Social Research, University of Mannheim, 68131 Mannheim, Germany
| | - Magdalena Hirsch
- Research Unit Migration, Integration, Transnationalization, Berlin Social Science Center (WZB), 10785 Berlin, Germany
| | - Mikael Hjerm
- Department of Sociology, Umeå University, 90187 Umeå, Sweden
| | - Oshrat Hochman
- Data and Research on Society, Leibniz Institute for the Social Sciences, 68159 Mannheim, Germany
| | - Andreas Hövermann
- Wirtschafts- und Sozialwissenschaftliches Institut (WSI), Hans Böckler Foundation, 40474 Düsseldorf, Germany
- German Socio-Economic Panel Survey, 10117 Berlin, Germany
| | - Sophia Hunger
- Center for Civil Society Research, Berlin Social Science Center, 10785 Berlin, Germany
| | - Christian Hunkler
- Berlin Institute for Integration and Migration Research (BIM), Humboldt University Berlin, 10099 Berlin, Germany
| | - Nora Huth
- School of Human and Social Sciences, University of Wuppertal, 42119 Wuppertal, Germany
| | - Zsófia S. Ignácz
- Institute of Sociology, Goethe University Frankfurt, 60323 Frankfurt, Germany
| | - Laura Jacobs
- Department of Political Science, Université Libre de Bruxelles, 1050 Bruxelles, Belgium
| | - Jannes Jacobsen
- Zeppelin University, 88045 Friedrichshafen, Germany
- Cluster "Data-Methods-Monitoring", German Center for Integration and Migration Research (DeZIM),10117 Berlin, Germany
| | - Bastian Jaeger
- Department of Social Psychology, Tilburg University, 5037AB Tilburg, The Netherlands
| | - Sebastian Jungkunz
- Institute for Socio-Economics, University of Duisburg-Essen, 47057 Duisburg, Germany
- Institute of Political Science, University of Münster, 48149 Münster, Germany
- Chair of Political Sociology, University of Bamberg, 96052 Bamberg, Germany
| | - Nils Jungmann
- Survey Data Curation, Leibniz Institute for the Social Sciences (GESIS), 50667 Cologne, Germany
| | - Mathias Kauff
- Department of Psychology, Medical School Hamburg, 20457 Hamburg, Germany
| | - Manuel Kleinert
- Institute of Sociology, Justus Liebig University of Giessen, 35394 Giessen, Germany
| | - Julia Klinger
- Institute of Sociology and Social Psychology, University of Cologne, 50931 Cologne, Germany
| | - Jan-Philipp Kolb
- Federal Statistics Office Germany, Destatis, 65189 Wiesbaden, Germany
| | - Marta Kołczyńska
- Department of Research on Social and Institutional Transformations, Institute of Political Studies of the Polish Academy of Sciences, 00-625 Warsaw, Poland
| | - John Kuk
- Department of Political Science, University of Oklahoma, Norman, OK 73019
| | - Katharina Kunißen
- Institute of Sociology, Johannes Gutenberg University Mainz, 55128 Mainz, Germany
| | | | | | - Philipp M. Lersch
- Socio-Economic Panel Study (SOEP), German Institute for Economic Research (DIW), 10117 Berlin, Germany
- Department of Social Sciences, Humboldt University Berlin, 10099 Berlin, Germany
| | - Lea-Maria Löbel
- Socio-Economic Panel Study (SOEP), German Institute for Economic Research (DIW), 10117 Berlin, Germany
| | - Philipp Lutscher
- Department of Political Science, University of Oslo, 0851 Oslo, Norway
| | - Matthias Mader
- Department of Politics and Public Administration, University of Konstanz, 78457 Konstanz, Germany
| | - Joan E. Madia
- Department of Sociology, Nuffield College, University of Oxford, Oxford, OX1 1JD, United Kingdom
- Institute for the Evaluation of Public Policies, Fondazione Bruno Kessler, 38122 Trento, Italy
| | - Natalia Malancu
- The Institute of Citizenship Studies (InCite), University of Geneva, 1205 Geneva, Switzerland
| | - Luis Maldonado
- Instituto de Sociologia, Pontifical Catholic University of Chile, Santiago, 7820436, Chile
| | - Helge Marahrens
- Department of Sociology, Indiana University, Bloomington, IN 47405
| | - Nicole Martin
- Department of Politics, University of Manchester, Manchester, M19 2JS, United Kingdom
| | - Paul Martinez
- Department of Institutional Research, Western Governors University, Salt Lake City, UT 84107
| | - Jochen Mayerl
- Institute of Sociology, Chemnitz University of Technology, 09126 Chemnitz, Germany
| | - Oscar J. Mayorga
- Department of Sociology, University of California, Los Angeles, CA 90095
| | - Patricia McManus
- Department of Sociology, Indiana University, Bloomington, IN 47405
| | - Kyle McWagner
- Department of Political Science, The University of California, Irvine, CA 92617
| | - Cecil Meeusen
- Department of Sociology, Center for Sociological Research, KU Leuven, 3000 Leuven, Belgium
| | - Daniel Meierrieks
- Research Unit Migration, Integration, Transnationalization, Berlin Social Science Center (WZB), 10785 Berlin, Germany
| | - Jonathan Mellon
- Department of Politics, University of Manchester, Manchester, M19 2JS, United Kingdom
| | - Friedolin Merhout
- Department of Sociology and Centre for Social Data Science, University of Copenhagen, 1353 Copenhagen, Denmark
| | - Samuel Merk
- Department of School Development, University of Education Karlsruhe, 76133 Karlsruhe, Germany
| | - Daniel Meyer
- Department of Education and Social Sciences, University of Cologne, 50931 Cologne, Germany
| | - Leticia Micheli
- Department of Psychology III, Julius-Maximilians University Würzburg, 97070 Würzburg, Germany
| | - Jonathan Mijs
- Department of Sociology, Boston University, Boston, MA 02215
| | - Cristóbal Moya
- Faculty of Sociology, Bielefeld University, 33615 Bielefeld, Germany
| | - Marcel Neunhoeffer
- School of Social Sciences, University of Mannheim, 68159 Mannheim, Germany
| | - Daniel Nüst
- Department of Geosciences, University of Münster, 49149 Münster, Germany
| | - Olav Nygård
- Division of Migration, Ethnicity and Society (REMESO), Linköping University, 60174 Linköping, Sweden
| | - Fabian Ochsenfeld
- Administrative Headquarters, Max Planck Society, 80539 Berlin, Germany
| | - Gunnar Otte
- Institute of Sociology, Johannes Gutenberg University Mainz, 55128 Mainz, Germany
| | | | - Christopher Prosser
- Department of Politics, International Relations and Philosophy, Royal Holloway University of London, London, TW20 0EX, United Kingdom
| | - Louis Raes
- Department of Economics, Tilburg University, 5037AB Tilburg, The Netherlands
| | - Kevin Ralston
- School of Social and Political Science, University of Edinburgh, Edinburgh, EH8 9LD, United Kingdom
| | - Miguel R. Ramos
- Department of Social Policy, Sociology and Criminology, University of Birmingham, Birmingham, B15 2TT, United Kingdom
| | - Arne Roets
- Department of Developmental, Personality and Social Psychology, Ghent University, 9000 Ghent, Belgium
| | - Jonathan Rogers
- Division of Social Science, New York University Abu Dhabi, Abu Dhabi, 10276, United Arab Emirates
| | - Guido Ropers
- School of Social Sciences, University of Mannheim, 68159 Mannheim, Germany
| | - Robin Samuel
- University of Luxembourg, 4365 Esch-sur-Alzette, Luxembourg
- Department of Social Sciences, University of Luxembourg, 4366 Esch-sur-Alzette, Luxembourg
| | - Gregor Sand
- Max Planck Institute for Social Law and Social Policy, 80799 Munich, Germany
| | - Ariela Schachter
- Department of Sociology, Washington University in St. Louis, St. Louis, MO 63130
| | - Merlin Schaeffer
- Department of Sociology, University of Copenhagen, 1353 Copenhagen, Denmark
| | - David Schieferdecker
- Institute for Media and Communication Studies, Freie Universität Berlin, 14195 Berlin, Germany
| | - Elmar Schlueter
- Institute of Sociology, Justus Liebig University of Giessen, 35394 Giessen, Germany
| | - Regine Schmidt
- Faculty of Social Sciences, Economics, and Business Administration, University of Bamberg, 96052 Bamberg, Germany
| | - Katja M. Schmidt
- Socio-Economic Panel Study (SOEP), German Institute for Economic Research (DIW), 10117 Berlin, Germany
| | | | | | - Jürgen Schneider
- Tübingen School of Education, University of Tübingen, 72074 Tübingen, Germany
| | - Martijn Schoonvelde
- University College Dublin, Dublin 4, Ireland
- Department of European Languages and Cultures, University of Groningen, 9712 EK Groningen, The Netherlands
| | - Julia Schulte-Cloos
- Robert Schuman Center for Advanced Studies, European University Institute, 50133 Florence, Italy
| | - Sandy Schumann
- Department of Security and Crime Science, University College London, London,WC1E 6BT, United Kingdom
| | - Reinhard Schunck
- School of Human and Social Sciences, University of Wuppertal, 42119 Wuppertal, Germany
| | - Jürgen Schupp
- Socio-Economic Panel Study (SOEP), German Institute for Economic Research (DIW), 10117 Berlin, Germany
| | - Julian Seuring
- Department of Migration, Leibniz Institute for Educational Trajectories, 96047 Bamberg, Germany
| | - Henning Silber
- Department of Survey Design and Methodology, Leibniz Institute for the Social Sciences (GESIS), 68159 Mannheim, Germany
| | - Willem Sleegers
- Department of Social Psychology, Tilburg University, 5037AB Tilburg, The Netherlands
| | - Nico Sonntag
- Institute of Sociology, Johannes Gutenberg University Mainz, 55128 Mainz, Germany
| | | | - Nadia Steiber
- Department of Sociology, University of Vienna, 1090 Vienna, Austria
| | - Nils Steiner
- Institute for Political Science, Johannes Gutenberg University Mainz, 55099 Mainz, Germany
| | | | - Dieter Stiers
- Center for Political Science Research, KU Leuven, 3000 Leuven, Belgium
| | - Dragana Stojmenovska
- Department of Sociology, University of Amsterdam, 1001 Amsterdam, The Netherlands
| | - Nora Storz
- Interdisciplinary Social Science, Utrecht University, 3584 Utrecht, The Netherlands
| | - Erich Striessnig
- Department of Demography, University of Vienna, 1010 Vienna, Austria
| | - Anne-Kathrin Stroppe
- Survey Data Curation, Leibniz Institute for the Social Sciences (GESIS), 50667 Cologne, Germany
| | - Janna Teltemann
- Institute for Social Sciences, University of Hildesheim, 31141 Hildesheim, Germany
| | - Andrey Tibajev
- Division of Migration, Ethnicity and Society (REMESO), Linköping University, 60174 Linköping, Sweden
| | - Brian Tung
- Department of Sociology, Washington University in St. Louis, St. Louis, MO 63130
| | - Giacomo Vagni
- Social Research Institute, Institute of Education, University College London, London, WC1H 0AL, United Kingdom
| | - Jasper Van Assche
- Department of Developmental, Personality and Social Psychology, Ghent University, 9000 Ghent, Belgium
- Center for Social and Cultural Psychology, Université Libre de Bruxelles, 1050 Brussels, Belgium
| | - Meta van der Linden
- Interdisciplinary Social Science, Utrecht University, 3584 Utrecht, The Netherlands
| | | | - Arno Van Hootegem
- Department of Sociology, Center for Sociological Research, KU Leuven, 3000 Leuven, Belgium
| | - Stefan Vogtenhuber
- Education and Employment, Institute for Advanced Studies, University of Vienna, Vienna, 1080 Austria
| | - Bogdan Voicu
- Research Institute for Quality of Life, Romanian Academy, 010071 Bucharest, Romania
- Department of Sociology, Lucian Blaga University of Sibiu, 550024 Sibiu, Romania
| | - Fieke Wagemans
- Netherlands Institute for Social Research, 2500 BD The Hague, the Netherlands
- Policy Perspectives, Citizen Perspectives, and Behaviors, Netherlands Institute for Social Research, 2594 The Hague, The Netherlands
| | - Nadja Wehl
- Research Cluster "The Politics of Inequality", University of Konstanz, 78464 Konstanz, Germany
| | - Hannah Werner
- Center for Political Science Research, KU Leuven, 3000 Leuven, Belgium
| | | | - Fabian Winter
- Mechanisms of Normative Change, Max Planck Institute for Research on Collective Goods, 53113 Bonn, Germany
| | - Christof Wolf
- Mannheim Centre for European Social Research, University of Mannheim, 68131 Mannheim, Germany
- School of Social Sciences, University of Mannheim, 68159 Mannheim, Germany
- President, Leibniz Institute for the Social Sciences (GESIS), 68159 Mannheim, Germany
| | - Yuki Yamada
- Faculty of Arts and Science, Kyushu University, Fukuoka, 819-0395, Japan
| | - Nan Zhang
- Mannheim Centre for European Social Research, University of Mannheim, 68131 Mannheim, Germany
| | - Conrad Ziller
- Institute for Socio-Economics, University of Duisburg-Essen, 47057 Duisburg, Germany
- Department of Political Science, University of Duisburg-Essen, 47057 Duisburg, Germany
| | - Stefan Zins
- Institute for Employment Research, Federal Employment Agency, 90478 Nuremberg, Germany
| | - Tomasz Żółtak
- Department of Research on Social and Institutional Transformations, Institute of Political Studies of the Polish Academy of Sciences, 00-625 Warsaw, Poland
| |
Collapse
|
27
|
Forscher PS, Wagenmakers EJ, Coles NA, Silan MA, Dutra N, Basnight-Brown D, IJzerman H. The Benefits, Barriers, and Risks of Big-Team Science. Perspect Psychol Sci 2022; 18:607-623. [PMID: 36190899 DOI: 10.1177/17456916221082970] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Progress in psychology has been frustrated by challenges concerning replicability, generalizability, strategy selection, inferential reproducibility, and computational reproducibility. Although often discussed separately, these five challenges may share a common cause: insufficient investment of intellectual and nonintellectual resources into the typical psychology study. We suggest that the emerging emphasis on big-team science can help address these challenges by allowing researchers to pool their resources together to increase the amount available for a single study. However, the current incentives, infrastructure, and institutions in academic science have all developed under the assumption that science is conducted by solo principal investigators and their dependent trainees, an assumption that creates barriers to sustainable big-team science. We also anticipate that big-team science carries unique risks, such as the potential for big-team-science organizations to be co-opted by unaccountable leaders, become overly conservative, and make mistakes at a grand scale. Big-team-science organizations must also acquire personnel who are properly compensated and have clear roles. Not doing so raises risks related to mismanagement and a lack of financial sustainability. If researchers can manage its unique barriers and risks, big-team science has the potential to spur great progress in psychology and beyond.
Collapse
Affiliation(s)
- Patrick S Forscher
- Research and Innovation Division, Busara Center for Behavioral Economics, Nairobi, Kenya.,Laboratoire Interuniversitaire de Psychologie, Université Grenoble Alpes
| | | | - Nicholas A Coles
- Center for the Study of Language and Information, Stanford University
| | - Miguel Alejandro Silan
- Unité de recherche Développement Individu Processus Handicap Éducation, Université Lumière Lyon 2.,Annecy Behavioral Science Lab, Menthon-Saint-Bernard, France.,Social and Political Laboratory, Psychology Department, University of the Philippines Diliman
| | - Natália Dutra
- Núcleo de Teoria e Pesquisa do Comportamento, Universidade Federal do Pará
| | | | - Hans IJzerman
- Laboratoire Interuniversitaire de Psychologie, Université Grenoble Alpes.,Institut Universitaire de France
| |
Collapse
|
28
|
Mazor M, Brown S, Ciaunica A, Demertzi A, Fahrenfort J, Faivre N, Francken JC, Lamy D, Lenggenhager B, Moutoussis M, Nizzi MC, Salomon R, Soto D, Stein T, Lubianiker N. The Scientific Study of Consciousness Cannot and Should Not Be Morally Neutral. Perspect Psychol Sci 2022; 18:535-543. [PMID: 36170496 DOI: 10.1177/17456916221110222] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
A target question for the scientific study of consciousness is how dimensions of consciousness, such as the ability to feel pain and pleasure or reflect on one's own experience, vary in different states and animal species. Considering the tight link between consciousness and moral status, answers to these questions have implications for law and ethics. Here we point out that given this link, the scientific community studying consciousness may face implicit pressure to carry out certain research programs or interpret results in ways that justify current norms rather than challenge them. We show that because consciousness largely determines moral status, the use of nonhuman animals in the scientific study of consciousness introduces a direct conflict between scientific relevance and ethics-the more scientifically valuable an animal model is for studying consciousness, the more difficult it becomes to ethically justify compromises to its well-being for consciousness research. Finally, in light of these considerations, we call for a discussion of the immediate ethical corollaries of the body of knowledge that has accumulated and for a more explicit consideration of the role of ideology and ethics in the scientific study of consciousness.
Collapse
Affiliation(s)
- Matan Mazor
- Department of Psychological Sciences, Birkbeck, University of London.,Wellcome Centre for Human Neuroimaging, Institute of Neurology, University College London
| | - Simon Brown
- Department of Philosophy, Johns Hopkins University
| | - Anna Ciaunica
- Centre for Philosophy of Science, University of Lisbon
| | - Athena Demertzi
- Physiology of Cognition, GIGA Consciousness Research Unit, Université de Liège.,Fund for Scientific Research, Bruxelles, Belgium
| | - Johannes Fahrenfort
- Department of Psychology, University of Amsterdam.,Department of Experimental and Applied Psychology, Vrije Universiteit
| | - Nathan Faivre
- Centre for Neuroprosthetics and Brain Mind Institute, Faculty of Life Sciences, Swiss Federal Institute of Technology.,University Grenoble Alpes, University Savoie Mont Blanc, CNRS, LPNC
| | - Jolien C Francken
- Faculty of Philosophy, Theology and Religious Studies, Radboud University
| | - Dominique Lamy
- Sagol School of Neuroscience, Tel Aviv University, Tel-Aviv, Israel.,School of Psychological Sciences, Tel Aviv University
| | | | - Michael Moutoussis
- Wellcome Centre for Human Neuroimaging, Institute of Neurology, University College London.,Max Planck-University College London Centre for Computational Psychiatry and Ageing Research, University College London
| | - Marie-Christine Nizzi
- Semel Institute for Neuroscience and Human Behavior, Department of Psychiatry & Biobehavioral Sciences, David Geffen School of Medicine, University of California Los Angeles.,Cognitive Science Program, Dartmouth College.,Institute for Interdisciplinary Brain and Behavioral Sciences, Chapman University
| | - Roy Salomon
- Gonda Multidisciplinary Brain Research Centre, Bar-Ilan University
| | - David Soto
- Basque Centre on Cognition, Brain and Language, San Sebastian, Spain.,Ikerbasque, Basque Foundation for Science, Bilbao, Spain
| | - Timo Stein
- Department of Psychology, University of Amsterdam
| | - Nitzan Lubianiker
- School of Psychological Sciences, Tel Aviv University.,Sagol Brain Institute, Tel-Aviv Medical Centre, Tel Aviv, Israel
| |
Collapse
|
29
|
Abstract
In the United States, White samples are often portrayed as if their racial identities were inconsequential to their thoughts, feelings, and behaviors, and research findings derived from White samples are often portrayed as if they were generalizable to all humans. We argue that these and other practices are rooted in a "White = neutral" framework (i.e., the conceptualization of White samples as nonracial). First, we review existing data and present some new data to highlight the scope of the White = neutral framework. Second, we integrate research from across psychological science to argue that the continued use of the White = neutral framework will prevent psychology from becoming a truly objective and inclusive science for at least three reasons: (a) Research with White samples will be valued over research with samples of color, (b) norms that maintain White neutrality will remain unchallenged, and (c) the role of White identity in psychological processes will remain underspecified and underexamined. Third, we provide recommendations for how to move beyond the White = neutral framework in hopes of encouraging all psychological scientists to move toward a White ≠ neutral framework in which all samples are identified for the unique and diverse perspectives that they bring to the world.
Collapse
Affiliation(s)
- Steven Othello Roberts
- Department of Psychology, Center for Comparative Studies in Race and Ethnicity, Stanford University
| | - Elizabeth Mortenson
- Department of Psychology, Center for Comparative Studies in Race and Ethnicity, Stanford University
| |
Collapse
|
30
|
Sosa DN, Altman RB. Contexts and contradictions: a roadmap for computational drug repurposing with knowledge inference. Brief Bioinform 2022; 23:6640007. [PMID: 35817308 PMCID: PMC9294417 DOI: 10.1093/bib/bbac268] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2022] [Revised: 05/25/2022] [Accepted: 06/07/2022] [Indexed: 11/30/2022] Open
Abstract
The cost of drug development continues to rise and may be prohibitive in cases of unmet clinical need, particularly for rare diseases. Artificial intelligence-based methods are promising in their potential to discover new treatment options. The task of drug repurposing hypothesis generation is well-posed as a link prediction problem in a knowledge graph (KG) of interacting of drugs, proteins, genes and disease phenotypes. KGs derived from biomedical literature are semantically rich and up-to-date representations of scientific knowledge. Inference methods on scientific KGs can be confounded by unspecified contexts and contradictions. Extracting context enables incorporation of relevant pharmacokinetic and pharmacodynamic detail, such as tissue specificity of interactions. Contradictions in biomedical KGs may arise when contexts are omitted or due to contradicting research claims. In this review, we describe challenges to creating literature-scale representations of pharmacological knowledge and survey current approaches toward incorporating context and resolving contradictions.
Collapse
Affiliation(s)
- Daniel N Sosa
- Department of Biomedical Data Science, Stanford University, 443 Via Ortega, 94305, California, USA
| | - Russ B Altman
- Department of Biological Engineering; Department of Genetics; Department of Biomedical Data Science, Stanford University, 443 Via Ortega, 94305, California, USA
| |
Collapse
|
31
|
Ellis RJ. Questionable Research Practices, Low Statistical Power, and Other Obstacles to Replicability: Why Preclinical Neuroscience Research Would Benefit from Registered Reports. eNeuro 2022; 9:ENEURO. [PMID: 35922130 DOI: 10.1523/ENEURO.0017-22.2022] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2022] [Revised: 05/22/2022] [Accepted: 05/31/2022] [Indexed: 02/03/2023] Open
Abstract
Replicability, the degree to which a previous scientific finding can be repeated in a distinct set of data, has been considered an integral component of institutionalized scientific practice since its inception several hundred years ago. In the past decade, large-scale replication studies have demonstrated that replicability is far from favorable, across multiple scientific fields. Here, I evaluate this literature and describe contributing factors including the prevalence of questionable research practices (QRPs), misunderstanding of p-values, and low statistical power. I subsequently discuss how these issues manifest specifically in preclinical neuroscience research. I conclude that these problems are multifaceted and difficult to solve, relying on the actions of early and late career researchers, funding sources, academic publishers, and others. I assert that any viable solution to the problem of substandard replicability must include changing academic incentives, with adoption of registered reports being the most immediately impactful and pragmatic strategy. For animal research in particular, comprehensive reporting guidelines that document potential sources of sensitivity for experimental outcomes is an essential addition.
Collapse
|
32
|
Roche DG, Berberi I, Dhane F, Lauzon F, Soeharjono S, Dakin R, Binning SA. Slow improvement to the archiving quality of open datasets shared by researchers in ecology and evolution. Proc Biol Sci 2022; 289:20212780. [PMID: 35582791 PMCID: PMC9114975 DOI: 10.1098/rspb.2021.2780] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/04/2023] Open
Abstract
Many leading journals in ecology and evolution now mandate open data upon publication. Yet, there is very little oversight to ensure the completeness and reusability of archived datasets, and we currently have a poor understanding of the factors associated with high-quality data sharing. We assessed 362 open datasets linked to first- or senior-authored papers published by 100 principal investigators (PIs) in the fields of ecology and evolution over a period of 7 years to identify predictors of data completeness and reusability (data archiving quality). Datasets scored low on these metrics: 56.4% were complete and 45.9% were reusable. Data reusability, but not completeness, was slightly higher for more recently archived datasets and PIs with less seniority. Journal open data policy, PI gender and PI corresponding author status were unrelated to data archiving quality. However, PI identity explained a large proportion of the variance in data completeness (27.8%) and reusability (22.0%), indicating consistent inter-individual differences in data sharing practices by PIs across time and contexts. Several PIs consistently shared data of either high or low archiving quality, but most PIs were inconsistent in how well they shared. One explanation for the high intra-individual variation we observed is that PIs often conduct research through students and postdoctoral researchers, who may be responsible for the data collection, curation and archiving. Levels of data literacy vary among trainees and PIs may not regularly perform quality control over archived files. Our findings suggest that research data management training and culture within a PI's group are likely to be more important determinants of data archiving quality than other factors such as a journal's open data policy. Greater incentives and training for individual researchers at all career stages could improve data sharing practices and enhance data transparency and reusability.
Collapse
Affiliation(s)
- Dominique G. Roche
- Department of Biology, Carleton University, 1125 Colonel By Drive, Ottawa, Ontario, Canada K1S 5B6,Département de science biologiques, Université de Montréal, Montréal, Canada H3C 3J7,Institut de Biologie, Université de Neuchâtel, Neuchâtel 2000, Switzerland
| | - Ilias Berberi
- Department of Biology, Carleton University, 1125 Colonel By Drive, Ottawa, Ontario, Canada K1S 5B6
| | - Fares Dhane
- Département de science biologiques, Université de Montréal, Montréal, Canada H3C 3J7
| | - Félix Lauzon
- Département de science biologiques, Université de Montréal, Montréal, Canada H3C 3J7,Department of Biology, McGill University, Montréal, Canada H3A 1B1
| | - Sandrine Soeharjono
- Département de science biologiques, Université de Montréal, Montréal, Canada H3C 3J7
| | - Roslyn Dakin
- Department of Biology, Carleton University, 1125 Colonel By Drive, Ottawa, Ontario, Canada K1S 5B6
| | - Sandra A. Binning
- Département de science biologiques, Université de Montréal, Montréal, Canada H3C 3J7
| |
Collapse
|
33
|
Paret C, Unverhau N, Feingold F, Poldrack RA, Stirner M, Schmahl C, Sicorello M. Survey on Open Science Practices in Functional Neuroimaging. Neuroimage 2022; 257:119306. [PMID: 35595201 DOI: 10.1016/j.neuroimage.2022.119306] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/18/2021] [Revised: 05/03/2022] [Accepted: 05/10/2022] [Indexed: 12/12/2022] Open
Abstract
Replicability and reproducibility of scientific findings is paramount for sustainable progress in neuroscience. Preregistration of the hypotheses and methods of an empirical study before analysis, the sharing of primary research data, and compliance with data standards such as the Brain Imaging Data Structure (BIDS), are considered effective practices to secure progress and to substantiate quality of research. We investigated the current level of adoption of open science practices in neuroimaging and the difficulties that prevent researchers from using them. Email invitations to participate in the survey were sent to addresses received through a PubMed search of human functional magnetic resonance imaging studies that were published between 2010 and 2020. 283 persons completed the questionnaire. Although half of the participants were experienced with preregistration, the willingness to preregister studies in the future was modest. The majority of participants had experience with the sharing of primary neuroimaging data. Most of the participants were interested in implementing a standardized data structure such as BIDS in their labs. Based on demographic variables, we compared participants on seven subscales, which had been generated through factor analysis. Exploratory analyses found that experienced researchers at lower career level had higher fear of being transparent and researchers with residence in the EU had a higher need for data governance. Additionally, researchers at medical faculties as compared to other university faculties reported a more unsupportive supervisor with regards to open science practices and a higher need for data governance. The results suggest growing adoption of open science practices but also highlight a number of important impediments.
Collapse
Affiliation(s)
- Christian Paret
- Department of Psychosomatic Medicine and Psychotherapy, Central Institute of Mental Health Mannheim, Medical Faculty Mannheim / Heidelberg University, Germany; Sagol Brain Institute, Wohl Institute for Advanced Imaging, Tel-Aviv Sourasky Medical Center and School of Psychological Sciences, Tel-Aviv University, Israel.
| | - Nike Unverhau
- Department of Psychosomatic Medicine and Psychotherapy, Central Institute of Mental Health Mannheim, Medical Faculty Mannheim / Heidelberg University, Germany
| | | | | | - Madita Stirner
- Department of Psychosomatic Medicine and Psychotherapy, Central Institute of Mental Health Mannheim, Medical Faculty Mannheim / Heidelberg University, Germany
| | - Christian Schmahl
- Department of Psychosomatic Medicine and Psychotherapy, Central Institute of Mental Health Mannheim, Medical Faculty Mannheim / Heidelberg University, Germany
| | - Maurizio Sicorello
- Department of Psychosomatic Medicine and Psychotherapy, Central Institute of Mental Health Mannheim, Medical Faculty Mannheim / Heidelberg University, Germany
| |
Collapse
|
34
|
|
35
|
Donegan KR, Gillan CM. New principles and new paths needed for online research in mental health: Commentary on Burnette et al. (2021). Int J Eat Disord 2022; 55:278-281. [PMID: 35005784 PMCID: PMC9306503 DOI: 10.1002/eat.23670] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/19/2021] [Revised: 12/17/2021] [Accepted: 12/18/2021] [Indexed: 11/18/2022]
Abstract
Online methods have become a powerful research tool, allowing us to conduct well-powered studies, to explore and replicate effects, and to recruit often rare and diverse samples. However, concerns about the validity and reliability of the data collected from some platforms have reached crescendo. In this issue, Burnette et al. (2021) describe how commonly employed protective measures such as captchas, response consistency requirements, and attention checks may no longer be sufficient to ensure high-quality data in survey-based studies on Amazon's Mechanical Turk. We echo and elaborate on these concerns, but believe that although imperfect, online research will continue to be incredibly important in driving progress in mental health science. Not all platforms or populations are well suited to every research question and so we posit that the future of online research will be much more varied, and in no small part supported by citizen scientists and those with lived experience. Whatever the medium, researchers cannot stand still; we must continuously reflect and adapt to technological advances, demographics, and motivational shifts of our participants. Online research is difficult but worthwhile.
Collapse
Affiliation(s)
- Kelly R Donegan
- School of Psychology, Trinity College Dublin, Dublin, Ireland.,Trinity College Institute of Neuroscience, Trinity College Dublin, Dublin, Ireland
| | - Claire M Gillan
- School of Psychology, Trinity College Dublin, Dublin, Ireland.,Trinity College Institute of Neuroscience, Trinity College Dublin, Dublin, Ireland.,Global Brain Health Institute, Trinity College Dublin, Dublin, Ireland
| |
Collapse
|
36
|
Lamers WS, Boyack K, Larivière V, Sugimoto CR, van Eck NJ, Waltman L, Murray D. Investigating disagreement in the scientific literature. eLife 2021; 10:72737. [PMID: 34951588 PMCID: PMC8709576 DOI: 10.7554/elife.72737] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2021] [Accepted: 11/26/2021] [Indexed: 11/13/2022] Open
Abstract
Disagreement is essential to scientific progress but the extent of disagreement in science, its evolution over time, and the fields in which it happens remain poorly understood. Here we report the development of an approach based on cue phrases that can identify instances of disagreement in scientific articles. These instances are sentences in an article that cite other articles. Applying this approach to a collection of more than four million English-language articles published between 2000 and 2015 period, we determine the level of disagreement in five broad fields within the scientific literature (biomedical and health sciences; life and earth sciences; mathematics and computer science; physical sciences and engineering; and social sciences and humanities) and 817 meso-level fields. Overall, the level of disagreement is highest in the social sciences and humanities, and lowest in mathematics and computer science. However, there is considerable heterogeneity across the meso-level fields, revealing the importance of local disciplinary cultures and the epistemic characteristics of disagreement. Analysis at the level of individual articles reveals notable episodes of disagreement in science, and illustrates how methodological artifacts can confound analyses of scientific texts.
Collapse
Affiliation(s)
- Wout S Lamers
- Centre for Science and Technology Studies, Leiden University, Leiden, Netherlands
| | - Kevin Boyack
- SciTech Strategies, Inc, Albuquerque, United States
| | - Vincent Larivière
- École de bibliothéconomie et des sciences de l'information, Université de Montréal, Montreal, Canada
| | - Cassidy R Sugimoto
- School of Public Policy, Georgia Institute of Technology, Atlanta, United States
| | - Nees Jan van Eck
- Centre for Science and Technology Studies, Leiden University, Leiden, Netherlands
| | - Ludo Waltman
- Centre for Science and Technology Studies, Leiden University, Leiden, Netherlands
| | - Dakota Murray
- School of Informatics, Computing, and Engineering, Indiana University, Bloomington, United States
| |
Collapse
|
37
|
Aczel B, Szaszi B, Nilsonne G, van den Akker OR, Albers CJ, van Assen MALM, Bastiaansen JA, Benjamin D, Boehm U, Botvinik-Nezer R, Bringmann LF, Busch NA, Caruyer E, Cataldo AM, Cowan N, Delios A, van Dongen NNN, Donkin C, van Doorn JB, Dreber A, Dutilh G, Egan GF, Gernsbacher MA, Hoekstra R, Hoffmann S, Holzmeister F, Huber J, Johannesson M, Jonas KJ, Kindel AT, Kirchler M, Kunkels YK, Lindsay DS, Mangin JF, Matzke D, Munafò MR, Newell BR, Nosek BA, Poldrack RA, van Ravenzwaaij D, Rieskamp J, Salganik MJ, Sarafoglou A, Schonberg T, Schweinsberg M, Shanks D, Silberzahn R, Simons DJ, Spellman BA, St-Jean S, Starns JJ, Uhlmann EL, Wicherts J, Wagenmakers EJ. Consensus-based guidance for conducting and reporting multi-analyst studies. eLife 2021; 10:e72185. [PMID: 34751133 PMCID: PMC8626083 DOI: 10.7554/elife.72185] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2021] [Accepted: 11/07/2021] [Indexed: 11/13/2022] Open
Abstract
Any large dataset can be analyzed in a number of ways, and it is possible that the use of different analysis strategies will lead to different results and conclusions. One way to assess whether the results obtained depend on the analysis strategy chosen is to employ multiple analysts and leave each of them free to follow their own approach. Here, we present consensus-based guidance for conducting and reporting such multi-analyst studies, and we discuss how broader adoption of the multi-analyst approach has the potential to strengthen the robustness of results and conclusions obtained from analyses of datasets in basic and applied research.
Collapse
Affiliation(s)
| | | | - Gustav Nilsonne
- Karolinska InstitutetStockholmSweden
- Stockholm UniversityStockholmSweden
| | | | | | | | - Jojanneke A Bastiaansen
- University Medical Center Groningen, University of GroningenGroningenNetherlands
- Friesland Mental Health Care ServicesLeeuwardenNetherlands
| | - Daniel Benjamin
- University of California Los AngelesLos AngelesUnited States
- National Bureau of Economic ResearchCambridgeUnited States
| | - Udo Boehm
- University of AmsterdamAmsterdamNetherlands
| | | | | | | | | | - Andrea M Cataldo
- McLean HospitalBelmontUnited States
- Harvard Medical SchoolBostonUnited States
| | | | | | | | | | | | - Anna Dreber
- Stockholm School of EconomicsStockholmSweden
- University of InnsbruckInnsbruckAustria
| | | | | | | | | | | | | | | | | | | | | | | | - Yoram K Kunkels
- University Medical Center Groningen, University of GroningenGroningenNetherlands
| | | | | | | | | | | | - Brian A Nosek
- Center for Open ScienceCharlottesvilleUnited States
- University of VirginiaCharlottesvilleUnited States
| | | | | | | | | | | | | | | | | | | | | | | | - Samuel St-Jean
- University of AlbertaEdmontonCanada
- Lund UniversityLundUnited States
| | | | | | | | | |
Collapse
|
38
|
Rosenfeld DL, Balcetis E, Bastian B, Berkman ET, Bosson JK, Brannon TN, Burrow AL, Cameron CD, Chen S, Cook JE, Crandall C, Davidai S, Dhont K, Eastwick PW, Gaither SE, Gangestad SW, Gilovich T, Gray K, Haines EL, Haselton MG, Haslam N, Hodson G, Hogg MA, Hornsey MJ, Huo YJ, Joel S, Kachanoff FJ, Kraft-Todd G, Leary MR, Ledgerwood A, Lee RT, Loughnan S, MacInnis CC, Mann T, Murray DR, Parkinson C, Pérez EO, Pyszczynski T, Ratner K, Rothgerber H, Rounds JD, Schaller M, Silver RC, Spellman BA, Strohminger N, Swim JK, Thoemmes F, Urganci B, Vandello JA, Volz S, Zayas V, Tomiyama AJ. Psychological Science in the Wake of COVID-19: Social, Methodological, and Metascientific Considerations. Perspect Psychol Sci 2021; 17:311-333. [PMID: 34597198 DOI: 10.1177/1745691621999374] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/10/2023]
Abstract
The COVID-19 pandemic has extensively changed the state of psychological science from what research questions psychologists can ask to which methodologies psychologists can use to investigate them. In this article, we offer a perspective on how to optimize new research in the pandemic's wake. Because this pandemic is inherently a social phenomenon-an event that hinges on human-to-human contact-we focus on socially relevant subfields of psychology. We highlight specific psychological phenomena that have likely shifted as a result of the pandemic and discuss theoretical, methodological, and practical considerations of conducting research on these phenomena. After this discussion, we evaluate metascientific issues that have been amplified by the pandemic. We aim to demonstrate how theoretically grounded views on the COVID-19 pandemic can help make psychological science stronger-not weaker-in its wake.
Collapse
Affiliation(s)
| | | | - Brock Bastian
- Melbourne School of Psychological Sciences, University of Melbourne
| | - Elliot T Berkman
- Department of Psychology, University of Oregon.,Center for Translational Neuroscience, University of Oregon
| | | | | | | | - C Daryl Cameron
- Department of Psychology, The Pennsylvania State University.,Rock Ethics Institute, The Pennsylvania State University
| | - Serena Chen
- Department of Psychology, University of California, Berkeley
| | | | | | | | | | | | | | | | | | - Kurt Gray
- Department of Psychology and Neuroscience, University of North Carolina, Chapel Hill
| | | | - Martie G Haselton
- Department of Psychology, University of California, Los Angeles.,Department of Communication, University of California, Los Angeles.,Institute for Society and Genetics, University of California, Los Angeles
| | - Nick Haslam
- Melbourne School of Psychological Sciences, University of Melbourne
| | | | | | | | - Yuen J Huo
- Department of Psychology, University of California, Los Angeles
| | | | - Frank J Kachanoff
- Department of Psychology and Neuroscience, University of North Carolina, Chapel Hill
| | | | - Mark R Leary
- Department of Psychology and Neuroscience, Duke University
| | | | - Randy T Lee
- Department of Psychology, Cornell University
| | - Steve Loughnan
- School of Philosophy, Psychology, and Language Sciences, The University of Edinburgh
| | | | - Traci Mann
- Department of Psychology, University of Minnesota
| | | | | | - Efrén O Pérez
- Department of Psychology, University of California, Los Angeles.,Department of Political Science, University of California, Los Angeles
| | - Tom Pyszczynski
- Department of Psychology, University of Colorado at Colorado Springs
| | | | | | | | - Mark Schaller
- Department of Psychology, University of British Columbia
| | - Roxane Cohen Silver
- Department of Psychological Science, University of California, Irvine.,Department of Medicine, University of California, Irvine.,Program in Public Health, University of California, Irvine
| | | | - Nina Strohminger
- Department of Legal Studies and Business Ethics, Wharton School of Business, University of Pennsylvania.,Department of Psychology, University of Pennsylvania
| | - Janet K Swim
- Department of Psychology, The Pennsylvania State University
| | - Felix Thoemmes
- Department of Human Development, Cornell University.,Department of Psychology, Cornell University
| | | | | | - Sarah Volz
- Department of Psychology, University of Minnesota
| | | | | |
Collapse
|
39
|
Kominsky JF, Begus K, Bass I, Colantonio J, Leonard JA, Mackey AP, Bonawitz E. Organizing the Methodological Toolbox: Lessons Learned From Implementing Developmental Methods Online. Front Psychol 2021; 12:702710. [PMID: 34589023 PMCID: PMC8473607 DOI: 10.3389/fpsyg.2021.702710] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2021] [Accepted: 08/23/2021] [Indexed: 11/16/2022] Open
Abstract
Adapting studies typically run in the lab, preschool, or museum to online data collection presents a variety of challenges. The solutions to those challenges depend heavily on the specific questions pursued, the methods used, and the constraints imposed by available technology. We present a partial sample of solutions, discussing approaches we have developed for adapting studies targeting a range of different developmental populations, from infants to school-aged children, and utilizing various online methods such as high-framerate video presentation, having participants interact with a display on their own computer, having the experimenter interact with both the participant and an actor, recording free-play with physical objects, recording infant looking times both offline and live, and more. We also raise issues and solutions regarding recruitment and representativeness in online samples. By identifying the concrete needs of a given approach, tools that meet each of those individual needs, and interfaces between those tools, we have been able to implement many (but not all) of our studies using online data collection during the COVID-19 pandemic. This systematic review aligning available tools and approaches with different methods can inform the design of future studies, in and outside of the lab.
Collapse
Affiliation(s)
- Jonathan F. Kominsky
- Graduate School of Education, Harvard University, Cambridge, MA, United States
- Department of Psychology, Rutgers University, Newark, NJ, United States
| | - Katarina Begus
- Graduate School of Education, Harvard University, Cambridge, MA, United States
- Department of Psychology, Rutgers University, Newark, NJ, United States
| | - Ilona Bass
- Graduate School of Education, Harvard University, Cambridge, MA, United States
- Department of Psychology, Rutgers University, Newark, NJ, United States
- Department of Psychology, Harvard University, Cambridge, MA, United States
| | - Joseph Colantonio
- Department of Psychology, Rutgers University, Newark, NJ, United States
| | - Julia A. Leonard
- Department of Psychology, University of Pennsylvania, Philadelphia, PA, United States
- Department of Psychology, Yale University, New Haven, CT, United States
| | - Allyson P. Mackey
- Department of Psychology, University of Pennsylvania, Philadelphia, PA, United States
| | - Elizabeth Bonawitz
- Graduate School of Education, Harvard University, Cambridge, MA, United States
| |
Collapse
|
40
|
Abstract
A series of failed replications and frauds have raised questions regarding self-correction in science. Metascientific activists have advocated policies that incentivize replications and make them more diagnostically potent. We argue that current debates, as well as research in science and technology studies, have paid little heed to a key dimension of replication practice. Although it sometimes serves a diagnostic function, replication is commonly motivated by a practical desire to extend research interests. The resulting replication, which we label 'integrative', is characterized by a pragmatic flexibility toward protocols. The goal is to appropriate what is useful, not test for truth. Within many experimental cultures, however, integrative replications can produce replications of ambiguous diagnostic power. Based on interviews with 60 members of the Board of Reviewing Editors for the journal Science, we show how the interplay between the diagnostic and integrative motives for replication differs between fields and produces different cultures of replication. We offer six theses that aim to put science and technology studies and science activism into dialog to show why effective reforms will need to confront issues of disciplinary difference.
Collapse
Affiliation(s)
- David Peterson
- University of California Los Angeles, Los Angeles, CA, USA
| | - Aaron Panofsky
- University of California Los Angeles, Los Angeles, CA, USA
| |
Collapse
|
41
|
Abstract
The reliance in psychology on verbal definitions means that psychological research is unusually moored to how humans think and communicate about categories. Psychological concepts (e.g., intelligence, attention) are easily assumed to represent objective, definable categories with an underlying essence. Like the “vital forces” previously thought to animate life, these assumed essences can create an illusion of understanding. By synthesizing a wide range of research lines from cognitive, clinical, and biological psychology and neuroscience, we describe a pervasive tendency across psychological science to assume that essences explain phenomena. Labeling a complex phenomenon can appear as theoretical progress before there is sufficient evidence that the described category has a definable essence or known boundary conditions. Category labels can further undermine progress by masking contingent and contextual relationships and obscuring the need to specify mechanisms. Finally, we highlight examples of promising methods that circumvent the lure of essences and suggest four concrete strategies for identifying and avoiding essentialist intuitions in theory development.
Collapse
Affiliation(s)
- C Brick
- Department of Psychology, University of Amsterdam.,Department of Psychology, University of Cambridge
| | - B Hood
- School of Psychological Science, University of Bristol
| | - V Ekroll
- Department of Psychosocial Science, University of Bergen
| | - L de-Wit
- Department of Psychology, University of Cambridge
| |
Collapse
|
42
|
Abstract
In the face of unreplicable results, statistical anomalies, and outright fraud, introspection and changes in the psychological sciences have taken root. Vibrant reform and metascience movements have emerged. These are exciting developments and may point toward practical improvements in the future. Yet there is nothing so practical as good theory. This article outlines aspects of reform and metascience in psychology that are ripe for an injection of theory, including a lot of excellent and overlooked theoretical work from different disciplines. I review established frameworks that model the process of scientific discovery, the types of scientific networks that we ought to aspire to, and the processes by which problematic norms and institutions might evolve, focusing especially on modeling from the philosophy of science and cultural evolution. We have unwittingly evolved a toxic scientific ecosystem; existing interdisciplinary theory may help us intelligently design a better one.
Collapse
Affiliation(s)
- Will M. Gervais
- Centre for Culture and Evolution, Department of Psychology, Brunel University London
| |
Collapse
|
43
|
Latinjak AT, Hatzigeorgiadis A. The Knowledge Map of Sport and Exercise Psychology: An Integrative Perspective. Front Psychol 2021; 12:661824. [PMID: 34220635 PMCID: PMC8242169 DOI: 10.3389/fpsyg.2021.661824] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2021] [Accepted: 05/21/2021] [Indexed: 11/13/2022] Open
Abstract
The present work contains a personal perspective on what sport and exercise psychology (SEP) is today. It is a global synthesis of research about psychological aspects related to the context and practice of sport and exercise. The intended impact was to positively influence teaching SEP to students, to promote interdisciplinary research and practice, and to assist the development of SEP as an applied science by helping experts develop a more holistic view of the field. Over 650 theoretical and review articles about psychological concepts in connection to sport and exercise were read in the process of creating a conceptual model that reflects the essence of SEP and leads to a conceptualization of SEP based on research topics. The result was a knowledge map of SEP made up of four main research clusters: biopsychological descriptors, external variables, psychological skills, and applied SEP practice. In terms of interdisciplinarity, the present perspective on SEP suggests that sport and exercise can be used as a research paradigm or natural laboratory to study psychological aspects relevant to various scientific fields, and that sport and exercise can be used as a therapeutic framework in response to challenges that researchers and practitioners in these fields are typically addressing.
Collapse
Affiliation(s)
- Alexander T. Latinjak
- School of Social Sciences and Humanities, University of Suffolk, Ipswich, United Kingdom
- Escola Universitària de la Salut i de l’Esport (EUSES), Universitat de Girona, Salt, Spain
| | - Antonis Hatzigeorgiadis
- Department of Physiological Education and Sport Science, University of Thessaly, Trikala, Greece
| |
Collapse
|
44
|
Klau S, Hoffmann S, Patel CJ, Ioannidis JP, Boulesteix AL. Examining the robustness of observational associations to model, measurement and sampling uncertainty with the vibration of effects framework. Int J Epidemiol 2021; 50:266-278. [PMID: 33147614 PMCID: PMC7938511 DOI: 10.1093/ije/dyaa164] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/03/2020] [Indexed: 02/05/2023] Open
Abstract
BACKGROUND The results of studies on observational associations may vary depending on the study design and analysis choices as well as due to measurement error. It is important to understand the relative contribution of different factors towards generating variable results, including low sample sizes, researchers' flexibility in model choices, and measurement error in variables of interest and adjustment variables. METHODS We define sampling, model and measurement uncertainty, and extend the concept of vibration of effects in order to study these three types of uncertainty in a common framework. In a practical application, we examine these types of uncertainty in a Cox model using data from the National Health and Nutrition Examination Survey. In addition, we analyse the behaviour of sampling, model and measurement uncertainty for varying sample sizes in a simulation study. RESULTS All types of uncertainty are associated with a potentially large variability in effect estimates. Measurement error in the variable of interest attenuates the true effect in most cases, but can occasionally lead to overestimation. When we consider measurement error in both the variable of interest and adjustment variables, the vibration of effects are even less predictable as both systematic under- and over-estimation of the true effect can be observed. The results on simulated data show that measurement and model vibration remain non-negligible even for large sample sizes. CONCLUSION Sampling, model and measurement uncertainty can have important consequences for the stability of observational associations. We recommend systematically studying and reporting these types of uncertainty, and comparing them in a common framework.
Collapse
Affiliation(s)
- Simon Klau
- Institute for Medical Information Processing, Biometry, and Epidemiology, Ludwig-Maximilians-Universität München, Munich, Germany.,Leibniz Institute for Prevention Research and Epidemiology-BIPS, Bremen, Germany
| | - Sabine Hoffmann
- Institute for Medical Information Processing, Biometry, and Epidemiology, Ludwig-Maximilians-Universität München, Munich, Germany.,LMU Open Science Center, Ludwig-Maximilians-Universität München, Munich, Germany
| | - Chirag J Patel
- Department of Biomedical Informatics, Harvard Medical School, Boston, MA, USA
| | - John Pa Ioannidis
- Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Stanford, CA, USA.,Department of Epidemiology and Population Health, Stanford University School of Medicine, Stanford, CA, USA.,Department of Biomedical Data Science, Stanford University School of Medicine, Stanford, CA, USA.,Department of Statistics, Stanford University School of Humanities and Sciences, Stanford, CA, USA.,Department of Medicine, Stanford University School of Medicine, Stanford, CA, USA
| | - Anne-Laure Boulesteix
- Institute for Medical Information Processing, Biometry, and Epidemiology, Ludwig-Maximilians-Universität München, Munich, Germany.,LMU Open Science Center, Ludwig-Maximilians-Universität München, Munich, Germany
| |
Collapse
|
45
|
Abstract
The scientific enterprise has long been based on the presumption of replication, although scientists have recently become aware of various corruptions of the enterprise that have hurt replicability. In this article, we begin by considering three illustrations of research paradigms that have all been subject to intense scrutiny through replications and theoretical concerns. The three paradigms are one for which the corpus of research points to a real finding, one for which the corpus of research points to a significantly attenuated effect, and one for which the debate is ongoing. We then discuss what scientists can learn-and how science can be improved-through replications more generally. From there, we discuss what we believe needs to be done to improve scientific inquiry with regard to replication moving forward. Finally, we conclude by providing readers with several different approaches to replication and how these approaches progress science. The approaches discussed include multilab replications of many effects, multilab replications of specific effects, adversarial collaborations, and stand-alone applications.
Collapse
Affiliation(s)
- John E Edlund
- Department of Psychology, Rochester Institute of Technology
| | - Kelly Cuccolo
- Department of Psychology, University of North Dakota
| | | | | | - Martha S Zlokovich
- Psi Chi, the International Honor Society in Psychology, Chattanooga, Tennessee
| |
Collapse
|
46
|
Rohrer JM, Tierney W, Uhlmann EL, DeBruine LM, Heyman T, Jones B, Schmukle SC, Silberzahn R, Willén RM, Carlsson R, Lucas RE, Strand J, Vazire S, Witt JK, Zentall TR, Chabris CF, Yarkoni T. Putting the Self in Self-Correction: Findings From the Loss-of-Confidence Project. Perspect Psychol Sci 2021; 16:1255-1269. [PMID: 33645334 PMCID: PMC8564260 DOI: 10.1177/1745691620964106] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/04/2023]
Abstract
Science is often perceived to be a self-correcting enterprise. In principle, the
assessment of scientific claims is supposed to proceed in a cumulative fashion,
with the reigning theories of the day progressively approximating truth more
accurately over time. In practice, however, cumulative self-correction tends to
proceed less efficiently than one might naively suppose. Far from evaluating new
evidence dispassionately and infallibly, individual scientists often cling
stubbornly to prior findings. Here we explore the dynamics of scientific
self-correction at an individual rather than collective level. In 13 written
statements, researchers from diverse branches of psychology share why and how
they have lost confidence in one of their own published findings. We
qualitatively characterize these disclosures and explore their implications. A
cross-disciplinary survey suggests that such loss-of-confidence sentiments are
surprisingly common among members of the broader scientific population yet
rarely become part of the public record. We argue that removing barriers to
self-correction at the individual level is imperative if the scientific
community as a whole is to achieve the ideal of efficient self-correction.
Collapse
Affiliation(s)
- Julia M Rohrer
- International Max Planck Research School on the Life Course, Max Planck Institute for Human Development, Berlin.,Department of Psychology, University of Leipzig
| | - Warren Tierney
- Department of Organizational Behavior, INSEAD, Singapore
| | - Eric L Uhlmann
- Department of Organizational Behavior, INSEAD, Singapore
| | - Lisa M DeBruine
- Institute of Neuroscience and Psychology, University of Glasgow
| | - Tom Heyman
- Laboratory of Experimental Psychology, KU Leuven.,Institute of Psychology, Leiden University
| | - Benedict Jones
- Institute of Neuroscience and Psychology, University of Glasgow
| | | | | | - Rebecca M Willén
- Institute for Globally Distributed Open Research and Education (IGDORE)
| | | | | | | | - Simine Vazire
- Melbourne School of Psychological Sciences, University of Melbourne
| | | | | | - Christopher F Chabris
- Autism and Developmental Medicine Institute, Geisinger Health System, Danville, Pennsylvania
| | - Tal Yarkoni
- Department of Psychology, University of Texas at Austin
| |
Collapse
|
47
|
Premachandra B, Lewis NA. Do We Report the Information That Is Necessary to Give Psychology Away? A Scoping Review of the Psychological Intervention Literature 2000-2018. Perspect Psychol Sci 2021; 17:226-238. [PMID: 33651952 DOI: 10.1177/1745691620974774] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Psychologists are spending a considerable amount of time researching and developing interventions in hopes that their efforts can help to tackle some of society's pressing problems. Unfortunately, those hopes are often not realized-many interventions are developed and reported in journals but do not make their way into the broader world they were designed to change. One potential reason for this is that there may be a gap between the information reported in articles and the information others, such as practitioners, need to implement the findings. We explored this possibility in the current article. We conducted a scoping review to assess the extent to which the information needed for implementation is reported in psychological intervention articles. Results suggest psychological intervention articles report, at most, 64% of the information needed to implement interventions. We discuss the implications of this for both psychological theories and applying them in the world.
Collapse
Affiliation(s)
| | - Neil A Lewis
- Department of Communication, Cornell University
- Division of General Internal Medicine, Department of Medicine, Weill Cornell Medical College
| |
Collapse
|
48
|
Abstract
The field of psychology has a long history of encouraging researchers to disseminate their findings to the broader public. This trend has continued in recent decades in part because of professional psychology organizations reissuing calls to "give psychology away." This recent wave of calls to give psychology away is different because it has been occurring alongside another movement in the field-the credibility revolution in which psychology has been reckoning with metascientific questions about what exactly psychologists know. This creates a dilemma for the modern psychologist: How is one to "give psychology away" if one is unsure about what is known or what one has to give? In the current article, we discuss strategies for navigating this tension by drawing on insights from the interdisciplinary fields of science communication and persuasion and social influence.
Collapse
Affiliation(s)
- Neil A Lewis
- Department of Communication, Cornell University and Division of General Internal Medicine, Weill Cornell Medical College
| | - Jonathan Wai
- Department of Education Reform and Department of Psychology, College of Education and Health Professions, University of Arkansas
| |
Collapse
|
49
|
Wang H, Radomska HS, Phelps MA. Replication Study: Coding-independent regulation of the tumor suppressor PTEN by competing endogenous mRNAs. eLife 2020; 9:56651. [PMID: 33073769 DOI: 10.7554/elife.56651] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2020] [Accepted: 09/18/2020] [Indexed: 12/19/2022] Open
Abstract
As part of the Reproducibility Project: Cancer Biology, we published a Registered Report (Phelps et al., 2016) that described how we intended to replicate selected experiments from the paper 'Coding-independent regulation of the tumor suppressor PTEN by competing endogenous mRNAs' (Tay et al., 2011). Here, we report the results. We found depletion of putative PTEN competing endogenous mRNAs (ceRNAs) in DU145 cells did not impact PTEN 3'UTR regulation using a reporter, while the original study reported decreased activity when SERINC1, VAPA, and CNOT6L were depleted (Figure 3C; Tay et al., 2011). Using the same reporter, we found decreased activity when ceRNA 3'UTRs were overexpressed, while the original study reported increased activity (Figure 3D; Tay et al., 2011). In HCT116 cells, ceRNA depletion resulted in decreased PTEN protein levels, a result similar to the findings reported in the original study (Figure 3G,H; Tay et al., 2011); however, while the original study reported an attenuated ceRNA effect in microRNA deficient (DicerEx5) HCT116 cells, we observed increased PTEN protein levels. Further, we found depletion of the ceRNAs VAPA or CNOT6L did not statistically impact DU145, wild-type HCT116, or DicerEx5 HCT116 cell proliferation. The original study reported increased DU145 and wild-type HCT116 cell proliferation when these ceRNAs were depleted, which was attenuated in the DicerEx5 HCT116 cells (Figure 5B; Tay et al., 2011). Differences between the original study and this replication attempt, such as variance between biological repeats, are factors that might have influenced the results. Finally, we report meta-analyses for each result.
Collapse
Affiliation(s)
- Hongyan Wang
- Pharmacoanalytic Shared Resource (PhASR), Comprehensive Cancer Center, The Ohio State University, Columbus, United States
| | - Hanna S Radomska
- Pharmacoanalytic Shared Resource (PhASR), Comprehensive Cancer Center, The Ohio State University, Columbus, United States
| | - Mitch A Phelps
- Pharmacoanalytic Shared Resource (PhASR), Comprehensive Cancer Center, The Ohio State University, Columbus, United States
| | -
- Science Exchange, Palo Alto, United States.,Center for Open Science, Charlottesville, United States
| |
Collapse
|
50
|
Heyman T, Maerten AS. Correction notices in psychology: impactful or inconsequential? R Soc Open Sci 2020; 7:200834. [PMID: 33204456 PMCID: PMC7657932 DOI: 10.1098/rsos.200834] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/13/2020] [Accepted: 09/03/2020] [Indexed: 06/11/2023]
Abstract
Science is self-correcting, or so the adage goes, but to what extent is that indeed the case? Answering this question requires careful consideration of the various approaches to achieve the collective goal of self-correction. One of the most straightforward mechanisms is individual self-correction: researchers rectifying their own mistakes by publishing a correction notice. Although it offers an efficient route to correcting the scientific record, it has received little to no attention from a metascientific point of view. We aim to fill this void by analysing the content of correction notices published from 2010 until 2018 in the three psychology journals featuring the highest number of corrections over that timespan based on the Scopus database (i.e. Psychological Science with N = 58, Frontiers in Psychology with N = 99 and Journal of Affective Disorders with N = 57). More concretely, we examined which aspects of the original papers were affected (e.g. hypotheses, data-analyses, metadata such as author order, affiliations, funding information etc.) as well as the perceived implications for the papers' main findings. Our exploratory analyses showed that many corrections involved inconsequential errors. Furthermore, authors rarely revised their conclusions, even though several corrections concerned changes to the results. We conclude with a discussion of current policies, and suggest ways to improve upon the present situation by (i) preventing mistakes, and (ii) transparently rectifying those mistakes that do find their way into the literature.
Collapse
Affiliation(s)
- Tom Heyman
- KU Leuven, Leuven, Belgium
- Leiden University, Leiden, Netherlands
| | | |
Collapse
|