1
|
Feuerstahler L. Scale Type Revisited: Some Misconceptions, Misinterpretations, and Recommendations. PSYCH 2023. [DOI: 10.3390/psych5020018] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/07/2023] Open
Abstract
Stevens’s classification of scales into nominal, ordinal, interval, and ratio types is among the most controversial yet resilient ideas in psychological and educational measurement. In this essay, I challenge the notion that scale type is essential for the development of measures in these fields. I highlight how the concept of scale type, and of interval-level measurement in particular, is variously interpreted by many researchers. These (often unstated) differences in perspectives lead to confusion about what evidence is appropriate to demonstrate interval-level measurement, as well as the implications of scale type for research in practice. I then borrow from contemporary ideas in the philosophy of measurement to demonstrate that scale type can only be established in the context of well-developed theory and through experimentation. I conclude that current notions of scale type are of limited use, and that scale type ought to occupy a lesser role in psychometric discourse and pedagogy.
Collapse
|
2
|
Bringmann LF, Elmer T, Eronen MI. Back to Basics: The Importance of Conceptual Clarification in Psychological Science. CURRENT DIRECTIONS IN PSYCHOLOGICAL SCIENCE 2022. [DOI: 10.1177/09637214221096485] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Although the lack of conceptual clarity has been observed to be a widespread and fundamental problem in psychology, conceptual clarification plays a mostly marginal role in psychological research. In this article, we argue that better conceptualization of psychological phenomena is needed to move psychology forward as a science. We first show how conceptual unclarity seeps through all aspects of psychological research, from everyday concepts to statistical measures. We then turn to recommendations on how to improve conceptual clarity in psychology, emphasizing the importance of seeing research as an iterative process in which it is necessary to revisit the phenomena that are the foundations of theories and models, as well as how they are conceptualized and measured.
Collapse
Affiliation(s)
- Laura F. Bringmann
- Department of Psychometrics and Statistics, Faculty of Social and Behavioural Sciences, University of Groningen
- Interdisciplinary Center Psychopathology and Emotion regulation (ICPE), Department of Psychiatry, University Medical Center Groningen, University of Groningen
| | - Timon Elmer
- Department of Psychometrics and Statistics, Faculty of Social and Behavioural Sciences, University of Groningen
| | | |
Collapse
|
3
|
Abstract
Meehl argued in 1978 that theories in psychology come and go, with little cumulative progress. We believe that this assessment still holds, as also evidenced by increasingly common claims that psychology is facing a “theory crisis” and that psychologists should invest more in theory building. In this article, we argue that the root cause of the theory crisis is that developing good psychological theories is extremely difficult and that understanding the reasons why it is so difficult is crucial for moving forward in the theory crisis. We discuss three key reasons based on philosophy of science for why developing good psychological theories is so hard: the relative lack of robust phenomena that impose constraints on possible theories, problems of validity of psychological constructs, and obstacles to discovering causal relationships between psychological variables. We conclude with recommendations on how to move past the theory crisis.
Collapse
|
4
|
Sanbonmatsu DM, Cooley EH, Butner JE. The Impact of Complexity on Methods and Findings in Psychological Science. Front Psychol 2021; 11:580111. [PMID: 33551904 PMCID: PMC7859482 DOI: 10.3389/fpsyg.2020.580111] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2020] [Accepted: 12/22/2020] [Indexed: 11/13/2022] Open
Abstract
The study of human behavior is severely hampered by logistical problems, ethical and legal constraints, and funding shortfalls. However, the biggest difficulty of conducting social and behavioral research is the extraordinary complexity of the study phenomena. In this article, we review the impact of complexity on research design, hypothesis testing, measurement, data analyses, reproducibility, and the communication of findings in psychological science. The systematic investigation of the world often requires different approaches because of the variability in complexity. Confirmatory testing, multi-factorial designs, survey methods, large samples, and modeling are frequently needed to study complex social and behavioral topics. Complexity impedes the measurement of general constructs, the reproducibility of results and scientific reporting, and the general rigor of research. Many of the benchmarks established by classic work in physical science are not attainable in studies of more complex phenomena. Consequently, the standards used to evaluate scientific research should be tethered to the complexity of the study topic.
Collapse
Affiliation(s)
- David M Sanbonmatsu
- Department of Psychology, University of Utah, Salt Lake City, UT, United States
| | - Emily H Cooley
- Department of Psychology, University of Utah, Salt Lake City, UT, United States
| | - Jonathan E Butner
- Department of Psychology, University of Utah, Salt Lake City, UT, United States
| |
Collapse
|
5
|
|
6
|
Eronen MI. Psychopathology and Truth: A Defense of Realism. THE JOURNAL OF MEDICINE AND PHILOSOPHY 2019; 44:507-520. [PMID: 31356663 DOI: 10.1093/jmp/jhz009] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
Recently Kenneth Kendler and Peter Zachar have raised doubts about the correspondence theory of truth and scientific realism in psychopathology. They argue that coherentist or pragmatist approaches to truth are better suited for understanding the reality of psychiatric disorders. In this article, I show that rejecting realism based on the correspondence theory is deeply problematic: It makes psychopathology categorically different from other sciences, and results in an implausible view of scientific discovery and progress. As an alternative, I suggest a robustness-based approach that can accommodate the significance of coherence and pragmatic factors without rejecting scientific realism and the correspondence theory of truth.
Collapse
|
7
|
Heino MTJ, Knittle K, Fried E, Sund R, Haukkala A, Borodulin K, Uutela A, Araujo-Soares V, Vasankari T, Hankonen N. Visualisation and network analysis of physical activity and its determinants: Demonstrating opportunities in analysing baseline associations in the Let's Move It trial. Health Psychol Behav Med 2019; 7:269-289. [PMID: 34040851 PMCID: PMC8114395 DOI: 10.1080/21642850.2019.1646136] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2019] [Accepted: 07/16/2019] [Indexed: 01/07/2023] Open
Abstract
Background: Visualisations and readily-accessible web-based supplementary files can improve data reporting and transparency. In this paper, we make use of recent developments in software and psychological network analysis to describe the baseline cohort of a trial testing the Let's Move It intervention, which aimed to increase physical activity (PA) and reduce sedentary behaviours (SB) among vocational school students. Methods: At baseline, 1166 adolescents, distributed across 6 school clusters and four educational tracks, completed measures of PA and SB, theoretical predictors of these behaviours, and body composition. Within a comprehensive website supplement, which includes all code and analyses, data were tabulated and visualised, and network analyses explored relations between predictor variables and outcomes. Results: Average daily moderate-to-vigorous PA was 65 min (CI95: 57min-73 min), and SB 8h44 min (CI95: 8h04min-9h24 min), with 25.8 (CI95: 23.5-28.0) interruptions to sitting. Cluster randomisation appeared to result in balanced distributions for baseline characteristics between intervention and control groups, but differences emerged across the four educational tracks. Self-reported behaviour change technique (BCT) use was low for many but not all techniques. A network analysis revealed direct relationships between PA and behavioural experiments, planning and autonomous motivation, and several BCTs were connected to PA via autonomous motivation. Visualisation uncovered a case of Simpson's paradox. Conclusions: Data-visualisation and data exploration techniques (e.g. network analysis) can help reveal the dynamics involved in complex multi-causal systems - a challenging task with traditional data presentations. The benefits of presenting complex data visually should encourage researchers to publish extensive analyses and descriptions as website supplements, which would increase the speed and quality of scientific communication, as well as help to address the crisis of reduced confidence in research findings. We hope that this example will serve as a template for other investigators to improve upon in the future.
Collapse
Affiliation(s)
| | - Keegan Knittle
- Faculty of Social Sciences, University of HelsinkiHelsinki, Finland
| | - Eiko Fried
- Department of Clinical Psychology, Leiden University, Leiden, The Netherlands
| | - Reijo Sund
- Faculty of Health Sciences, University of Eastern FinlandKuopio, Finland
| | - Ari Haukkala
- Faculty of Social Sciences, University of HelsinkiHelsinki, Finland
| | - Katja Borodulin
- Faculty of Social Sciences, University of HelsinkiHelsinki, Finland
| | - Antti Uutela
- Faculty of Social Sciences, University of HelsinkiHelsinki, Finland
| | - Vera Araujo-Soares
- Institute of Health and Society, Medical Faculty, Newcastle University, Newcastle upon Tyne, UK
| | | | - Nelli Hankonen
- Faculty of Social Sciences, University of HelsinkiHelsinki, Finland
| |
Collapse
|
8
|
Uher J. Quantitative Data From Rating Scales: An Epistemological and Methodological Enquiry. Front Psychol 2018; 9:2599. [PMID: 30622493 PMCID: PMC6308206 DOI: 10.3389/fpsyg.2018.02599] [Citation(s) in RCA: 20] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/14/2018] [Accepted: 12/03/2018] [Indexed: 11/13/2022] Open
Abstract
Rating scales are popular methods for generating quantitative data directly by persons rather than automated technologies. But scholars increasingly challenge their foundations. This article contributes epistemological and methodological analyses of the processes involved in person-generated quantification. They are crucial for measurement because data analyses can reveal information about study phenomena only if relevant properties were encoded systematically in the data. The Transdisciplinary Philosophy-of-Science Paradigm for Research on Individuals (TPS-Paradigm) is applied to explore psychological and social-science concepts of measurement and quantification, including representational measurement theory, psychometric theories and their precursors in psychophysics. These are compared to theories from metrology specifying object-dependence of measurement processes and subject-independence of outcomes as key criteria, which allow tracing data to the instances measured and the ways they were quantified. Separate histories notwithstanding, the article's basic premise is that general principles of scientific measurement and quantification should apply to all sciences. It elaborates principles by which these metrological criteria can be implemented also in psychology and social sciences, while considering their research objects' peculiarities. Application of these principles is illustrated by quantifications of individual-specific behaviors ('personality'). The demands rating methods impose on data-generating persons are deconstructed and compared with the demands involved in other quantitative methods (e.g., ethological observations). These analyses highlight problematic requirements for raters. Rating methods sufficiently specify neither the empirical study phenomena nor the symbolic systems used as data nor rules of assignment between them. Instead, pronounced individual differences in raters' interpretation and use of items and scales indicate considerable subjectivity in data generation. Together with recoding scale categories into numbers, this introduces a twofold break in the traceability of rating data, compromising interpretability of findings. These insights question common reliability and validity concepts for ratings and provide novel explanations for replicability problems. Specifically, rating methods standardize only data formats but not the actual data generation. Measurement requires data generation processes to be adapted to the study phenomena's properties and the measurement-executing persons' abilities and interpretations, rather than to numerical outcome formats facilitating statistical analyses. Researchers must finally investigate how people actually generate ratings to specify the representational systems underlying rating data.
Collapse
Affiliation(s)
- Jana Uher
- London School of Economics and Political Science, London, United Kingdom
| |
Collapse
|
9
|
Abstract
According to classical measurement theory, fundamental measurement necessarily requires the operation of concatenation qua physical addition. Quantities which do not allow this operation are measurable only indirectly by means of derived measurement. Since only extensive quantities sustain the operation of physical addition, measurement in psychology has been considered problematic. In contrast, the theory of conjoint measurement, as developed in representational measurement theory, proposes that the operation of ordering is sufficient for establishing fundamental measurement. The validity of this view is questioned. The misconception about the advantages of conjoint measurement, it is argued, results from the failure to notice that magnitudes of derived quantities cannot be determined directly, i.e., without the help of associated quantitative indicators. This takes away the advantages conjoint measurement has over derived measurement, making it practically useless.
Collapse
|
10
|
Guyon H, Kop JL, Juhel J, Falissard B. Measurement, ontology, and epistemology: Psychology needs pragmatism-realism. THEORY & PSYCHOLOGY 2018. [DOI: 10.1177/0959354318761606] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
Measurement in psychology is at the heart of a major debate in the academic literature. We aim to contribute to a critical discussion of this issue. We propose to reposition the object of this type of measure, namely a mental attribute as measured by mental tests. Mental attributes should be considered not as a true object independent of the knower, but as an emergent property of a person dependent on the social context. On the basis of this clarified ontology, we consider that an empirical approach to measuring a mental attribute is possible. This approach must be resolutely pragmatist and realist. In practical terms, this means that a test needs to be renegotiated relative to the context. The validation of quantitative measures requires verification of a certain number of criteria. Consequently, our work critically explores measures as they are usually implemented in the area of psychometrics.
Collapse
|
11
|
Kougiali ZG, Fasulo A, Needs A, Van Laar D. Planting the seeds of change: Directionality in the narrative construction of recovery from addiction. Psychol Health 2017; 32:639-664. [PMID: 28276737 DOI: 10.1080/08870446.2017.1293053] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Abstract
OBJECTIVE The dominant theoretical perspective that guides treatment evaluations in addiction assumes linearity in the relationship between treatment and outcomes, viewing behaviour change as a 'before and after event'. In this study we aim to examine how the direction of the trajectory of the process from addiction to recovery is constructed in personal narratives of active and recovering users. DESIGN 21 life stories from individuals at different stages of recovery and active use were collected and analysed following the principles of narrative analysis. RESULTS Personal trajectories were constructed in discontinuous, non-linear and long lasting patterns of repeated, and interchangeable, episodes of relapse and abstinence. Relapse appeared to be described as an integral part of a learning process through which knowledge leading to recovery was gradually obtained. CONCLUSION The findings show that long-term recovery is represented as being preceded by periods of discontinuity before change is stabilised. Such periods are presented to be lasting longer than most short-term pre-post intervention designs can capture and suggest the need to rethink how change is defined and measured.
Collapse
Affiliation(s)
- Zetta G Kougiali
- a School of Psychology , University of East London, Stratford Campus , London , UK
| | - Alessandra Fasulo
- b Department of Psychology , University of Portsmouth , Portsmouth , UK
| | - Adrian Needs
- b Department of Psychology , University of Portsmouth , Portsmouth , UK
| | - Darren Van Laar
- b Department of Psychology , University of Portsmouth , Portsmouth , UK
| |
Collapse
|
12
|
Linkov V. Psychology is not primarily Empirical Science: A Comparison of Cultures in the Lexical Hypothesis Tradition as a Failure of Introspection. Integr Psychol Behav Sci 2016; 51:285-302. [PMID: 28035626 DOI: 10.1007/s12124-016-9375-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
Abstract
A large part of psychology has become an empirical science that assumes that there might exist one set of research methods suitable for psychological research in all human cultures. Research questions, methods, and theories formulated from one cultural perspective are not thoroughly introspectively examined when being used in another cultural environment. This leads to research that answers questions that are not meaningful in such environments. Research coming from the lexical hypothesis tradition is given as an example. The original research in English language decided that the lexicon was enough to represent language structures for the purpose of examining how language reflects personality; however, some languages might use specific grammatical structures to reflect personality, so the lexicon is not enough to adequately represent these languages. Despite this, researchers still follow the research method developed for the English language. The Czech and Korean languages are examples of this approach. A solution to this problem is the thorough use of introspection during the formulation of research questions.
Collapse
Affiliation(s)
- Václav Linkov
- CDV - Transport Research Centre, Líšeňská 33a, 636 00, Brno, Czech Republic.
| |
Collapse
|