Stanton K, Watts AL, Levin-Aspenson HF, Carpenter RW, Emery NN, Zimmerman M. Focusing Narrowly on Model Fit in Factor Analysis Can Mask Construct Heterogeneity and Model Misspecification: Applied Demonstrations across Sample and Assessment Types.
J Pers Assess 2023;
105:1-13. [PMID:
35286224 DOI:
10.1080/00223891.2022.2047060]
[Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2021] [Accepted: 02/09/2022] [Indexed: 02/01/2023]
Abstract
This study builds upon research indicating that focusing narrowly on model fit when evaluating factor analytic models can lead to problematic inferences regarding the nature of item sets, as well as how models should be applied to inform measure development and validation. To advance research in this area, we present concrete examples relevant to researchers in clinical, personality, and related subfields highlighting two specific scenarios when an overreliance on model fit may be problematic. Specifically, we present data analytic examples showing that focusing narrowly on model fit may lead to (a) incorrect conclusions that heterogeneous item sets reflect narrower homogeneous constructs and (b) the retention of potentially problematic items when developing assessment measures. We use both interview data from adult outpatients (N = 2,149) and self-report data from adults recruited online (N = 547) to demonstrate the importance of these issues across sample types and assessment methods. Following demonstrations with these data, we make recommendations focusing on how other model characteristics (e.g., factor loading patterns; carefully considering the content and nature of factor indicators) should be considered in addition to information provided by model fit indices when evaluating factor analytic models.
Collapse