DeCarlo M, Bean G. Assessing Measurement Invariance in ASWB Exams: Regulatory Research Proposal to Advance Equity.
JOURNAL OF EVIDENCE-BASED SOCIAL WORK (2019) 2024;
21:214-235. [PMID:
38345106 DOI:
10.1080/26408066.2024.2308814]
[Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/18/2024]
Abstract
PURPOSE
Social workers from minoritized racial, ethnic, linguistic, and age groups are far less likely to pass licensing examinations required to practice. Using a simulated data set, our study investigates measurement equivalence, or invariance, of social work licensing exams.
MATERIALS
For this analysis, we simulated responses to 15 multiple-choice questions which were scored as either correct or incorrect using the R mirt package and used mirt to fit a 2-parameter logistic model (2PL) to the response data. We generated the data so that five items could demonstrate DIF and calculated their impact on the test characteristic curves and item characteristic curves.
RESULTS
Small amounts of differential item functioning added up into differential test functioning, but the effect size was small. This result is one potential outcome of an analysis of ASWB exams.
DISCUSSION
Most studies evaluating test characteristic curves demonstrate small effect sizes. Measuring the test characteristic curve and the test information curve will help to investigate content-irrelevant sources of variance in the exams, including unfairness, unreliability, and invalid pass scores.
CONCLUSION
Differential test functioning is a core part of measurement invariance studies. Psychometric standards require test developers to assess measurement invariance at both the item-level and test-level to protect themselves from accusations of bias.
Collapse