1
|
Brown MI, Heck PR, Chabris CF. The Social Shapes Test as a Self-Administered, Online Measure of Social Intelligence: Two Studies with Typically Developing Adults and Adults with Autism Spectrum Disorder. J Autism Dev Disord 2024; 54:1804-1819. [PMID: 36757539 PMCID: PMC9909157 DOI: 10.1007/s10803-023-05901-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/11/2023] [Indexed: 02/10/2023]
Abstract
The Social Shapes Test (SST) is a measure of social intelligence which does not use human faces or rely on extensive verbal ability. The SST has shown promising validity among adults without autism spectrum disorder (ASD), but it is uncertain whether it is suitable for adults with ASD. We find measurement invariance between adults with (n = 229) or without ASD (n = 1,049) on the 23-item SST. We also find that adults without ASD score higher on the SST than adults with ASD (d = 0.21). We also provide two, 14-item versions which demonstrated good parallel test-retest reliability and are positively related to scores on the Frith-Happé task. The SST is suitable for remote, online research studies.
Collapse
Affiliation(s)
- Matt I Brown
- Geisinger Health System, Lewisburg, PA, USA.
- Human Resources Research Organization, 66 Canal Center Plaza, Suite 700, 22314, Alexandria, VA, USA.
| | | | | |
Collapse
|
2
|
Martin-Kowal J, Wiernik B, Carretta TR, Coovert MD. Development of a serious gaming approach for cyber aptitude assessment. MILITARY PSYCHOLOGY 2024; 36:3-15. [PMID: 38193874 PMCID: PMC10802808 DOI: 10.1080/08995605.2021.1984740] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2021] [Accepted: 09/16/2021] [Indexed: 10/19/2022]
Abstract
Numerous traditional assessments have been developed to determine suitability of US military recruits for cyber careers. Cyber career field managers expressed a concern there may be well-qualified candidates that lack cyber knowledge, and therefore are not identified with knowledge-based tests. Technological advances such as serious gaming may provide opportunities to assess constructs traditional methods do not effectively measure. The purpose of this effort was to identify potential gains in validity that could be achieved beyond traditional methods through the use of serious games for several cyber jobs (both for enlisted and officer positions). Throughout this phase of research, an extensive literature review of military and civilian assessments targeted cyber occupations. Then, military subject matter experts in these career fields provided input and guidance (e.g., focus on aptitudes and traits as knowledge and skill are rapidly outdated). A gap analysis between all measures of such constructs identified a short list of candidates for measurement in a serious game. A survey of 800 airmen in the 1N4X1A, 3D1X2 and 17DEX/SX career fields was conducted; 290 respondents identified six constructs to be the focus for serious game assessment. The game was developed, and constructs validated on a sample chosen to model Air Force enlisted recruits. Additional psychometric data from enlistees and cyber trainees will be gathered once COVID-19 restrictions are lifted.
Collapse
Affiliation(s)
- Jaclyn Martin-Kowal
- MDC & Associates, Tampa, Florida, USA
- Personnel Decisions Research Institute, Arlington, Virginia, USA
| | - Brenton Wiernik
- MDC & Associates, Tampa, Florida, USA
- Department of Psychology, University of South Florida, Tampa, Florida, USA
| | | | - Michael D. Coovert
- MDC & Associates, Tampa, Florida, USA
- Department of Psychology, University of South Florida, Tampa, Florida, USA
| |
Collapse
|
3
|
Tuerk C, Saha T, Bouchard MF, Booij L. Computerized Cognitive Test Batteries for Children and Adolescents-A Scoping Review of Tools For Lab- and Web-Based Settings From 2000 to 2021. Arch Clin Neuropsychol 2023; 38:1683-1710. [PMID: 37259540 PMCID: PMC10681451 DOI: 10.1093/arclin/acad039] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2022] [Revised: 03/20/2023] [Accepted: 04/20/2023] [Indexed: 06/02/2023] Open
Abstract
OBJECTIVE Cognitive functioning is essential to well-being. Since cognitive difficulties are common in many disorders, their early identification is critical, notably during childhood and adolescence. This scoping review aims to provide a comprehensive literature overview of computerized cognitive test batteries (CCTB) that have been developed and used in children and adolescents over the past 22 years and to evaluate their psychometric properties. METHOD Among 3192 records identified from three databases (PubMed, PsycNET, and Web of Science) between 2000 and 2021, 564 peer-reviewed articles conducted in children and adolescents aged 3 to 18 years met inclusion criteria. Twenty main CCTBs were identified and further reviewed following PRISMA guidelines. Relevant study details (sample information, topic, location, setting, norms, and psychometrics) were extracted, as well as administration and instrument characteristics for the main CCTBs. RESULTS Findings suggest that CCTB use varies according to age, location, and topic, with eight tools accounting for 85% of studies, and the Cambridge Neuropsychological Test Automated Battery (CANTAB) being most frequently used. Few instruments were applied in web-based settings or include social cognition tasks. Only 13% of studies reported psychometric properties. CONCLUSIONS Over the past two decades, a high number of computerized cognitive batteries have been developed. Among these, more validation studies are needed, particularly across diverse cultural contexts. This review offers a comprehensive synthesis of CCTBs to aid both researchers and clinicians to conduct cognitive assessments in children in either a lab- or web-based setting.
Collapse
Affiliation(s)
- Carola Tuerk
- Department of Psychology, Concordia University, 7141 Sherbrooke Street West, Montreal, QC H4B 1R6, Canada
- Sainte-Justine Hospital Research Center, 3175 Côte-Sainte-Catherine Road, Montreal, QC H3T 1C5, Canada
| | - Trisha Saha
- Department of Environmental and Occupational Health, School of Public Health, University of Montreal, 7101 Park Avenue, Montreal, QC H3N 1X9, Canada
| | - Maryse F Bouchard
- Sainte-Justine Hospital Research Center, 3175 Côte-Sainte-Catherine Road, Montreal, QC H3T 1C5, Canada
- Department of Environmental and Occupational Health, School of Public Health, University of Montreal, 7101 Park Avenue, Montreal, QC H3N 1X9, Canada
- Institut National de la Recherche Scientifique, 531 des Prairies Blvd, Laval, QC H7V 1B7, Canada
| | - Linda Booij
- Department of Psychology, Concordia University, 7141 Sherbrooke Street West, Montreal, QC H4B 1R6, Canada
- Sainte-Justine Hospital Research Center, 3175 Côte-Sainte-Catherine Road, Montreal, QC H3T 1C5, Canada
- Department of Psychiatry and Addictology, University of Montreal, 2900 Boulevard Edouard Montpetit, Montreal, QC H3T 1J4, Canada
- Department of Psychiatry, McGill University, 1033 Pine Avenue West, Montreal, Quebec H3A 1A1, Canada
- Research Centre, Douglas Mental Health University Institute, 6875 Boulevard LaSalle, Verdun, QC H4H 1R3, Canada
| |
Collapse
|
4
|
Leutner F, Codreanu SC, Brink S, Bitsakis T. Game based assessments of cognitive ability in recruitment: Validity, fairness and test-taking experience. Front Psychol 2023; 13:942662. [PMID: 36743642 PMCID: PMC9891208 DOI: 10.3389/fpsyg.2022.942662] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/12/2022] [Accepted: 12/20/2022] [Indexed: 01/19/2023] Open
Abstract
Gamification and machine learning are emergent technologies in recruitment, promising to improve the user experience and fairness of assessments. We test this by validating a game based assessment of cognitive ability with a machine learning based scoring algorithm optimised for validity and fairness. We use applied data from 11,574 assessment completions. The assessment has convergent validity (r = 0.5) and test-retest reliability (r = 0.68). It maintains fairness in a separate sample of 3,107 job applicants, showing that fairness-optimised machine learning can improve outcome parity issues with cognitive ability tests in recruitment settings. We show that there are no significant gender differences in test taking anxiety resulting from the games, and that anxiety does not directly predict game performance, supporting the notion that game based assessments help with test taking anxiety. Interactions between anxiety, gender and performance are explored. Feedback from 4,778 job applicants reveals a Net Promoter score of 58, indicating more applicants support than dislike the assessment, and that games deliver a positive applicant experience in practise. Satisfaction with the format is high, but applicants raise face validity concerns over the abstract games. We encourage the use of gamification and machine learning to improve the fairness and user experience of psychometric tests.
Collapse
Affiliation(s)
- Franziska Leutner
- Institute of Management Studies, Goldsmiths, University of London, London, United Kingdom,HireVue, Inc, Salt Lake City, UT, United States,*Correspondence: Franziska Leutner,
| | | | | | | |
Collapse
|
5
|
Ramos-Villagrasa PJ, Fernández-del-Río E, Castro Á. Game-related assessments for personnel selection: A systematic review. Front Psychol 2022; 13:952002. [PMID: 36248590 PMCID: PMC9554090 DOI: 10.3389/fpsyg.2022.952002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2022] [Accepted: 09/05/2022] [Indexed: 11/18/2022] Open
Abstract
Industrial development in recent decades has led to using information and communication technologies (ICT) to support personnel selection processes. One of the most notable examples is game-related assessments (GRA), supposedly as accurate as conventional tests but which generate better applicant reactions and reduce the likelihood of adverse impact and faking. However, such claims still lack scientific support. Given practitioners’ increasing use of GRA, this article reviews the scientific literature on gamification applied to personnel selection to determine whether the current state of the art supports their use in professional practice and identify specific aspects on which future research should focus. Following the PRISMA model, a search was carried out in the Web of Science and Scopus databases, identifying 34 valid articles, of which 85.3% are empirical studies that analyze five areas: (1) validity; (2) applicant reactions; (3) design of GRA; (4) personal characteristics and GRA; and (5) adverse impact and faking. Together, these studies show that GRA can be used in personnel selection but that the supposed advantages of GRA over conventional tests are fewer than imagined. The results also suggest several aspects on which research should focus (e.g., construct validity, differences depending on the type of game, prediction of different job performance dimensions), which could help define the situations in which the use of GRA may be recommended.
Collapse
Affiliation(s)
- Pedro J. Ramos-Villagrasa
- Department of Psychology and Sociology, Universidad de Zaragoza, Zaragoza, Spain
- *Correspondence: Pedro J. Ramos-Villagrasa,
| | | | - Ángel Castro
- Department of Psychology and Sociology, Universidad de Zaragoza, Teruel, Spain
| |
Collapse
|
6
|
Nye CD. Assessing Interests in the Twenty-First-Century Workforce: Building on a Century of Interest Measurement. ANNUAL REVIEW OF ORGANIZATIONAL PSYCHOLOGY AND ORGANIZATIONAL BEHAVIOR 2022. [DOI: 10.1146/annurev-orgpsych-012420-083120] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Recent research has re-emphasized the importance of vocational interests for understanding workplace attitudes and behavior. As a result, there is a renewed interest in the assessment of vocational interests in organizations. Numerous interest assessments have been developed over the past century, and they are now administered to millions of people throughout the world. Nevertheless, there is still work to be done, particularly as interest assessments are increasingly being used in organizational settings. This article reviews developments in interest assessments and discusses the implications of their use for both research and practice. It discusses the advantages and disadvantages of examining vocational interests in organizational contexts and proposes future research directions.
Collapse
Affiliation(s)
- Christopher D. Nye
- Department of Psychology, College of Social Science, Michigan State University, East Lansing, Michigan, USA
| |
Collapse
|
7
|
Koch M, Becker N, Spinath FM, Greiff S. Assessing intelligence without intelligence tests. Future perspectives. INTELLIGENCE 2021. [DOI: 10.1016/j.intell.2021.101596] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
|
8
|
al‐Qallawi S, Raghavan M. A review of online reactions to game‐based assessment mobile applications. INTERNATIONAL JOURNAL OF SELECTION AND ASSESSMENT 2021. [DOI: 10.1111/ijsa.12346] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/26/2023]
Affiliation(s)
- Sherif al‐Qallawi
- School of Psychology Florida Institute of Technology Melbourne Florida USA
| | - Mukhunth Raghavan
- Department of Psychology University of South Florida Tampa Florida USA
| |
Collapse
|
9
|
Potočnik K, Anderson NR, Born M, Kleinmann M, Nikolaou I. Paving the way for research in recruitment and selection: recent developments, challenges and future opportunities. EUROPEAN JOURNAL OF WORK AND ORGANIZATIONAL PSYCHOLOGY 2021. [DOI: 10.1080/1359432x.2021.1904898] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Affiliation(s)
| | | | - Marise Born
- School of Social and Behavioural Sciences, Erasmus University Rotterdam, Rotterdam, The Netherlands
| | - Martin Kleinmann
- Department of Psychology, Work and Organisational Psychology, University of Zürich, Switzerland
| | - Ioannis Nikolaou
- Department of Management Science and Technology, Athens University of Economics and Business, Athens, Greece
| |
Collapse
|
10
|
Landers RN, Marin S. Theory and Technology in Organizational Psychology: A Review of Technology Integration Paradigms and Their Effects on the Validity of Theory. ANNUAL REVIEW OF ORGANIZATIONAL PSYCHOLOGY AND ORGANIZATIONAL BEHAVIOR 2021. [DOI: 10.1146/annurev-orgpsych-012420-060843] [Citation(s) in RCA: 20] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Despite the centrality of technology to understanding how humans in organizations think, feel, and behave, researchers in organizational psychology and organizational behavior even now often avoid theorizing about it. In our review, we identify four major paradigmatic approaches in theoretical approaches to technology, which typically occur in sequence: technology-as-context, technology-as-causal, technology-as-instrumental, and technology-as-designed. Each paradigm describes a typically implicit philosophical orientation toward technology as demonstrated through choices about theory development and research design. Of these approaches, one is unnecessarily limited and two are harmful oversimplifications that we contend have systematically weakened the quality of theory across our discipline. As such, we argue that to avoid creating impractical and even inaccurate theory, researchers must explicitly model technology design. To facilitate this shift, we define technology, present our paradigmatic framework, explain the framework's importance, and provide recommendations across five key domains: personnel selection, training and development, performance management and motivation, groups and teams, and leadership.
Collapse
Affiliation(s)
- Richard N. Landers
- Department of Psychology, College of Liberal Arts, University of Minnesota, Minneapolis, Minnesota 55455, USA;,
| | - Sebastian Marin
- Department of Psychology, College of Liberal Arts, University of Minnesota, Minneapolis, Minnesota 55455, USA;,
| |
Collapse
|
11
|
Woods SA, Ahmed S, Nikolaou I, Costa AC, Anderson NR. Personnel selection in the digital age: a review of validity and applicant reactions, and future research challenges. EUROPEAN JOURNAL OF WORK AND ORGANIZATIONAL PSYCHOLOGY 2019. [DOI: 10.1080/1359432x.2019.1681401] [Citation(s) in RCA: 21] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
Affiliation(s)
| | - Sara Ahmed
- Surrey Business School, University of Surrey, Guildford, UK
| | - Ioannis Nikolaou
- Department of Management Science & Technology, Athens University of Economics & Business, Athens, Greece
| | | | - Neil R. Anderson
- Bradford School of Management, University of Bradford, Bradford, UK
| |
Collapse
|
12
|
Ryan AM, Derous E. The Unrealized Potential of Technology in Selection Assessment. REVISTA DE PSICOLOGÍA DEL TRABAJO Y DE LAS ORGANIZACIONES 2019. [DOI: 10.5093/jwop2019a10] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
|
13
|
Hermes M, Albers F, Böhnke JR, Huelmann G, Maier J, Stelling D. Measurement and structural invariance of cognitive ability tests after computer-based training. COMPUTERS IN HUMAN BEHAVIOR 2019. [DOI: 10.1016/j.chb.2018.11.040] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
14
|
Aguado D, Vidal A, Olea J, Ponsoda V, Barrada JR, Abad FJ. Cheating on Unproctored Internet Test Applications: An Analysis of a Verification Test in a Real Personnel Selection Context. THE SPANISH JOURNAL OF PSYCHOLOGY 2018; 21:E62. [PMID: 30501646 DOI: 10.1017/sjp.2018.50] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
This study analyses the extent to which cheating occurs in a real selection setting. A two-stage, unproctored and proctored, test administration was considered. Test score inconsistencies were concluded by applying a verification test (Guo and Drasgow Z-test). An initial simulation study showed that the Z-test has adequate Type I error and power rates in the specific selection settings explored. A second study applied the Z-test statistic verification procedure to a sample of 954 employment candidates. Additional external evidence based on item time response to the verification items was gathered. The results revealed a good performance of the Z-test statistic and a relatively low, but non-negligible, number of suspected cheaters that showed higher distorted ability estimates. The study with real data provided additional information on the presence of suspected cheating in unproctored applications and the viability of using item response times as an additional evidence of cheating. In the verification test, suspected cheaters spent 5.78 seconds per item more than expected considering the item difficulty and their assumed ability in the unproctored stage. We found that the percentage of suspected cheaters in the empirical study could be estimated at 13.84%. In summary, the study provides evidence of the usefulness of the Z-test in the detection of cheating in a specific setting, in which a computerized adaptive test for assessing English grammar knowledge was used for personnel selection.
Collapse
|
15
|
Leutner F, Chamorro-Premuzic T. Stronger Together: Personality, Intelligence and the Assessment of Career Potential. J Intell 2018; 6:jintelligence6040049. [PMID: 31162476 PMCID: PMC6480750 DOI: 10.3390/jintelligence6040049] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2018] [Revised: 11/05/2018] [Accepted: 11/08/2018] [Indexed: 11/22/2022] Open
Abstract
Personality and intelligence have a long history in applied psychology, with research dating back more than 100 years. In line, early developments in industrial-organizational psychology were largely founded on the predictive power of personality and intelligence measures vis-à-vis career-related outcomes. However, despite a wealth of evidence in support of their utility, the concepts, theories, and measures of personality and intelligence are still widely underutilized in organizations, even when these express a commitment to making data-driven decisions about employees and leaders. This paper discusses the value of personality and intelligence to understand individual differences in career potential, and how to increase the adoption of theories and tools for evaluating personality and intelligence in real-world organizational contexts. Although personality and intelligence are distinct constructs, the assessment of career potential is incomplete without both.
Collapse
Affiliation(s)
- Franziska Leutner
- Department of Psychology and Language Science, University College London, 26 Bedford Way, London WC1H0AP, UK.
| | - Tomas Chamorro-Premuzic
- Department of Psychology and Language Science, University College London, 26 Bedford Way, London WC1H0AP, UK.
| |
Collapse
|
16
|
Ryan AM, Reeder MC, Golubovich J, Grand J, Inceoglu I, Bartram D, Derous E, Nikolaou I, Yao X. Culture and Testing Practices: Is the World Flat? APPLIED PSYCHOLOGY-AN INTERNATIONAL REVIEW-PSYCHOLOGIE APPLIQUEE-REVUE INTERNATIONALE 2017. [DOI: 10.1111/apps.12095] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Affiliation(s)
| | | | | | | | | | - Dave Bartram
- CEB's SHL Talent Measurement Solutions, and University of Pretoria; South Africa
| | | | | | - Xiang Yao
- School of Psychological and Cognitive Sciences and Beijing Key Laboratory of Behavior and Mental Health, Peking University; China
| |
Collapse
|
17
|
Brown MI, Grossenbacher MA. Can you test me now? Equivalence of GMA tests on mobile and non-mobile devices. INTERNATIONAL JOURNAL OF SELECTION AND ASSESSMENT 2017. [DOI: 10.1111/ijsa.12160] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
18
|
Cascio WF, Montealegre R. How Technology Is Changing Work and Organizations. ANNUAL REVIEW OF ORGANIZATIONAL PSYCHOLOGY AND ORGANIZATIONAL BEHAVIOR 2016. [DOI: 10.1146/annurev-orgpsych-041015-062352] [Citation(s) in RCA: 306] [Impact Index Per Article: 38.3] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Affiliation(s)
- Wayne F. Cascio
- The Business School, University of Colorado, Denver, Denver, Colorado 80217;
| | - Ramiro Montealegre
- Leeds School of Business, University of Colorado, Boulder, Boulder, Colorado 80309;
| |
Collapse
|
19
|
Gnambs T, Kaspar K. Socially Desirable Responding in Web-Based Questionnaires: A Meta-Analytic Review of the Candor Hypothesis. Assessment 2016; 24:746-762. [PMID: 26739360 DOI: 10.1177/1073191115624547] [Citation(s) in RCA: 39] [Impact Index Per Article: 4.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Unproctored, web-based assessments supposedly reduce social desirability distortions in self-report questionnaires because of an increased sense of privacy among participants. Three random-effects meta-analyses focusing either on social desirability ( k = 30, total N = 3,746), the Big Five of personality ( k = 66, total N = 2,951), or psychopathology ( k = 96, total N = 16,034) compared social desirability distortions of self-reports across computerized and paper-and-pencil administration modes. Overall, a near-zero effect, Δ = 0.01, was obtained that did not indicate less socially desirable responding in computerized assessments. Moreover, moderator analyses did not identify differential effects for proctored and unproctored procedures. Thus, paper-and-pencil and computerized administrations of self-report scales yield comparable mean scores. Unproctored web-based surveys do not offer an advantage with regard to socially desirable responding in self-report questionnaires.
Collapse
Affiliation(s)
- Timo Gnambs
- 1 Leibniz Institute for Educational Trajectories, Bamberg, Germany
| | - Kai Kaspar
- 2 University of Cologne, Cologne, Germany
| |
Collapse
|