1
|
Harris C, Tang Y, Birnbaum E, Cherian C, Mendhe D, Chen MH. Digital Neuropsychology beyond Computerized Cognitive Assessment: Applications of Novel Digital Technologies. Arch Clin Neuropsychol 2024; 39:290-304. [PMID: 38520381 DOI: 10.1093/arclin/acae016] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2024] [Accepted: 02/16/2024] [Indexed: 03/25/2024] Open
Abstract
Compared with other health disciplines, there is a stagnation in technological innovation in the field of clinical neuropsychology. Traditional paper-and-pencil tests have a number of shortcomings, such as low-frequency data collection and limitations in ecological validity. While computerized cognitive assessment may help overcome some of these issues, current computerized paradigms do not address the majority of these limitations. In this paper, we review recent literature on the applications of novel digital health approaches, including ecological momentary assessment, smartphone-based assessment and sensors, wearable devices, passive driving sensors, smart homes, voice biomarkers, and electronic health record mining, in neurological populations. We describe how each digital tool may be applied to neurologic care and overcome limitations of traditional neuropsychological assessment. Ethical considerations, limitations of current research, as well as our proposed future of neuropsychological practice are also discussed.
Collapse
Affiliation(s)
- Che Harris
- Institute for Health, Health Care Policy and Aging Research, Rutgers University, New Brunswick, NJ, USA
- Department of Neurology, Robert Wood Johnson Medical School, Rutgers University, New Brunswick, NJ, USA
| | - Yingfei Tang
- Institute for Health, Health Care Policy and Aging Research, Rutgers University, New Brunswick, NJ, USA
- Department of Neurology, Robert Wood Johnson Medical School, Rutgers University, New Brunswick, NJ, USA
| | - Eliana Birnbaum
- Institute for Health, Health Care Policy and Aging Research, Rutgers University, New Brunswick, NJ, USA
| | - Christine Cherian
- Institute for Health, Health Care Policy and Aging Research, Rutgers University, New Brunswick, NJ, USA
| | - Dinesh Mendhe
- Institute for Health, Health Care Policy and Aging Research, Rutgers University, New Brunswick, NJ, USA
| | - Michelle H Chen
- Institute for Health, Health Care Policy and Aging Research, Rutgers University, New Brunswick, NJ, USA
- Department of Neurology, Robert Wood Johnson Medical School, Rutgers University, New Brunswick, NJ, USA
| |
Collapse
|
2
|
Binoy S, Lithwick Algon A, Ben Adiva Y, Montaser-Kouhsari L, Saban W. Online cognitive testing in Parkinson's disease: advantages and challenges. Front Neurol 2024; 15:1363513. [PMID: 38651103 PMCID: PMC11034553 DOI: 10.3389/fneur.2024.1363513] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/30/2023] [Accepted: 03/27/2024] [Indexed: 04/25/2024] Open
Abstract
Parkinson's disease (PD) is primarily characterized by motor symptoms. Yet, many people with PD experience cognitive decline, which is often unnoticed by clinicians, although it may have a significant impact on quality of life. For over half a century, traditional in-person PD cognitive assessment lacked accessibility, scalability, and specificity due to its inherent limitations. In this review, we propose that novel methods of online cognitive assessment could potentially address these limitations. We first outline the challenges of traditional in-person cognitive testing in PD. We then summarize the existing literature on online cognitive testing in PD. Finally, we explore the advantages, but also the limitations, of three major processes involved in online PD cognitive testing: recruitment and sampling methods, measurement and participation, and disease monitoring and management. Taking the limitations into account, we aim to highlight the potential of online cognitive testing as a more accessible and efficient approach to cognitive testing in PD.
Collapse
Affiliation(s)
- Sharon Binoy
- Loyola Stritch School of Medicine, Maywood, IL, United States
- Center for Accessible Neuropsychology and Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel
- Department of Occupational Therapy, Faculty of Medical & Health Sciences, Tel Aviv University, Tel Aviv, Israel
| | - Avigail Lithwick Algon
- Center for Accessible Neuropsychology and Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel
- Department of Occupational Therapy, Faculty of Medical & Health Sciences, Tel Aviv University, Tel Aviv, Israel
| | - Yoad Ben Adiva
- Center for Accessible Neuropsychology and Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel
- Department of Occupational Therapy, Faculty of Medical & Health Sciences, Tel Aviv University, Tel Aviv, Israel
| | - Leila Montaser-Kouhsari
- Department of Neurology, Brigham and Women Hospital, Harvard University, Boston, MA, United States
| | - William Saban
- Center for Accessible Neuropsychology and Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel
- Department of Occupational Therapy, Faculty of Medical & Health Sciences, Tel Aviv University, Tel Aviv, Israel
| |
Collapse
|
3
|
Klaming L, Spaltman M, Vermeent S, van Elswijk G, Miller JB, Schmand B. Test-retest reliability and reliable change index of the Philips IntelliSpace Cognition digital test battery. Clin Neuropsychol 2024:1-19. [PMID: 38360593 DOI: 10.1080/13854046.2024.2315747] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/19/2023] [Accepted: 01/11/2024] [Indexed: 02/17/2024]
Abstract
OBJECTIVE This article provides the test-retest reliability and Reliable Change Indices (RCIs) of the Philips IntelliSpace Cognition (ISC) platform, which contains digitized versions of well-established neuropsychological tests. METHOD 147 participants (ages 19 to 88) completed a digital cognitive test battery on the ISC platform or paper-pencil versions of the same test battery during two separate visits. Intraclass correlation coefficients (ICC) were calculated separately for the ISC and analog test versions to compare reliabilities between administration modalities. RCIs were calculated for the digital tests using the practice-adjusted RCI and standardized regression-based (SRB) method. RESULTS Test-retest reliabilities for the ISC tests ranged from moderate to excellent and were comparable to the test-retest reliabilities for the paper-pencil tests. Baseline test performance, retest interval, age, and education predicted test performance at visit 2 with baseline test performance being the strongest predictor for all outcome measures. For most outcome measures, both methods for the calculation of RCIs show agreement on whether or not a reliable change was observed. CONCLUSIONS RCIs for the digital tests enable clinicians to determine whether a measured change between assessments is due to real improvement or decline. Together, this contributes to the growing evidence for the clinical utility of the ISC platform.
Collapse
Affiliation(s)
- Laura Klaming
- Digital Cognitive Diagnostics, Philips Healthcare, Eindhoven, The Netherlands
| | - Mandy Spaltman
- Digital Cognitive Diagnostics, Philips Healthcare, Eindhoven, The Netherlands
| | - Stefan Vermeent
- Digital Cognitive Diagnostics, Philips Healthcare, Eindhoven, The Netherlands
| | - Gijs van Elswijk
- Digital Cognitive Diagnostics, Philips Healthcare, Eindhoven, The Netherlands
| | - Justin B Miller
- Cleveland Clinic Lou Ruvo Center for Brain Health, Las Vegas, NV, USA
| | - Ben Schmand
- Digital Cognitive Diagnostics, Philips Healthcare, Eindhoven, The Netherlands
| |
Collapse
|
4
|
Laera G, Hering A, Kliegel M. Assessing time-based prospective memory online: A comparison study between laboratory-based and web-based testing. Q J Exp Psychol (Hove) 2024:17470218231220578. [PMID: 38053325 DOI: 10.1177/17470218231220578] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/07/2023]
Abstract
Prospective memory (PM, i.e., the ability to remember and perform future intentions) is assessed mainly within laboratory settings; however, in the last two decades, several studies have started testing PM online. Most part of those studies focused on event-based PM (EBPM), and only a few assessed time-based PM (TBPM), possibly because time keeping is difficult to control or standardise without experimental control. Thus, it is still unclear whether time monitoring patterns in online studies replicate typical patterns obtained in laboratory tasks. In this study, we therefore aimed to investigate whether the behavioural outcome measures obtained from the traditional TBPM paradigm in the laboratory-accuracy and time monitoring-are comparable with an online version in a sample of 101 younger adults. Results showed no significant difference in TBPM performance in the laboratory versus online setting, as well as no difference in time monitoring. However, we found that participants were somewhat faster and more accurate at the ongoing task during the laboratory assessment, but those differences were not related to holding an intention in mind. The findings suggest that, although participants seemed generally more distracted when tested remotely, online assessment yielded similar results in key temporal characteristics and behavioural performance as for the laboratory assessment. The results are discussed in terms of possible conceptual and methodological implications for online testing.
Collapse
Affiliation(s)
- Gianvito Laera
- Cognitive Aging Lab (CAL), Faculty of Psychology and Educational Sciences, University of Geneva, Geneva, Switzerland
- Centre for the Interdisciplinary Study of Gerontology and Vulnerability, University of Geneva, Geneva, Switzerland
- LIVES-Overcoming Vulnerability: Life Course Perspectives, Swiss National Centre of Competence in Research, Lausanne, Switzerland
| | - Alexandra Hering
- Cognitive Aging Lab (CAL), Faculty of Psychology and Educational Sciences, University of Geneva, Geneva, Switzerland
- Centre for the Interdisciplinary Study of Gerontology and Vulnerability, University of Geneva, Geneva, Switzerland
- Department of Developmental Psychology, Tilburg School for Social and Behavioral Sciences, Tilburg University, Tilburg, The Netherlands
| | - Matthias Kliegel
- Cognitive Aging Lab (CAL), Faculty of Psychology and Educational Sciences, University of Geneva, Geneva, Switzerland
- Centre for the Interdisciplinary Study of Gerontology and Vulnerability, University of Geneva, Geneva, Switzerland
- LIVES-Overcoming Vulnerability: Life Course Perspectives, Swiss National Centre of Competence in Research, Lausanne, Switzerland
| |
Collapse
|
5
|
Alegret M, García-Gutiérrez F, Muñoz N, Espinosa A, Ortega G, Lleonart N, Rodríguez I, Rosende-Roca M, Pytel V, Cantero-Fortiz Y, Rentz DM, Marquié M, Valero S, Ruiz A, Butler C, Boada M. FACEmemory®, an Innovative Online Platform for Episodic Memory Pre-Screening: Findings from the First 3,000 Participants. J Alzheimers Dis 2024; 97:1173-1187. [PMID: 38217602 DOI: 10.3233/jad-230983] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/15/2024]
Abstract
BACKGROUND The FACEmemory® online platform comprises a complex memory test and sociodemographic, medical, and family questions. This is the first study of a completely self-administered memory test with voice recognition, pre-tested in a memory clinic, sensitive to Alzheimer's disease, using information and communication technologies, and offered freely worldwide. OBJECTIVE To investigate the demographic and clinical variables associated with the total FACEmemory score, and to identify distinct patterns of memory performance on FACEmemory. METHODS Data from the first 3,000 subjects who completed the FACEmemory test were analyzed. Descriptive analyses were applied to demographic, FACEmemory, and medical and family variables; t-test and chi-square analyses were used to compare participants with preserved versus impaired performance on FACEmemory (cut-off = 32); multiple linear regression was used to identify variables that modulate FACEmemory performance; and machine learning techniques were applied to identify different memory patterns. RESULTS Participants had a mean age of 50.57 years and 13.65 years of schooling; 64.07% were women, and 82.10% reported memory complaints with worries. The group with impaired FACEmemory performance (20.40%) was older, had less schooling, and had a higher prevalence of hypertension, diabetes, dyslipidemia, and family history of neurodegenerative disease than the group with preserved performance. Age, schooling, sex, country, and completion of the medical and family history questionnaire were associated with the FACEmemory score. Finally, machine learning techniques identified four patterns of FACEmemory performance: normal, dysexecutive, storage, and completely impaired. CONCLUSIONS FACEmemory is a promising tool for assessing memory in people with subjective memory complaints and for raising awareness about cognitive decline in the community.
Collapse
Affiliation(s)
- Montserrat Alegret
- Ace Alzheimer Center Barcelona, Universitat Internacional de Catalunya, Barcelona, Spain
- Networking Research Center on Neurodegenerative Diseases (CIBERNED), Instituto de Salud Carlos III, Madrid, Spain
| | | | - Nathalia Muñoz
- Ace Alzheimer Center Barcelona, Universitat Internacional de Catalunya, Barcelona, Spain
| | - Ana Espinosa
- Ace Alzheimer Center Barcelona, Universitat Internacional de Catalunya, Barcelona, Spain
- Networking Research Center on Neurodegenerative Diseases (CIBERNED), Instituto de Salud Carlos III, Madrid, Spain
| | - Gemma Ortega
- Ace Alzheimer Center Barcelona, Universitat Internacional de Catalunya, Barcelona, Spain
- Networking Research Center on Neurodegenerative Diseases (CIBERNED), Instituto de Salud Carlos III, Madrid, Spain
| | - Núria Lleonart
- Ace Alzheimer Center Barcelona, Universitat Internacional de Catalunya, Barcelona, Spain
| | - Isabel Rodríguez
- Ace Alzheimer Center Barcelona, Universitat Internacional de Catalunya, Barcelona, Spain
| | - Maitee Rosende-Roca
- Ace Alzheimer Center Barcelona, Universitat Internacional de Catalunya, Barcelona, Spain
| | - Vanesa Pytel
- Ace Alzheimer Center Barcelona, Universitat Internacional de Catalunya, Barcelona, Spain
| | - Yahveth Cantero-Fortiz
- Ace Alzheimer Center Barcelona, Universitat Internacional de Catalunya, Barcelona, Spain
| | - Dorene M Rentz
- Department of Neurology, Center for Alzheimer Research and Treatment, Brigham and Women's Hospital, Boston, MA, USA
- Department of Neurology, Massachusetts General Hospital, Boston, MA, USA
| | - Marta Marquié
- Ace Alzheimer Center Barcelona, Universitat Internacional de Catalunya, Barcelona, Spain
- Networking Research Center on Neurodegenerative Diseases (CIBERNED), Instituto de Salud Carlos III, Madrid, Spain
| | - Sergi Valero
- Ace Alzheimer Center Barcelona, Universitat Internacional de Catalunya, Barcelona, Spain
- Networking Research Center on Neurodegenerative Diseases (CIBERNED), Instituto de Salud Carlos III, Madrid, Spain
| | - Agustín Ruiz
- Ace Alzheimer Center Barcelona, Universitat Internacional de Catalunya, Barcelona, Spain
- Networking Research Center on Neurodegenerative Diseases (CIBERNED), Instituto de Salud Carlos III, Madrid, Spain
| | - Christopher Butler
- Ace Alzheimer Center Barcelona, Universitat Internacional de Catalunya, Barcelona, Spain
- Department of Brain Sciences, Imperial College London, London, UK
| | - Mercè Boada
- Ace Alzheimer Center Barcelona, Universitat Internacional de Catalunya, Barcelona, Spain
- Networking Research Center on Neurodegenerative Diseases (CIBERNED), Instituto de Salud Carlos III, Madrid, Spain
| |
Collapse
|
6
|
Tuerk C, Saha T, Bouchard MF, Booij L. Computerized Cognitive Test Batteries for Children and Adolescents-A Scoping Review of Tools For Lab- and Web-Based Settings From 2000 to 2021. Arch Clin Neuropsychol 2023; 38:1683-1710. [PMID: 37259540 PMCID: PMC10681451 DOI: 10.1093/arclin/acad039] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2022] [Revised: 03/20/2023] [Accepted: 04/20/2023] [Indexed: 06/02/2023] Open
Abstract
OBJECTIVE Cognitive functioning is essential to well-being. Since cognitive difficulties are common in many disorders, their early identification is critical, notably during childhood and adolescence. This scoping review aims to provide a comprehensive literature overview of computerized cognitive test batteries (CCTB) that have been developed and used in children and adolescents over the past 22 years and to evaluate their psychometric properties. METHOD Among 3192 records identified from three databases (PubMed, PsycNET, and Web of Science) between 2000 and 2021, 564 peer-reviewed articles conducted in children and adolescents aged 3 to 18 years met inclusion criteria. Twenty main CCTBs were identified and further reviewed following PRISMA guidelines. Relevant study details (sample information, topic, location, setting, norms, and psychometrics) were extracted, as well as administration and instrument characteristics for the main CCTBs. RESULTS Findings suggest that CCTB use varies according to age, location, and topic, with eight tools accounting for 85% of studies, and the Cambridge Neuropsychological Test Automated Battery (CANTAB) being most frequently used. Few instruments were applied in web-based settings or include social cognition tasks. Only 13% of studies reported psychometric properties. CONCLUSIONS Over the past two decades, a high number of computerized cognitive batteries have been developed. Among these, more validation studies are needed, particularly across diverse cultural contexts. This review offers a comprehensive synthesis of CCTBs to aid both researchers and clinicians to conduct cognitive assessments in children in either a lab- or web-based setting.
Collapse
Affiliation(s)
- Carola Tuerk
- Department of Psychology, Concordia University, 7141 Sherbrooke Street West, Montreal, QC H4B 1R6, Canada
- Sainte-Justine Hospital Research Center, 3175 Côte-Sainte-Catherine Road, Montreal, QC H3T 1C5, Canada
| | - Trisha Saha
- Department of Environmental and Occupational Health, School of Public Health, University of Montreal, 7101 Park Avenue, Montreal, QC H3N 1X9, Canada
| | - Maryse F Bouchard
- Sainte-Justine Hospital Research Center, 3175 Côte-Sainte-Catherine Road, Montreal, QC H3T 1C5, Canada
- Department of Environmental and Occupational Health, School of Public Health, University of Montreal, 7101 Park Avenue, Montreal, QC H3N 1X9, Canada
- Institut National de la Recherche Scientifique, 531 des Prairies Blvd, Laval, QC H7V 1B7, Canada
| | - Linda Booij
- Department of Psychology, Concordia University, 7141 Sherbrooke Street West, Montreal, QC H4B 1R6, Canada
- Sainte-Justine Hospital Research Center, 3175 Côte-Sainte-Catherine Road, Montreal, QC H3T 1C5, Canada
- Department of Psychiatry and Addictology, University of Montreal, 2900 Boulevard Edouard Montpetit, Montreal, QC H3T 1J4, Canada
- Department of Psychiatry, McGill University, 1033 Pine Avenue West, Montreal, Quebec H3A 1A1, Canada
- Research Centre, Douglas Mental Health University Institute, 6875 Boulevard LaSalle, Verdun, QC H4H 1R3, Canada
| |
Collapse
|
7
|
Lynham AJ, Jones IR, Walters JTR. Cardiff Online Cognitive Assessment in a National Sample: Cross-Sectional Web-Based Study. J Med Internet Res 2023; 25:e46675. [PMID: 37703073 PMCID: PMC10534289 DOI: 10.2196/46675] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2023] [Revised: 06/14/2023] [Accepted: 07/20/2023] [Indexed: 09/14/2023] Open
Abstract
BACKGROUND Psychiatric disorders are associated with cognitive impairment. We have developed a web-based, 9-task cognitive battery to measure the core domains affected in people with psychiatric disorders. To date, this assessment has been used to collect data on a clinical sample of participants with psychiatric disorders. OBJECTIVE The aims of this study were (1) to establish a briefer version of the battery (called the Cardiff Online Cognitive Assessment [CONCA]) that can give a valid measure of cognitive ability ("g") and (2) to collect normative data and demonstrate CONCA's application in a health population sample. METHODS Based on 6 criteria and data from our previous study, we selected 5 out of the original 9 tasks to include in CONCA. These included 3 core tasks that were sufficient to derive a measure of "g" and 2 optional tasks. Participants from a web-based national cohort study (HealthWise Wales) were invited to complete CONCA. Completion rates, sample characteristics, performance distributions, and associations between cognitive performance and demographic characteristics and mental health measures were examined. RESULTS A total of 3679 participants completed at least one CONCA task, of which 3135 completed all 3 core CONCA tasks. Performance on CONCA was associated with age (B=-0.05, SE 0.002; P<.001), device (tablet computer: B=-0.26, SE 0.05; P<.001; smartphone: B=-0.46, SE 0.05; P<.001), education (degree: B=1.68, SE 0.14; P<.001), depression symptoms (B=-0.04, SE 0.01; P<.001), and anxiety symptoms (B=-0.04, SE 0.01; P<.001). CONCLUSIONS CONCA provides a valid measure of "g," which can be derived using as few as 3 tasks that take no more than 15 minutes. Performance on CONCA showed associations with demographic characteristics in the expected direction and was associated with current depression and anxiety symptoms. The effect of device on cognitive performance is an important consideration for research using web-based assessments.
Collapse
Affiliation(s)
- Amy Joanne Lynham
- Division of Psychological Medicine, School of Medicine, Cardiff University, Cardiff, United Kingdom
| | - Ian R Jones
- Division of Psychological Medicine, School of Medicine, Cardiff University, Cardiff, United Kingdom
| | - James T R Walters
- Division of Psychological Medicine, School of Medicine, Cardiff University, Cardiff, United Kingdom
| |
Collapse
|
8
|
Hosseini M, Borhani-Haghighi A, Petramfar P, Foroughi AA, Ostovan VR, Nami M. Evaluating cognitive impairment in the early stages of Parkinson's disease using the Cambridge brain sciences-cognitive platform. Clin Neurol Neurosurg 2023; 232:107866. [PMID: 37413872 DOI: 10.1016/j.clineuro.2023.107866] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2023] [Revised: 06/17/2023] [Accepted: 06/27/2023] [Indexed: 07/08/2023]
Abstract
BACKGROUND Non-motor symptoms (NMS) such as cognitive impairment are among common presentations in patients with Parkinson's disease (PD). In parallel with motor symptoms, these impediments can affect PD patients' quality of life. However, cognitive impairment has received less attention in early PD. On the other hand, the relationship between olfactory symptoms and cognitive impairment is unclear in early PD. Considering the importance of accurate and timely assessment of cognitive function in PD patients using readily available/validated tests, this study has employed the Cambridge Brain Sciences-Cognitive Platform (CBS-CP) as a computer-based tool to assess cognitive presentations in early PD patients. METHODS Thirty-four eligible males and females were assigned to PD and healthy controls (HCs). The cognitive performance was assessed using CBS-CP and Mini-Mental State Examination (MMSE), and olfactory function was measured through the standardized olfactory Quick Smell test (QST). RESULTS PD patients had poorer performance in all CBS-CP tasks, including short-term memory, attention, and reasoning domains than HCs. Meanwhile, the verbal domain task scores showed no significant difference between groups. PD MMSE results were in the normal range (mean=26.96), although there was a significant difference between the PD and HCs groups (P = 0.000). Our results revealed no correlation between cognitive impairment and olfactory function in PD patients. CONCLUSION Given the widely studied features of CBS-CP and its reliability across published evidence, CBS-CP appears to be a suitable measurement to evaluate cognitive impairment in early PD with normal MMSE scores. It seems cognitive and olfactory impairments are independent in early PD. DATA AVAILABILITY STATEMENT The datasets generated during the current study are available from the corresponding author upon reasonable request.
Collapse
Affiliation(s)
- Maryam Hosseini
- Department of Neuroscience, School of Advanced Medical Sciences and Technologies, Shiraz University of Medical Sciences, Shiraz, Iran; DANA Brain Health Institute, Iranian Neuroscience Society-Fars Branch, Shiraz, Iran
| | | | - Peyman Petramfar
- Clinical Neurology Research Center, Shiraz University of Medical Sciences, Shiraz, Iran
| | - Amin Abolhasani Foroughi
- Epilepsy Research Center, Shiraz University of Medical Sciences, Shiraz, Iran; Medical Imaging Research Center, Shiraz University of Medical Sciences, Shiraz, Iran
| | - Vahid Reza Ostovan
- Clinical Neurology Research Center, Shiraz University of Medical Sciences, Shiraz, Iran; Department of Neurology, Shiraz University of Medical Sciences, Shiraz, Iran.
| | - Mohammad Nami
- DANA Brain Health Institute, Iranian Neuroscience Society-Fars Branch, Shiraz, Iran; Cognitive Neuropsychology Unit, Department of Social Sciences, Canadian University Dubai, Dubai, United Arab Emirates.
| |
Collapse
|
9
|
Del Giovane M, Trender WR, Bălăeţ M, Mallas EJ, Jolly AE, Bourke NJ, Zimmermann K, Graham NS, Lai H, Losty EJ, Oiarbide GA, Hellyer PJ, Faiman I, Daniels SJ, Batey P, Harrison M, Giunchiglia V, Kolanko MA, David MC, Li LM, Demarchi C, Friedland D, Sharp DJ, Hampshire A. Computerised cognitive assessment in patients with traumatic brain injury: an observational study of feasibility and sensitivity relative to established clinical scales. EClinicalMedicine 2023; 59:101980. [PMID: 37152359 PMCID: PMC10154960 DOI: 10.1016/j.eclinm.2023.101980] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/08/2023] [Revised: 03/28/2023] [Accepted: 04/12/2023] [Indexed: 05/09/2023] Open
Abstract
Background Online technology could potentially revolutionise how patients are cognitively assessed and monitored. However, it remains unclear whether assessments conducted remotely can match established pen-and-paper neuropsychological tests in terms of sensitivity and specificity. Methods This observational study aimed to optimise an online cognitive assessment for use in traumatic brain injury (TBI) clinics. The tertiary referral clinic in which this tool has been clinically implemented typically sees patients a minimum of 6 months post-injury in the chronic phase. Between March and August 2019, we conducted a cross-group, cross-device and factor analyses at the St. Mary's Hospital TBI clinic and major trauma wards at Imperial College NHS trust and St. George's Hospital in London (UK), to identify a battery of tasks that assess aspects of cognition affected by TBI. Between September 2019 and February 2020, we evaluated the online battery against standard face-to-face neuropsychological tests at the Imperial College London research centre. Canonical Correlation Analysis (CCA) determined the shared variance between the online battery and standard neuropsychological tests. Finally, between October 2020 and December 2021, the tests were integrated into a framework that automatically generates a results report where patients' performance is compared to a large normative dataset. We piloted this as a practical tool to be used under supervised and unsupervised conditions at the St. Mary's Hospital TBI clinic in London (UK). Findings The online assessment discriminated processing-speed, visual-attention, working-memory, and executive-function deficits in TBI. CCA identified two significant modes indicating shared variance with standard neuropsychological tests (r = 0.86, p < 0.001 and r = 0.81, p = 0.02). Sensitivity to cognitive deficits after TBI was evident in the TBI clinic setting under supervised and unsupervised conditions (F (15,555) = 3.99; p < 0.001). Interpretation Online cognitive assessment of TBI patients is feasible, sensitive, and efficient. When combined with normative sociodemographic models and autogenerated reports, it has the potential to transform cognitive assessment in the healthcare setting. Funding This work was funded by a National Institute for Health Research (NIHR) Invention for Innovation (i4i) grant awarded to DJS and AH (II-LB-0715-20006).
Collapse
Affiliation(s)
- Martina Del Giovane
- UK Dementia Research Institute, Care Research & Technology Centre, Imperial College and the University of Surrey, 9th Floor, Sir Michael Uren Hub, 86 Wood Ln, W12 0BZ, London, United Kingdom
- Department of Brain Sciences, Imperial College London, London, United Kingdom. Burlington Danes, The Hammersmith Hospital, Du Cane Road, W12 0NN, London, United Kingdom
| | - William R. Trender
- Department of Brain Sciences, Imperial College London, London, United Kingdom. Burlington Danes, The Hammersmith Hospital, Du Cane Road, W12 0NN, London, United Kingdom
| | - Maria Bălăeţ
- Department of Brain Sciences, Imperial College London, London, United Kingdom. Burlington Danes, The Hammersmith Hospital, Du Cane Road, W12 0NN, London, United Kingdom
| | - Emma-Jane Mallas
- UK Dementia Research Institute, Care Research & Technology Centre, Imperial College and the University of Surrey, 9th Floor, Sir Michael Uren Hub, 86 Wood Ln, W12 0BZ, London, United Kingdom
- Department of Brain Sciences, Imperial College London, London, United Kingdom. Burlington Danes, The Hammersmith Hospital, Du Cane Road, W12 0NN, London, United Kingdom
| | - Amy E. Jolly
- UK Dementia Research Institute, Care Research & Technology Centre, Imperial College and the University of Surrey, 9th Floor, Sir Michael Uren Hub, 86 Wood Ln, W12 0BZ, London, United Kingdom
- Department of Brain Sciences, Imperial College London, London, United Kingdom. Burlington Danes, The Hammersmith Hospital, Du Cane Road, W12 0NN, London, United Kingdom
- Department of Neuroinflammation, UCL Queen Square Institute of Neurology, Faculty of Brain Sciences, Queen Square, WC1N 3BG, London, United Kingdom
| | - Niall J. Bourke
- UK Dementia Research Institute, Care Research & Technology Centre, Imperial College and the University of Surrey, 9th Floor, Sir Michael Uren Hub, 86 Wood Ln, W12 0BZ, London, United Kingdom
- Department of Brain Sciences, Imperial College London, London, United Kingdom. Burlington Danes, The Hammersmith Hospital, Du Cane Road, W12 0NN, London, United Kingdom
- Department of Neuroimaging, Institute of Psychiatry, Psychology and Neuroscience, King’s College London, 16 De Crespigny Park, SE5 8AB, London, United Kingdom
| | - Karl Zimmermann
- UK Dementia Research Institute, Care Research & Technology Centre, Imperial College and the University of Surrey, 9th Floor, Sir Michael Uren Hub, 86 Wood Ln, W12 0BZ, London, United Kingdom
- Department of Brain Sciences, Imperial College London, London, United Kingdom. Burlington Danes, The Hammersmith Hospital, Du Cane Road, W12 0NN, London, United Kingdom
| | - Neil S.N. Graham
- UK Dementia Research Institute, Care Research & Technology Centre, Imperial College and the University of Surrey, 9th Floor, Sir Michael Uren Hub, 86 Wood Ln, W12 0BZ, London, United Kingdom
- Department of Brain Sciences, Imperial College London, London, United Kingdom. Burlington Danes, The Hammersmith Hospital, Du Cane Road, W12 0NN, London, United Kingdom
| | - Helen Lai
- UK Dementia Research Institute, Care Research & Technology Centre, Imperial College and the University of Surrey, 9th Floor, Sir Michael Uren Hub, 86 Wood Ln, W12 0BZ, London, United Kingdom
- Department of Brain Sciences, Imperial College London, London, United Kingdom. Burlington Danes, The Hammersmith Hospital, Du Cane Road, W12 0NN, London, United Kingdom
| | - Ethan J.F. Losty
- Department of Brain Sciences, Imperial College London, London, United Kingdom. Burlington Danes, The Hammersmith Hospital, Du Cane Road, W12 0NN, London, United Kingdom
| | - Garazi Araña Oiarbide
- Department of Brain Sciences, Imperial College London, London, United Kingdom. Burlington Danes, The Hammersmith Hospital, Du Cane Road, W12 0NN, London, United Kingdom
| | - Peter J. Hellyer
- Institute of Psychiatry, Psychology and Neuroscience, King’s College London, 16 De Crespigny Park, SE5 8AF, London, United Kingdom
| | - Irene Faiman
- Clinical Neuropsychology Service, St George's University Hospital NHS Foundation Trust, Blackshaw Road, SW17 0QT, London, United Kingdom
| | - Sarah J.C. Daniels
- UK Dementia Research Institute, Care Research & Technology Centre, Imperial College and the University of Surrey, 9th Floor, Sir Michael Uren Hub, 86 Wood Ln, W12 0BZ, London, United Kingdom
- Department of Brain Sciences, Imperial College London, London, United Kingdom. Burlington Danes, The Hammersmith Hospital, Du Cane Road, W12 0NN, London, United Kingdom
| | - Philippa Batey
- UK Dementia Research Institute, Care Research & Technology Centre, Imperial College and the University of Surrey, 9th Floor, Sir Michael Uren Hub, 86 Wood Ln, W12 0BZ, London, United Kingdom
- The Helix Centre, Imperial College London, and the Royal College of Arts, St. Mary’s Hospital, 3rd Floor Paterson Building, 20 South Wharf Road, Paddington, W2 1PE, London, United Kingdom
| | - Matthew Harrison
- UK Dementia Research Institute, Care Research & Technology Centre, Imperial College and the University of Surrey, 9th Floor, Sir Michael Uren Hub, 86 Wood Ln, W12 0BZ, London, United Kingdom
- The Helix Centre, Imperial College London, and the Royal College of Arts, St. Mary’s Hospital, 3rd Floor Paterson Building, 20 South Wharf Road, Paddington, W2 1PE, London, United Kingdom
| | - Valentina Giunchiglia
- Department of Brain Sciences, Imperial College London, London, United Kingdom. Burlington Danes, The Hammersmith Hospital, Du Cane Road, W12 0NN, London, United Kingdom
| | - Magdalena A. Kolanko
- UK Dementia Research Institute, Care Research & Technology Centre, Imperial College and the University of Surrey, 9th Floor, Sir Michael Uren Hub, 86 Wood Ln, W12 0BZ, London, United Kingdom
- Department of Brain Sciences, Imperial College London, London, United Kingdom. Burlington Danes, The Hammersmith Hospital, Du Cane Road, W12 0NN, London, United Kingdom
| | - Michael C.B. David
- UK Dementia Research Institute, Care Research & Technology Centre, Imperial College and the University of Surrey, 9th Floor, Sir Michael Uren Hub, 86 Wood Ln, W12 0BZ, London, United Kingdom
- Department of Brain Sciences, Imperial College London, London, United Kingdom. Burlington Danes, The Hammersmith Hospital, Du Cane Road, W12 0NN, London, United Kingdom
| | - Lucia M. Li
- UK Dementia Research Institute, Care Research & Technology Centre, Imperial College and the University of Surrey, 9th Floor, Sir Michael Uren Hub, 86 Wood Ln, W12 0BZ, London, United Kingdom
- Department of Brain Sciences, Imperial College London, London, United Kingdom. Burlington Danes, The Hammersmith Hospital, Du Cane Road, W12 0NN, London, United Kingdom
| | - Célia Demarchi
- UK Dementia Research Institute, Care Research & Technology Centre, Imperial College and the University of Surrey, 9th Floor, Sir Michael Uren Hub, 86 Wood Ln, W12 0BZ, London, United Kingdom
- Department of Brain Sciences, Imperial College London, London, United Kingdom. Burlington Danes, The Hammersmith Hospital, Du Cane Road, W12 0NN, London, United Kingdom
| | - Daniel Friedland
- UK Dementia Research Institute, Care Research & Technology Centre, Imperial College and the University of Surrey, 9th Floor, Sir Michael Uren Hub, 86 Wood Ln, W12 0BZ, London, United Kingdom
- Department of Brain Sciences, Imperial College London, London, United Kingdom. Burlington Danes, The Hammersmith Hospital, Du Cane Road, W12 0NN, London, United Kingdom
| | - David J. Sharp
- UK Dementia Research Institute, Care Research & Technology Centre, Imperial College and the University of Surrey, 9th Floor, Sir Michael Uren Hub, 86 Wood Ln, W12 0BZ, London, United Kingdom
- Department of Brain Sciences, Imperial College London, London, United Kingdom. Burlington Danes, The Hammersmith Hospital, Du Cane Road, W12 0NN, London, United Kingdom
| | - Adam Hampshire
- Department of Brain Sciences, Imperial College London, London, United Kingdom. Burlington Danes, The Hammersmith Hospital, Du Cane Road, W12 0NN, London, United Kingdom
| |
Collapse
|
10
|
Rizzi E, Vezzoli M, Pegoraro S, Facchin A, Strina V, Daini R. Teleneuropsychology: normative data for the assessment of memory in online settings. Neurol Sci 2023; 44:529-538. [PMID: 36197578 PMCID: PMC9533275 DOI: 10.1007/s10072-022-06426-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2022] [Accepted: 09/22/2022] [Indexed: 11/18/2022]
Abstract
INTRODUCTION The COVID-19 pandemic has forced significant changes in clinical practice. Psychologists and neuropsychologists had to modify their settings to assess patients' abilities, switching from an in-person modality to a remote setting by using video calling platforms. Consequently, this change brought about the need for new normative data tailored to remote settings. AIM AND METHODS The study aimed to develop normative data for the online assessment of neuropsychological memory tests and to compare it with the published norms obtained in standard settings. Two hundred and four healthy Italian volunteers performed three verbal memory tests through the Google Meet platform: the Digit Span (Backward and Forward), the Rey Auditory Verbal Learning, and the Verbal Paired Associated Learning Test. RESULTS This research provides specific norms that consider the influence of demographic characteristics. Their comparison with published norms shows a medium to high agreement between systems. The present study provides a reference for the clinical use of neuropsychological instruments to assess verbal memory in a remote setting and offers specific recommendations.
Collapse
Affiliation(s)
- Ezia Rizzi
- Department of Social and Human Science, University of Salento, Studium 2000, Via di Valesio, 73100 Lecce, Italy ,Department of Psychology, University of Milano-Bicocca, Milan, Italy
| | - Michela Vezzoli
- Department of Psychology, University of Milano-Bicocca, Milan, Italy
| | - Sara Pegoraro
- Department of Psychology, University of Milano-Bicocca, Milan, Italy
| | - Alessio Facchin
- Department of Psychology, University of Milano-Bicocca, Milan, Italy
| | - Veronica Strina
- Department of Psychology, University of Milano-Bicocca, Milan, Italy
| | - Roberta Daini
- Department of Psychology, University of Milano-Bicocca, Milan, Italy
| |
Collapse
|
11
|
Ashford MT, Eichenbaum J, Jin C, Neuhaus J, Aaronson A, Ulbricht A, Camacho MR, Fockler J, Flenniken D, Truran D, Mackin RS, Maruff P, Weiner MW, Nosheny RL. Associations between Participant Characteristics and Participant Feedback about an Unsupervised Online Cognitive Assessment in a Research Registry. J Prev Alzheimers Dis 2023; 10:607-614. [PMID: 37357303 PMCID: PMC10126538 DOI: 10.14283/jpad.2023.40] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2023] [Accepted: 03/21/2023] [Indexed: 07/26/2023]
Abstract
BACKGROUND This study aims to understand whether and how participant characteristics (age, gender, education, ethnocultural identity) are related to their feedback about taking a remote, unsupervised, online cognitive assessment. METHODS The Brain Health Registry is a public online registry which includes cognitive assessments. Multivariable ordinal regressions assessed associations between participant characteristics and feedback responses of older (55+) participants (N=11,553) regarding their Cogstate Brief Battery assessment experience. RESULTS Higher age, secondary education or less, Latino identity, and female gender were associated with a poorer assessment experience; higher age and a non-White identity were associated with experiencing the assessment instructions as less clear; and higher age, non-White identity, and secondary education or less were associated with rating additional human support with the assessment as more useful. DISCUSSION Our findings highlight the importance of improving the design and instructions of unsupervised, remote, online cognitive assessments to better suit the needs of diverse communities.
Collapse
Affiliation(s)
- M T Ashford
- Miriam Ashford, NCIRE - Northern California Institute for Research and Education, 4150 Clement Street, San Francisco, CA 94121, USA, , Phone: 650-208-9267
| | | | | | | | | | | | | | | | | | | | | | | | | | | |
Collapse
|
12
|
Poulton A, Chen LPE, Dali G, Fox M, Hester R. Web-Based Independent Versus Laboratory-Based Stop-Signal Task Performance: Within-Subjects Counterbalanced Comparison Study. J Med Internet Res 2022; 24:e32922. [PMID: 35635745 PMCID: PMC9153905 DOI: 10.2196/32922] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2021] [Revised: 02/03/2022] [Accepted: 02/23/2022] [Indexed: 01/29/2023] Open
Abstract
Background
Considered a facet of behavioral impulsivity, response inhibition facilitates adaptive and goal-directed behavior. It is often assessed using the Stop-Signal Task (SST), which is presented on stand-alone computers under controlled laboratory conditions. Sample size may consequently be a function of cost or time and sample diversity constrained to those willing or able to attend the laboratory. Statistical power and generalizability of results might, in turn, be impacted. Such limitations may potentially be overcome via the implementation of web-based testing.
Objective
The aim of this study was to investigate if there were differences between variables derived from a web-based SST when it was undertaken independently—that is, outside the laboratory, on any computer, and in the absence of researchers—versus when it was performed under laboratory conditions.
Methods
We programmed a web-based SST in HTML and JavaScript and employed a counterbalanced design. A total of 166 individuals (mean age 19.72, SD 1.85, range 18-36 years; 146/166, 88% female) were recruited. Of them, 79 undertook the independent task prior to visiting the laboratory and 78 completed the independent task following their laboratory visit. The average time between SST testing was 3.72 (SD 2.86) days. Dependent samples and Bayesian paired samples t tests were used to examine differences between laboratory-based and independent SST variables. Correlational analyses were conducted on stop-signal reaction times (SSRT).
Results
After exclusions, 123 participants (mean age 19.73, SD 1.97 years) completed the SST both in the laboratory and independently. While participants were less accurate on go trials and exhibited reduced inhibitory control when undertaking the independent—compared to the laboratory-based—SST, there was a positive association between the SSRT of each condition (r=.48; P<.001; 95% CI 0.33-0.61).
Conclusions
Findings suggest a web-based SST, which participants undertake on any computer, at any location, and in the absence of the researcher, is a suitable measure of response inhibition.
Collapse
Affiliation(s)
- Antoinette Poulton
- Melbourne School of Psychological Sciences, University of Melbourne, Parkville, Australia
| | - Li Peng Evelyn Chen
- Melbourne School of Psychological Sciences, University of Melbourne, Parkville, Australia
| | - Gezelle Dali
- Melbourne School of Psychological Sciences, University of Melbourne, Parkville, Australia
| | - Michael Fox
- Melbourne School of Psychological Sciences, University of Melbourne, Parkville, Australia
| | - Robert Hester
- Melbourne School of Psychological Sciences, University of Melbourne, Parkville, Australia
| |
Collapse
|
13
|
Netzel L, Moran R, Hopfe D, Salvatore AP, Brown W, Murray NG. Test-Retest Reliability of Remote ImPACT Administration. ARCHIVES OF CLINICAL NEUROPSYCHOLOGY : THE OFFICIAL JOURNAL OF THE NATIONAL ACADEMY OF NEUROPSYCHOLOGISTS 2022; 37:449-456. [PMID: 34272867 DOI: 10.1093/arclin/acab055] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/15/2020] [Revised: 03/19/2021] [Accepted: 06/22/2021] [Indexed: 02/02/2023]
Abstract
OBJECTIVE To evaluate the performance and test-retest reliability obtained when administering a computerized baseline neurocognitive exam to NCAA Division I student-athletes in a controlled laboratory setting versus an uncontrolled remote location. METHOD A sample of 129 (female = 100) Division I student-athletes completed Immediate Post-Concussion Assessment and Cognitive Testing (ImPACT) pre-season assessments for two distinct and respective sports seasons in a controlled laboratory environment and an uncontrolled remote environment. Depending on the environment, participants were given verbal (controlled) or written (uncontrolled) guidelines for taking the test. RESULTS Multivariate repeated-measures ANOVA's determined that there were no within-subject differences between testing environments on ImPACT composite scores and cognitive efficiency index (CEI). The Chi-square test did not find any significant differences in impulse control or the number of invalid test scores, as determined by ImPACT, between environments. Intraclass correlations found the ImPACT subtest scores to range in test-retest reliability across testing environments, demonstrating moderate (verbal memory composite, r = 0.46; visual memory composite, r = 0.64; reaction time, r = 0.61; impulse control, r = 0.52; and CEI, r = 0.61) and good (visual motor composite, r = 0.77) test-retest reliability. CONCLUSIONS Results indicate that ImPACT is reliable between controlled and uncontrolled testing environments. This further suggests that ImPACT can be administered in a remote environment, pending specific adherence to testing instructions, or in the event of social distancing or isolation policies.
Collapse
Affiliation(s)
- Lauren Netzel
- Neuromechanics Laboratory, School of Community Health Sciences, University of Nevada, Reno, Nevada, 89557, USA
| | - Ryan Moran
- Athletic Training Research Laboratory, College of Human Environmental Sciences, The University of Alabama, Tuscaloosa, Alabama, 35487, USA
| | - Dustin Hopfe
- Neuromechanics Laboratory, School of Community Health Sciences, University of Nevada, Reno, Nevada, 89557, USA
| | - Anthony P Salvatore
- Department of Communicative Disorders, The University of Louisiana, Lafayette, Louisiana, 70504, USA
| | - Warren Brown
- Department of Communicative Disorders, The University of Louisiana, Lafayette, Louisiana, 70504, USA
| | - Nicholas G Murray
- Neuromechanics Laboratory, School of Community Health Sciences, University of Nevada, Reno, Nevada, 89557, USA
| |
Collapse
|
14
|
Lynham AJ, Jones IR, Walters JTR. Web-Based Cognitive Testing in Psychiatric Research: Validation and Usability Study. J Med Internet Res 2022; 24:e28233. [PMID: 35142640 PMCID: PMC8874806 DOI: 10.2196/28233] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2021] [Revised: 06/11/2021] [Accepted: 11/21/2021] [Indexed: 01/23/2023] Open
Abstract
Background Cognitive impairments are features of many psychiatric disorders and affect functioning. A barrier to cognitive research on psychiatric disorders is the lack of large cross-disorder data sets. However, the collection of cognitive data can be logistically challenging and expensive. Web-based collection may be an alternative; however, little is known about who does and does not complete web-based cognitive assessments for psychiatric research. Objective The aims of this study are to develop a web-based cognitive battery for use in psychiatric research, validate the battery against the Measurement and Treatment Research to Improve Cognition in Schizophrenia (MATRICS) Consensus Cognitive Battery, and compare the characteristics of the participants who chose to take part with those of the individuals who did not participate. Methods Tasks were developed by The Many Brains Project and selected to measure the domains specified by the MATRICS initiative. We undertook a cross-validation study of 65 participants with schizophrenia, bipolar disorder, depression, or no history of psychiatric disorders to compare the web-based tasks with the MATRICS Consensus Cognitive Battery. Following validation, we invited participants from 2 large ongoing genetic studies, which recruited participants with psychiatric disorders to complete the battery and evaluated the demographic and clinical characteristics of those who took part. Results Correlations between web-based and MATRICS tasks ranged between 0.26 and 0.73. Of the 961 participants, 887 (92.3%) completed at least one web-based task, and 644 (67%) completed all tasks, indicating adequate completion rates. Predictors of web-based participation included being female (odds ratio [OR] 1.3, 95% CI 1.07-1.58), ethnicity other than White European (OR 0.66, 95% CI 0.46-0.96), higher levels of education (OR 1.19, 95% CI 1.11-1.29), diagnosis of an eating disorder (OR 2.17, 95% CI 1.17-4) or depression and anxiety (OR 5.12, 95% CI 3.38-7.83), and absence of a diagnosis of schizophrenia (OR 0.59, 95% CI 0.35-0.94). Lower performance on the battery was associated with poorer functioning (B=−1.76, SE 0.26; P<.001). Conclusions Our findings offer valuable insights into the advantages and disadvantages of testing cognitive function remotely for mental health research.
Collapse
Affiliation(s)
- Amy Joanne Lynham
- Medical Research Council Centre for Neuropsychiatric Genetics and Genomics, Division of Psychiatry and Clinical Neurosciences, School of Medicine, Cardiff University, Cardiff, United Kingdom
| | - Ian R Jones
- Medical Research Council Centre for Neuropsychiatric Genetics and Genomics, Division of Psychiatry and Clinical Neurosciences, School of Medicine, Cardiff University, Cardiff, United Kingdom
| | - James T R Walters
- Medical Research Council Centre for Neuropsychiatric Genetics and Genomics, Division of Psychiatry and Clinical Neurosciences, School of Medicine, Cardiff University, Cardiff, United Kingdom
| |
Collapse
|
15
|
Van Patten R. Introduction to the Special Issue - Neuropsychology from a distance: Psychometric properties and clinical utility of remote neurocognitive tests. J Clin Exp Neuropsychol 2022; 43:767-773. [PMID: 35133240 DOI: 10.1080/13803395.2021.2021645] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Affiliation(s)
- Ryan Van Patten
- Department of Psychiatry and Human Behavior, Alpert Medical School of Brown, University, Providence, RI, USA.,Providence VA Medical Center, Providence, RI, USA
| |
Collapse
|
16
|
Leong V, Raheel K, Sim JY, Kacker K, Karlaftis VM, Vassiliu C, Kalaivanan K, Chen SHA, Robbins TW, Sahakian BJ, Kourtzi Z. A New Remote Guided Method for Supervised Web-Based Cognitive Testing to Ensure High-Quality Data: Development and Usability Study. J Med Internet Res 2022; 24:e28368. [PMID: 34989691 PMCID: PMC8778570 DOI: 10.2196/28368] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2021] [Revised: 06/25/2021] [Accepted: 07/27/2021] [Indexed: 01/06/2023] Open
Abstract
Background The global COVID-19 pandemic has triggered a fundamental reexamination of how human psychological research can be conducted safely and robustly in a new era of digital working and physical distancing. Online web-based testing has risen to the forefront as a promising solution for the rapid mass collection of cognitive data without requiring human contact. However, a long-standing debate exists over the data quality and validity of web-based studies. This study examines the opportunities and challenges afforded by the societal shift toward web-based testing and highlights an urgent need to establish a standard data quality assurance framework for online studies. Objective This study aims to develop and validate a new supervised online testing methodology, remote guided testing (RGT). Methods A total of 85 healthy young adults were tested on 10 cognitive tasks assessing executive functioning (flexibility, memory, and inhibition) and learning. Tasks were administered either face-to-face in the laboratory (n=41) or online using remote guided testing (n=44) and delivered using identical web-based platforms (Cambridge Neuropsychological Test Automated Battery, Inquisit, and i-ABC). Data quality was assessed using detailed trial-level measures (missed trials, outlying and excluded responses, and response times) and overall task performance measures. Results The results indicated that, across all data quality and performance measures, RGT data was statistically-equivalent to in-person data collected in the lab (P>.40 for all comparisons). Moreover, RGT participants out-performed the lab group on measured verbal intelligence (P<.001), which could reflect test environment differences, including possible effects of mask-wearing on communication. Conclusions These data suggest that the RGT methodology could help ameliorate concerns regarding online data quality—particularly for studies involving high-risk or rare cohorts—and offer an alternative for collecting high-quality human cognitive data without requiring in-person physical attendance.
Collapse
Affiliation(s)
- Victoria Leong
- Psychology, School of Social Sciences, Nanyang Technological University, Singapore, Singapore.,Centre for Research and Development in Learning, Nanyang Technological University, Singapore, Singapore.,Lee Kong Chian School of Medicine, Nanyang Technological University, Singapore, Singapore
| | - Kausar Raheel
- Psychology, School of Social Sciences, Nanyang Technological University, Singapore, Singapore
| | - Jia Yi Sim
- Psychology, School of Social Sciences, Nanyang Technological University, Singapore, Singapore
| | - Kriti Kacker
- Psychology, School of Social Sciences, Nanyang Technological University, Singapore, Singapore
| | - Vasilis M Karlaftis
- Department of Psychology, University of Cambridge, Cambridge, United Kingdom
| | - Chrysoula Vassiliu
- Faculty of Modern and Medieval Languages and Linguistics, University of Cambridge, Cambridge, United Kingdom
| | - Kastoori Kalaivanan
- Centre for Research and Development in Learning, Nanyang Technological University, Singapore, Singapore
| | - S H Annabel Chen
- Psychology, School of Social Sciences, Nanyang Technological University, Singapore, Singapore.,Centre for Research and Development in Learning, Nanyang Technological University, Singapore, Singapore.,Lee Kong Chian School of Medicine, Nanyang Technological University, Singapore, Singapore.,National Institute of Education, Nanyang Technological University, Singapore, Singapore
| | - Trevor W Robbins
- Department of Psychology, University of Cambridge, Cambridge, United Kingdom.,Behavioural and Clinical Neuroscience Institute, University of Cambridge, Cambridge, United Kingdom
| | - Barbara J Sahakian
- Behavioural and Clinical Neuroscience Institute, University of Cambridge, Cambridge, United Kingdom.,Department of Psychiatry, University of Cambridge, Cambridge, United Kingdom
| | - Zoe Kourtzi
- Department of Psychology, University of Cambridge, Cambridge, United Kingdom
| |
Collapse
|
17
|
Li W, Yue L, Xiao S. Subjective cognitive decline is associated with a higher risk of objective cognitive decline: A cross-sectional and longitudinal study. Front Psychiatry 2022; 13:950270. [PMID: 36245867 PMCID: PMC9558822 DOI: 10.3389/fpsyt.2022.950270] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/22/2022] [Accepted: 09/12/2022] [Indexed: 11/13/2022] Open
Abstract
BACKGROUND Subjective cognitive decline (SCD) is considered as an independent risk factor for objective cognitive impairment, such as dementia and mild cognitive impairment (MCI), but the mechanism is unclear. METHODS The current study consisted of two parts, the first of which included 1,010 older adults with SCD and 535 normal controls and was followed for 1 year. The second cross-sectional study included 94 older adults with SCD and 64 healthy controls. Unlike the first cohort, subjects in the second study underwent magnetic resonance imaging and had more detailed neuropsychological tests, such as Mini- mental State Examination (MMSE), Montreal Cognitive Assessment (MoCA), Digit Span, Auditory Verbal Learning Test (AVLT), Associative Learning Test (ALT), Verbal Fluency (VF), Wechsler's filling and Wechsler's building blocks. RESULTS In cohort 1, we found that SCD had a higher risk of objective cognitive impairment compared to normal controls (X2 = 20.354, p = 0.002), and the results of Cox Regression analysis also suggest that SCD was a risk factor for objective cognitive decline (p < 0.001, HR = 2.608, 95%CI: 2.213-3.075). In study 2, we found that the scores of MoCA, digit span, verbal fluency, and Wechsler's filling of SCD elderly were significantly lower than those of normal controls, but the cortical thickness of the rostral middle frontal gyrus (RMFG) was significantly higher than that of normal controls (p < 0.05). CONCLUSIONS SCD is a cognition-related disease with multi-cognitive domain impairment, which is associated with a higher risk of objective cognitive impairment. Moreover, the increased cortical thickness of the left rostral middle frontal gyrus (RMFG) might be an important mechanism of cognitive decline in SCD.
Collapse
Affiliation(s)
- Wei Li
- Department of Geriatric Psychiatry, Shanghai Mental Health Center, Shanghai Jiao Tong University School of Medicine, Shanghai, China.,Alzheimer's Disease and Related Disorders Center, Shanghai Jiao Tong University, Shanghai, China
| | - Ling Yue
- Department of Geriatric Psychiatry, Shanghai Mental Health Center, Shanghai Jiao Tong University School of Medicine, Shanghai, China.,Alzheimer's Disease and Related Disorders Center, Shanghai Jiao Tong University, Shanghai, China
| | - Shifu Xiao
- Department of Geriatric Psychiatry, Shanghai Mental Health Center, Shanghai Jiao Tong University School of Medicine, Shanghai, China.,Alzheimer's Disease and Related Disorders Center, Shanghai Jiao Tong University, Shanghai, China
| |
Collapse
|
18
|
Singh S, Strong RW, Jung L, Li FH, Grinspoon L, Scheuer LS, Passell EJ, Martini P, Chaytor N, Soble JR, Germine L. The TestMyBrain Digital Neuropsychology Toolkit: Development and Psychometric Characteristics. J Clin Exp Neuropsychol 2021; 43:786-795. [PMID: 34907842 DOI: 10.1080/13803395.2021.2002269] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Abstract
INTRODUCTION To allow continued administration of neuropsychological evaluations remotely during the pandemic, tests from the not-for-profit platform, TestMyBrain.org (TMB), were used to develop the TMB Digital Neuropsychology Toolkit (DNT). This study details the psychometric characteristics of the DNT, as well as the infrastructure and development of the DNT. METHOD The DNT was primarily distributed for clinical use, with (72.8%) of individuals requesting access for clinical purposes. To assess reliability and validity of the DNT, anonymous data from DNT test administrations were analyzed and compared to a large, non-clinical normative sample from TMB. RESULTS DNT test scores showed acceptable to very good split-half reliability (.68-.99). Factor analysis revealed three latent factors, corresponding to processing speed, working memory, and a broader general cognitive ability factor that included perceptual reasoning and episodic memory. Average test scores were slightly poorer for the DNT sample than for the TMB comparison sample, as expected given the clinical use of the DNT. CONCLUSIONS Initial estimates of reliability and validity of DNT tests support their use as digital measures of neuropsychological functioning. Tests within cognitive domains correlated highly with each other and demonstrated good reliability and validity. Future work will seek to validate DNT tests in specific clinical populations and determine best practices for using DNT outcome measures to assess engagement and psychological symptomatology.
Collapse
Affiliation(s)
- Shifali Singh
- Institute for Technology in Psychiatry, McLean Hospital, Belmont, MA, USA.,Department of Psychiatry, Harvard Medical School, Boston, MA, USA
| | - Roger W Strong
- Institute for Technology in Psychiatry, McLean Hospital, Belmont, MA, USA.,Department of Psychiatry, Harvard Medical School, Boston, MA, USA
| | - Laneé Jung
- Institute for Technology in Psychiatry, McLean Hospital, Belmont, MA, USA.,Department of Psychiatry, Harvard Medical School, Boston, MA, USA
| | - Frances Haofei Li
- Institute for Technology in Psychiatry, McLean Hospital, Belmont, MA, USA.,Department of Psychiatry, Harvard Medical School, Boston, MA, USA
| | - Liz Grinspoon
- Institute for Technology in Psychiatry, McLean Hospital, Belmont, MA, USA.,Department of Psychiatry, Harvard Medical School, Boston, MA, USA
| | - Luke S Scheuer
- Institute for Technology in Psychiatry, McLean Hospital, Belmont, MA, USA.,Department of Psychiatry, Harvard Medical School, Boston, MA, USA
| | - Eliza J Passell
- Institute for Technology in Psychiatry, McLean Hospital, Belmont, MA, USA.,Department of Psychiatry, Harvard Medical School, Boston, MA, USA
| | - Paolo Martini
- Institute for Technology in Psychiatry, McLean Hospital, Belmont, MA, USA.,Department of Psychiatry, Harvard Medical School, Boston, MA, USA
| | - Naomi Chaytor
- Elson S. Floyd College of Medicine, Washington State University, Spokane, WA, USA
| | - Jason R Soble
- Department of Psychiatry, University of Illinois College of Medicine, Chicago, IL, USA.,Department of Neurology, University of Illinois College of Medicine, Chicago, IL, USA
| | - Laura Germine
- Institute for Technology in Psychiatry, McLean Hospital, Belmont, MA, USA.,Department of Psychiatry, Harvard Medical School, Boston, MA, USA
| |
Collapse
|
19
|
Jóhannsdóttir KR, Ferretti D, Árnadóttir BS, Jónsdóttir MK. Objective Measures of Cognitive Performance in Sleep Disorder Research. Sleep Med Clin 2021; 16:575-593. [PMID: 34711383 DOI: 10.1016/j.jsmc.2021.08.002] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
Neurocognitive tests offer objective and reliable assessment of patients' status and progress. However, there is no consensus on how to use neurocognitive assessment in sleep disorder research. An effective use of neurocognitive assessment must be based on standardized practices and have a firm theoretic basis. The aim of this review is to offer an overview of how different tests have been used in the field, mapping each test onto a corresponding cognitive domain and propose how to move forward with a suggested cognitive battery of tests covering all major cognitive domains.
Collapse
Affiliation(s)
- Kamilla Rún Jóhannsdóttir
- Department of Psychology, Reykjavik University, Menntavegur 1, Reykjavik 102, Iceland; Reykjavik University Sleep Institute, School of Technology, Reykjavik University, Menntavegur 1, Reykjavik 102, Iceland.
| | - Dimitri Ferretti
- Reykjavik University Sleep Institute, School of Technology, Reykjavik University, Menntavegur 1, Reykjavik 102, Iceland
| | - Birta Sóley Árnadóttir
- Department of Psychology, Reykjavik University, Menntavegur 1, Reykjavik 102, Iceland; Reykjavik University Sleep Institute, School of Technology, Reykjavik University, Menntavegur 1, Reykjavik 102, Iceland
| | - María Kristín Jónsdóttir
- Department of Psychology, Reykjavik University, Menntavegur 1, Reykjavik 102, Iceland; Reykjavik University Sleep Institute, School of Technology, Reykjavik University, Menntavegur 1, Reykjavik 102, Iceland; Landspitali University Hospital, Reykjavik, Iceland
| |
Collapse
|
20
|
Zuber S, Haas M, Framorando D, Ballhausen N, Gillioz E, Künzi M, Kliegel M. The Geneva Space Cruiser: a fully self-administered online tool to assess prospective memory across the adult lifespan. Memory 2021; 30:117-132. [PMID: 34699342 DOI: 10.1080/09658211.2021.1995435] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
The current study aimed to examine whether the Geneva Space Cruiser - a new online adaptation of the Cruiser - represents a valid, reliable and useful tool to assess prospective memory (PM) across the adult lifespan via fully self-administered online testing. Therefore, an adult lifespan sample of 252 adults (19-86 years old) performed the Geneva Space Cruiser in the laboratory and online, at home, and also performed a more traditional laboratory PM task. A second sample of 224 young adults (19-35 years old) participated in a test-retest online assessment of the Geneva Space Cruiser. Bayesian analyses showed that the Geneva Space Cruiser yielded similar results when administered in the laboratory versus online, both in terms of data distribution as well as of key outcome measures (i.e., PM performance and monitoring). Results further showed very good test-retest reliability and acceptable construct validity. Finally, the online tool was sensitive for detecting age-differences similar to those typically observed in laboratory studies. Together, our findings suggest that the Geneva Space Cruiser represents a rather valid, moderately to highly reliable, and generally useful tool to assess PM in online testing across wide ranges of the adult lifespan, with certain limitations for the oldest participants and for women.
Collapse
Affiliation(s)
- S Zuber
- Centre for the Interdisciplinary Study of Gerontology and Vulnerability, University of Geneva, Geneva, Switzerland.,Swiss National Centre of Competences in Research LIVES-Overcoming vulnerability: life course perspectives, Lausanne and Geneva, Switzerland.,Department of Psychology, University of Geneva, Geneva, Switzerland
| | - M Haas
- Centre for the Interdisciplinary Study of Gerontology and Vulnerability, University of Geneva, Geneva, Switzerland.,Department of Psychology, University of Geneva, Geneva, Switzerland
| | - D Framorando
- School of Psychology, The University of Queensland, Brisbane, Australia
| | - N Ballhausen
- Centre for the Interdisciplinary Study of Gerontology and Vulnerability, University of Geneva, Geneva, Switzerland.,Department of Developmental Psychology, Tilburg University, Tilburg, The Netherlands
| | - E Gillioz
- Department of Psychology, University of Geneva, Geneva, Switzerland
| | - M Künzi
- Centre for the Interdisciplinary Study of Gerontology and Vulnerability, University of Geneva, Geneva, Switzerland.,Swiss National Centre of Competences in Research LIVES-Overcoming vulnerability: life course perspectives, Lausanne and Geneva, Switzerland.,Department of Psychology, University of Geneva, Geneva, Switzerland
| | - M Kliegel
- Centre for the Interdisciplinary Study of Gerontology and Vulnerability, University of Geneva, Geneva, Switzerland.,Swiss National Centre of Competences in Research LIVES-Overcoming vulnerability: life course perspectives, Lausanne and Geneva, Switzerland.,Department of Psychology, University of Geneva, Geneva, Switzerland
| |
Collapse
|
21
|
Segmentation of Prefrontal Lobe Based on Improved Clustering Algorithm in Patients with Diabetes. COMPUTATIONAL AND MATHEMATICAL METHODS IN MEDICINE 2021; 2021:8129044. [PMID: 34659449 PMCID: PMC8516534 DOI: 10.1155/2021/8129044] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/02/2021] [Revised: 09/12/2021] [Accepted: 09/14/2021] [Indexed: 11/18/2022]
Abstract
Diabetics are prone to postoperative cognitive dysfunction (POCD). The occurrence may be related to the damage of the prefrontal lobe. In this study, the prefrontal lobe was segmented based on an improved clustering algorithm in patients with diabetes, in order to evaluate the relationship between prefrontal lobe volume and COPD. In this study, a total of 48 diabetics who underwent selective noncardiac surgery were selected. Preoperative magnetic resonance imaging (MRI) images of the patients were segmented based on the improved clustering algorithm, and their prefrontal volume was measured. The correlation between the volume of the prefrontal lobe and Z-score or blood glucose was analyzed. Qualitative analysis shows that the gray matter, white matter, and cerebrospinal fluid based on the improved clustering algorithm were easy to distinguish. Quantitative evaluation results show that the proposed segmentation algorithm can obtain the optimal Jaccard coefficient and the least average segmentation time. There was a negative correlation between the volume of the prefrontal lobe and the Z-score. The cut-off value of prefrontal lobe volume for predicting POCD was <179.8, with the high specificity. There was a negative correlation between blood glucose and volume of the prefrontal lobe. From the results, we concluded that the segmentation of the prefrontal lobe based on an improved clustering algorithm before operation may predict the occurrence of POCD in diabetics.
Collapse
|
22
|
Yaneva A, Massaldjieva R, Mateva N. Initial Adaptation of the General Cognitive Assessment Battery by Cognifit™ for Bulgarian Older Adults. Exp Aging Res 2021; 48:336-350. [PMID: 34605370 DOI: 10.1080/0361073x.2021.1981096] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/15/2022]
Abstract
BACKGROUND Online neuropsychological assessment batteries may facilitate the screening of cognitive functions in older adults and could be useful for early diagnosis and detection of cognitive impairments. OBJECTIVE The primary aim of this study was to assess the psychometric qualities of an online multi-domain cognitive assessment battery (General Cognitive Assessment Battery (GCAB) by Cognifit™) applied for the first time in Bulgaria. METHODS A total of 20 healthy older adults (6 male and 14 female, aged 60-82) completed the GCAB as well as the Mini-Mental State Examination (MMSE) and the Consortium to Establish a Registry for Alzheimer's Disease (CERAD) neuropsychological battery. Descriptive statistics were used to describe the demographic characteristics of the sample and the scores on the GCAB and the CERAD battery. The internal consistency of the GCAB was evaluated using item analysis and measured with Cronbach's alpha. The concurrent validity of the GCAB was assessed with respect to the CERAD using Spearman's r after verifying the linear relationship between the GCAB and CERAD scores. RESULTS The GCAB showed good concurrent validity when compared with the corresponding CERAD tests. The correlation coefficients ranged from 0.67 for working memory to 0.47 for short-term auditory memory. We found very good reliability of the GCAB, with the inter-class correlation coefficient higher than 0.8 for all cognitive domains. There were no significant correlations between MMSE and GCAB scores. CONCLUSION The GCAB was found to be valid for the cognitive screening of Bulgarian healthy older adults and may provide an adequate assessment of their cognitive status. The GCAB showed good concurrent validity when compared with the CERAD battery, measuring similar cognitive constructs. Further work is necessary to explore its validity and reliability.
Collapse
Affiliation(s)
- Antonia Yaneva
- Department of Medical Informatics, Biostatistics and eLearning, Medical University of Plovdiv, Plovdiv, Bulgaria
| | - Radka Massaldjieva
- Department of Healthcare Management, Medical University of Plovdiv, Bulgaria University, Plovdiv, Bulgaria
| | - Nonka Mateva
- Department of Medical Informatics, Biostatistics and eLearning, Medical University of Plovdiv, Plovdiv, Bulgaria
| |
Collapse
|
23
|
LaPlume AA, Paterson TSE, Gardner S, Stokes KA, Freedman M, Levine B, Troyer AK, Anderson ND. Interindividual and intraindividual variability in amnestic mild cognitive impairment (aMCI) measured with an online cognitive assessment. J Clin Exp Neuropsychol 2021; 43:796-812. [PMID: 34556008 DOI: 10.1080/13803395.2021.1982867] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
INTRODUCTION Mean cognitive performance is worse in amnestic mild cognitive impairment (aMCI) compared to control groups. However, studies on variability of cognitive performance in aMCI have yielded inconclusive results, with many differences in variability measures and samples from one study to another. METHODS We examined variability in aMCI using an existing older adult sample (n = 91; 51 with aMCI, 40 with normal cognition for age), measured with an online self-administered computerized cognitive assessment (Cogniciti's Brain Health Assessment). Our methodology extended past findings by using pure measures of variability (controlling for confounding effects of group performance or practice), and a clinically representative aMCI sample (reflecting the continuum of cognitive performance between normal cognition and aMCI). RESULTS Between-group t-tests showed significantly greater between-person variability (interindividual variability or diversity) in overall cognitive performance in aMCI than controls, although the effect size was with a small to moderate effect size, d = 0.44. No significant group differences were found in within-person variability (intraindividual variability) across cognitive tasks (dispersion) or across trials of a response time task (inconsistency), which may be because we used a sample measuring the continuum of cognitive performance. Exploratory correlation analyses showed that a worse overall score was associated with greater inter- and intraindividual variability, and that variability measures were correlated with each other, indicating people with worse cognitive performance were more variable. DISCUSSION The current study demonstrates that self-administered online tests can be used to remotely assess different types of variability in people at risk of Alzheimer`s. Our findings show small but significantly more interindividual differences in people with aMCI. This diversity is considered as "noise" in standard assessments of mean performance, but offers an interesting and cognitively informative "signal" in itself.
Collapse
Affiliation(s)
- Annalise A LaPlume
- Rotman Research Institute, Baycrest (Fully Affiliated with the University of Toronto), Toronto, Canada
| | - Theone S E Paterson
- Department of Psychology, University of Victoria, Victoria, Canada.,Neuropsychology and Cognitive Health Program, Baycrest, Toronto, Canada
| | - Sandra Gardner
- Rotman Research Institute, Baycrest (Fully Affiliated with the University of Toronto), Toronto, Canada.,Biostatistics Division, Dalla Lana School of Public Health, University of Toronto, Toronto, Canada
| | - Kathryn A Stokes
- Neuropsychology and Cognitive Health Program, Baycrest, Toronto, Canada
| | - Morris Freedman
- Rotman Research Institute, Baycrest (Fully Affiliated with the University of Toronto), Toronto, Canada.,Division of Neurology, Baycrest, Toronto, Canada.,Department of Medicine, Division of Neurology, Mt. Sinai Hospital, Toronto, ON, Canada.,Department of Medicine (Neurology), University of Toronto, Toronto, Canada
| | - Brian Levine
- Rotman Research Institute, Baycrest (Fully Affiliated with the University of Toronto), Toronto, Canada.,Department of Medicine (Neurology), University of Toronto, Toronto, Canada.,Department of Psychology, University of Toronto, Toronto, Canada
| | - Angela K Troyer
- Neuropsychology and Cognitive Health Program, Baycrest, Toronto, Canada.,Department of Psychology, University of Toronto, Toronto, Canada
| | - Nicole D Anderson
- Rotman Research Institute, Baycrest (Fully Affiliated with the University of Toronto), Toronto, Canada.,Department of Psychology, University of Toronto, Toronto, Canada.,Department of Psychiatry, University of Toronto, Toronto, Canada
| |
Collapse
|
24
|
Visser LN, Dubbelman MA, Verrijp M, Wanders L, Pelt S, Zwan MD, Thijssen DH, Wouters H, Sikkes SA, van Hout HP, van der Flier WM. The Cognitive Online Self-Test Amsterdam (COST-A): Establishing norm scores in a community-dwelling population. ALZHEIMER'S & DEMENTIA (AMSTERDAM, NETHERLANDS) 2021; 13:e12234. [PMID: 34541288 PMCID: PMC8438682 DOI: 10.1002/dad2.12234] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/25/2021] [Accepted: 07/01/2021] [Indexed: 12/27/2022]
Abstract
BACKGROUND Heightened public awareness about Alzheimer's disease and dementia increases the need for at-home cognitive self-testing. We offered Cognitive Online Self-Test Amsterdam (COST-A) to independent groups of cognitively normal adults and investigated the robustness of a norm-score formula and cutoff. METHODS Three thousand eighty-eight participants (mean age ± standard deviation = 61 ± 12 years, 70% female) completed COST-A and evaluated it. Demographically adjusted norm scores were the difference between expected COST-A scores, based on age, gender, and education, and actual scores. We applied the resulting norm-score formula to two independent cohorts. RESULTS Participants evaluated COST-A to be of adequate difficulty and duration. Our norm-score formula was shown to be robust: ≈8% of participants in two cognitively normal cohorts had abnormal scores. A cutoff of -1.5 standard deviations proved optimal for distinguishing normal from impaired cognition. CONCLUSION With robust norm scores, COST-A is a promising new tool for research and clinical practice, providing low cost and minimally invasive remote assessment of cognitive functioning.
Collapse
Affiliation(s)
- Leonie N.C. Visser
- Alzheimer Center AmsterdamDepartment of NeurologyAmsterdam NeuroscienceAmsterdam UMCVU University Medical CenterAmsterdamthe Netherlands
- Division of Clinical GeriatricsCenter for Alzheimer ResearchDepartment of NeurobiologyCare Sciences and SocietyKarolinska InstitutetStockholmSweden
| | - Mark A. Dubbelman
- Alzheimer Center AmsterdamDepartment of NeurologyAmsterdam NeuroscienceAmsterdam UMCVU University Medical CenterAmsterdamthe Netherlands
| | - Merike Verrijp
- Alzheimer Center AmsterdamDepartment of NeurologyAmsterdam NeuroscienceAmsterdam UMCVU University Medical CenterAmsterdamthe Netherlands
| | - Lisa Wanders
- Radboud Institute for Health SciencesDepartment of PhysiologyRadboud University Medical CenterNijmegenThe Netherlands
- Top Institute Food and NutritionWageningenThe Netherlands
| | - Sophie Pelt
- Alzheimer Center AmsterdamDepartment of NeurologyAmsterdam NeuroscienceAmsterdam UMCVU University Medical CenterAmsterdamthe Netherlands
| | - Marissa D. Zwan
- Alzheimer Center AmsterdamDepartment of NeurologyAmsterdam NeuroscienceAmsterdam UMCVU University Medical CenterAmsterdamthe Netherlands
| | - Dick H.J. Thijssen
- Radboud Institute for Health SciencesDepartment of PhysiologyRadboud University Medical CenterNijmegenThe Netherlands
| | - Hans Wouters
- General Practitioners Research InstituteGroningenThe Netherlands
| | - Sietske A.M. Sikkes
- Alzheimer Center AmsterdamDepartment of NeurologyAmsterdam NeuroscienceAmsterdam UMCVU University Medical CenterAmsterdamthe Netherlands
- Faculty of Behavioural and Movement SciencesClinical Developmental Psychology & Clinical NeuropsychologyVrije Universiteit AmsterdamAmsterdamThe Netherlands
| | - Hein P.J. van Hout
- Department of General Practice and Medicine for Older PersonsAmsterdam Institute for Public Health Care ResearchVrije Universiteit AmsterdamAmsterdamThe Netherlands
| | - Wiesje M. van der Flier
- Alzheimer Center AmsterdamDepartment of NeurologyAmsterdam NeuroscienceAmsterdam UMCVU University Medical CenterAmsterdamthe Netherlands
- Department of Epidemiology and BiostatisticsAmsterdam UMCAmsterdamThe Netherlands
| |
Collapse
|
25
|
Mackin RS, Rhodes E, Insel PS, Nosheny R, Finley S, Ashford M, Camacho MR, Truran D, Mosca K, Seabrook G, Morrison R, Narayan VA, Weiner M. Reliability and Validity of a Home-Based Self-Administered Computerized Test of Learning and Memory Using Speech Recognition. AGING NEUROPSYCHOLOGY AND COGNITION 2021; 29:867-881. [PMID: 34139954 PMCID: PMC10081827 DOI: 10.1080/13825585.2021.1927961] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Abstract
INTRODUCTION The objective of this study is to evaluate the reliability and validity of the ReVeReTM word list recall test (RWLRT), which uses speech recognition, when administered remotely and unsupervised. METHODS Prospective cohort study. Participants included 249 cognitively intact community dwelling older adults. Measures included clinician administered neuropsychological assessments at baseline and unsupervised remotely administered tests of cognition from six time-points over six months. RESULTS The RWLRT showed acceptable validity. Reliability coefficients varied across time points, with poor reliability between times 1 and 2 and fair-to-good reliability across the remaining five testing sessions. Practice effects were observed with repeated administration as expected. DISCUSSION Unsupervised computerized tests of cognition, particularly word list learning and memory tests that use speech recognition, have significant potential for large scale early detection and long-term tracking of cognitive decline due to AD.
Collapse
Affiliation(s)
- R Scott Mackin
- Department of Psychiatry and Behavioral Sciences, University of California, San Francisco, USA.,Center for Imaging of Neurodegenerative Diseases (CIND) San Francisco Veterans Affair Medical Center, USA
| | - Emma Rhodes
- Center for Imaging of Neurodegenerative Diseases (CIND) San Francisco Veterans Affair Medical Center, USA.,Mental Illness Research Education and Clinical Centers, Veterans Administration Medical Center, San Francisco, CA, USA
| | - Philip S Insel
- Department of Psychiatry and Behavioral Sciences, University of California, San Francisco, USA
| | - Rachel Nosheny
- Department of Psychiatry and Behavioral Sciences, University of California, San Francisco, USA.,Center for Imaging of Neurodegenerative Diseases (CIND) San Francisco Veterans Affair Medical Center, USA
| | - Shannon Finley
- Center for Imaging of Neurodegenerative Diseases (CIND) San Francisco Veterans Affair Medical Center, USA
| | - Miriam Ashford
- Center for Imaging of Neurodegenerative Diseases (CIND) San Francisco Veterans Affair Medical Center, USA
| | - Monica R Camacho
- Center for Imaging of Neurodegenerative Diseases (CIND) San Francisco Veterans Affair Medical Center, USA
| | - Diana Truran
- Center for Imaging of Neurodegenerative Diseases (CIND) San Francisco Veterans Affair Medical Center, USA
| | | | | | | | | | - Michael Weiner
- Department of Psychiatry and Behavioral Sciences, University of California, San Francisco, USA.,Center for Imaging of Neurodegenerative Diseases (CIND) San Francisco Veterans Affair Medical Center, USA.,Department of Radiology, University of California, San Francisco, USA
| |
Collapse
|
26
|
Tsoy E, Zygouris S, Possin KL. Current State of Self-Administered Brief Computerized Cognitive Assessments for Detection of Cognitive Disorders in Older Adults: A Systematic Review. JPAD-JOURNAL OF PREVENTION OF ALZHEIMERS DISEASE 2021; 8:267-276. [PMID: 34101783 PMCID: PMC7987552 DOI: 10.14283/jpad.2021.11] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Indexed: 01/28/2023]
Abstract
Early diagnosis of cognitive disorders in older adults is a major healthcare priority with benefits to patients, families, and health systems. Rapid advances in digital technology offer potential for developing innovative diagnostic pathways to support early diagnosis. Brief self-administered computerized cognitive tools in particular hold promise for clinical implementation by minimizing demands on staff time. In this study, we conducted a systematic review of self-administered computerized cognitive assessment measures designed for the detection of cognitive impairment in older adults. Studies were identified via a systematic search of published peer-reviewed literature across major scientific databases. All studies reporting on psychometric validation of brief (≤30 minutes) self-administered computerized measures for detection of MCI and all-cause dementia in older adults were included. Seventeen studies reporting on 10 cognitive tools met inclusion criteria and were subjected to systematic review. There was substantial variability in characteristics of validation samples and reliability and validity estimates. Only 2 measures evaluated feasibility and usability in the intended clinical settings. Similar to past reviews, we found variability across measures with regard to psychometric rigor and potential for widescale applicability in clinical settings. Despite the promise that self-administered cognitive tests hold for clinical implementation, important gaps in scientific rigor in development, validation, and feasibility studies of these measures remain. Developments in technology and biomarker studies provide potential avenues for future directions on the use of digital technology in clinical care.
Collapse
Affiliation(s)
- E Tsoy
- Katherine L. Possin, PhD, Associate Professor in Residence, Department of Neurology, University of California San Francisco, Memory and Aging Center, Box 1207, 675 Nelson Rising Lane, Suite 190, San Francisco, CA 94158, Tel: 415-476-1889, E-mail:
| | | | | |
Collapse
|
27
|
Noguchi-Shinohara M, Domoto C, Yoshida T, Niwa K, Yuki-Nozaki S, Samuraki-Yokohama M, Sakai K, Hamaguchi T, Ono K, Iwasa K, Matsunari I, Komai K, Nakamura H, Yamada M. A new computerized assessment battery for cognition (C-ABC) to detect mild cognitive impairment and dementia around 5 min. PLoS One 2020; 15:e0243469. [PMID: 33306697 PMCID: PMC7732101 DOI: 10.1371/journal.pone.0243469] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2020] [Accepted: 11/21/2020] [Indexed: 11/24/2022] Open
Abstract
This study aimed to develop a new computerized assessment battery for cognition (C-ABC) to detect mild cognitive impairment (MCI) and dementia. We performed C-ABC in subjects with dementia (n = 422), MCI (n = 145), and normal cognition (NC; n = 574), and analyzed by age stratum (50s, 60s, and 70–85 years). To distinguish MCI from NC, the C-ABC total combined score, which were calculated by dividing the C-ABC total score by the C-ABC required time, revealed the best area under the curves (AUC) at 0.838 and 0.735 in the 50s and 60s age groups, respectively; notably, this entire procedure took approximately 5 min. To distinguish dementia from NC and MCI, the partial items of C-ABC (items 3 + 6 combined score) revealed the best AUCs at 0.910, 0.874, and 0.882 in the 50s, 60s, and 70–85 age groups, respectively. Furthermore, the items 3 + 6 combined score established the best AUC at 0.794 in the 70–85 age group to distinguish MCI from NC; this entire procedure took around 2 min. Hence, this study suggests that C-ABC could be a useful tool for detecting dementia or MCI in a short time.
Collapse
Affiliation(s)
- Moeko Noguchi-Shinohara
- Department of Neurology and Neurobiology of Aging, Kanazawa University Graduate School of Medical Sciences, Kanazawa University, Kanazawa, Japan.,Department of Preemptive Medicine for Dementia, Kanazawa University Graduate School of Medical Sciences, Kanazawa University, Kanazawa, Japan
| | - Chiaki Domoto
- Department of Neurology and Neurobiology of Aging, Kanazawa University Graduate School of Medical Sciences, Kanazawa University, Kanazawa, Japan
| | - Taketoshi Yoshida
- Department of Knowledge Science of Japan Advanced Institute of Science and Technology (JAIST), Nomi, Ishikawa, Japan
| | - Kozue Niwa
- Department of Neurology and Neurobiology of Aging, Kanazawa University Graduate School of Medical Sciences, Kanazawa University, Kanazawa, Japan
| | - Sohshi Yuki-Nozaki
- Department of Neurology and Neurobiology of Aging, Kanazawa University Graduate School of Medical Sciences, Kanazawa University, Kanazawa, Japan
| | - Miharu Samuraki-Yokohama
- Department of Neurology and Neurobiology of Aging, Kanazawa University Graduate School of Medical Sciences, Kanazawa University, Kanazawa, Japan
| | - Kenji Sakai
- Department of Neurology and Neurobiology of Aging, Kanazawa University Graduate School of Medical Sciences, Kanazawa University, Kanazawa, Japan
| | - Tsuyoshi Hamaguchi
- Department of Neurology and Neurobiology of Aging, Kanazawa University Graduate School of Medical Sciences, Kanazawa University, Kanazawa, Japan
| | - Kenjiro Ono
- Department of Neurology and Neurobiology of Aging, Kanazawa University Graduate School of Medical Sciences, Kanazawa University, Kanazawa, Japan.,Division of Neurology, Department of Internal Medicine, Showa University School of Medicine, Tokyo, Japan
| | - Kazuo Iwasa
- Department of Neurology and Neurobiology of Aging, Kanazawa University Graduate School of Medical Sciences, Kanazawa University, Kanazawa, Japan.,Department of Health and Medical Sciences, Ishikawa Prefectural Nursing University, Kahoku, Ishikawa, Japan
| | - Ichiro Matsunari
- Division of Nuclear Medicine, Department of Radiology, Saitama Medical University Hospital, Saitama, Japan
| | - Kiyonobu Komai
- Department of Neurology, Hokuriku Brain and Neuromuscular Disease Center, National Hospital Organization Iou National Hospital, Kanazawa, Japan
| | - Hiroyuki Nakamura
- Department of Public Health, Kanazawa University Graduate School of Advanced Preventive Medical Sciences, Kanazawa, Japan.,Kanazawa University Advanced Preventive Medical Sciences Research Center, Kanazawa, Japan.,Department of Environmental and Preventive Medicine, Kanazawa University Graduate School of Medical Sciences, Kanazawa University, Kanazawa, Japan
| | - Masahito Yamada
- Department of Neurology and Neurobiology of Aging, Kanazawa University Graduate School of Medical Sciences, Kanazawa University, Kanazawa, Japan
| |
Collapse
|
28
|
Backx R, Skirrow C, Dente P, Barnett JH, Cormack FK. Comparing Web-Based and Lab-Based Cognitive Assessment Using the Cambridge Neuropsychological Test Automated Battery: A Within-Subjects Counterbalanced Study. J Med Internet Res 2020; 22:e16792. [PMID: 32749999 PMCID: PMC7435628 DOI: 10.2196/16792] [Citation(s) in RCA: 53] [Impact Index Per Article: 13.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2019] [Revised: 05/15/2020] [Accepted: 06/11/2020] [Indexed: 12/22/2022] Open
Abstract
Background Computerized assessments are already used to derive accurate and reliable measures of cognitive function. Web-based cognitive assessment could improve the accessibility and flexibility of research and clinical assessment, widen participation, and promote research recruitment while simultaneously reducing costs. However, differences in context may influence task performance. Objective This study aims to determine the comparability of an unsupervised, web-based administration of the Cambridge Neuropsychological Test Automated Battery (CANTAB) against a typical in-person lab-based assessment, using a within-subjects counterbalanced design. The study aims to test (1) reliability, quantifying the relationship between measurements across settings using correlational approaches; (2) equivalence, the extent to which test results in different settings produce similar overall results; and (3) agreement, by quantifying acceptable limits to bias and differences between measurement environments. Methods A total of 51 healthy adults (32 women and 19 men; mean age 36.8, SD 15.6 years) completed 2 testing sessions, which were completed on average 1 week apart (SD 4.5 days). Assessments included equivalent tests of emotion recognition (emotion recognition task [ERT]), visual recognition (pattern recognition memory [PRM]), episodic memory (paired associate learning [PAL]), working memory and spatial planning (spatial working memory [SWM] and one touch stockings of Cambridge), and sustained attention (rapid visual information processing [RVP]). Participants were randomly allocated to one of the two groups, either assessed in-person in the laboratory first (n=33) or with unsupervised web-based assessments on their personal computing systems first (n=18). Performance indices (errors, correct trials, and response sensitivity) and median reaction times were extracted. Intraclass and bivariate correlations examined intersetting reliability, linear mixed models and Bayesian paired sample t tests tested for equivalence, and Bland-Altman plots examined agreement. Results Intraclass correlation (ICC) coefficients ranged from ρ=0.23-0.67, with high correlations in 3 performance indices (from PAL, SWM, and RVP tasks; ρ≥0.60). High ICC values were also seen for reaction time measures from 2 tasks (PRM and ERT tasks; ρ≥0.60). However, reaction times were slower during web-based assessments, which undermined both equivalence and agreement for reaction time measures. Performance indices did not differ between assessment settings and generally showed satisfactory agreement. Conclusions Our findings support the comparability of CANTAB performance indices (errors, correct trials, and response sensitivity) in unsupervised, web-based assessments with in-person and laboratory tests. Reaction times are not as easily translatable from in-person to web-based testing, likely due to variations in computer hardware. The results underline the importance of examining more than one index to ascertain comparability, as high correlations can present in the context of systematic differences, which are a product of differences between measurement environments. Further work is now needed to examine web-based assessments in clinical populations and in larger samples to improve sensitivity for detecting subtler differences between test settings.
Collapse
Affiliation(s)
- Rosa Backx
- Cambridge Cognition Ltd, Cambridge, United Kingdom
| | - Caroline Skirrow
- Cambridge Cognition Ltd, Cambridge, United Kingdom.,School of Psychological Science, University of Bristol, Bristol, United Kingdom
| | | | - Jennifer H Barnett
- Cambridge Cognition Ltd, Cambridge, United Kingdom.,Department of Psychiatry, University of Cambridge, Cambridge, United Kingdom
| | | |
Collapse
|
29
|
Perin S, Buckley RF, Pase MP, Yassi N, Lavale A, Wilson PH, Schembri A, Maruff P, Lim YY. Unsupervised assessment of cognition in the Healthy Brain Project: Implications for web-based registries of individuals at risk for Alzheimer's disease. ALZHEIMER'S & DEMENTIA (NEW YORK, N. Y.) 2020; 6:e12043. [PMID: 32607409 PMCID: PMC7317647 DOI: 10.1002/trc2.12043] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/20/2020] [Accepted: 05/26/2020] [Indexed: 12/23/2022]
Abstract
INTRODUCTION Web-based platforms are used increasingly to assess cognitive function in unsupervised settings. The utility of cognitive data arising from unsupervised assessments remains unclear. We examined the acceptability, usability, and validity of unsupervised cognitive testing in middle-aged adults enrolled in the Healthy Brain Project. METHODS A total of 1594 participants completed unsupervised assessments of the Cogstate Brief Battery. Acceptability was defined by the amount of missing data, and usability by examining error of test performance and the time taken to read task instructions and complete tests (learnability). RESULTS Overall, we observed high acceptability (98% complete data) and high usability (95% met criteria for low error rates and high learnability). Test validity was confirmed by observation of expected inverse relationships between performance and increasing test difficulty and age. CONCLUSION Consideration of test design paired with acceptability and usability criteria can provide valid indices of cognition in the unsupervised settings used to develop registries of individuals at risk for Alzheimer's disease.
Collapse
Affiliation(s)
- Stephanie Perin
- Melbourne Dementia Research CentreFlorey Institute of Neuroscience and Mental Health and the University of MelbourneParkvilleVictoriaAustralia
- School of Psychological Sciences, Turner Institute for Brain and Mental HealthMonash UniversityClaytonVictoriaAustralia
- School of PsychologyFaculty of Health SciencesAustralian Catholic UniversityMelbourneVictoriaAustralia
| | - Rachel F. Buckley
- Melbourne Dementia Research CentreFlorey Institute of Neuroscience and Mental Health and the University of MelbourneParkvilleVictoriaAustralia
- Melbourne School of Psychological SciencesUniversity of MelbourneParkvilleVictoriaAustralia
- Department of NeurologyMassachusetts General Hospital and Harvard Medical SchoolBostonMassachusettsUSA
- Center for Alzheimer Research and TreatmentDepartment of NeurologyBrigham and Women's HospitalBostonMassachusettsUSA
| | - Matthew P. Pase
- Melbourne Dementia Research CentreFlorey Institute of Neuroscience and Mental Health and the University of MelbourneParkvilleVictoriaAustralia
- School of Psychological Sciences, Turner Institute for Brain and Mental HealthMonash UniversityClaytonVictoriaAustralia
- Harvard T.H. Chan School of Public HealthBostonMassachusettsUSA
| | - Nawaf Yassi
- Melbourne Dementia Research CentreFlorey Institute of Neuroscience and Mental Health and the University of MelbourneParkvilleVictoriaAustralia
- Department of Medicine and NeurologyMelbourne Brain Centre at The Royal Melbourne HospitalUniversity of MelbourneParkvilleVictoriaAustralia
- Population Health and Immunity DivisionThe Walter and Eliza Hall Institute of Medical ResearchParkvilleVictoriaAustralia
| | - Alexandra Lavale
- Melbourne Dementia Research CentreFlorey Institute of Neuroscience and Mental Health and the University of MelbourneParkvilleVictoriaAustralia
- School of Psychological Sciences, Turner Institute for Brain and Mental HealthMonash UniversityClaytonVictoriaAustralia
| | - Peter H. Wilson
- School of PsychologyFaculty of Health SciencesAustralian Catholic UniversityMelbourneVictoriaAustralia
| | | | - Paul Maruff
- Melbourne Dementia Research CentreFlorey Institute of Neuroscience and Mental Health and the University of MelbourneParkvilleVictoriaAustralia
- Cogstate LtdMelbourneVictoriaAustralia
| | - Yen Ying Lim
- Melbourne Dementia Research CentreFlorey Institute of Neuroscience and Mental Health and the University of MelbourneParkvilleVictoriaAustralia
- School of Psychological Sciences, Turner Institute for Brain and Mental HealthMonash UniversityClaytonVictoriaAustralia
| |
Collapse
|
30
|
Vermeent S, Dotsch R, Schmand B, Klaming L, Miller JB, van Elswijk G. Evidence of Validity for a Newly Developed Digital Cognitive Test Battery. Front Psychol 2020; 11:770. [PMID: 32390918 PMCID: PMC7194127 DOI: 10.3389/fpsyg.2020.00770] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2019] [Accepted: 03/30/2020] [Indexed: 01/11/2023] Open
Abstract
Clinical practice still relies heavily on traditional paper-and-pencil testing to assess a patient’s cognitive functions. Digital technology has the potential to be an efficient and powerful alternative, but for many of the existing digital tests and test batteries the psychometric properties have not been properly established. We validated a newly developed digital test battery consisting of digitized versions of conventional neuropsychological tests. Two confirmatory factor analysis models were specified: a model based on traditional neuropsychological theory and expert consensus and one based on the Cattell-Horn-Carroll (CHC) taxonomy. For both models, the outcome measures of the digital tests loaded on the cognitive domains in the same way as established in the neuropsychological literature. Interestingly, no clear distinction could be made between the CHC model and traditional neuropsychological model in terms of model fit. Taken together, these findings provide preliminary evidence for the structural validity of the digital cognitive test battery.
Collapse
Affiliation(s)
- Stefan Vermeent
- Digital Cognitive Diagnostics, Philips Healthcare, Eindhoven, Netherlands
| | - Ron Dotsch
- Department of Brain, Behavior and Cognition, Philips Research, Eindhoven, Netherlands
| | - Ben Schmand
- Digital Cognitive Diagnostics, Philips Healthcare, Eindhoven, Netherlands
| | - Laura Klaming
- Department of Brain, Behavior and Cognition, Philips Research, Eindhoven, Netherlands
| | - Justin B Miller
- Cleveland Clinic Lou Ruvo Center for Brain Health, Las Vegas, NV, United States
| | - Gijs van Elswijk
- Digital Cognitive Diagnostics, Philips Healthcare, Eindhoven, Netherlands
| |
Collapse
|
31
|
Noll KR, Bradshaw ME, Parsons MW, Dawson EL, Rexer J, Wefel JS. Monitoring of Neurocognitive Function in the Care of Patients with Brain Tumors. Curr Treat Options Neurol 2019; 21:33. [PMID: 31250277 DOI: 10.1007/s11940-019-0573-2] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/31/2022]
Abstract
PURPOSE OF REVIEW A detailed characterization of the nature of neurocognitive impairment in patients with brain tumors is provided, as well as considerations for clinical practice regarding neuropsychological assessment throughout the disease course. RECENT FINDINGS Neurocognitive impairment is common in patients with brain tumors and may result from the tumor itself, as a consequence of treatment, including surgery, chemotherapy, and radiation, or in association with supportive care medications (e.g., anticonvulsant and pain medications). Serial surveillance of neurocognitive functioning in this population can facilitate medical decision-making and inform recommendations to improve patient daily functioning and quality of life. Neuropsychological assessment is increasingly recognized as a critical component of the multidisciplinary care of patients with brain tumors and has already had practice-changing effects. Further understanding of genetic risk factors for neurocognitive decline along with the development of novel assessment and intervention strategies may further enhance functioning and general well-being in this patient population.
Collapse
Affiliation(s)
- Kyle R Noll
- Section of Neuropsychology, Department of Neuro-Oncology, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Unit 431, Houston, TX, 77030, USA
| | - Mariana E Bradshaw
- Section of Neuropsychology, Department of Neuro-Oncology, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Unit 431, Houston, TX, 77030, USA
| | - Michael W Parsons
- Department of Neuro-Oncology, Psychology Assessment Center, Massachusetts General Hospital, Boston, MA, 02114, USA
| | - Erica L Dawson
- Department of Psychiatry and Behavioral Health, The Ohio State University, Columbus, OH, 43210, USA
| | - Jennie Rexer
- Section of Neuropsychology, Department of Neuro-Oncology, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Unit 431, Houston, TX, 77030, USA
| | - Jeffrey S Wefel
- Section of Neuropsychology, Department of Neuro-Oncology, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Unit 431, Houston, TX, 77030, USA. .,Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, TX, 77030, USA.
| |
Collapse
|
32
|
Witlox L, Schagen SB, de Ruiter MB, Geerlings MI, Peeters PHM, Koevoets EW, van der Wall E, Stuiver M, Sonke G, Velthuis MJ, Palen JAMVD, Jobsen JJ, May AM, Monninkhof EM. Effect of physical exercise on cognitive function and brain measures after chemotherapy in patients with breast cancer (PAM study): protocol of a randomised controlled trial. BMJ Open 2019; 9:e028117. [PMID: 31227537 PMCID: PMC6597001 DOI: 10.1136/bmjopen-2018-028117] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/13/2022] Open
Abstract
INTRODUCTION After treatment with chemotherapy, many patients with breast cancer experience cognitive problems. While limited interventions are available to improve cognitive functioning, physical exercise showed positive effects in healthy older adults and people with mild cognitive impairment. The Physical Activity and Memory study aims to investigate the effect of physical exercise on cognitive functioning and brain measures in chemotherapy-exposed patients with breast cancer with cognitive problems. METHODS AND ANALYTICS One hundred and eighty patients with breast cancer with cognitive problems 2-4 years after diagnosis are randomised (1:1) into an exercise intervention or a control group. The 6-month exercise intervention consists of twice a week 1-hour aerobic and strength exercises supervised by a physiotherapist and twice a week 1-hour Nordic or power walking. The control group is asked to maintain their habitual activity pattern during 6 months. The primary outcome (verbal learning) is measured at baseline and 6 months. Further measurements include online neuropsychological tests, self-reported cognitive complaints, a 3-tesla brain MRI, patient-reported outcomes (quality of life, fatigue, depression, anxiety, work performance), blood sampling and physical fitness. The MRI scans and blood sampling will be used to gain insight into underlying mechanisms. At 18 months online neuropsychological tests, self-reported cognitive complaints and patient-reported outcomes will be repeated. ETHICS AND DISSEMINATION Study results may impact usual care if physical exercise improves cognitive functioning for breast cancer survivors. TRIAL REGISTRATION NUMBER NTR6104.
Collapse
Affiliation(s)
- Lenja Witlox
- Julius Center for Health Sciences and Primary Care, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Sanne B Schagen
- Division of Psychosocial Research and Epidemiology, Netherlands Cancer Institute, Amsterdam, The Netherlands
| | - Michiel B de Ruiter
- Division of Psychosocial Research and Epidemiology, Netherlands Cancer Institute, Amsterdam, The Netherlands
| | - Mirjam I Geerlings
- Julius Center for Health Sciences and Primary Care, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Petra H M Peeters
- Julius Center for Health Sciences and Primary Care, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Emmie W Koevoets
- Julius Center for Health Sciences and Primary Care, University Medical Center Utrecht, Utrecht, The Netherlands
- Division of Psychosocial Research and Epidemiology, Netherlands Cancer Institute, Amsterdam, The Netherlands
| | - Elsken van der Wall
- Department of Medical Oncology, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Martijn Stuiver
- Center for Quality of Life, Netherlands Cancer Institute, Amsterdam, The Netherlands
- ACHIEVE Center of Applied Research, Faculty of Health, University of Applied Sciences, Amsterdam, The Netherlands
| | - Gabe Sonke
- Center for Quality of Life, Netherlands Cancer Institute, Amsterdam, The Netherlands
| | - Miranda J Velthuis
- Netherlands Comprehensive Cancer Organisation (IKNL), Utrecht, The Netherlands
| | - Job A M van der Palen
- Medical School Twente, Medisch Spectrum Twente, Enschede, The Netherlands
- Department of Research Methodology, Measurement, Universiteit Twente, Enschede, The Netherlands
| | - Jan J Jobsen
- Medical School Twente, Medisch Spectrum Twente, Enschede, The Netherlands
| | - Anne M May
- Julius Center for Health Sciences and Primary Care, University Medical Center Utrecht, Utrecht, The Netherlands
| | - E M Monninkhof
- Julius Center for Health Sciences and Primary Care, University Medical Center Utrecht, Utrecht, The Netherlands
| |
Collapse
|
33
|
Bott NT, Madero EN, Glenn JM, Lange AR, Anderson JJ, Newton DO, Brennan AH, Buffalo EA, Rentz DM, Zola SM. Device-Embedded Cameras for Eye Tracking-Based Cognitive Assessment: Implications for Teleneuropsychology. Telemed J E Health 2019; 26:477-481. [PMID: 31161968 DOI: 10.1089/tmj.2019.0039] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Introduction: Widespread screening for cognitive decline is an important challenge to address as the aging population grows, but there is currently a shortage of clinical infrastructure to meet the demand for in-person evaluation. Remotely delivered assessments that utilize eye-tracking data from webcams, such as visual paired comparison (VPC) tasks, could increase access to remote, asynchronous neuropsychological screening for cognitive decline but further validation against clinical-grade eye trackers is required. Methods: To demonstrate equivalence between a novel automated scoring system for eye-tracking metrics acquired through a laptop-embedded camera and a gold-standard eye tracker, we analyzed VPC data from 18 subjects aged 50+ with normal cognitive function across three visits. The eye tracker data were scored by the manufacturer's software, and the webcam data were scored by a novel algorithm. Results: Automated scoring of webcam-based VPC data revealed strong correlations with the clinical-grade eye-tracking camera. Correlation of mean VPC performance across all time points was robust: r = 0.95 (T1 r = 0.97; T2 r = 0.88; T3 r = 0.97; p's < 0.001). Correlation of per-trial performance across time points was also robust: r = 0.88 (T1 r = 0.85; T2 r = 0.89; T3 r = 0.92; p's < 0.001). Mean differences between performance data acquired by each device were 0.00. Conclusion: These results suggest that device-embedded cameras are a valid and scalable alternative to traditional laboratory-based equipment for gaze-based tasks measuring cognitive function. The validation of this technique represents an important technical advance for the field of teleneuropsychology.
Collapse
Affiliation(s)
- Nicholas T Bott
- Neurotrack Technologies, Inc., Redwood City, California.,Department of Medicine, School of Medicine, Stanford University, Stanford, California.,PGSP-Stanford PsyD Consortium, Department of Clinical Psychology, Palo Alto University, Palo Alto, California
| | | | | | - Alex R Lange
- Neurotrack Technologies, Inc., Redwood City, California
| | | | - Doug O Newton
- Neurotrack Technologies, Inc., Redwood City, California
| | | | - Elizabeth A Buffalo
- Neurotrack Technologies, Inc., Redwood City, California.,Department of Physiology and Biophysics, University of Washington, Seattle, Washington
| | - Dorene M Rentz
- Neurotrack Technologies, Inc., Redwood City, California.,Department of Neurology, Massachusetts General Hospital, Boston, Massachusetts.,Department of Neurology, Brigham and Women's Hospital, Boston, Massachusetts
| | - Stuart M Zola
- Neurotrack Technologies, Inc., Redwood City, California.,Department of Psychiatry and Behavioral Sciences, Emory University, Atlanta, Georgia
| |
Collapse
|
34
|
Schmand B. Why are neuropsychologists so reluctant to embrace modern assessment techniques? Clin Neuropsychol 2019; 33:209-219. [DOI: 10.1080/13854046.2018.1523468] [Citation(s) in RCA: 17] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/26/2023]
Affiliation(s)
- Ben Schmand
- Department of Psychology, University of Amsterdam, Amsterdam, The Netherlands
| |
Collapse
|