1
|
Zhang G, Garrett DR, Simmons AM, Kiat JE, Luck SJ. Evaluating the effectiveness of artifact correction and rejection in event-related potential research. Psychophysiology 2024; 61:e14511. [PMID: 38165059 PMCID: PMC11021170 DOI: 10.1111/psyp.14511] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2023] [Revised: 11/18/2023] [Accepted: 12/14/2023] [Indexed: 01/03/2024]
Abstract
Eyeblinks and other large artifacts can create two major problems in event-related potential (ERP) research, namely confounds and increased noise. Here, we developed a method for assessing the effectiveness of artifact correction and rejection methods in minimizing these two problems. We then used this method to assess a common artifact minimization approach, in which independent component analysis (ICA) is used to correct ocular artifacts, and artifact rejection is used to reject trials with extreme values resulting from other sources (e.g., movement artifacts). This approach was applied to data from five common ERP components (P3b, N400, N170, mismatch negativity, and error-related negativity). Four common scoring methods (mean amplitude, peak amplitude, peak latency, and 50% area latency) were examined for each component. We found that eyeblinks differed systematically across experimental conditions for several of the components. We also found that artifact correction was reasonably effective at minimizing these confounds, although it did not usually eliminate them completely. In addition, we found that the rejection of trials with extreme voltage values was effective at reducing noise, with the benefits of eliminating these trials outweighing the reduced number of trials available for averaging. For researchers who are analyzing similar ERP components and participant populations, this combination of artifact correction and rejection approaches should minimize artifact-related confounds and lead to improved data quality. Researchers who are analyzing other components or participant populations can use the method developed in this study to determine which artifact minimization approaches are effective in their data.
Collapse
Affiliation(s)
- Guanghui Zhang
- Center for Mind & Brain, University of California-Davis, Davis, California, USA
| | - David R Garrett
- Center for Mind & Brain, University of California-Davis, Davis, California, USA
| | - Aaron M Simmons
- Center for Mind & Brain, University of California-Davis, Davis, California, USA
| | - John E Kiat
- Center for Mind & Brain, University of California-Davis, Davis, California, USA
| | - Steven J Luck
- Center for Mind & Brain, University of California-Davis, Davis, California, USA
| |
Collapse
|
2
|
Zhang G, Garrett DR, Simmons AM, Kiat JE, Luck SJ. Evaluating the effectiveness of artifact correction and rejection in event-related potential research. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.09.16.558075. [PMID: 37745415 PMCID: PMC10516012 DOI: 10.1101/2023.09.16.558075] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/26/2023]
Abstract
Eyeblinks and other large artifacts can create two major problems in event-related potential (ERP) research, namely confounds and increased noise. Here, we developed a method for assessing the effectiveness of artifact correction and rejection methods at minimizing these two problems. We then used this method to assess a common artifact minimization approach, in which independent component analysis (ICA) is used to correct ocular artifacts, and artifact rejection is used to reject trials with extreme values resulting from other sources (e.g., movement artifacts). This approach was applied to data from five common ERP components (P3b, N400, N170, mismatch negativity, and error-related negativity). Four common scoring methods (mean amplitude, peak amplitude, peak latency, and 50% area latency) were examined for each component. We found that eyeblinks differed systematically across experimental conditions for several of the components. We also found that artifact correction was reasonably effective at minimizing these confounds, although it did not usually eliminate them completely. In addition, we found that the rejection of trials with extreme voltage values was effective at reducing noise, with the benefits of eliminating these trials outweighing the reduced number of trials available for averaging. For researchers who are analyzing similar ERP components and participant populations, this combination of artifact correction and rejection approaches should minimize artifact-related confounds and lead to improved data quality. Researchers who are analyzing other components or participant populations can use the method developed in this study to determine which artifact minimization approaches are effective in their data.
Collapse
Affiliation(s)
- Guanghui Zhang
- Center for Mind & Brain, University of California-Davis, Davis, CA, USA
| | - David R Garrett
- Center for Mind & Brain, University of California-Davis, Davis, CA, USA
| | - Aaron M Simmons
- Center for Mind & Brain, University of California-Davis, Davis, CA, USA
| | - John E Kiat
- Center for Mind & Brain, University of California-Davis, Davis, CA, USA
| | - Steven J Luck
- Center for Mind & Brain, University of California-Davis, Davis, CA, USA
| |
Collapse
|
3
|
Nebe S, Reutter M, Baker DH, Bölte J, Domes G, Gamer M, Gärtner A, Gießing C, Gurr C, Hilger K, Jawinski P, Kulke L, Lischke A, Markett S, Meier M, Merz CJ, Popov T, Puhlmann LMC, Quintana DS, Schäfer T, Schubert AL, Sperl MFJ, Vehlen A, Lonsdorf TB, Feld GB. Enhancing precision in human neuroscience. eLife 2023; 12:e85980. [PMID: 37555830 PMCID: PMC10411974 DOI: 10.7554/elife.85980] [Citation(s) in RCA: 8] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2023] [Accepted: 07/23/2023] [Indexed: 08/10/2023] Open
Abstract
Human neuroscience has always been pushing the boundary of what is measurable. During the last decade, concerns about statistical power and replicability - in science in general, but also specifically in human neuroscience - have fueled an extensive debate. One important insight from this discourse is the need for larger samples, which naturally increases statistical power. An alternative is to increase the precision of measurements, which is the focus of this review. This option is often overlooked, even though statistical power benefits from increasing precision as much as from increasing sample size. Nonetheless, precision has always been at the heart of good scientific practice in human neuroscience, with researchers relying on lab traditions or rules of thumb to ensure sufficient precision for their studies. In this review, we encourage a more systematic approach to precision. We start by introducing measurement precision and its importance for well-powered studies in human neuroscience. Then, determinants for precision in a range of neuroscientific methods (MRI, M/EEG, EDA, Eye-Tracking, and Endocrinology) are elaborated. We end by discussing how a more systematic evaluation of precision and the application of respective insights can lead to an increase in reproducibility in human neuroscience.
Collapse
Affiliation(s)
- Stephan Nebe
- Zurich Center for Neuroeconomics, Department of Economics, University of ZurichZurichSwitzerland
| | - Mario Reutter
- Department of Psychology, Julius-Maximilians-UniversityWürzburgGermany
| | - Daniel H Baker
- Department of Psychology and York Biomedical Research Institute, University of YorkYorkUnited Kingdom
| | - Jens Bölte
- Institute for Psychology, University of Münster, Otto-Creuzfeldt Center for Cognitive and Behavioral NeuroscienceMünsterGermany
| | - Gregor Domes
- Department of Biological and Clinical Psychology, University of TrierTrierGermany
- Institute for Cognitive and Affective NeuroscienceTrierGermany
| | - Matthias Gamer
- Department of Psychology, Julius-Maximilians-UniversityWürzburgGermany
| | - Anne Gärtner
- Faculty of Psychology, Technische Universität DresdenDresdenGermany
| | - Carsten Gießing
- Biological Psychology, Department of Psychology, School of Medicine and Health Sciences, Carl von Ossietzky University of OldenburgOldenburgGermany
| | - Caroline Gurr
- Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, University Hospital, Goethe UniversityFrankfurtGermany
- Brain Imaging Center, Goethe UniversityFrankfurtGermany
| | - Kirsten Hilger
- Department of Psychology, Julius-Maximilians-UniversityWürzburgGermany
- Department of Psychology, Psychological Diagnostics and Intervention, Catholic University of Eichstätt-IngolstadtEichstättGermany
| | - Philippe Jawinski
- Department of Psychology, Humboldt-Universität zu BerlinBerlinGermany
| | - Louisa Kulke
- Department of Developmental with Educational Psychology, University of BremenBremenGermany
| | - Alexander Lischke
- Department of Psychology, Medical School HamburgHamburgGermany
- Institute of Clinical Psychology and Psychotherapy, Medical School HamburgHamburgGermany
| | - Sebastian Markett
- Department of Psychology, Humboldt-Universität zu BerlinBerlinGermany
| | - Maria Meier
- Department of Psychology, University of KonstanzKonstanzGermany
- University Psychiatric Hospitals, Child and Adolescent Psychiatric Research Department (UPKKJ), University of BaselBaselSwitzerland
| | - Christian J Merz
- Department of Cognitive Psychology, Institute of Cognitive Neuroscience, Faculty of Psychology, Ruhr University BochumBochumGermany
| | - Tzvetan Popov
- Department of Psychology, Methods of Plasticity Research, University of ZurichZurichSwitzerland
| | - Lara MC Puhlmann
- Leibniz Institute for Resilience ResearchMainzGermany
- Max Planck Institute for Human Cognitive and Brain SciencesLeipzigGermany
| | - Daniel S Quintana
- Max Planck Institute for Human Cognitive and Brain SciencesLeipzigGermany
- NevSom, Department of Rare Disorders & Disabilities, Oslo University HospitalOsloNorway
- KG Jebsen Centre for Neurodevelopmental Disorders, University of OsloOsloNorway
- Norwegian Centre for Mental Disorders Research (NORMENT), University of OsloOsloNorway
| | - Tim Schäfer
- Department of Child and Adolescent Psychiatry, Psychosomatics and Psychotherapy, University Hospital, Goethe UniversityFrankfurtGermany
- Brain Imaging Center, Goethe UniversityFrankfurtGermany
| | | | - Matthias FJ Sperl
- Department of Clinical Psychology and Psychotherapy, University of GiessenGiessenGermany
- Center for Mind, Brain and Behavior, Universities of Marburg and GiessenGiessenGermany
| | - Antonia Vehlen
- Department of Biological and Clinical Psychology, University of TrierTrierGermany
| | - Tina B Lonsdorf
- Department of Systems Neuroscience, University Medical Center Hamburg-EppendorfHamburgGermany
- Department of Psychology, Biological Psychology and Cognitive Neuroscience, University of BielefeldBielefeldGermany
| | - Gordon B Feld
- Department of Clinical Psychology, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg UniversityMannheimGermany
- Department of Psychology, Heidelberg UniversityHeidelbergGermany
- Department of Addiction Behavior and Addiction Medicine, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg UniversityMannheimGermany
- Department of Psychiatry and Psychotherapy, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg UniversityMannheimGermany
| |
Collapse
|
4
|
Zhang G, Garrett DR, Luck SJ. Optimal Filters for ERP Research II: Recommended Settings for Seven Common ERP Components. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.06.13.544794. [PMID: 37397984 PMCID: PMC10312706 DOI: 10.1101/2023.06.13.544794] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/04/2023]
Abstract
In research with event-related potentials (ERPs), aggressive filters can substantially improve the signal-to-noise ratio and maximize statistical power, but they can also produce significant waveform distortion. Although this tradeoff has been well documented, the field lacks recommendations for filter cutoffs that quantitatively address both of these competing considerations. To fill this gap, we quantified the effects of a broad range of low-pass filter and high-pass filter cutoffs for seven common ERP components (P3b, N400, N170, N2pc, mismatch negativity, error-related negativity, and lateralized readiness potential) recorded from a set of neurotypical young adults. We also examined four common scoring methods (mean amplitude, peak amplitude, peak latency, and 50% area latency). For each combination of component and scoring method, we quantified the effects of filtering on data quality (noise level and signal-to-noise ratio) and waveform distortion. This led to recommendations for optimal low-pass and high-pass filter cutoffs. We repeated the analyses after adding artificial noise to provide recommendations for datasets with moderately greater noise levels. For researchers who are analyzing data with similar ERP components, noise levels, and participant populations, using the recommended filter settings should lead to improved data quality and statistical power without creating problematic waveform distortion.
Collapse
Affiliation(s)
- Guanghui Zhang
- Center for Mind and Brain, University of California-Davis, Davis, California, 95618, USA
| | - David R. Garrett
- Center for Mind and Brain, University of California-Davis, Davis, California, 95618, USA
| | - Steven J. Luck
- Center for Mind and Brain, University of California-Davis, Davis, California, 95618, USA
| |
Collapse
|