1
|
Marsicano G, Bertini C, Ronconi L. Decoding cognition in neurodevelopmental, psychiatric and neurological conditions with multivariate pattern analysis of EEG data. Neurosci Biobehav Rev 2024; 164:105795. [PMID: 38977116 DOI: 10.1016/j.neubiorev.2024.105795] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2024] [Revised: 06/21/2024] [Accepted: 07/03/2024] [Indexed: 07/10/2024]
Abstract
Multivariate pattern analysis (MVPA) of electroencephalographic (EEG) data represents a revolutionary approach to investigate how the brain encodes information. By considering complex interactions among spatio-temporal features at the individual level, MVPA overcomes the limitations of univariate techniques, which often fail to account for the significant inter- and intra-individual neural variability. This is particularly relevant when studying clinical populations, and therefore MVPA of EEG data has recently started to be employed as a tool to study cognition in brain disorders. Here, we review the insights offered by this methodology in the study of anomalous patterns of neural activity in conditions such as autism, ADHD, schizophrenia, dyslexia, neurological and neurodegenerative disorders, within different cognitive domains (perception, attention, memory, consciousness). Despite potential drawbacks that should be attentively addressed, these studies reveal a peculiar sensitivity of MVPA in unveiling dysfunctional and compensatory neurocognitive dynamics of information processing, which often remain blind to traditional univariate approaches. Such higher sensitivity in characterizing individual neurocognitive profiles can provide unique opportunities to optimise assessment and promote personalised interventions.
Collapse
Affiliation(s)
- Gianluca Marsicano
- Department of Psychology, University of Bologna, Viale Berti Pichat 5, Bologna 40121, Italy; Centre for Studies and Research in Cognitive Neuroscience, University of Bologna, Via Rasi e Spinelli 176, Cesena 47023, Italy.
| | - Caterina Bertini
- Department of Psychology, University of Bologna, Viale Berti Pichat 5, Bologna 40121, Italy; Centre for Studies and Research in Cognitive Neuroscience, University of Bologna, Via Rasi e Spinelli 176, Cesena 47023, Italy.
| | - Luca Ronconi
- School of Psychology, Vita-Salute San Raffaele University, Milan, Italy; Division of Neuroscience, IRCCS San Raffaele Scientific Institute, Milan, Italy.
| |
Collapse
|
2
|
Torres NL, Castro SL, Silva S. Visual movement impairs duration discrimination at short intervals. Q J Exp Psychol (Hove) 2024; 77:57-69. [PMID: 36717537 PMCID: PMC10712207 DOI: 10.1177/17470218231156542] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2022] [Revised: 01/19/2023] [Accepted: 01/24/2023] [Indexed: 02/01/2023]
Abstract
The classic advantage of audition over vision in time processing has been recently challenged by studies using continuously moving visual stimuli such as bouncing balls. Bouncing balls drive beat-based synchronisation better than static visual stimuli (flashes) and as efficiently as auditory ones (beeps). It is yet unknown how bouncing balls modulate performance in duration perception. Our previous study addressing this was inconclusive: there were no differences among bouncing balls, flashes, and beeps, but this could have been due to the fact that intervals were too long to allow sensitivity to modality (visual vs auditory). In this study, we conducted a first experiment to determine whether shorter intervals elicit cross-stimulus differences. We found that short (mean 157 ms) but not medium (326 ms) intervals made duration perception worse for bouncing balls compared with flashes and beeps. In a second experiment, we investigated whether the lower efficiency of bouncing balls was due to experimental confounds, lack of realism, or movement. We ruled out the experimental confounds and found support for the hypothesis that visual movement-be it continuous or discontinuous-impairs duration perception at short interval lengths. Therefore, unlike beat-based synchronisation, duration perception does not benefit from continuous visual movement, which may even have a detrimental effect at short intervals.
Collapse
Affiliation(s)
- Nathércia L Torres
- Center for Psychology at the University of Porto (CPUP), Faculty of Psychology and Educational Sciences, University of Porto, Porto, Portugal
| | - São Luís Castro
- Center for Psychology at the University of Porto (CPUP), Faculty of Psychology and Educational Sciences, University of Porto, Porto, Portugal
| | - Susana Silva
- Center for Psychology at the University of Porto (CPUP), Faculty of Psychology and Educational Sciences, University of Porto, Porto, Portugal
| |
Collapse
|
3
|
He X, Ke Z, Wu Z, Chen L, Yue Z. The speed and temporal frequency of visual apparent motion modulate auditory duration perception. Sci Rep 2023; 13:11281. [PMID: 37438383 PMCID: PMC10338538 DOI: 10.1038/s41598-023-38183-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2022] [Accepted: 07/04/2023] [Indexed: 07/14/2023] Open
Abstract
In the present study, we investigated how the perception of auditory duration could be modulated by a task-irrelevant, concurrent visual apparent motion, induced by visual bars alternating between left and right sides. Moreover, we examined the influence of the speed and temporal frequency of visual apparent motion on the perception of auditory duration. In each trial, the standard visual stimuli (two vertical bars) were presented sequentially, except that visual apparent motion was included in the fourth stimulus. A tone was presented simultaneously with each visual stimulus, while the fourth tone was presented with varied duration. Participants judged whether the fourth tone lasted longer than the other four tones. In Experiment 1, the speed of visual apparent motion (Fast vs. Slow) was manipulated by changing the interval between two bars. The mean point of subjective equality (PSE) in the Slow apparent motion condition was larger than that in the Static condition. Moreover, participants tended to overestimate the duration only in the Static condition, i.e., time dilation effect, which disappeared under apparent motion conditions. In Experiment 2, in addition to speed, we controlled the temporal frequency of apparent motion by manipulating the number of bars, generating four conditions of visual apparent motion (Physical-fast, Perceived-fast, Perceived-slow, vs. Static). The mean PSE was significantly smaller in the Physical-fast condition than in the Static and Perceived-slow conditions. Moreover, we found a time compression effect in both the Perceived-slow and Static conditions but not in the Perceived-fast and Physical-fast conditions. These results suggest that the auditory duration could be modulated by the concurrent, contextual visual apparent motion, and both the speed and temporal frequency of the task-irrelevant visual apparent motion contribute to the bias in perceiving the auditory duration.
Collapse
Affiliation(s)
- Xiang He
- Department of Psychology, Sun Yat-Sen University, Guangzhou, 510006, China
| | - Zijun Ke
- Department of Psychology, Sun Yat-Sen University, Guangzhou, 510006, China
| | - Zehua Wu
- Department of Psychology, Sun Yat-Sen University, Guangzhou, 510006, China
| | - Lihan Chen
- School of Psychological and Cognitive Sciences and Beijing Key Laboratory of Behavior and Mental Health, Peking University, Beijing, 100871, China.
| | - Zhenzhu Yue
- Department of Psychology, Sun Yat-Sen University, Guangzhou, 510006, China.
| |
Collapse
|
4
|
Feng Z, Zhu S, Duan J, Lu Y, Li L. Cross-modality effect in implicit learning of temporal sequence. CURRENT PSYCHOLOGY 2023. [DOI: 10.1007/s12144-022-04228-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
|
5
|
Zhang H, Yang S, Qiao Y, Ge Q, Tang Y, Northoff G, Zang Y. Default mode network mediates low-frequency fluctuations in brain activity and behavior during sustained attention. Hum Brain Mapp 2022; 43:5478-5489. [PMID: 35903957 PMCID: PMC9704793 DOI: 10.1002/hbm.26024] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2022] [Revised: 07/02/2022] [Accepted: 07/10/2022] [Indexed: 01/15/2023] Open
Abstract
The low-frequency (<0.1 Hz) fluctuation in sustained attention attracts enormous interest in cognitive neuroscience and clinical research since it always leads to cognitive and behavioral lapses. What is the source of the spontaneous fluctuation in sustained attention in neural activity, and how does the neural fluctuation relate to behavioral fluctuation? Here, we address these questions by collecting and analyzing two independent fMRI and behavior datasets. We show that the neural (fMRI) fluctuation in a key brain network, the default-mode network (DMN), mediate behavioral (reaction time) fluctuation during sustained attention. DMN shows the increased amplitude of fluctuation, which correlates with the behavioral fluctuation in a similar frequency range (0.01-0.1 Hz) but not in the lower (<0.01 Hz) or higher (>0.1 Hz) frequency range. This was observed during both auditory and visual sustained attention and was replicable across independent datasets. These results provide a novel insight into the neural source of attention-fluctuation and extend the former concept that DMN was deactivated in cognitive tasks. More generally, our findings highlight the temporal dynamic of the brain-behavior relationship.
Collapse
Affiliation(s)
- Hang Zhang
- Centre for Cognition and Brain DisordersThe Affiliated Hospital of Hangzhou Normal UniversityHangzhouZhejiangChina,Institute of Psychological ScienceHangzhou Normal UniversityHangzhouZhejiangChina,Zhejiang Key Laboratory for Research in Assessment of Cognitive ImpairmentHangzhouZhejiangChina
| | - Shi‐You Yang
- Centre for Cognition and Brain DisordersThe Affiliated Hospital of Hangzhou Normal UniversityHangzhouZhejiangChina,Institute of Psychological ScienceHangzhou Normal UniversityHangzhouZhejiangChina,Zhejiang Key Laboratory for Research in Assessment of Cognitive ImpairmentHangzhouZhejiangChina
| | - Yang Qiao
- Centre for Cognition and Brain DisordersThe Affiliated Hospital of Hangzhou Normal UniversityHangzhouZhejiangChina,Institute of Psychological ScienceHangzhou Normal UniversityHangzhouZhejiangChina,Zhejiang Key Laboratory for Research in Assessment of Cognitive ImpairmentHangzhouZhejiangChina
| | - Qiu Ge
- Centre for Cognition and Brain DisordersThe Affiliated Hospital of Hangzhou Normal UniversityHangzhouZhejiangChina,Institute of Psychological ScienceHangzhou Normal UniversityHangzhouZhejiangChina,Zhejiang Key Laboratory for Research in Assessment of Cognitive ImpairmentHangzhouZhejiangChina
| | - Yi‐Yuan Tang
- College of Health SolutionsArizona State UniversityTempeArizonaUSA
| | - Georg Northoff
- Centre for Cognition and Brain DisordersThe Affiliated Hospital of Hangzhou Normal UniversityHangzhouZhejiangChina,Institute of Mental Health ResearchUniversity of OttawaOttawaCanada
| | - Yu‐Feng Zang
- Centre for Cognition and Brain DisordersThe Affiliated Hospital of Hangzhou Normal UniversityHangzhouZhejiangChina,Institute of Psychological ScienceHangzhou Normal UniversityHangzhouZhejiangChina,Zhejiang Key Laboratory for Research in Assessment of Cognitive ImpairmentHangzhouZhejiangChina
| |
Collapse
|
6
|
Searching for individual multi-sensory fingerprints and their links with adiposity – New insights from meta-analyses and empirical data. Food Qual Prefer 2022. [DOI: 10.1016/j.foodqual.2022.104574] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/24/2023]
|
7
|
A supramodal and conceptual representation of subsecond time revealed with perceptual learning of temporal interval discrimination. Sci Rep 2022; 12:10668. [PMID: 35739220 PMCID: PMC9226181 DOI: 10.1038/s41598-022-14698-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2022] [Accepted: 06/10/2022] [Indexed: 11/21/2022] Open
Abstract
Subsecond time perception has been frequently attributed to modality-specific timing mechanisms that would predict no cross-modal transfer of temporal perceptual learning. In fact, perceptual learning of temporal interval discrimination (TID) reportedly shows either no cross-modal transfer, or asymmetric transfer from audition to vision, but not vice versa. However, here we demonstrate complete cross-modal transfer of auditory and visual TID learning using a double training paradigm. Specifically, visual TID learning transfers to and optimizes auditory TID when the participants also receive exposure to the auditory temporal interval by practicing a functionally orthogonal near-threshold tone frequency discrimination task at the same trained interval. Auditory TID learning also transfers to and optimizes visual TID with additional practice of an orthogonal near-threshold visual contrast discrimination task at the same trained interval. Practicing these functionally orthogonal tasks per se has no impact on TID thresholds. We interpret the transfer results as indications of a supramodal representation of subsecond time. Moreover, because TID learning shows complete transfer between modalities with vastly different temporal precisions, the sub-second time presentation must be conceptual. Double training may refine this supramodal and conceptual subsecond time representation and connect it to a new sense to improve time perception.
Collapse
|
8
|
van Ackooij M, Paul JM, van der Zwaag W, van der Stoep N, Harvey BM. Auditory timing-tuned neural responses in the human auditory cortices. Neuroimage 2022; 258:119366. [PMID: 35690255 DOI: 10.1016/j.neuroimage.2022.119366] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2022] [Revised: 05/25/2022] [Accepted: 06/08/2022] [Indexed: 11/27/2022] Open
Abstract
Perception of sub-second auditory event timing supports multisensory integration, and speech and music perception and production. Neural populations tuned for the timing (duration and rate) of visual events were recently described in several human extrastriate visual areas. Here we ask whether the brain also contains neural populations tuned for auditory event timing, and whether these are shared with visual timing. Using 7T fMRI, we measured responses to white noise bursts of changing duration and rate. We analyzed these responses using neural response models describing different parametric relationships between event timing and neural response amplitude. This revealed auditory timing-tuned responses in the primary auditory cortex, and auditory association areas of the belt, parabelt and premotor cortex. While these areas also showed tonotopic tuning for auditory pitch, pitch and timing preferences were not consistently correlated. Auditory timing-tuned response functions differed between these areas, though without clear hierarchical integration of responses. The similarity of auditory and visual timing tuned responses, together with the lack of overlap between the areas showing these responses for each modality, suggests modality-specific responses to event timing are computed similarly but from different sensory inputs, and then transformed differently to suit the needs of each modality.
Collapse
Affiliation(s)
- Martijn van Ackooij
- Experimental Psychology, Helmholtz Institute, Utrecht University, Heidelberglaan 1, Utrecht 3584 CS, the Netherlands
| | - Jacob M Paul
- Experimental Psychology, Helmholtz Institute, Utrecht University, Heidelberglaan 1, Utrecht 3584 CS, the Netherlands; Melbourne School of Psychological Sciences, University of Melbourne, Redmond Barry Building, Parkville 3010, Victoria, Australia
| | | | - Nathan van der Stoep
- Experimental Psychology, Helmholtz Institute, Utrecht University, Heidelberglaan 1, Utrecht 3584 CS, the Netherlands
| | - Ben M Harvey
- Experimental Psychology, Helmholtz Institute, Utrecht University, Heidelberglaan 1, Utrecht 3584 CS, the Netherlands.
| |
Collapse
|
9
|
Espinoza-Monroy M, de Lafuente V. Discrimination of Regular and Irregular Rhythms Explained by a Time Difference Accumulation Model. Neuroscience 2021; 459:16-26. [PMID: 33549694 DOI: 10.1016/j.neuroscience.2021.01.035] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2020] [Revised: 01/20/2021] [Accepted: 01/28/2021] [Indexed: 02/07/2023]
Abstract
Perceiving the temporal regularity in a sequence of repetitive sensory events facilitates the preparation and execution of relevant behaviors with tight temporal constraints. How we estimate temporal regularity from repeating patterns of sensory stimuli is not completely understood. We developed a decision-making task in which participants had to decide whether a train of visual, auditory, or tactile pulses, had a regular or an irregular temporal pattern. We tested the hypothesis that subjects categorize stimuli as irregular by accumulating the time differences between the predicted and observed times of sensory pulses defining a temporal rhythm. Results suggest that instead of waiting for a single large temporal deviation, participants accumulate timing-error signals and judge a pattern as irregular when the amount of evidence reaches a decision threshold. Model fits of bounded integration showed that this accumulation occurs with negligible leak of evidence. Consistent with previous findings, we show that participants perform better when evaluating the regularity of auditory pulses, as compared with visual or tactile stimuli. Our results suggest that temporal regularity is estimated by comparing expected and measured pulse onset times, and that each prediction error is accumulated towards a threshold to generate a behavioral choice.
Collapse
Affiliation(s)
- Marisol Espinoza-Monroy
- Instituto de Neurobiología, Universidad Nacional Autónoma de México, Querétaro, QRO 76230, Mexico
| | - Victor de Lafuente
- Instituto de Neurobiología, Universidad Nacional Autónoma de México, Querétaro, QRO 76230, Mexico.
| |
Collapse
|
10
|
Spatiotemporal perturbations in paced finger tapping suggest a common mechanism for the processing of time errors. Sci Rep 2019; 9:17814. [PMID: 31780695 PMCID: PMC6882783 DOI: 10.1038/s41598-019-54133-x] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2019] [Accepted: 11/07/2019] [Indexed: 12/27/2022] Open
Abstract
Paced finger tapping is a sensorimotor synchronization task where a subject has to keep pace with a metronome while the time differences (asynchronies) between each stimulus and its response are recorded. A usual way to study the underlying error correction mechanism is to perform unexpected temporal perturbations to the stimuli sequence. An overlooked issue is that at the moment of a temporal perturbation two things change: the stimuli period (a parameter) and the asynchrony (a variable). In terms of experimental manipulation, it would be desirable to have separate, independent control of parameter and variable values. In this work we perform paced finger tapping experiments combining simple temporal perturbations (tempo step change) and spatial perturbations with temporal effect (raised or lowered point of contact). In this way we decouple the parameter-and-variable confounding, performing novel perturbations where either the parameter or the variable changes. Our results show nonlinear features like asymmetry and are compatible with a common error correction mechanism for all types of asynchronies. We suggest taking this confounding into account when analyzing perturbations of any kind in finger tapping tasks but also in other areas of sensorimotor synchronization, like music performance experiments and paced walking in gait coordination studies.
Collapse
|