1
|
Han Y, Du L, Huang Q, Cui D, Li Y. Enhancing specialization of attention-related EEG power and phase synchronism brain patterns by meditation. Cereb Cortex 2024; 34:bhae288. [PMID: 39024158 DOI: 10.1093/cercor/bhae288] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2024] [Revised: 06/21/2024] [Accepted: 07/03/2024] [Indexed: 07/20/2024] Open
Abstract
Meditation, mental training that aims to improve one's ability to regulate their cognition, has been widely applied in clinical medicine. However, the mechanism by which meditation affects brain activity is still unclear. To explore this question, electroencephalogram data were recorded in 20 long-term meditators and 20 nonmeditators during 2 high-level cognitive tasks (meditation and mental calculation) and a relaxed resting state (control). Then, the power spectral density and phase synchronization of the electroencephalogram were extracted and compared between these 2 groups. In addition, machine learning was used to discriminate the states within each group. We found that the meditation group showed significantly higher classification accuracy and calculation efficiency than the control group. Then, during the calculation task, both the power and global phase synchronism of the gamma response decreased in meditators compared to their relaxation state; yet, no such change was observed in the control group. A potential explanation for our observations is that meditation improved the flexibility of the brain through neural plastic mechanism. In conclusion, we provided robust evidence that long-term meditation experience could produce detectable neurophysiological changes in brain activity, which possibly enhance the functional segregation and/or specialization in the brain.
Collapse
Affiliation(s)
- Yupeng Han
- School of Automation Science and Engineering, South China University of Technology, Wushan Road 381, Guangzhou 510641, China
- Research Center for Brain-Computer Interfaces, Pazhou Laboratory, Qiaotou Street 248, Guangzhou 510665, China
| | - Lizhao Du
- Shanghai Med-X Engineering Research Center, School of Biomedical Engineering, Shanghai Jiao Tong University, Huashan Road 1954, Shanghai, 200030, China
- Shanghai Mental Health Center, Shanghai Jiao Tong University School of Medicine, Humin Road 3210, Shanghai 201108, China
- Shanghai Key Laboratory of Psychotic Disorders, Humin Road 3210, Shanghai 201108, China
| | - Qiyun Huang
- Research Center for Brain-Computer Interfaces, Pazhou Laboratory, Qiaotou Street 248, Guangzhou 510665, China
| | - Donghong Cui
- Shanghai Mental Health Center, Shanghai Jiao Tong University School of Medicine, Humin Road 3210, Shanghai 201108, China
- Shanghai Key Laboratory of Psychotic Disorders, Humin Road 3210, Shanghai 201108, China
- Brain Science and Technology Research Center, Shanghai Jiao Tong University, Huanshan Road 1954, Shanghai 200030, China
| | - Yuanqing Li
- School of Automation Science and Engineering, South China University of Technology, Wushan Road 381, Guangzhou 510641, China
- Research Center for Brain-Computer Interfaces, Pazhou Laboratory, Qiaotou Street 248, Guangzhou 510665, China
| |
Collapse
|
2
|
Berthault E, Chen S, Falk S, Morillon B, Schön D. Auditory and motor priming of metric structure improves understanding of degraded speech. Cognition 2024; 248:105793. [PMID: 38636164 DOI: 10.1016/j.cognition.2024.105793] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/04/2023] [Revised: 03/07/2024] [Accepted: 04/09/2024] [Indexed: 04/20/2024]
Abstract
Speech comprehension is enhanced when preceded (or accompanied) by a congruent rhythmic prime reflecting the metrical sentence structure. Although these phenomena have been described for auditory and motor primes separately, their respective and synergistic contribution has not been addressed. In this experiment, participants performed a speech comprehension task on degraded speech signals that were preceded by a rhythmic prime that could be auditory, motor or audiomotor. Both auditory and audiomotor rhythmic primes facilitated speech comprehension speed. While the presence of a purely motor prime (unpaced tapping) did not globally benefit speech comprehension, comprehension accuracy scaled with the regularity of motor tapping. In order to investigate inter-individual variability, participants also performed a Spontaneous Speech Synchronization test. The strength of the estimated perception-production coupling correlated positively with overall speech comprehension scores. These findings are discussed in the framework of the dynamic attending and active sensing theories.
Collapse
Affiliation(s)
- Emma Berthault
- Aix Marseille Université, INSERM, INS, Institut de Neurosciences des Systèmes, Marseille, France.
| | - Sophie Chen
- Aix Marseille Université, INSERM, INS, Institut de Neurosciences des Systèmes, Marseille, France.
| | - Simone Falk
- Department of Linguistics and Translation, University of Montreal, Canada; International Laboratory for Brain, Music and Sound Research, Montreal, Canada.
| | - Benjamin Morillon
- Aix Marseille Université, INSERM, INS, Institut de Neurosciences des Systèmes, Marseille, France.
| | - Daniele Schön
- Aix Marseille Université, INSERM, INS, Institut de Neurosciences des Systèmes, Marseille, France.
| |
Collapse
|
3
|
Kaya E, Kotz SA, Henry MJ. A novel method for estimating properties of attentional oscillators reveals an age-related decline in flexibility. eLife 2024; 12:RP90735. [PMID: 38904659 PMCID: PMC11192533 DOI: 10.7554/elife.90735] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 06/22/2024] Open
Abstract
Dynamic attending theory proposes that the ability to track temporal cues in the auditory environment is governed by entrainment, the synchronization between internal oscillations and regularities in external auditory signals. Here, we focused on two key properties of internal oscillators: their preferred rate, the default rate in the absence of any input; and their flexibility, how they adapt to changes in rhythmic context. We developed methods to estimate oscillator properties (Experiment 1) and compared the estimates across tasks and individuals (Experiment 2). Preferred rates, estimated as the stimulus rates with peak performance, showed a harmonic relationship across measurements and were correlated with individuals' spontaneous motor tempo. Estimates from motor tasks were slower than those from the perceptual task, and the degree of slowing was consistent for each individual. Task performance decreased with trial-to-trial changes in stimulus rate, and responses on individual trials were biased toward the preceding trial's stimulus properties. Flexibility, quantified as an individual's ability to adapt to faster-than-previous rates, decreased with age. These findings show domain-specific rate preferences for the assumed oscillatory system underlying rhythm perception and production, and that this system loses its ability to flexibly adapt to changes in the external rhythmic context during aging.
Collapse
Affiliation(s)
- Ece Kaya
- Max Planck Institute for Empirical AestheticsFrankfurtGermany
- Maastricht UniversityMaastrichtNetherlands
| | - Sonja A Kotz
- Maastricht UniversityMaastrichtNetherlands
- Max Planck Institute for Human Cognitive and Brain SciencesLeipzigGermany
| | - Molly J Henry
- Max Planck Institute for Empirical AestheticsFrankfurtGermany
- Toronto Metropolitan UniversityTorontoCanada
| |
Collapse
|
4
|
Yamazaki R, Ushiyama J. Head movements induced by voluntary neck flexion stabilize sensorimotor synchronization of the finger to syncopated auditory rhythms. Front Psychol 2024; 15:1335050. [PMID: 38903467 PMCID: PMC11188995 DOI: 10.3389/fpsyg.2024.1335050] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2023] [Accepted: 05/03/2024] [Indexed: 06/22/2024] Open
Abstract
Head movements that are synchronized with musical rhythms often emerge during musical activities, such as hip hop dance. Although such movements are known to affect the meter and pulse perception of complex auditory rhythms, no studies have investigated their contribution to the performance of sensorimotor synchronization (SMS). In the present study, participants listened to syncopated auditory rhythms and flexed their dominant hand index finger in time with the perceived pulses (4/4 meters). In the first experiment (Exp. 1), the participants moved their heads via voluntary neck flexion to the pulses in parallel with finger SMS (Nodding condition, ND). This performance was compared with finger SMS without nodding (Without Nodding condition, WN). In the second experiment (Exp. 2), we investigated the specificity of the effect of head SMS on finger SMS confirmed in Exp. 1 by asking participants to flex their bilateral index fingers to the pulses (Bimanual condition, BM). We compared the performance of dominant hand finger SMS between the BM and ND conditions. In Exp. 1, we found that dominant hand finger SMS was significantly more stable (smaller standard deviation of asynchrony) in the ND versus WN condition (p < 0.001). In Exp. 2, dominant hand finger SMS was significantly more accurate (smaller absolute value of asynchrony) in the ND versus BM condition (p = 0.037). In addition, the stability of dominant hand finger SMS was significantly correlated with the index of phase locking between the pulses and head SMS across participants in the ND condition (r = -0.85, p < 0.001). In contrast, the stability of dominant hand finger SMS was not significantly correlated with the index of phase locking between pulses and non-dominant hand finger SMS in the BM condition (r = -0.25, p = 0.86 after multiple comparison correction). These findings suggest that SMS modulation depends on the motor effectors simultaneously involved in synchronization: simultaneous head SMS stabilizes the timing of dominant hand finger SMS, while simultaneous non-dominant hand finger SMS deteriorates the timing accuracy of dominant hand finger SMS. The present study emphasizes the unique and crucial role of head movements in rhythmic behavior.
Collapse
Affiliation(s)
- Ryoichiro Yamazaki
- Graduate School of Media and Governance, Keio University, Fujisawa, Japan
| | - Junichi Ushiyama
- Faculty of Environment and Information Studies, Keio University, Fujisawa, Japan
- Department of Rehabilitation Medicine, Keio University School of Medicine, Tokyo, Japan
| |
Collapse
|
5
|
Mai U, Chu G, Raphael BJ. Maximum Likelihood Inference of Time-scaled Cell Lineage Trees with Mixed-type Missing Data. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.03.05.583638. [PMID: 38496496 PMCID: PMC10942411 DOI: 10.1101/2024.03.05.583638] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/19/2024]
Abstract
Recent dynamic lineage tracing technologies combine CRISPR-based genome editing with single-cell sequencing to track cell divisions during development. A key computational problem in dynamic lineage tracing is to infer a cell lineage tree from the measured CRISPR-induced mutations. Three features of dynamic lineage tracing data distinguish this problem from standard phylogenetic tree inference. First, the CRISPR-editing process modifies a genomic location exactly once. This non-modifiable property is not well described by the time-reversible models commonly used in phylogenetics. Second, as a consequence of non-modifiability, the number of mutations per time unit decreases over time. Third, CRISPR-based genome-editing and single-cell sequencing results in high rates of both heritable and non-heritable (dropout) missing data. To model these features, we introduce the Probabilistic Mixed-type Missing (PMM) model. We describe an algorithm, LAML (Lineage Analysis via Maximum Likelihood), to search for the maximum likelihood (ML) tree under the PMM model. LAML combines an Expectation Maximization (EM) algorithm with a heuristic tree search to jointly estimate tree topology, branch lengths and missing data parameters. We derive a closed-form solution for the M-step in the case of no heritable missing data, and a block coordinate ascent approach in the general case which is more efficient than the standard General Time Reversible (GTR) phylogenetic model. On simulated data, LAML infers more accurate tree topologies and branch lengths than existing methods, with greater advantages on datasets with higher ratios of heritable to non-heritable missing data. We show that LAML provides unbiased time-scaled estimates of branch lengths. In contrast, we demonstrate that maximum parsimony methods for lineage tracing data not only underestimate branch lengths, but also yield branch lengths which are not proportional to time, due to the nonlinear decay in the number of mutations on branches further from the root. On lineage tracing data from a mouse model of lung adenocarcinoma, we show that LAML infers phylogenetic distances that are more concordant with gene expression data compared to distances derived from maximum parsimony. The LAML tree topology is more plausible than existing published trees, with fewer total cell migrations between distant metastases and fewer reseeding events where cells migrate back to the primary tumor. Crucially, we identify three distinct time epochs of metastasis progression, which includes a burst of metastasis events to various anatomical sites during a single month.
Collapse
Affiliation(s)
| | | | - Benjamin J. Raphael
- Department of Computer Science, Princeton University, Princeton, NJ 08544, USA
| |
Collapse
|
6
|
Zalta A, Large EW, Schön D, Morillon B. Neural dynamics of predictive timing and motor engagement in music listening. SCIENCE ADVANCES 2024; 10:eadi2525. [PMID: 38446888 PMCID: PMC10917349 DOI: 10.1126/sciadv.adi2525] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/13/2023] [Accepted: 01/30/2024] [Indexed: 03/08/2024]
Abstract
Why do humans spontaneously dance to music? To test the hypothesis that motor dynamics reflect predictive timing during music listening, we created melodies with varying degrees of rhythmic predictability (syncopation) and asked participants to rate their wanting-to-move (groove) experience. Degree of syncopation and groove ratings are quadratically correlated. Magnetoencephalography data showed that, while auditory regions track the rhythm of melodies, beat-related 2-hertz activity and neural dynamics at delta (1.4 hertz) and beta (20 to 30 hertz) rates in the dorsal auditory pathway code for the experience of groove. Critically, the left sensorimotor cortex coordinates these groove-related delta and beta activities. These findings align with the predictions of a neurodynamic model, suggesting that oscillatory motor engagement during music listening reflects predictive timing and is effected by interaction of neural dynamics along the dorsal auditory pathway.
Collapse
Affiliation(s)
- Arnaud Zalta
- Aix Marseille Université, Inserm, INS, Institut de Neurosciences des Systèmes, Marseille, France
- APHM, INSERM, Inst Neurosci Syst, Service de Pharmacologie Clinique et Pharmacovigilance, Aix Marseille Université, Marseille, France
| | - Edward W. Large
- Department of Psychological Sciences, Ecological Psychology Division, University of Connecticut, Storrs, CT, USA
- Department of Physics, University of Connecticut, Storrs, CT, USA
| | - Daniele Schön
- Aix Marseille Université, Inserm, INS, Institut de Neurosciences des Systèmes, Marseille, France
| | - Benjamin Morillon
- Aix Marseille Université, Inserm, INS, Institut de Neurosciences des Systèmes, Marseille, France
| |
Collapse
|
7
|
Naghibi N, Jahangiri N, Khosrowabadi R, Eickhoff CR, Eickhoff SB, Coull JT, Tahmasian M. Embodying Time in the Brain: A Multi-Dimensional Neuroimaging Meta-Analysis of 95 Duration Processing Studies. Neuropsychol Rev 2024; 34:277-298. [PMID: 36857010 PMCID: PMC10920454 DOI: 10.1007/s11065-023-09588-1] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2022] [Accepted: 10/05/2022] [Indexed: 03/02/2023]
Abstract
Time is an omnipresent aspect of almost everything we experience internally or in the external world. The experience of time occurs through such an extensive set of contextual factors that, after decades of research, a unified understanding of its neural substrates is still elusive. In this study, following the recent best-practice guidelines, we conducted a coordinate-based meta-analysis of 95 carefully-selected neuroimaging papers of duration processing. We categorized the included papers into 14 classes of temporal features according to six categorical dimensions. Then, using the activation likelihood estimation (ALE) technique we investigated the convergent activation patterns of each class with a cluster-level family-wise error correction at p < 0.05. The regions most consistently activated across the various timing contexts were the pre-SMA and bilateral insula, consistent with an embodied theory of timing in which abstract representations of duration are rooted in sensorimotor and interoceptive experience, respectively. Moreover, class-specific patterns of activation could be roughly divided according to whether participants were timing auditory sequential stimuli, which additionally activated the dorsal striatum and SMA-proper, or visual single interval stimuli, which additionally activated the right middle frontal and inferior parietal cortices. We conclude that temporal cognition is so entangled with our everyday experience that timing stereotypically common combinations of stimulus characteristics reactivates the sensorimotor systems with which they were first experienced.
Collapse
Affiliation(s)
- Narges Naghibi
- Institute for Cognitive and Brain Sciences, Shahid Beheshti University, Tehran, Iran
| | - Nadia Jahangiri
- Faculty of Psychology & Education, Allameh Tabataba'i University, Tehran, Iran
| | - Reza Khosrowabadi
- Institute for Cognitive and Brain Sciences, Shahid Beheshti University, Tehran, Iran
| | - Claudia R Eickhoff
- Institute of Neuroscience and Medicine Research, Structural and functional organisation of the brain (INM-1), Jülich Research Center, Jülich, Germany
- Institute of Clinical Neuroscience and Medical Psychology, Medical Faculty, Heinrich Heine University, Düsseldorf, Germany
| | - Simon B Eickhoff
- Institute of Neuroscience and Medicine Research, Brain and Behaviour (INM-7), Jülich Research Center, Wilhelm-Johnen-Straße, Jülich, Germany
- Institute for Systems Neuroscience, Medical Faculty, Heinrich-Heine University, Düsseldorf, Germany
| | - Jennifer T Coull
- Laboratoire de Neurosciences Cognitives (UMR 7291), Aix-Marseille Université & CNRS, Marseille, France
| | - Masoud Tahmasian
- Institute of Neuroscience and Medicine Research, Brain and Behaviour (INM-7), Jülich Research Center, Wilhelm-Johnen-Straße, Jülich, Germany.
- Institute for Systems Neuroscience, Medical Faculty, Heinrich-Heine University, Düsseldorf, Germany.
| |
Collapse
|
8
|
Marchetti G. The self and conscious experience. Front Psychol 2024; 15:1340943. [PMID: 38333065 PMCID: PMC10851942 DOI: 10.3389/fpsyg.2024.1340943] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2023] [Accepted: 01/04/2024] [Indexed: 02/10/2024] Open
Abstract
The primary determinant of the self (S) is the conscious experience (CE) we have of it. Therefore, it does not come as a surprise that empirical research on S mainly resorts to the CE (or lack of CE) that subjects have of their S. What comes as a surprise is that empirical research on S does not tackle the problem of how CE contributes to building S. Empirical research investigates how S either biases the cognitive processing of stimuli or is altered through a wide range of means (meditation, hypnosis, etc.). In either case, even for different reasons, considerations of how CE contributes to building S are left unspecified in empirical research. This article analyzes these reasons and proposes a theoretical model of how CE contributes to building S. According to the proposed model, the phenomenal aspect of consciousness is produced by the modulation-engendered by attentional activity-of the energy level of the neural substrate (that is, the organ of attention) that underpins attentional activity. The phenomenal aspect of consciousness supplies the agent with a sense of S and informs the agent on how its S is affected by the agent's own operations. The phenomenal aspect of consciousness performs its functions through its five main dimensions: qualitative, quantitative, hedonic, temporal, and spatial. Each dimension of the phenomenal aspect of consciousness can be explained by a specific aspect of the modulation of the energy level of the organ of attention. Among other advantages, the model explains the various forms of S as outcomes resulting from the operations of a single mechanism and provides a unifying framework for empirical research on the neural underpinnings of S.
Collapse
Affiliation(s)
- Giorgio Marchetti
- Mind, Consciousness and Language Research Center, Alano di Piave, Italy
| |
Collapse
|
9
|
Bonnet P, Bonnefond M, Kösem A. What is a Rhythm for the Brain? The Impact of Contextual Temporal Variability on Auditory Perception. J Cogn 2024; 7:15. [PMID: 38250558 PMCID: PMC10798173 DOI: 10.5334/joc.344] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2023] [Accepted: 01/02/2024] [Indexed: 01/23/2024] Open
Abstract
Temporal predictions can be formed and impact perception when sensory timing is fully predictable: for instance, the discrimination of a target sound is enhanced if it is presented on the beat of an isochronous rhythm. However, natural sensory stimuli, like speech or music, are not entirely predictable, but still possess statistical temporal regularities. We investigated whether temporal expectations can be formed in non-fully predictable contexts, and how the temporal variability of sensory contexts affects auditory perception. Specifically, we asked how "rhythmic" an auditory stimulation needs to be in order to observe temporal predictions effects on auditory discrimination performances. In this behavioral auditory oddball experiment, participants listened to auditory sound sequences where the temporal interval between each sound was drawn from gaussian distributions with distinct standard deviations. Participants were asked to discriminate sounds with a deviant pitch in the sequences. Auditory discrimination performances, as measured with deviant sound discrimination accuracy and response times, progressively declined as the temporal variability of the sound sequence increased. Moreover, both global and local temporal statistics impacted auditory perception, suggesting that temporal statistics are promptly integrated to optimize perception. Altogether, these results suggests that temporal predictions can be set up quickly based on the temporal statistics of past sensory events and are robust to a certain amount of temporal variability. Therefore, temporal predictions can be built on sensory stimulations that are not purely periodic nor temporally deterministic.
Collapse
Affiliation(s)
- Pierre Bonnet
- Lyon Neuroscience Research Center (CRNL), Computation, Cognition and Neurophysiology team (Cophy), Inserm U1028, Université Claude Bernard Lyon1, CNRS UMR 5292, 69000 Lyon, France
| | - Mathilde Bonnefond
- Lyon Neuroscience Research Center (CRNL), Computation, Cognition and Neurophysiology team (Cophy), Inserm U1028, Université Claude Bernard Lyon1, CNRS UMR 5292, 69000 Lyon, France
| | - Anne Kösem
- Lyon Neuroscience Research Center (CRNL), Computation, Cognition and Neurophysiology team (Cophy), Inserm U1028, Université Claude Bernard Lyon1, CNRS UMR 5292, 69000 Lyon, France
| |
Collapse
|
10
|
Coull JT, Korolczuk I, Morillon B. The Motor of Time: Coupling Action to Temporally Predictable Events Heightens Perception. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2024; 1455:199-213. [PMID: 38918353 DOI: 10.1007/978-3-031-60183-5_11] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/27/2024]
Abstract
Timing and motor function share neural circuits and dynamics, which underpin their close and synergistic relationship. For instance, the temporal predictability of a sensory event optimizes motor responses to that event. Knowing when an event is likely to occur lowers response thresholds, leading to faster and more efficient motor behavior though in situations of response conflict can induce impulsive and inappropriate responding. In turn, through a process of active sensing, coupling action to temporally predictable sensory input enhances perceptual processing. Action not only hones perception of the event's onset or duration, but also boosts sensory processing of its non-temporal features such as pitch or shape. The effects of temporal predictability on motor behavior and sensory processing involve motor and left parietal cortices and are mediated by changes in delta and beta oscillations in motor areas of the brain.
Collapse
Affiliation(s)
- Jennifer T Coull
- Centre for Research in Psychology and Neuroscience (UMR 7077), Aix-Marseille Université & CNRS, Marseille, France.
| | - Inga Korolczuk
- Department of Pathophysiology, Medical University of Lublin, Lublin, Poland
| | - Benjamin Morillon
- Aix Marseille Université, INSERM, INS, Institut de Neurosciences des Systèmes, Marseille, France
| |
Collapse
|
11
|
Nguyen T, Reisner S, Lueger A, Wass SV, Hoehl S, Markova G. Sing to me, baby: Infants show neural tracking and rhythmic movements to live and dynamic maternal singing. Dev Cogn Neurosci 2023; 64:101313. [PMID: 37879243 PMCID: PMC10618693 DOI: 10.1016/j.dcn.2023.101313] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/27/2023] [Revised: 09/29/2023] [Accepted: 10/12/2023] [Indexed: 10/27/2023] Open
Abstract
Infant-directed singing has unique acoustic characteristics that may allow even very young infants to respond to the rhythms carried through the caregiver's voice. The goal of this study was to examine neural and movement responses to live and dynamic maternal singing in 7-month-old infants and their relation to linguistic development. In total, 60 mother-infant dyads were observed during two singing conditions (playsong and lullaby). In Study 1 (n = 30), we measured infant EEG and used an encoding approach utilizing ridge regressions to measure neural tracking. In Study 2 (n =40), we coded infant rhythmic movements. In both studies, we assessed children's vocabulary when they were 20 months old. In Study 1, we found above-threshold neural tracking of maternal singing, with superior tracking of lullabies than playsongs. We also found that the acoustic features of infant-directed singing modulated tracking. In Study 2, infants showed more rhythmic movement to playsongs than lullabies. Importantly, neural coordination (Study 1) and rhythmic movement (Study 2) to playsongs were positively related to infants' expressive vocabulary at 20 months. These results highlight the importance of infants' brain and movement coordination to their caregiver's musical presentations, potentially as a function of musical variability.
Collapse
Affiliation(s)
- Trinh Nguyen
- Faculty of Psychology, University of Vienna, Liebiggasse 5, 1010 Vienna, Austria; Neuroscience of Perception and Action Lab, Italian Institute of Technology, Viale Regina Elena 291, 00161 Rome, Italy.
| | - Susanne Reisner
- Faculty of Psychology, University of Vienna, Liebiggasse 5, 1010 Vienna, Austria
| | - Anja Lueger
- Faculty of Psychology, University of Vienna, Liebiggasse 5, 1010 Vienna, Austria
| | - Samuel V Wass
- Department of Psychology, University of East London, University Way, London E16 2RD, United Kingdom
| | - Stefanie Hoehl
- Faculty of Psychology, University of Vienna, Liebiggasse 5, 1010 Vienna, Austria
| | - Gabriela Markova
- Faculty of Psychology, University of Vienna, Liebiggasse 5, 1010 Vienna, Austria; Institute for Early Life Care, Paracelsus Medical University, Strubergasse 13, 5020 Salzburg, Austria.
| |
Collapse
|
12
|
Meng J, Zhao Y, Wang K, Sun J, Yi W, Xu F, Xu M, Ming D. Rhythmic temporal prediction enhances neural representations of movement intention for brain-computer interface. J Neural Eng 2023; 20:066004. [PMID: 37875107 DOI: 10.1088/1741-2552/ad0650] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2023] [Accepted: 10/24/2023] [Indexed: 10/26/2023]
Abstract
Objective.Detecting movement intention is a typical use of brain-computer interfaces (BCI). However, as an endogenous electroencephalography (EEG) feature, the neural representation of movement is insufficient for improving motor-based BCI. This study aimed to develop a new movement augmentation BCI encoding paradigm by incorporating the cognitive function of rhythmic temporal prediction, and test the feasibility of this new paradigm in optimizing detections of movement intention.Methods.A visual-motion synchronization task was designed with two movement intentions (left vs. right) and three rhythmic temporal prediction conditions (1000 ms vs. 1500 ms vs. no temporal prediction). Behavioural and EEG data of 24 healthy participants were recorded. Event-related potentials (ERPs), event-related spectral perturbation induced by left- and right-finger movements, the common spatial pattern (CSP) and support vector machine, Riemann tangent space algorithm and logistic regression were used and compared across the three temporal prediction conditions, aiming to test the impact of temporal prediction on movement detection.Results.Behavioural results showed significantly smaller deviation time for 1000 ms and 1500 ms conditions. ERP analyses revealed 1000 ms and 1500 ms conditions led to rhythmic oscillations with a time lag in contralateral and ipsilateral areas of movement. Compared with no temporal prediction, 1000 ms condition exhibited greater beta event-related desynchronization (ERD) lateralization in motor area (P< 0.001) and larger beta ERD in frontal area (P< 0.001). 1000 ms condition achieved an averaged left-right decoding accuracy of 89.71% using CSP and 97.30% using Riemann tangent space, both significantly higher than no temporal prediction. Moreover, movement and temporal information can be decoded simultaneously, achieving 88.51% four-classification accuracy.Significance.The results not only confirm the effectiveness of rhythmic temporal prediction in enhancing detection ability of motor-based BCI, but also highlight the dual encodings of movement and temporal information within a single BCI paradigm, which is promising to expand the range of intentions that can be decoded by the BCI.
Collapse
Affiliation(s)
- Jiayuan Meng
- The Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin 300072, People's Republic of China
- Haihe Laboratory of Brain-computer Interaction and Human-machine Integration, Tianjin 300392, People's Republic of China
| | - Yingru Zhao
- The Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin 300072, People's Republic of China
| | - Kun Wang
- The Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin 300072, People's Republic of China
- Haihe Laboratory of Brain-computer Interaction and Human-machine Integration, Tianjin 300392, People's Republic of China
| | - Jinsong Sun
- The Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin 300072, People's Republic of China
| | - Weibo Yi
- Beijing Machine and Equipment Institute, Beijing, People's Republic of China
| | - Fangzhou Xu
- International School for Optoelectronic Engineering, Qilu University of Technology (Shandong Academy of Sciences), Jinan, People's Republic of China
| | - Minpeng Xu
- The Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin 300072, People's Republic of China
- Haihe Laboratory of Brain-computer Interaction and Human-machine Integration, Tianjin 300392, People's Republic of China
- International School for Optoelectronic Engineering, Qilu University of Technology (Shandong Academy of Sciences), Jinan, People's Republic of China
| | - Dong Ming
- The Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin 300072, People's Republic of China
- Haihe Laboratory of Brain-computer Interaction and Human-machine Integration, Tianjin 300392, People's Republic of China
| |
Collapse
|
13
|
L'Hermite S, Zoefel B. Rhythmic Entrainment Echoes in Auditory Perception. J Neurosci 2023; 43:6667-6678. [PMID: 37604689 PMCID: PMC10538584 DOI: 10.1523/jneurosci.0051-23.2023] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2023] [Revised: 03/10/2023] [Accepted: 03/20/2023] [Indexed: 08/23/2023] Open
Abstract
Rhythmic entrainment echoes-rhythmic brain responses that outlast rhythmic stimulation-can demonstrate endogenous neural oscillations entrained by the stimulus rhythm. Here, we tested for such echoes in auditory perception. Participants detected a pure tone target, presented at a variable delay after another pure tone that was rhythmically modulated in amplitude. In four experiments involving 154 human (female and male) participants, we tested (1) which stimulus rate produces the strongest entrainment echo and, inspired by the tonotopical organization of the auditory system and findings in nonhuman primates, (2) whether these are organized according to sound frequency. We found the strongest entrainment echoes after 6 and 8 Hz stimulation, respectively. The best moments for target detection (in phase or antiphase with the preceding rhythm) depended on whether sound frequencies of entraining and target stimuli matched, which is in line with a tonotopical organization. However, for the same experimental condition, best moments were not always consistent across experiments. We provide a speculative explanation for these differences that relies on the notion that neural entrainment and repetition-related adaptation might exercise competing opposite influences on perception. Together, we find rhythmic echoes in auditory perception that seem more complex than those predicted from initial theories of neural entrainment.SIGNIFICANCE STATEMENT Rhythmic entrainment echoes are rhythmic brain responses that are produced by a rhythmic stimulus and persist after its offset. These echoes play an important role for the identification of endogenous brain oscillations, entrained by rhythmic stimulation, and give us insights into whether and how participants predict the timing of events. In four independent experiments involving >150 participants, we examined entrainment echoes in auditory perception. We found that entrainment echoes have a preferred rate (between 6 and 8 Hz) and seem to follow the tonotopic organization of the auditory system. Although speculative, we also found evidence that several, potentially competing processes might interact to produce such echoes, a notion that might need to be considered for future experimental design.
Collapse
Affiliation(s)
| | - Benedikt Zoefel
- Université de Toulouse III-Paul Sabatier, 31062 Toulouse, France
- Centre National de la Recherche Scientifique, Centre de Recherche Cerveau et Cognition, Centre Hospitalier Universitaire Purpan, 31052 Toulouse, France
| |
Collapse
|
14
|
Rosso M, Moens B, Leman M, Moumdjian L. Neural entrainment underpins sensorimotor synchronization to dynamic rhythmic stimuli. Neuroimage 2023; 277:120226. [PMID: 37321359 DOI: 10.1016/j.neuroimage.2023.120226] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2023] [Revised: 05/02/2023] [Accepted: 06/12/2023] [Indexed: 06/17/2023] Open
Abstract
Neural entrainment, defined as unidirectional synchronization of neural oscillations to an external rhythmic stimulus, is a topic of major interest in the field of neuroscience. Despite broad scientific consensus on its existence, on its pivotal role in sensory and motor processes, and on its fundamental definition, empirical research struggles in quantifying it with non-invasive electrophysiology. To this date, broadly adopted state-of-the-art methods still fail to capture the dynamic underlying the phenomenon. Here, we present event-related frequency adjustment (ERFA) as a methodological framework to induce and to measure neural entrainment in human participants, optimized for multivariate EEG datasets. By applying dynamic phase and tempo perturbations to isochronous auditory metronomes during a finger-tapping task, we analyzed adaptive changes in instantaneous frequency of entrained oscillatory components during error correction. Spatial filter design allowed us to untangle, from the multivariate EEG signal, perceptual and sensorimotor oscillatory components attuned to the stimulation frequency. Both components dynamically adjusted their frequency in response to perturbations, tracking the stimulus dynamics by slowing down and speeding up the oscillation over time. Source separation revealed that sensorimotor processing enhanced the entrained response, supporting the notion that the active engagement of the motor system plays a critical role in processing rhythmic stimuli. In the case of phase shift, motor engagement was a necessary condition to observe any response, whereas sustained tempo changes induced frequency adjustment even in the perceptual oscillatory component. Although the magnitude of the perturbations was controlled across positive and negative direction, we observed a general bias in the frequency adjustments towards positive changes, which points at the effect of intrinsic dynamics constraining neural entrainment. We conclude that our findings provide compelling evidence for neural entrainment as mechanism underlying overt sensorimotor synchronization, and highlight that our methodology offers a paradigm and a measure for quantifying its oscillatory dynamics by means of non-invasive electrophysiology, rigorously informed by the fundamental definition of entrainment.
Collapse
Affiliation(s)
- Mattia Rosso
- IPEM Institute for Systematic Musicology, Ghent University, Ghent, Belgium; Université de Lille, ULR 4072 - PSITEC - Psychologie: Interactions, Temps, Emotions, Cognition, Lille, France.
| | - Bart Moens
- IPEM Institute for Systematic Musicology, Ghent University, Ghent, Belgium
| | - Marc Leman
- IPEM Institute for Systematic Musicology, Ghent University, Ghent, Belgium
| | - Lousin Moumdjian
- IPEM Institute for Systematic Musicology, Ghent University, Ghent, Belgium; REVAL Rehabilitation Research Center, Faculty of Rehabilitation Sciences, Hasselt University, Hasselt, Belgium; UMSC Hasselt, Pelt, Belgium
| |
Collapse
|
15
|
Gunasekaran H, Azizi L, van Wassenhove V, Herbst SK. Characterizing endogenous delta oscillations in human MEG. Sci Rep 2023; 13:11031. [PMID: 37419933 PMCID: PMC10328979 DOI: 10.1038/s41598-023-37514-1] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2023] [Accepted: 06/22/2023] [Indexed: 07/09/2023] Open
Abstract
Rhythmic activity in the delta frequency range (0.5-3 Hz) is a prominent feature of brain dynamics. Here, we examined whether spontaneous delta oscillations, as found in invasive recordings in awake animals, can be observed in non-invasive recordings performed in humans with magnetoencephalography (MEG). In humans, delta activity is commonly reported when processing rhythmic sensory inputs, with direct relationships to behaviour. However, rhythmic brain dynamics observed during rhythmic sensory stimulation cannot be interpreted as an endogenous oscillation. To test for endogenous delta oscillations we analysed human MEG data during rest. For comparison, we additionally analysed two conditions in which participants engaged in spontaneous finger tapping and silent counting, arguing that internally rhythmic behaviours could incite an otherwise silent neural oscillator. A novel set of analysis steps allowed us to show narrow spectral peaks in the delta frequency range in rest, and during overt and covert rhythmic activity. Additional analyses in the time domain revealed that only the resting state condition warranted an interpretation of these peaks as endogenously periodic neural dynamics. In sum, this work shows that using advanced signal processing techniques, it is possible to observe endogenous delta oscillations in non-invasive recordings of human brain dynamics.
Collapse
Affiliation(s)
- Harish Gunasekaran
- Cognitive Neuroimaging Unit, NeuroSpin, CEA, INSERM, CNRS, Université Paris-Saclay, 91191, Gif/Yvette, France
| | - Leila Azizi
- Cognitive Neuroimaging Unit, NeuroSpin, CEA, INSERM, CNRS, Université Paris-Saclay, 91191, Gif/Yvette, France
| | - Virginie van Wassenhove
- Cognitive Neuroimaging Unit, NeuroSpin, CEA, INSERM, CNRS, Université Paris-Saclay, 91191, Gif/Yvette, France
| | - Sophie K Herbst
- Cognitive Neuroimaging Unit, NeuroSpin, CEA, INSERM, CNRS, Université Paris-Saclay, 91191, Gif/Yvette, France.
| |
Collapse
|
16
|
Musical tempo affects EEG spectral dynamics during subsequent time estimation. Biol Psychol 2023; 178:108517. [PMID: 36801434 DOI: 10.1016/j.biopsycho.2023.108517] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2022] [Revised: 01/24/2023] [Accepted: 02/12/2023] [Indexed: 02/19/2023]
Abstract
The perception of time depends on the rhythmicity of internal and external synchronizers. One external synchronizer that affects time estimation is music. This study aimed to analyze the effects of musical tempi on EEG spectral dynamics during subsequent time estimation. Participants performed a time production task after (i) silence and (ii) listening to music at different tempi -90, 120, and 150 bpm- while EEG activity was recorded. While listening, there was an increase in alpha power at all tempi compared to the resting state and an increase of beta at the fastest tempo. The beta increase persisted during the subsequent time estimations, with higher beta power during the task after listening to music at the fastest tempo than task performance without music. Spectral dynamics in frontal regions showed lower alpha activity in the final stages of time estimations after listening to music at 90- and 120-bpm than in the silence condition and higher beta in the early stages at 150 bpm. Behaviorally, the 120 bpm musical tempo produced slight improvements. Listening to music modified tonic EEG activity that subsequently affected EEG dynamics during time production. Music at a more optimal rate could have benefited temporal expectation and anticipation. The fastest musical tempo may have generated an over-activated state that affected subsequent time estimations. These results emphasize the importance of music as an external stimulus that can affect brain functional organization during time perception even after listening.
Collapse
|
17
|
Kazanina N, Tavano A. What neural oscillations can and cannot do for syntactic structure building. Nat Rev Neurosci 2023; 24:113-128. [PMID: 36460920 DOI: 10.1038/s41583-022-00659-5] [Citation(s) in RCA: 8] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/02/2022] [Indexed: 12/04/2022]
Abstract
Understanding what someone says requires relating words in a sentence to one another as instructed by the grammatical rules of a language. In recent years, the neurophysiological basis for this process has become a prominent topic of discussion in cognitive neuroscience. Current proposals about the neural mechanisms of syntactic structure building converge on a key role for neural oscillations in this process, but they differ in terms of the exact function that is assigned to them. In this Perspective, we discuss two proposed functions for neural oscillations - chunking and multiscale information integration - and evaluate their merits and limitations taking into account a fundamentally hierarchical nature of syntactic representations in natural languages. We highlight insights that provide a tangible starting point for a neurocognitive model of syntactic structure building.
Collapse
Affiliation(s)
- Nina Kazanina
- University of Bristol, Bristol, UK.
- Higher School of Economics, Moscow, Russia.
| | | |
Collapse
|
18
|
Makov S, Pinto D, Har-Shai Yahav P, Miller LM, Zion Golumbic E. "Unattended, distracting or irrelevant": Theoretical implications of terminological choices in auditory selective attention research. Cognition 2023; 231:105313. [PMID: 36344304 DOI: 10.1016/j.cognition.2022.105313] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2022] [Revised: 09/30/2022] [Accepted: 10/19/2022] [Indexed: 11/06/2022]
Abstract
For seventy years, auditory selective attention research has focused on studying the cognitive mechanisms of prioritizing the processing a 'main' task-relevant stimulus, in the presence of 'other' stimuli. However, a closer look at this body of literature reveals deep empirical inconsistencies and theoretical confusion regarding the extent to which this 'other' stimulus is processed. We argue that many key debates regarding attention arise, at least in part, from inappropriate terminological choices for experimental variables that may not accurately map onto the cognitive constructs they are meant to describe. Here we critically review the more common or disruptive terminological ambiguities, differentiate between methodology-based and theory-derived terms, and unpack the theoretical assumptions underlying different terminological choices. Particularly, we offer an in-depth analysis of the terms 'unattended' and 'distractor' and demonstrate how their use can lead to conflicting theoretical inferences. We also offer a framework for thinking about terminology in a more productive and precise way, in hope of fostering more productive debates and promoting more nuanced and accurate cognitive models of selective attention.
Collapse
Affiliation(s)
- Shiri Makov
- The Gonda Multidisciplinary Center for Brain Research, Bar Ilan University, Israel
| | - Danna Pinto
- The Gonda Multidisciplinary Center for Brain Research, Bar Ilan University, Israel
| | - Paz Har-Shai Yahav
- The Gonda Multidisciplinary Center for Brain Research, Bar Ilan University, Israel
| | - Lee M Miller
- The Center for Mind and Brain, University of California, Davis, CA, United States of America; Department of Neurobiology, Physiology, & Behavior, University of California, Davis, CA, United States of America; Department of Otolaryngology / Head and Neck Surgery, University of California, Davis, CA, United States of America
| | - Elana Zion Golumbic
- The Gonda Multidisciplinary Center for Brain Research, Bar Ilan University, Israel.
| |
Collapse
|
19
|
Snapiri L, Kaplan Y, Shalev N, Landau AN. Rhythmic modulation of visual discrimination is linked to individuals' spontaneous motor tempo. Eur J Neurosci 2023; 57:646-656. [PMID: 36512369 DOI: 10.1111/ejn.15898] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2022] [Revised: 11/17/2022] [Accepted: 12/01/2022] [Indexed: 12/15/2022]
Abstract
The impact of external rhythmic structure on perception has been demonstrated across different modalities and experimental paradigms. However, recent findings emphasize substantial individual differences in rhythm-based perceptual modulation. Here, we examine the link between spontaneous rhythmic preferences, as measured through the motor system, and individual differences in rhythmic modulation of visual discrimination. As a first step, we measure individual rhythmic preferences using the spontaneous tapping task. Then we assess perceptual rhythmic modulation using a visual discrimination task in which targets can appear either in-phase or out-of-phase with a preceding rhythmic stream of visual stimuli. The tempo of the preceding stream was manipulated over different experimental blocks (0.77 Hz, 1.4 Hz, 2 Hz). We find that visual rhythmic stimulation modulates discrimination performance. The modulation is dependent on the tempo of stimulation, with maximal perceptual benefits for the slowest tempo of stimulation (0.77 Hz). Most importantly, the strength of modulation is also linked to individuals' spontaneous motor tempo. Individuals with slower spontaneous tempi show greater rhythmic modulation compared to individuals with faster spontaneous tempi. This finding suggests that different tempi affect the cognitive system with varying levels of efficiency and that self-generated rhythms impact our ability to utilize rhythmic structure in the environment for guiding perception and performance.
Collapse
Affiliation(s)
- Leah Snapiri
- Department of Psychology, The Hebrew University of Jerusalem, Jerusalem, Israel
| | - Yael Kaplan
- Department of Psychology, The Hebrew University of Jerusalem, Jerusalem, Israel
| | - Nir Shalev
- Department of Experimental Psychology, University of Oxford, Oxford, UK
| | - Ayelet N Landau
- Department of Psychology, The Hebrew University of Jerusalem, Jerusalem, Israel.,Department of Cognitive Science, The Hebrew University of Jerusalem, Jerusalem, Israel
| |
Collapse
|
20
|
Luo L, Lu L. Studying rhythm processing in speech through the lens of auditory-motor synchronization. Front Neurosci 2023; 17:1146298. [PMID: 36937684 PMCID: PMC10017839 DOI: 10.3389/fnins.2023.1146298] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2023] [Accepted: 02/20/2023] [Indexed: 03/06/2023] Open
Abstract
Continuous speech is organized into a hierarchy of rhythms. Accurate processing of this rhythmic hierarchy through the interactions of auditory and motor systems is fundamental to speech perception and production. In this mini-review, we aim to evaluate the implementation of behavioral auditory-motor synchronization paradigms when studying rhythm processing in speech. First, we present an overview of the classic finger-tapping paradigm and its application in revealing differences in auditory-motor synchronization between the typical and clinical populations. Next, we highlight key findings on rhythm hierarchy processing in speech and non-speech stimuli from finger-tapping studies. Following this, we discuss the potential caveats of the finger-tapping paradigm and propose the speech-speech synchronization (SSS) task as a promising tool for future studies. Overall, we seek to raise interest in developing new methods to shed light on the neural mechanisms of speech processing.
Collapse
Affiliation(s)
- Lu Luo
- School of Psychology, Beijing Sport University, Beijing, China
- Laboratory of Sports Stress and Adaptation of General Administration of Sport, Beijing, China
| | - Lingxi Lu
- Center for the Cognitive Science of Language, Beijing Language and Culture University, Beijing, China
- *Correspondence: Lingxi Lu,
| |
Collapse
|
21
|
Wu W, Huang X, Qi X, Lu Y. Bias of Attentional Oscillations in Individuals with Subthreshold Depression: Evidence from a Pre-Cueing Facial Expression Judgment Task. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2022; 19:14559. [PMID: 36361443 PMCID: PMC9654165 DOI: 10.3390/ijerph192114559] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/12/2022] [Revised: 10/30/2022] [Accepted: 11/03/2022] [Indexed: 06/16/2023]
Abstract
Background: Study results regarding attentional bias in depressed individuals are inconsistent. Recent studies have found that attention is a discrete process, alternating between periods of either enhanced or diminished attention sensitivity. Whether a visual target can be detected depends on when it occurs relative to these oscillation rhythms. We infer that the inconsistency of attentional bias may be related to the abnormality of attentional oscillations in depressed individuals. Methods: A pre-cueing attentional task was used. We set 48 levels of stimulus onset asynchrony (SOA) between cues and targets and measured the response time (RT) of participants, as well as their EEG signals. Results: The RTs showed patterns of behavioral oscillations. Repeated-measure ANOVA indicated that subthreshold depressed participants had significantly higher RTs for negative expressions than for neutral but significantly lower RTs for positive than for neutral. The frequency analysis indicated that the RT oscillational frequency of subthreshold depressed participants to negative/positive expressions was different from that to neutral. The EEG time-frequency analysis showed that when faced with negative expressions, the intensity of the neural alpha oscillatory power of subthreshold depressed participants was significantly lower than that of normal controls. When faced with positive expressions, the intensity of neural alpha oscillatory power was significantly higher than that of normal controls. Conclusion: Compared to normal persons, subthreshold depressed individuals may have biases in both the amplitude and frequency of attentional oscillations. These attentional biases correspond to the intensity of their neural alpha wave rhythms.
Collapse
|
22
|
Laroche J, Tomassini A, Volpe G, Camurri A, Fadiga L, D’Ausilio A. Interpersonal sensorimotor communication shapes intrapersonal coordination in a musical ensemble. Front Hum Neurosci 2022; 16:899676. [PMID: 36248684 PMCID: PMC9556642 DOI: 10.3389/fnhum.2022.899676] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/19/2022] [Accepted: 09/01/2022] [Indexed: 11/25/2022] Open
Abstract
Social behaviors rely on the coordination of multiple effectors within one's own body as well as between the interacting bodies. However, little is known about how coupling at the interpersonal level impacts coordination among body parts at the intrapersonal level, especially in ecological, complex, situations. Here, we perturbed interpersonal sensorimotor communication in violin players of an orchestra and investigated how this impacted musicians' intrapersonal movements coordination. More precisely, first section violinists were asked to turn their back to the conductor and to face the second section of violinists, who still faced the conductor. Motion capture of head and bow kinematics showed that altering the usual interpersonal coupling scheme increased intrapersonal coordination. Our perturbation also induced smaller yet more complex head movements, which spanned multiple, faster timescales that closely matched the metrical levels of the musical score. Importantly, perturbation differentially increased intrapersonal coordination across these timescales. We interpret this behavioral shift as a sensorimotor strategy that exploits periodical movements to effectively tune sensory processing in time and allows coping with the disruption in the interpersonal coupling scheme. As such, head movements, which are usually deemed to fulfill communicative functions, may possibly be adapted to help regulate own performance in time.
Collapse
Affiliation(s)
- Julien Laroche
- Center for Translational Neurophysiology of Speech and Communication, Italian Institute of Technology, Ferrara, Italy
| | - Alice Tomassini
- Center for Translational Neurophysiology of Speech and Communication, Italian Institute of Technology, Ferrara, Italy
| | - Gualtiero Volpe
- Casa Paganini – InfoMus Research Centre, Department of Informatics, Bioengineering, Robotics and Systems Engineering (DIBRIS), University of Genova, Genova, Italy
| | - Antonio Camurri
- Casa Paganini – InfoMus Research Centre, Department of Informatics, Bioengineering, Robotics and Systems Engineering (DIBRIS), University of Genova, Genova, Italy
| | - Luciano Fadiga
- Center for Translational Neurophysiology of Speech and Communication, Italian Institute of Technology, Ferrara, Italy
- Sezione di Fisiologia, Dipartimento di Neuroscienze e Riabilitazione, Università di Ferrara, Ferrara, Italy
| | - Alessandro D’Ausilio
- Center for Translational Neurophysiology of Speech and Communication, Italian Institute of Technology, Ferrara, Italy
- Sezione di Fisiologia, Dipartimento di Neuroscienze e Riabilitazione, Università di Ferrara, Ferrara, Italy
| |
Collapse
|
23
|
Weineck K, Wen OX, Henry MJ. Neural synchronization is strongest to the spectral flux of slow music and depends on familiarity and beat salience. eLife 2022; 11:75515. [PMID: 36094165 PMCID: PMC9467512 DOI: 10.7554/elife.75515] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2021] [Accepted: 07/25/2022] [Indexed: 11/29/2022] Open
Abstract
Neural activity in the auditory system synchronizes to sound rhythms, and brain–environment synchronization is thought to be fundamental to successful auditory perception. Sound rhythms are often operationalized in terms of the sound’s amplitude envelope. We hypothesized that – especially for music – the envelope might not best capture the complex spectro-temporal fluctuations that give rise to beat perception and synchronized neural activity. This study investigated (1) neural synchronization to different musical features, (2) tempo-dependence of neural synchronization, and (3) dependence of synchronization on familiarity, enjoyment, and ease of beat perception. In this electroencephalography study, 37 human participants listened to tempo-modulated music (1–4 Hz). Independent of whether the analysis approach was based on temporal response functions (TRFs) or reliable components analysis (RCA), the spectral flux of music – as opposed to the amplitude envelope – evoked strongest neural synchronization. Moreover, music with slower beat rates, high familiarity, and easy-to-perceive beats elicited the strongest neural response. Our results demonstrate the importance of spectro-temporal fluctuations in music for driving neural synchronization, and highlight its sensitivity to musical tempo, familiarity, and beat salience. When we listen to a melody, the activity of our neurons synchronizes to the music: in fact, it is likely that the closer the match, the better we can perceive the piece. However, it remains unclear exactly which musical features our brain cells synchronize to. Previous studies, which have often used ‘simplified’ music, have highlighted that the amplitude envelope (how the intensity of the sounds changes over time) could be involved in this phenomenon, alongside factors such as musical training, attention, familiarity with the piece or even enjoyment. Whether differences in neural synchronization could explain why musical tastes vary between people is also still a matter of debate. In their study, Weineck et al. aim to better understand what drives neuronal synchronization to music. A technique known as electroencephalography was used to record brain activity in 37 volunteers listening to instrumental music whose tempo ranged from 60 to 240 beats per minute. The tunes varied across an array of features such as familiarity, enjoyment and how easy the beat was to perceive. Two different approaches were then used to calculate neural synchronization, which yielded converging results. The analyses revealed that three types of factors were associated with a strong neural synchronization. First, amongst the various cadences, a tempo of 60-120 beats per minute elicited the strongest match with neuronal activity. Interestingly, this beat is commonly found in Western pop music, is usually preferred by listeners, and often matches spontaneous body rhythms such as walking pace. Second, synchronization was linked to variations in pitch and sound quality (known as ‘spectral flux’) rather than in the amplitude envelope. And finally, familiarity and perceived beat saliency – but not enjoyment or musical expertise – were connected to stronger synchronization. These findings help to better understand how our brains allow us to perceive and connect with music. The work conducted by Weineck et al. should help other researchers to investigate this field; in particular, it shows how important it is to consider spectral flux rather than amplitude envelope in experiments that use actual music.
Collapse
Affiliation(s)
- Kristin Weineck
- Research Group "Neural and Environmental Rhythms", Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany.,Goethe University Frankfurt, Institute for Cell Biology and Neuroscience, Frankfurt am Main, Germany
| | - Olivia Xin Wen
- Research Group "Neural and Environmental Rhythms", Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany
| | - Molly J Henry
- Research Group "Neural and Environmental Rhythms", Max Planck Institute for Empirical Aesthetics, Frankfurt am Main, Germany.,Department of Psychology, Toronto Metropolitan University, Toronto, Canada
| |
Collapse
|
24
|
Zhang H, Yang S, Qiao Y, Ge Q, Tang Y, Northoff G, Zang Y. Default mode network mediates low-frequency fluctuations in brain activity and behavior during sustained attention. Hum Brain Mapp 2022; 43:5478-5489. [PMID: 35903957 PMCID: PMC9704793 DOI: 10.1002/hbm.26024] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2022] [Revised: 07/02/2022] [Accepted: 07/10/2022] [Indexed: 01/15/2023] Open
Abstract
The low-frequency (<0.1 Hz) fluctuation in sustained attention attracts enormous interest in cognitive neuroscience and clinical research since it always leads to cognitive and behavioral lapses. What is the source of the spontaneous fluctuation in sustained attention in neural activity, and how does the neural fluctuation relate to behavioral fluctuation? Here, we address these questions by collecting and analyzing two independent fMRI and behavior datasets. We show that the neural (fMRI) fluctuation in a key brain network, the default-mode network (DMN), mediate behavioral (reaction time) fluctuation during sustained attention. DMN shows the increased amplitude of fluctuation, which correlates with the behavioral fluctuation in a similar frequency range (0.01-0.1 Hz) but not in the lower (<0.01 Hz) or higher (>0.1 Hz) frequency range. This was observed during both auditory and visual sustained attention and was replicable across independent datasets. These results provide a novel insight into the neural source of attention-fluctuation and extend the former concept that DMN was deactivated in cognitive tasks. More generally, our findings highlight the temporal dynamic of the brain-behavior relationship.
Collapse
Affiliation(s)
- Hang Zhang
- Centre for Cognition and Brain DisordersThe Affiliated Hospital of Hangzhou Normal UniversityHangzhouZhejiangChina,Institute of Psychological ScienceHangzhou Normal UniversityHangzhouZhejiangChina,Zhejiang Key Laboratory for Research in Assessment of Cognitive ImpairmentHangzhouZhejiangChina
| | - Shi‐You Yang
- Centre for Cognition and Brain DisordersThe Affiliated Hospital of Hangzhou Normal UniversityHangzhouZhejiangChina,Institute of Psychological ScienceHangzhou Normal UniversityHangzhouZhejiangChina,Zhejiang Key Laboratory for Research in Assessment of Cognitive ImpairmentHangzhouZhejiangChina
| | - Yang Qiao
- Centre for Cognition and Brain DisordersThe Affiliated Hospital of Hangzhou Normal UniversityHangzhouZhejiangChina,Institute of Psychological ScienceHangzhou Normal UniversityHangzhouZhejiangChina,Zhejiang Key Laboratory for Research in Assessment of Cognitive ImpairmentHangzhouZhejiangChina
| | - Qiu Ge
- Centre for Cognition and Brain DisordersThe Affiliated Hospital of Hangzhou Normal UniversityHangzhouZhejiangChina,Institute of Psychological ScienceHangzhou Normal UniversityHangzhouZhejiangChina,Zhejiang Key Laboratory for Research in Assessment of Cognitive ImpairmentHangzhouZhejiangChina
| | - Yi‐Yuan Tang
- College of Health SolutionsArizona State UniversityTempeArizonaUSA
| | - Georg Northoff
- Centre for Cognition and Brain DisordersThe Affiliated Hospital of Hangzhou Normal UniversityHangzhouZhejiangChina,Institute of Mental Health ResearchUniversity of OttawaOttawaCanada
| | - Yu‐Feng Zang
- Centre for Cognition and Brain DisordersThe Affiliated Hospital of Hangzhou Normal UniversityHangzhouZhejiangChina,Institute of Psychological ScienceHangzhou Normal UniversityHangzhouZhejiangChina,Zhejiang Key Laboratory for Research in Assessment of Cognitive ImpairmentHangzhouZhejiangChina
| |
Collapse
|
25
|
Marchetti G. The why of the phenomenal aspect of consciousness: Its main functions and the mechanisms underpinning it. Front Psychol 2022; 13:913309. [PMID: 35967722 PMCID: PMC9368316 DOI: 10.3389/fpsyg.2022.913309] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2022] [Accepted: 07/01/2022] [Indexed: 12/02/2022] Open
Abstract
What distinguishes conscious information processing from other kinds of information processing is its phenomenal aspect (PAC), the-what-it-is-like for an agent to experience something. The PAC supplies the agent with a sense of self, and informs the agent on how its self is affected by the agent’s own operations. The PAC originates from the activity that attention performs to detect the state of what I define “the self” (S). S is centered and develops on a hierarchy of innate and acquired values, and is primarily expressed via the central and peripheral nervous systems; it maps the agent’s body and cognitive capacities, and its interactions with the environment. The detection of the state of S by attention modulates the energy level of the organ of attention (OA), i.e., the neural substrate that underpins attention. This modulation generates the PAC. The PAC can be qualified according to five dimensions: qualitative, quantitative, hedonic, temporal and spatial. Each dimension can be traced back to a specific feature of the modulation of the energy level of the OA.
Collapse
|
26
|
De Winne J, Devos P, Leman M, Botteldooren D. With No Attention Specifically Directed to It, Rhythmic Sound Does Not Automatically Facilitate Visual Task Performance. Front Psychol 2022; 13:894366. [PMID: 35756201 PMCID: PMC9226390 DOI: 10.3389/fpsyg.2022.894366] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2022] [Accepted: 05/19/2022] [Indexed: 11/22/2022] Open
Abstract
In a century where humans and machines—powered by artificial intelligence or not—increasingly work together, it is of interest to understand human processing of multi-sensory stimuli in relation to attention and working memory. This paper explores whether and when supporting visual information with rhythmic auditory stimuli can optimize multi-sensory information processing. In turn, this can make the interaction between humans or between machines and humans more engaging, rewarding and activating. For this purpose a novel working memory paradigm was developed where participants are presented with a series of five target digits randomly interchanged with five distractor digits. Their goal is to remember the target digits and recall them orally. Depending on the condition support is provided by audio and/or rhythm. It is expected that the sound will lead to a better performance. It is also expected that this effect of sound is different in case of rhythmic and non-rhythmic sound. Last but not least, some variability is expected across participants. To make correct conclusions, the data of the experiment was statistically analyzed in a classic way, but also predictive models were developed in order to predict outcomes based on a range of input variables related to the experiment and the participant. The effect of auditory support could be confirmed, but no difference was observed between rhythmic and non-rhythmic sounds. Overall performance was indeed affected by individual differences, such as visual dominance or perceived task difficulty. Surprisingly a music education did not significantly affect the performance and even tended toward a negative effect. To better understand the underlying processes of attention, also brain activation data, e.g., by means of electroencephalography (EEG), should be recorded. This approach can be subject to a future work.
Collapse
Affiliation(s)
- Jorg De Winne
- Department of Information Technology, WAVES, Ghent University, Ghent, Belgium.,Department of Art, Music and Theater Studies, Institute for Psychoacoustics and Electronic Music (IPEM), Ghent University, Ghent, Belgium
| | - Paul Devos
- Department of Information Technology, WAVES, Ghent University, Ghent, Belgium
| | - Marc Leman
- Department of Art, Music and Theater Studies, Institute for Psychoacoustics and Electronic Music (IPEM), Ghent University, Ghent, Belgium
| | - Dick Botteldooren
- Department of Information Technology, WAVES, Ghent University, Ghent, Belgium
| |
Collapse
|
27
|
Kachlicka M, Laffere A, Dick F, Tierney A. Slow phase-locked modulations support selective attention to sound. Neuroimage 2022; 252:119024. [PMID: 35231629 PMCID: PMC9133470 DOI: 10.1016/j.neuroimage.2022.119024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2022] [Revised: 02/16/2022] [Accepted: 02/19/2022] [Indexed: 11/16/2022] Open
Abstract
To make sense of complex soundscapes, listeners must select and attend to task-relevant streams while ignoring uninformative sounds. One possible neural mechanism underlying this process is alignment of endogenous oscillations with the temporal structure of the target sound stream. Such a mechanism has been suggested to mediate attentional modulation of neural phase-locking to the rhythms of attended sounds. However, such modulations are compatible with an alternate framework, where attention acts as a filter that enhances exogenously-driven neural auditory responses. Here we attempted to test several predictions arising from the oscillatory account by playing two tone streams varying across conditions in tone duration and presentation rate; participants attended to one stream or listened passively. Attentional modulation of the evoked waveform was roughly sinusoidal and scaled with rate, while the passive response did not. However, there was only limited evidence for continuation of modulations through the silence between sequences. These results suggest that attentionally-driven changes in phase alignment reflect synchronization of slow endogenous activity with the temporal structure of attended stimuli.
Collapse
Affiliation(s)
- Magdalena Kachlicka
- Department of Psychological Sciences, Birkbeck, University of London, Malet Street, London WC1E 7HX, England
| | - Aeron Laffere
- Department of Psychological Sciences, Birkbeck, University of London, Malet Street, London WC1E 7HX, England
| | - Fred Dick
- Department of Psychological Sciences, Birkbeck, University of London, Malet Street, London WC1E 7HX, England; Division of Psychology & Language Sciences, UCL, Gower Street, London WC1E 6BT, England
| | - Adam Tierney
- Department of Psychological Sciences, Birkbeck, University of London, Malet Street, London WC1E 7HX, England.
| |
Collapse
|
28
|
Liu Q, Ulloa A, Horwitz B. The Spatiotemporal Neural Dynamics of Intersensory Attention Capture of Salient Stimuli: A Large-Scale Auditory-Visual Modeling Study. Front Comput Neurosci 2022; 16:876652. [PMID: 35645750 PMCID: PMC9133449 DOI: 10.3389/fncom.2022.876652] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2022] [Accepted: 04/04/2022] [Indexed: 11/13/2022] Open
Abstract
The spatiotemporal dynamics of the neural mechanisms underlying endogenous (top-down) and exogenous (bottom-up) attention, and how attention is controlled or allocated in intersensory perception are not fully understood. We investigated these issues using a biologically realistic large-scale neural network model of visual-auditory object processing of short-term memory. We modeled and incorporated into our visual-auditory object-processing model the temporally changing neuronal mechanisms for the control of endogenous and exogenous attention. The model successfully performed various bimodal working memory tasks, and produced simulated behavioral and neural results that are consistent with experimental findings. Simulated fMRI data were generated that constitute predictions that human experiments could test. Furthermore, in our visual-auditory bimodality simulations, we found that increased working memory load in one modality would reduce the distraction from the other modality, and a possible network mediating this effect is proposed based on our model.
Collapse
Affiliation(s)
- Qin Liu
- Brain Imaging and Modeling Section, National Institute on Deafness and Other Communication Disorders, National Institutes of Health, Bethesda, MD, United States
- Department of Physics, University of Maryland, College Park, College Park, MD, United States
| | - Antonio Ulloa
- Brain Imaging and Modeling Section, National Institute on Deafness and Other Communication Disorders, National Institutes of Health, Bethesda, MD, United States
- Center for Information Technology, National Institutes of Health, Bethesda, MD, United States
| | - Barry Horwitz
- Brain Imaging and Modeling Section, National Institute on Deafness and Other Communication Disorders, National Institutes of Health, Bethesda, MD, United States
- *Correspondence: Barry Horwitz,
| |
Collapse
|
29
|
Ferreri L, Versace R, Victor C, Plancher G. Temporal Predictions in Space: Isochronous Rhythms Promote Forward Projections of the Body. Front Psychol 2022; 13:832322. [PMID: 35602686 PMCID: PMC9115380 DOI: 10.3389/fpsyg.2022.832322] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2021] [Accepted: 03/17/2022] [Indexed: 11/18/2022] Open
Abstract
A regular rhythmic stimulation increases people's ability to anticipate future events in time and to move their body in space. Temporal concepts are usually prescribed to spatial locations through a past-behind and future-ahead mapping. In this study, we tested the hypothesis that a regular rhythmic stimulation could promote the forward-body (i.e., toward the future) projections in the peri-personal space. In a Visual Approach/Avoidance by the Self Task (VAAST), participants (N = 24) observed a visual scene on the screen (i.e., a music studio with a metronome in the middle). They were exposed to 3 s of auditory isochronous or non-isochronous rhythms, after which they were asked to make as quickly as possible a perceptual judgment on the visual scene (i.e., whether the metronome pendulum was pointing to the right or left). The responses could trigger a forward or backward visual flow, i.e., approaching or moving them away from the scene. Results showed a significant interaction between the rhythmic stimulation and the movement projections (p < 0.001): participants were faster for responses triggering forward-body projections (but not backward-body projections) after the exposure to isochronous (but not non-isochronous) rhythm. By highlighting the strong link between isochronous rhythms and forward-body projections, these findings support the idea that temporal predictions driven by a regular auditory stimulation are grounded in a perception-action system integrating temporal and spatial information.
Collapse
Affiliation(s)
| | | | | | - Gaën Plancher
- Laboratoire d’Étude des Mécanismes Cognitifs, Université Lumière Lyon 2, Lyon, France
| |
Collapse
|
30
|
Jones KT, Smith CC, Gazzaley A, Zanto TP. Research outside the laboratory: Longitudinal at-home neurostimulation. Behav Brain Res 2022; 428:113894. [DOI: 10.1016/j.bbr.2022.113894] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2021] [Revised: 03/14/2022] [Accepted: 04/11/2022] [Indexed: 11/02/2022]
|
31
|
Kliger Amrani A, Zion Golumbic E. Memory-Paced Tapping to Auditory Rhythms: Effects of Rate, Speech, and Motor Engagement. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2022; 65:923-939. [PMID: 35133867 DOI: 10.1044/2021_jslhr-21-00406] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
PURPOSE Humans have a near-automatic tendency to entrain their motor actions to rhythms in the environment. Entrainment has been hypothesized to play an important role in processing naturalistic stimuli, such as speech and music, which have intrinsically rhythmic properties. Here, we studied two facets of entraining one's rhythmic motor actions to an external stimulus: (a) synchronized finger tapping to auditory rhythmic stimuli and (b) memory-paced reproduction of a previously heard rhythm. METHOD Using modifications of the Synchronization-Continuation tapping paradigm, we studied how these two rhythmic behaviors were affected by different stimulus and task features. We tested synchronization and memory-paced tapping for a broad range of rates, from stimulus onset asynchrony of subsecond to suprasecond, both for strictly isochronous tone sequences and for rhythmic speech stimuli (counting from 1 to 10), which are more ecological yet less isochronous. We also asked what role motor engagement plays in forming a stable internal representation for rhythms and guiding memory-paced tapping. RESULTS AND CONCLUSIONS Our results show that individuals can flexibly synchronize their motor actions to a very broad range of rhythms. However, this flexibility does not extend to memory-paced tapping, which is accurate only in a narrower range of rates, around ~1.5 Hz. This pattern suggests that intrinsic rhythmic defaults in the auditory and/or motor system influence the internal representation of rhythms, in the absence of an external pacemaker. Interestingly, memory-paced tapping for speech rhythms and simple tone sequences shared similar "optimal rates," although with reduced accuracy, suggesting that internal constraints on rhythmic entrainment generalize to more ecological stimuli. Last, we found that actively synchronizing to tones versus passively listening to them led to more accurate memory-paced tapping performance, which emphasizes the importance of action-perception interactions in forming stable entrainment to external rhythms.
Collapse
Affiliation(s)
- Anat Kliger Amrani
- The Leslie and Susan Gonda Multidisciplinary Brain Research Center, Bar-Ilan University, Ramat Gan, Israel
| | - Elana Zion Golumbic
- The Leslie and Susan Gonda Multidisciplinary Brain Research Center, Bar-Ilan University, Ramat Gan, Israel
| |
Collapse
|
32
|
Herbst SK, Stefanics G, Obleser J. Endogenous modulation of delta phase by expectation–A replication of Stefanics et al., 2010. Cortex 2022; 149:226-245. [DOI: 10.1016/j.cortex.2022.02.001] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2021] [Revised: 01/31/2022] [Accepted: 02/01/2022] [Indexed: 11/03/2022]
|
33
|
Westerberg JA, Sigworth EA, Schall JD, Maier A. Pop-out search instigates beta-gated feature selectivity enhancement across V4 layers. Proc Natl Acad Sci U S A 2021; 118:e2103702118. [PMID: 34893538 PMCID: PMC8685673 DOI: 10.1073/pnas.2103702118] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/21/2021] [Indexed: 11/18/2022] Open
Abstract
Visual search is a workhorse for investigating how attention interacts with processing of sensory information. Attentional selection has been linked to altered cortical sensory responses and feature preferences (i.e., tuning). However, attentional modulation of feature selectivity during search is largely unexplored. Here we map the spatiotemporal profile of feature selectivity during singleton search. Monkeys performed a search where a pop-out feature determined the target of attention. We recorded laminar neural responses from visual area V4. We first identified "feature columns" which showed preference for individual colors. In the unattended condition, feature columns were significantly more selective in superficial relative to middle and deep layers. Attending a stimulus increased selectivity in all layers but not equally. Feature selectivity increased most in the deep layers, leading to higher selectivity in extragranular layers as compared to the middle layer. This attention-induced enhancement was rhythmically gated in phase with the beta-band local field potential. Beta power dominated both extragranular laminar compartments, but current source density analysis pointed to an origin in superficial layers, specifically. While beta-band power was present regardless of attentional state, feature selectivity was only gated by beta in the attended condition. Neither the beta oscillation nor its gating of feature selectivity varied with microsaccade production. Importantly, beta modulation of neural activity predicted response times, suggesting a direct link between attentional gating and behavioral output. Together, these findings suggest beta-range synaptic activation in V4's superficial layers rhythmically gates attentional enhancement of feature tuning in a way that affects the speed of attentional selection.
Collapse
Affiliation(s)
- Jacob A Westerberg
- Department of Psychology, Vanderbilt Brain Institute, Vanderbilt Vision Research Center, Vanderbilt University, Nashville, TN 37240;
| | | | - Jeffrey D Schall
- Centre for Vision Research, Vision: Science to Applications Program, Department of Biology and Department of Psychology, York University, Toronto, ON M3J 1P3, Canada
| | - Alexander Maier
- Department of Psychology, Vanderbilt Brain Institute, Vanderbilt Vision Research Center, Vanderbilt University, Nashville, TN 37240
| |
Collapse
|
34
|
Pei L, Longcamp M, Leung FKS, Ouyang G. Temporally resolved neural dynamics underlying handwriting. Neuroimage 2021; 244:118578. [PMID: 34534659 DOI: 10.1016/j.neuroimage.2021.118578] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2021] [Revised: 08/31/2021] [Accepted: 09/14/2021] [Indexed: 01/22/2023] Open
Abstract
How do the temporal dynamics of neural activity encode highly coordinated visual-motor behaviour? To capture the millisecond-resolved neural activations associated with fine visual-motor skills, we devised a co-registration system to simultaneously record electroencephalogram and handwriting kinematics while participants were performing four handwriting tasks (writing in Chinese/English scripts with their dominant/non-dominant hand). The neural activation associated with each stroke was clearly identified with a well-structured and reliable pattern. The functional significance of this pattern was validated by its significant associations with language, hand and the cognitive stages and kinematics of handwriting. Furthermore, the handwriting rhythmicity was found to be synchronised to the brain's ongoing theta oscillation, and the synchronisation was associated with the factor of language and hand. These major findings imply an implication between motor skill formation and the interplay between the rhythms in the brain and the peripheral systems.
Collapse
Affiliation(s)
- Leisi Pei
- Faculty of Education, The University of Hong Kong, Hong Kong, China
| | | | | | - Guang Ouyang
- Faculty of Education, The University of Hong Kong, Hong Kong, China.
| |
Collapse
|
35
|
De Kock R, Gladhill KA, Ali MN, Joiner WM, Wiener M. How movements shape the perception of time. Trends Cogn Sci 2021; 25:950-963. [PMID: 34531138 PMCID: PMC9991018 DOI: 10.1016/j.tics.2021.08.002] [Citation(s) in RCA: 20] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2021] [Revised: 08/07/2021] [Accepted: 08/09/2021] [Indexed: 11/16/2022]
Abstract
In order to keep up with a changing environment, mobile organisms must be capable of deciding both where and when to move. This precision necessitates a strong sense of time, as otherwise we would fail in many of our movement goals. Yet, despite this intrinsic link, only recently have researchers begun to understand how these two features interact. Primarily, two effects have been observed: movements can bias time estimates, but they can also make them more precise. Here we review this literature and propose that both effects can be explained by a Bayesian cue combination framework, in which movement itself affords the most precise representation of time, which can influence perception in either feedforward or active sensing modes.
Collapse
|
36
|
The influence of auditory rhythms on the speed of inferred motion. Atten Percept Psychophys 2021; 84:2360-2383. [PMID: 34435321 DOI: 10.3758/s13414-021-02364-4] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/26/2021] [Indexed: 12/24/2022]
Abstract
The present research explored the influence of isochronous auditory rhythms on the timing of movement-related prediction in two experiments. In both experiments, participants observed a moving disc that was visible for a predetermined period before disappearing behind a small, medium, or large occluded area for the remainder of its movement. In Experiment 1, the disc was visible for 1 s. During this period, participants were exposed to either a fast or slow auditory rhythm, or they heard nothing. They were instructed to press a key to indicate when they believed the moving disc had reached a specified location on the other side of the occluded area. The procedure measured the (signed) error in participants' estimate of the time it would take for a moving object to contact a stationary one. The principal results of Experiment 1 were main effects of the rate of the auditory rhythm and of the size of the occlusion on participants' judgments. In Experiment 2, the period of visibility was varied with size of the occlusion area to keep the total movement time constant for all three levels of occlusion. The results replicated the main effect of rhythm found in Experiment 1 and showed a small, significant interaction, but indicated no main effect of occlusion size. Overall, the results indicate that exposure to fast isochronous auditory rhythms during an interval of inferred motion can influence the imagined rate of such motion and suggest a possible role of an internal rhythmicity in the maintenance of temporally accurate dynamic mental representations.
Collapse
|
37
|
Bauer AKR, van Ede F, Quinn AJ, Nobre AC. Rhythmic Modulation of Visual Perception by Continuous Rhythmic Auditory Stimulation. J Neurosci 2021; 41:7065-7075. [PMID: 34261698 PMCID: PMC8372019 DOI: 10.1523/jneurosci.2980-20.2021] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2020] [Revised: 04/16/2021] [Accepted: 05/29/2021] [Indexed: 11/21/2022] Open
Abstract
At any given moment our sensory systems receive multiple, often rhythmic, inputs from the environment. Processing of temporally structured events in one sensory modality can guide both behavioral and neural processing of events in other sensory modalities, but whether this occurs remains unclear. Here, we used human electroencephalography (EEG) to test the cross-modal influences of a continuous auditory frequency-modulated (FM) sound on visual perception and visual cortical activity. We report systematic fluctuations in perceptual discrimination of brief visual stimuli in line with the phase of the FM-sound. We further show that this rhythmic modulation in visual perception is related to an accompanying rhythmic modulation of neural activity recorded over visual areas. Importantly, in our task, perceptual and neural visual modulations occurred without any abrupt and salient onsets in the energy of the auditory stimulation and without any rhythmic structure in the visual stimulus. As such, the results provide a critical validation for the existence and functional role of cross-modal entrainment and demonstrates its utility for organizing the perception of multisensory stimulation in the natural environment.SIGNIFICANCE STATEMENT Our sensory environment is filled with rhythmic structures that are often multi-sensory in nature. Here, we show that the alignment of neural activity to the phase of an auditory frequency-modulated (FM) sound has cross-modal consequences for vision: yielding systematic fluctuations in perceptual discrimination of brief visual stimuli that are mediated by accompanying rhythmic modulation of neural activity recorded over visual areas. These cross-modal effects on visual neural activity and perception occurred without any abrupt and salient onsets in the energy of the auditory stimulation and without any rhythmic structure in the visual stimulus. The current work shows that continuous auditory fluctuations in the natural environment can provide a pacing signal for neural activity and perception across the senses.
Collapse
Affiliation(s)
- Anna-Katharina R Bauer
- Department of Experimental Psychology, University of Oxford, Oxford OX2 6GG, United Kingdom
- Oxford Centre for Human Brain Activity, Wellcome Centre for Integrative Neuroimaging, Department of Psychiatry, University of Oxford, Oxford OX3 7JX, United Kingdom
| | - Freek van Ede
- Oxford Centre for Human Brain Activity, Wellcome Centre for Integrative Neuroimaging, Department of Psychiatry, University of Oxford, Oxford OX3 7JX, United Kingdom
- Institute for Brain and Behavior Amsterdam, Department of Experimental and Applied Psychology, Vrije Universiteit Amsterdam, Amsterdam 1081BT, The Netherlands
| | - Andrew J Quinn
- Department of Experimental Psychology, University of Oxford, Oxford OX2 6GG, United Kingdom
- Oxford Centre for Human Brain Activity, Wellcome Centre for Integrative Neuroimaging, Department of Psychiatry, University of Oxford, Oxford OX3 7JX, United Kingdom
| | - Anna C Nobre
- Department of Experimental Psychology, University of Oxford, Oxford OX2 6GG, United Kingdom
- Oxford Centre for Human Brain Activity, Wellcome Centre for Integrative Neuroimaging, Department of Psychiatry, University of Oxford, Oxford OX3 7JX, United Kingdom
| |
Collapse
|
38
|
Sensory attenuation is modulated by the contrasting effects of predictability and control. Neuroimage 2021; 237:118103. [PMID: 33957233 DOI: 10.1016/j.neuroimage.2021.118103] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2020] [Revised: 03/18/2021] [Accepted: 04/23/2021] [Indexed: 11/22/2022] Open
Abstract
Self-generated stimuli have been found to elicit a reduced sensory response compared with externally-generated stimuli. However, much of the literature has not adequately controlled for differences in the temporal predictability and temporal control of stimuli. In two experiments, we compared the N1 (and P2) components of the auditory-evoked potential to self- and externally-generated tones that differed with respect to these two factors. In Experiment 1 (n = 42), we found that increasing temporal predictability reduced N1 amplitude in a manner that may often account for the observed reduction in sensory response to self-generated sounds. We also observed that reducing temporal control over the tones resulted in a reduction in N1 amplitude. The contrasting effects of temporal predictability and temporal control on N1 amplitude meant that sensory attenuation prevailed when controlling for each. Experiment 2 (n = 38) explored the potential effect of selective attention on the results of Experiment 1 by modifying task requirements such that similar levels of attention were allocated to the visual stimuli across conditions. The results of Experiment 2 replicated those of Experiment 1, and suggested that the observed effects of temporal control and sensory attenuation were not driven by differences in attention. Given that self- and externally-generated sensations commonly differ with respect to both temporal predictability and temporal control, findings of the present study may necessitate a re-evaluation of the experimental paradigms used to study sensory attenuation.
Collapse
|
39
|
Bistable perception alternates between internal and external modes of sensory processing. iScience 2021; 24:102234. [PMID: 33748716 PMCID: PMC7967014 DOI: 10.1016/j.isci.2021.102234] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2020] [Revised: 01/20/2021] [Accepted: 02/21/2021] [Indexed: 12/28/2022] Open
Abstract
Perceptual history can exert pronounced effects on the contents of conscious experience: when confronted with completely ambiguous stimuli, perception does not waver at random between diverging stimulus interpretations but sticks with recent percepts for prolonged intervals. Here, we investigated the relevance of perceptual history in situations more similar to everyday experience, where sensory stimuli are usually not completely ambiguous. Using partially ambiguous visual stimuli, we found that the balance between past and present is not stable over time but slowly fluctuates between two opposing modes. For time periods of up to several minutes, perception was either largely determined by perceptual history or driven predominantly by disambiguating sensory evidence. Computational modeling suggested that the construction of unambiguous conscious experiences is modulated by slow fluctuations between internally and externally oriented modes of sensory processing.
Collapse
|
40
|
Richards VM, Tisby MK, Suzuki-Gill EN, Shen Y. Sub-optimal construction of an auditory profile from temporally distributed spectral information. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2021; 149:1567. [PMID: 33765831 PMCID: PMC7943247 DOI: 10.1121/10.0003646] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/13/2020] [Revised: 02/12/2021] [Accepted: 02/16/2021] [Indexed: 06/12/2023]
Abstract
When spectral components of a complex sound are presented not simultaneously but distributed over time, human listeners can still, to a degree, perceptually recover the spectral profile of the sound. This capability of integrating spectral information over time was investigated using a cued informational masking paradigm. Listeners detected a 1-kHz pure tone in a simultaneous masker composed of six random-frequency tones drawn on every trial. The spectral profile of the masker was cued using a precursor sound that consisted of a sequence of 50-ms bursts, separated by inter-burst intervals of 100 ms. Each burst in the precursor consisted of pure tones at the masker frequencies with tones appearing at each of the masker frequencies at different presentation probabilities. As the presentation probability increased in different conditions, the detectability of the target improved, indicating reliable precursor cuing regarding the spectral content of the masker. For many listeners, performance did not significantly improve as the number of precursor bursts increased from 2 to 16, indicating inefficient integration of information beyond 2 bursts. Additional analyses suggest that when intensity of the bursts is relatively constant, the contribution of the precursor is dominated by information in the initial burst.
Collapse
Affiliation(s)
- Virginia M Richards
- Department of Cognitive Sciences, University of California, Irvine, California 92687, USA
| | - Mariel Kazuko Tisby
- Department of Cognitive Sciences, University of California, Irvine, California 92687, USA
| | - Eli N Suzuki-Gill
- Department of Cognitive Sciences, University of California, Irvine, California 92687, USA
| | - Yi Shen
- Department of Speech and Hearing Sciences, University of Washington, Seattle, Washington 98105, USA
| |
Collapse
|
41
|
Assaneo MF, Rimmele JM, Sanz Perl Y, Poeppel D. Speaking rhythmically can shape hearing. Nat Hum Behav 2021; 5:71-82. [PMID: 33046860 DOI: 10.1038/s41562-020-00962-0] [Citation(s) in RCA: 29] [Impact Index Per Article: 9.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/03/2020] [Accepted: 09/09/2020] [Indexed: 01/28/2023]
Abstract
Evidence suggests that temporal predictions arising from the motor system can enhance auditory perception. However, in speech perception, we lack evidence of perception being modulated by production. Here we show a behavioural protocol that captures the existence of such auditory-motor interactions. Participants performed a syllable discrimination task immediately after producing periodic syllable sequences. Two speech rates were explored: a 'natural' (individually preferred) and a fixed 'non-natural' (2 Hz) rate. Using a decoding approach, we show that perceptual performance is modulated by the stimulus phase determined by a participant's own motor rhythm. Remarkably, for 'natural' and 'non-natural' rates, this finding is restricted to a subgroup of the population with quantifiable auditory-motor coupling. The observed pattern is compatible with a neural model assuming a bidirectional interaction of auditory and speech motor cortices. Crucially, the model matches the experimental results only if it incorporates individual differences in the strength of the auditory-motor connection.
Collapse
Affiliation(s)
- M Florencia Assaneo
- Department of Psychology, New York University, New York, NY, USA. .,Instituto de Neurobiología, Universidad Nacional Autónoma de México, Santiago de Querétaro, Mexico.
| | - Johanna M Rimmele
- Department of Neuroscience, Max-Planck-Institute for Empirical Aesthetics, Frankfurt am Main, Germany.
| | - Yonatan Sanz Perl
- Department of Physics, FCEyN, University of Buenos Aires, Buenos Aires, Argentina.,National Scientific and Technical Research Council (CONICET), Buenos Aires, Argentina.,University of San Andrés, Buenos Aires, Argentina
| | - David Poeppel
- Department of Psychology, New York University, New York, NY, USA.,Department of Neuroscience, Max-Planck-Institute for Empirical Aesthetics, Frankfurt am Main, Germany
| |
Collapse
|
42
|
Terashima H, Kihara K, Kawahara JI, Kondo HM. Common principles underlie the fluctuation of auditory and visual sustained attention. Q J Exp Psychol (Hove) 2020; 74:705-715. [PMID: 33103992 PMCID: PMC8044612 DOI: 10.1177/1747021820972255] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Sustained attention plays an important role in adaptive behaviours in everyday activities. As previous studies have mostly focused on vision, and attentional resources have been thought to be specific to sensory modalities, it is still unclear how mechanisms of attentional fluctuations overlap between visual and auditory modalities. To reduce the effects of sudden stimulus onsets, we developed a new gradual-onset continuous performance task (gradCPT) in the auditory domain and compared dynamic fluctuation of sustained attention in vision and audition. In the auditory gradCPT, participants were instructed to listen to a stream of narrations and judge the gender of each narration. In the visual gradCPT, they were asked to observe a stream of scenery images and indicate whether the scene was a city or mountain. Our within-individual comparison revealed that auditory and visual attention are similar in terms of the false alarm rate and dynamic properties including fluctuation frequency. Absolute timescales of the fluctuation in the two modalities were comparable, notwithstanding the difference in stimulus onset asynchrony. The results suggest that fluctuations of visual and auditory attention are underpinned by common principles and support models with a more central, modality-general controller.
Collapse
Affiliation(s)
- Hiroki Terashima
- NTT Communication Science Laboratories, Nippon Telegraph and Telephone, Atsugi, Japan
| | - Ken Kihara
- Department of Information Technology and Human Factors, National Institute of Advanced Industrial Science and Technology (AIST), Tsukuba, Japan
| | - Jun I Kawahara
- Department of Psychology, Hokkaido University, Sapporo, Japan
| | - Hirohito M Kondo
- NTT Communication Science Laboratories, Nippon Telegraph and Telephone, Atsugi, Japan.,School of Psychology, Chukyo University, Nagoya, Japan
| |
Collapse
|
43
|
Visual aperiodic temporal prediction increases perceptual sensitivity and reduces response latencies. Acta Psychol (Amst) 2020; 209:103129. [PMID: 32619784 DOI: 10.1016/j.actpsy.2020.103129] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2019] [Revised: 06/17/2020] [Accepted: 06/22/2020] [Indexed: 11/23/2022] Open
Abstract
As a predictive organ, the brain can predict upcoming events to guide perception and action in the process of adaptive behavior. The classical models of oscillatory entrainment explain the facilitating effects that occur after periodic stimulation in behavior but cannot explain aperiodic facilitating effects. In the present study, by comparing the behavior performance of participants in periodic predictable (PP), aperiodic predictable (AP) and aperiodic unpredictable (AU) stimulus streams, we investigated the effect of an aperiodic predictable stream on the perceptual sensitivity and response latencies in the visual modality. The results showed that there was no difference between PP and AP conditions in sensitivity (d') and reaction times (RTs), both of which were significantly different from those in the AU condition. Moreover, a significant correlation between d' and RTs was observed when predictability existed. These results indicate that the aperiodic predictable stimulus streams increases perceptual sensitivity and reduces response latencies in a top-down manner. Individuals proactively and flexibly predict upcoming events based on the temporal structure of visual stimuli in the service of adaptive behavior.
Collapse
|