1
|
Buaron B, Reznik D, Mukamel R. High or low expectations: Expected intensity of action outcome is embedded in action kinetics. Cognition 2024; 251:105887. [PMID: 39018636 DOI: 10.1016/j.cognition.2024.105887] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2024] [Revised: 07/10/2024] [Accepted: 07/11/2024] [Indexed: 07/19/2024]
Abstract
Goal-directed actions are performed in order to attain certain sensory consequences in the world. However, expected attributes of these consequences can affect the kinetics of the action. In a set of three studies (n = 120), we examined how expected attributes of stimulus outcome (intensity) shape the kinetics of the triggering action (applied force), even when the action kinetic and attribute are independent. We show that during action execution (button presses), the expected intensity of sensory outcome affects the applied force of the stimulus-producing action in an inverse fashion. Thus, participants applied more force when the expected intensity of the outcome was low (vs. high intensity outcome). In the absence of expectations or when actions were performed in response to the sensory event, no intensity-dependent force modulations were found. Thus, expectations of stimulus intensity and causality play an important role in shaping action kinetics. Finally, we examined the relationship between kinetics and perception and found no influence of applied force level on perceptual detection of low intensity (near-threshold) outcome stimuli, suggesting no causal link between the two. Taken together, our results demonstrate that action kinetics are embedded with high-level context such as the expectation of consequence intensity and the causal relationship with environmental cues.
Collapse
Affiliation(s)
- Batel Buaron
- Sagol School of Neuroscience and School of Psychological Sciences, Tel-Aviv University, Israel
| | - Daniel Reznik
- Department of Psychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipizg, Germany
| | - Roy Mukamel
- Sagol School of Neuroscience and School of Psychological Sciences, Tel-Aviv University, Israel.
| |
Collapse
|
2
|
Tast V, Schröger E, Widmann A. Suppression and omission effects in auditory predictive processing-Two of the same? Eur J Neurosci 2024; 60:4049-4062. [PMID: 38764129 DOI: 10.1111/ejn.16393] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2023] [Revised: 04/24/2024] [Accepted: 04/30/2024] [Indexed: 05/21/2024]
Abstract
Recent theories describe perception as an inferential process based on internal predictive models that are adjusted by prediction violations (prediction error). Two different modulations of the auditory N1 event-related brain potential component are often discussed as an expression of auditory predictive processing. The sound-related N1 component is attenuated for self-generated sounds compared to the N1 elicited by externally generated sounds (N1 suppression). An omission-related component in the N1 time-range is elicited when the self-generated sounds are occasionally omitted (omission N1). Both phenomena were explained by action-related forward modelling, which takes place when the sensory input is predictable: prediction error signals are reduced when predicted sensory input is presented (N1 suppression) and elicited when predicted sensory input is omitted (omission N1). This common theoretical account is appealing but has not yet been directly tested. We manipulated the predictability of a sound in a self-generation paradigm in which, in two conditions, either 80% or 50% of the button presses did generate a sound, inducing a strong or a weak expectation for the occurrence of the sound. Consistent with the forward modelling account, an omission N1 was observed in the 80% but not in the 50% condition. However, N1 suppression was highly similar in both conditions. Thus, our results demonstrate a clear effect of predictability for the omission N1 but not for the N1 suppression. These results imply that the two phenomena rely (at least in part) on different mechanisms and challenge prediction related accounts of N1 suppression.
Collapse
Affiliation(s)
- Valentina Tast
- Wilhelm Wundt Institute for Psychology, Leipzig University, Leipzig, Germany
| | - Erich Schröger
- Wilhelm Wundt Institute for Psychology, Leipzig University, Leipzig, Germany
| | - Andreas Widmann
- Wilhelm Wundt Institute for Psychology, Leipzig University, Leipzig, Germany
| |
Collapse
|
3
|
Gu J, Buidze T, Zhao K, Gläscher J, Fu X. The neural network of sensory attenuation: A neuroimaging meta-analysis. Psychon Bull Rev 2024:10.3758/s13423-024-02532-1. [PMID: 38954157 DOI: 10.3758/s13423-024-02532-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/04/2024] [Indexed: 07/04/2024]
Abstract
Sensory attenuation refers to the reduction in sensory intensity resulting from self-initiated actions compared to stimuli initiated externally. A classic example is scratching oneself without feeling itchy. This phenomenon extends across various sensory modalities, including visual, auditory, somatosensory, and nociceptive stimuli. The internal forward model proposes that during voluntary actions, an efferent copy of the action command is sent out to predict sensory feedback. This predicted sensory feedback is then compared with the actual sensory feedback, leading to the suppression or reduction of sensory stimuli originating from self-initiated actions. To further elucidate the neural mechanisms underlying sensory attenuation effect, we conducted an extensive meta-analysis of functional magnetic resonance imaging (fMRI) and positron emission tomography (PET) studies. Utilizing activation likelihood estimation (ALE) analysis, our results revealed significant activations in a prominent cluster encompassing the right superior temporal gyrus (rSTG), right middle temporal gyrus (rMTG), and right insula when comparing external-generated with self-generated conditions. Additionally, significant activation was observed in the right anterior cerebellum when comparing self-generated to external-generated conditions. Further analysis using meta-analytic connectivity modeling (MACM) unveiled distinct brain networks co-activated with the rMTG and right cerebellum, respectively. Based on these findings, we propose that sensory attenuation arises from the suppression of reflexive inputs elicited by self-initiated actions through the internal forward modeling of a cerebellum-centered action prediction network, enabling the "sensory conflict detection" regions to effectively discriminate between inputs resulting from self-induced actions and those originating externally.
Collapse
Affiliation(s)
- Jingjin Gu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China
- Department of Psychology, University of the Chinese Academy of Sciences, Beijing, 100049, China
| | - Tatia Buidze
- Institute for Systems Neuroscience, University Medical Center Hamburg-Eppendorf, Hamburg, 20246, Germany
| | - Ke Zhao
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China.
- Department of Psychology, University of the Chinese Academy of Sciences, Beijing, 100049, China.
| | - Jan Gläscher
- Institute for Systems Neuroscience, University Medical Center Hamburg-Eppendorf, Hamburg, 20246, Germany
| | - Xiaolan Fu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China
- Department of Psychology, University of the Chinese Academy of Sciences, Beijing, 100049, China
| |
Collapse
|
4
|
Balla VR, Kilencz T, Szalóki S, Dalos VD, Partanen E, Csifcsák G. Motor dominance and movement-outcome congruency influence the electrophysiological correlates of sensory attenuation for self-induced visual stimuli. Int J Psychophysiol 2024; 200:112344. [PMID: 38614439 DOI: 10.1016/j.ijpsycho.2024.112344] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2024] [Revised: 04/08/2024] [Accepted: 04/10/2024] [Indexed: 04/15/2024]
Abstract
This study explores the impact of movement-outcome congruency and motor dominance on the action-associated modulations of early visual event-related potentials (ERPs). Employing the contingent paradigm, participants with varying degrees of motor dominance were exposed to stimuli depicting left or right human hands in the corresponding visual hemifields. Stimuli were either passively observed or evoked by voluntary button-presses with the dominant or non-dominant hand, in a manner that was either congruent or incongruent with stimulus laterality and hemifield. Early occipital responses (C1 and P1 components) revealed modulations consistent with sensory attenuation (SA) for self-evoked stimuli. Our findings suggest that sensory attenuation during the initial stages of visual processing (C1 component) is a general phenomenon across all degrees of handedness and stimulus/movement combinations. However, the magnitude of C1 suppression was modulated by handedness and movement-stimulus congruency, reflecting stronger SA in right-handed participants for stimuli depicting the right hand, when elicited by actions of the corresponding hand, and measured above the contralateral occipital lobe. P1 modulation suggested concurrent but opposing influences of attention and sensory prediction, with more pronounced suppression following stimulus-congruent button-presses over the hemisphere contralateral to movement, especially in left-handed individuals. We suggest that effects of motor dominance on the degree of SA may stem from functional/anatomical asymmetries in the processing of body parts (C1) and attention networks (P1). Overall, our results demonstrate the modulating effect of hand dominance and movement-outcome congruency on SA, underscoring the need for deeper exploration of their interplay. Additional empirical evidence in this direction could substantiate a premotor account for action-associated modulation of early sensory processing in the visual domain.
Collapse
Affiliation(s)
- Viktória Roxána Balla
- Cognitive Brain Research Unit, Department of Psychology and Logopedics, Faculty of Medicine, University of Helsinki, Finland.
| | - Tünde Kilencz
- Department of Psychiatry and Psychotherapy, Faculty of Medicine, Semmelweis University, Budapest, Hungary
| | - Szilvia Szalóki
- Department of Cognitive and Neuropsychology, Institute of Psychology, Faculty of Humanities and Social Sciences, University of Szeged, Hungary
| | - Vera Daniella Dalos
- Doctoral School of Interdisciplinary Medicine, Faculty of Medicine, University of Szeged, Hungary
| | - Eino Partanen
- Cognitive Brain Research Unit, Department of Psychology and Logopedics, Faculty of Medicine, University of Helsinki, Finland
| | - Gábor Csifcsák
- Department of Psychology, Faculty of Health Sciences, UiT The Arctic University of Norway, Tromsø, Norway
| |
Collapse
|
5
|
Harduf A, Panishev G, Harel EV, Stern Y, Salomon R. The bodily self from psychosis to psychedelics. Sci Rep 2023; 13:21209. [PMID: 38040825 PMCID: PMC10692325 DOI: 10.1038/s41598-023-47600-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/24/2023] [Accepted: 11/16/2023] [Indexed: 12/03/2023] Open
Abstract
The sense of self is a foundational element of neurotypical human consciousness. We normally experience the world as embodied agents, with the unified sensation of our selfhood being nested in our body. Critically, the sense of self can be altered in psychiatric conditions such as psychosis and altered states of consciousness induced by psychedelic compounds. The similarity of phenomenological effects across psychosis and psychedelic experiences has given rise to the "psychotomimetic" theory suggesting that psychedelics simulate psychosis-like states. Moreover, psychedelic-induced changes in the sense of self have been related to reported improvements in mental health. Here we investigated the bodily self in psychedelic, psychiatric, and control populations. Using the Moving Rubber Hand Illusion, we tested (N = 75) patients with psychosis, participants with a history of substantial psychedelic experiences, and control participants to see how psychedelic and psychiatric experience impacts the bodily self. Results revealed that psychosis patients had reduced Body Ownership and Sense of Agency during volitional action. The psychedelic group reported subjective long-lasting changes to the sense of self, but no differences between control and psychedelic participants were found. Our results suggest that while psychedelics induce both acute and enduring subjective changes in the sense of self, these are not manifested at the level of the bodily self. Furthermore, our data show that bodily self-processing, related to volitional action, is disrupted in psychosis patients. We discuss these findings in relation to anomalous self-processing across psychedelic and psychotic experiences.
Collapse
Affiliation(s)
- Amir Harduf
- The Multidisciplinary Brain Research Center, Bar-Ilan University, 5290002, Ramat-Gan, Israel
- The Faculty of Life Sciences, Bar-Ilan University, 5290002, Ramat-Gan, Israel
| | - Gabriella Panishev
- The Multidisciplinary Brain Research Center, Bar-Ilan University, 5290002, Ramat-Gan, Israel
| | - Eiran V Harel
- Beer Yaakov-Ness Ziona Mental Health Center, Beer Yaakov, Israel
| | - Yonatan Stern
- The Multidisciplinary Brain Research Center, Bar-Ilan University, 5290002, Ramat-Gan, Israel
- Department of Cognitive Sciences, University of Haifa, 3498838, Haifa, Israel
| | - Roy Salomon
- Department of Cognitive Sciences, University of Haifa, 3498838, Haifa, Israel.
| |
Collapse
|
6
|
Dery H, Buaron B, Mazinter R, Lavi S, Mukamel R. Playing with your ears: Audio-motor skill learning is sensitive to the lateral relationship between trained hand and ear. iScience 2023; 26:107720. [PMID: 37674982 PMCID: PMC10477063 DOI: 10.1016/j.isci.2023.107720] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2023] [Revised: 07/06/2023] [Accepted: 08/22/2023] [Indexed: 09/08/2023] Open
Abstract
A salient feature of motor and sensory circuits in the brain is their contralateral hemispheric bias-a feature that might play a role in integration and learning of sensorimotor skills. In the current behavioral study, we examined whether the lateral configuration between sound-producing hand and feedback-receiving ear affects performance and learning of an audio-motor skill. Right-handed participants (n = 117) trained to play a piano sequence using their right or left hand while auditory feedback was presented monaurally, either to the right or left ear. Participants receiving auditory feedback to the contralateral ear during training performed better than participants receiving ipsilateral feedback (with respect to the training hand). Furthermore, in the Left-Hand training groups, the contralateral training advantage persisted in a generalization task. Our results demonstrate that audio-motor learning is sensitive to the lateral configuration between motor and sensory circuits and suggest that integration of neural activity across hemispheres facilitates such learning.
Collapse
Affiliation(s)
- Hadar Dery
- School of Psychological Sciences, Tel Aviv University, Tel Aviv 6997801, Israel
| | - Batel Buaron
- School of Psychological Sciences, Tel Aviv University, Tel Aviv 6997801, Israel
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv 6997801, Israel
| | - Roni Mazinter
- School of Psychological Sciences, Tel Aviv University, Tel Aviv 6997801, Israel
| | - Shalev Lavi
- School of Psychological Sciences, Tel Aviv University, Tel Aviv 6997801, Israel
| | - Roy Mukamel
- School of Psychological Sciences, Tel Aviv University, Tel Aviv 6997801, Israel
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv 6997801, Israel
| |
Collapse
|
7
|
Vivaldo CA, Lee J, Shorkey M, Keerthy A, Rothschild G. Auditory cortex ensembles jointly encode sound and locomotion speed to support sound perception during movement. PLoS Biol 2023; 21:e3002277. [PMID: 37651461 PMCID: PMC10499203 DOI: 10.1371/journal.pbio.3002277] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2023] [Revised: 09/13/2023] [Accepted: 07/26/2023] [Indexed: 09/02/2023] Open
Abstract
The ability to process and act upon incoming sounds during locomotion is critical for survival and adaptive behavior. Despite the established role that the auditory cortex (AC) plays in behavior- and context-dependent sound processing, previous studies have found that auditory cortical activity is on average suppressed during locomotion as compared to immobility. While suppression of auditory cortical responses to self-generated sounds results from corollary discharge, which weakens responses to predictable sounds, the functional role of weaker responses to unpredictable external sounds during locomotion remains unclear. In particular, whether suppression of external sound-evoked responses during locomotion reflects reduced involvement of the AC in sound processing or whether it results from masking by an alternative neural computation in this state remains unresolved. Here, we tested the hypothesis that rather than simple inhibition, reduced sound-evoked responses during locomotion reflect a tradeoff with the emergence of explicit and reliable coding of locomotion velocity. To test this hypothesis, we first used neural inactivation in behaving mice and found that the AC plays a critical role in sound-guided behavior during locomotion. To investigate the nature of this processing, we used two-photon calcium imaging of local excitatory auditory cortical neural populations in awake mice. We found that locomotion had diverse influences on activity of different neurons, with a net suppression of baseline-subtracted sound-evoked responses and neural stimulus detection, consistent with previous studies. Importantly, we found that the net inhibitory effect of locomotion on baseline-subtracted sound-evoked responses was strongly shaped by elevated ongoing activity that compressed the response dynamic range, and that rather than reflecting enhanced "noise," this ongoing activity reliably encoded the animal's locomotion speed. Decoding analyses revealed that locomotion speed and sound are robustly co-encoded by auditory cortical ensemble activity. Finally, we found consistent patterns of joint coding of sound and locomotion speed in electrophysiologically recorded activity in freely moving rats. Together, our data suggest that rather than being suppressed by locomotion, auditory cortical ensembles explicitly encode it alongside sound information to support sound perception during locomotion.
Collapse
Affiliation(s)
- Carlos Arturo Vivaldo
- Department of Psychology, University of Michigan, Ann Arbor, Michigan, United States of America
| | - Joonyeup Lee
- Department of Psychology, University of Michigan, Ann Arbor, Michigan, United States of America
| | - MaryClaire Shorkey
- Department of Psychology, University of Michigan, Ann Arbor, Michigan, United States of America
| | - Ajay Keerthy
- Department of Psychology, University of Michigan, Ann Arbor, Michigan, United States of America
| | - Gideon Rothschild
- Department of Psychology, University of Michigan, Ann Arbor, Michigan, United States of America
- Kresge Hearing Research Institute and Department of Otolaryngology—Head and Neck Surgery, University of Michigan, Ann Arbor, Michigan, United States of America
| |
Collapse
|
8
|
Harrison AW, Hughes G, Rudman G, Christensen BK, Whitford TJ. Exploring the internal forward model: action-effect prediction and attention in sensorimotor processing. Cereb Cortex 2023:7191713. [PMID: 37288477 DOI: 10.1093/cercor/bhad189] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2023] [Revised: 05/10/2023] [Accepted: 05/11/2023] [Indexed: 06/09/2023] Open
Abstract
Action-effect predictions are believed to facilitate movement based on its association with sensory objectives and suppress the neurophysiological response to self- versus externally generated stimuli (i.e. sensory attenuation). However, research is needed to explore theorized differences in the use of action-effect prediction based on whether movement is uncued (i.e. volitional) or in response to external cues (i.e. stimulus-driven). While much of the sensory attenuation literature has examined effects involving the auditory N1, evidence is also conflicted regarding this component's sensitivity to action-effect prediction. In this study (n = 64), we explored the influence of action-effect contingency on event-related potentials associated with visually cued and uncued movement, as well as resultant stimuli. Our findings replicate recent evidence demonstrating reduced N1 amplitude for tones produced by stimulus-driven movement. Despite influencing motor preparation, action-effect contingency was not found to affect N1 amplitudes. Instead, we explore electrophysiological markers suggesting that attentional mechanisms may suppress the neurophysiological response to sound produced by stimulus-driven movement. Our findings demonstrate lateralized parieto-occipital activity that coincides with the auditory N1, corresponds to a reduction in its amplitude, and is topographically consistent with documented effects of attentional suppression. These results provide new insights into sensorimotor coordination and potential mechanisms underlying sensory attenuation.
Collapse
Affiliation(s)
- Anthony W Harrison
- School of Psychology, UNSW Sydney, Mathews Building, Library Walk, Kensington NSW 2052, Australia
| | - Gethin Hughes
- Department of Psychology, University Of Essex, Wivenhoe Park, Colchester CO4 3SQ, United Kingdom
| | - Gabriella Rudman
- School of Psychology, UNSW Sydney, Mathews Building, Library Walk, Kensington NSW 2052, Australia
| | - Bruce K Christensen
- Research School of Psychology, Building 39, The Australian National University, Science Rd, Canberra ACT 2601, Australia
| | - Thomas J Whitford
- School of Psychology, UNSW Sydney, Mathews Building, Library Walk, Kensington NSW 2052, Australia
| |
Collapse
|
9
|
Ody E, Straube B, He Y, Kircher T. Perception of self-generated and externally-generated visual stimuli: Evidence from EEG and behavior. Psychophysiology 2023:e14295. [PMID: 36966486 DOI: 10.1111/psyp.14295] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2022] [Revised: 01/23/2023] [Accepted: 03/04/2023] [Indexed: 03/27/2023]
Abstract
Efference copy-based forward model mechanisms may help us to distinguish between self-generated and externally-generated sensory consequences. Previous studies have shown that self-initiation modulates neural and perceptual responses to identical stimulation. For example, event-related potentials (ERPs) elicited by tones that follow a button press are reduced in amplitude relative to ERPs elicited by passively attended tones. However, previous EEG studies investigating visual stimuli in this context are rare, provide inconclusive results, and lack adequate control conditions with passive movements. Furthermore, although self-initiation is known to modulate behavioral responses, it is not known whether differences in the amplitude of ERPs also reflect differences in perception of sensory outcomes. In this study, we presented to participants visual stimuli consisting of gray discs following either active button presses, or passive button presses, in which an electromagnet moved the participant's finger. Two discs presented visually 500-1250 ms apart followed each button press, and participants judged which of the two was more intense. Early components of the primary visual response (N1 and P2) over the occipital electrodes were suppressed in the active condition. Interestingly, suppression in the intensity judgment task was only correlated with suppression of the visual P2 component. These data support the notion of efference copy-based forward model predictions in the visual sensory modality, but especially later processes (P2) seem to be perceptually relevant. Taken together, the results challenge the assumption that N1 differences reflect perceptual suppression and emphasize the relevance of the P2 ERP component.
Collapse
Affiliation(s)
- Edward Ody
- Department of Psychiatry and Psychotherapy, University of Marburg, Rudolf Bultmann-Strasse 8, Marburg, 35039, Germany
| | - Benjamin Straube
- Department of Psychiatry and Psychotherapy, University of Marburg, Rudolf Bultmann-Strasse 8, Marburg, 35039, Germany
| | - Yifei He
- Department of Psychiatry and Psychotherapy, University of Marburg, Rudolf Bultmann-Strasse 8, Marburg, 35039, Germany
| | - Tilo Kircher
- Department of Psychiatry and Psychotherapy, University of Marburg, Rudolf Bultmann-Strasse 8, Marburg, 35039, Germany
| |
Collapse
|
10
|
Rineau AL, Bringoux L, Sarrazin JC, Berberian B. Being active over one's own motion: Considering predictive mechanisms in self-motion perception. Neurosci Biobehav Rev 2023; 146:105051. [PMID: 36669748 DOI: 10.1016/j.neubiorev.2023.105051] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2022] [Revised: 01/16/2023] [Accepted: 01/16/2023] [Indexed: 01/19/2023]
Abstract
Self-motion perception is a key element guiding pilots' behavior. Its importance is mostly revealed when impaired, leading in most cases to spatial disorientation which is still today a major factor of accidents occurrence. Self-motion perception is known as mainly based on visuo-vestibular integration and can be modulated by the physical properties of the environment with which humans interact. For instance, several studies have shown that the respective weight of visual and vestibular information depends on their reliability. More recently, it has been suggested that the internal state of an operator can also modulate multisensory integration. Interestingly, the systems' automation can interfere with this internal state through the loss of the intentional nature of movements (i.e., loss of agency) and the modulation of associated predictive mechanisms. In this context, one of the new challenges is to better understand the relationship between automation and self-motion perception. The present review explains how linking the concepts of agency and self-motion is a first approach to address this issue.
Collapse
Affiliation(s)
- Anne-Laure Rineau
- Information Processing and Systems, ONERA, Salon de Provence, Base Aérienne 701, France.
| | | | | | - Bruno Berberian
- Information Processing and Systems, ONERA, Salon de Provence, Base Aérienne 701, France.
| |
Collapse
|
11
|
Press C, Thomas ER, Yon D. Cancelling cancellation? Sensorimotor control, agency, and prediction. Neurosci Biobehav Rev 2023; 145:105012. [PMID: 36565943 DOI: 10.1016/j.neubiorev.2022.105012] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2022] [Revised: 12/06/2022] [Accepted: 12/18/2022] [Indexed: 12/24/2022]
Abstract
For decades, classic theories of action control and action awareness have been built around the idea that the brain predictively 'cancels' expected action outcomes from perception. However, recent research casts doubt over this basic premise. What do these new findings mean for classic accounts of action? Should we now 'cancel' old data, theories and approaches generated under this idea? In this paper, we argue 'No'. While doubts about predictive cancellation may urge us to fundamentally rethink how predictions shape perception, the wider pyramid using these ideas to explain action control and agentic experiences can remain largely intact. Some adaptive functions assigned to predictive cancellation can be achieved through quasi-predictive processes, that influence perception without actively tracking the probabilistic structure of the environment. Other functions may rely upon truly predictive processes, but not require that these predictions cancel perception. Appreciating the role of these processes may help us to move forward in explaining how agents optimise their interactions with the external world, even if predictive cancellation is cancelled from theory.
Collapse
Affiliation(s)
- Clare Press
- Department of Psychological Sciences, Birkbeck, University of London, Malet Street, London WC1E 7HX, UK; Wellcome Centre for Human Neuroimaging, UCL, 12 Queen Square, London WC1N 3AR, UK.
| | - Emily R Thomas
- Neuroscience Institute, New York University School of Medicine, 550 1st Ave, New York, NY 10016, USA
| | - Daniel Yon
- Department of Psychological Sciences, Birkbeck, University of London, Malet Street, London WC1E 7HX, UK
| |
Collapse
|
12
|
Paraskevoudi N, SanMiguel I. Sensory suppression and increased neuromodulation during actions disrupt memory encoding of unpredictable self-initiated stimuli. Psychophysiology 2022; 60:e14156. [PMID: 35918912 PMCID: PMC10078310 DOI: 10.1111/psyp.14156] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2021] [Revised: 04/06/2022] [Accepted: 07/01/2022] [Indexed: 11/26/2022]
Abstract
Actions modulate sensory processing by attenuating responses to self- compared to externally generated inputs, which is traditionally attributed to stimulus-specific motor predictions. Yet, suppression has been also found for stimuli merely coinciding with actions, pointing to unspecific processes that may be driven by neuromodulatory systems. Meanwhile, the differential processing for self-generated stimuli raises the possibility of producing effects also on memory for these stimuli; however, evidence remains mixed as to the direction of the effects. Here, we assessed the effects of actions on sensory processing and memory encoding of concomitant, but unpredictable sounds, using a combination of self-generation and memory recognition task concurrently with EEG and pupil recordings. At encoding, subjects performed button presses that half of the time generated a sound (motor-auditory; MA) and listened to passively presented sounds (auditory-only; A). At retrieval, two sounds were presented and participants had to respond which one was present before. We measured memory bias and memory performance by having sequences where either both or only one of the test sounds were presented at encoding, respectively. Results showed worse memory performance - but no differences in memory bias -, attenuated responses, and larger pupil diameter for MA compared to A sounds. Critically, the larger the sensory attenuation and pupil diameter, the worse the memory performance for MA sounds. Nevertheless, sensory attenuation did not correlate with pupil dilation. Collectively, our findings suggest that sensory attenuation and neuromodulatory processes coexist during actions, and both relate to disrupted memory for concurrent, albeit unpredictable sounds.
Collapse
Affiliation(s)
- Nadia Paraskevoudi
- Institut de Neurociències, Universitat de Barcelona, Barcelona, Spain.,Brainlab-Cognitive Neuroscience Research Group, Departament de Psicologia Clinica i Psicobiologia, University of Barcelona, Barcelona, Spain
| | - Iria SanMiguel
- Institut de Neurociències, Universitat de Barcelona, Barcelona, Spain.,Brainlab-Cognitive Neuroscience Research Group, Departament de Psicologia Clinica i Psicobiologia, University of Barcelona, Barcelona, Spain.,Institut de Recerca Sant Joan de Déu, Esplugues de Llobregat, Spain
| |
Collapse
|
13
|
Aberbach-Goodman S, Buaron B, Mudrik L, Mukamel R. Same Action, Different Meaning: Neural Substrates of Action Semantic Meaning. Cereb Cortex 2022; 32:4293-4303. [PMID: 35024783 DOI: 10.1093/cercor/bhab483] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/18/2021] [Revised: 11/21/2021] [Accepted: 11/22/2021] [Indexed: 11/12/2022] Open
Abstract
Voluntary actions are shaped by desired goals and internal intentions. Multiple factors, including the planning of subsequent actions and the expectation of sensory outcome, were shown to modulate kinetics and neural activity patterns associated with similar goal-directed actions. Notably, in many real-world tasks, actions can also vary across the semantic meaning they convey, although little is known about how semantic meaning modulates associated neurobehavioral measures. Here, we examined how behavioral and functional magnetic resonance imaging measures are modulated when subjects execute similar actions (button presses) for two different semantic meanings-to answer "yes" or "no" to a binary question. Our findings reveal that, when subjects answer using their right hand, the two semantic meanings are differentiated based on voxel patterns in the frontoparietal cortex and lateral-occipital complex bilaterally. When using their left hand, similar regions were found, albeit only with a more liberal threshold. Although subjects were faster to answer "yes" versus "no" when using their right hand, the neural differences cannot be explained by these kinetic differences. To the best of our knowledge, this is the first evidence showing that semantic meaning is embedded in the neural representation of actions, independent of alternative modulating factors such as kinetic and sensory features.
Collapse
Affiliation(s)
- Shahar Aberbach-Goodman
- Sagol School of Neuroscience and School of Psychological Sciences, Tel Aviv University, Tel-Aviv 6997801, Israel
| | - Batel Buaron
- Sagol School of Neuroscience and School of Psychological Sciences, Tel Aviv University, Tel-Aviv 6997801, Israel
| | - Liad Mudrik
- Sagol School of Neuroscience and School of Psychological Sciences, Tel Aviv University, Tel-Aviv 6997801, Israel
| | - Roy Mukamel
- Sagol School of Neuroscience and School of Psychological Sciences, Tel Aviv University, Tel-Aviv 6997801, Israel
| |
Collapse
|
14
|
The auditory brain in action: Intention determines predictive processing in the auditory system-A review of current paradigms and findings. Psychon Bull Rev 2021; 29:321-342. [PMID: 34505988 PMCID: PMC9038838 DOI: 10.3758/s13423-021-01992-z] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/29/2021] [Indexed: 11/08/2022]
Abstract
According to the ideomotor theory, action may serve to produce desired sensory outcomes. Perception has been widely described in terms of sensory predictions arising due to top-down input from higher order cortical areas. Here, we demonstrate that the action intention results in reliable top-down predictions that modulate the auditory brain responses. We bring together several lines of research, including sensory attenuation, active oddball, and action-related omission studies: Together, the results suggest that the intention-based predictions modulate several steps in the sound processing hierarchy, from preattentive to evaluation-related processes, also when controlling for additional prediction sources (i.e., sound regularity). We propose an integrative theoretical framework—the extended auditory event representation system (AERS), a model compatible with the ideomotor theory, theory of event coding, and predictive coding. Initially introduced to describe regularity-based auditory predictions, we argue that the extended AERS explains the effects of action intention on auditory processing while additionally allowing studying the differences and commonalities between intention- and regularity-based predictions—we thus believe that this framework could guide future research on action and perception.
Collapse
|
15
|
Paraskevoudi N, SanMiguel I. Self-generation and sound intensity interactively modulate perceptual bias, but not perceptual sensitivity. Sci Rep 2021; 11:17103. [PMID: 34429453 PMCID: PMC8385100 DOI: 10.1038/s41598-021-96346-z] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/27/2020] [Accepted: 08/02/2021] [Indexed: 02/07/2023] Open
Abstract
The ability to distinguish self-generated stimuli from those caused by external sources is critical for all behaving organisms. Although many studies point to a sensory attenuation of self-generated stimuli, recent evidence suggests that motor actions can result in either attenuated or enhanced perceptual processing depending on the environmental context (i.e., stimulus intensity). The present study employed 2-AFC sound detection and loudness discrimination tasks to test whether sound source (self- or externally-generated) and stimulus intensity (supra- or near-threshold) interactively modulate detection ability and loudness perception. Self-generation did not affect detection and discrimination sensitivity (i.e., detection thresholds and Just Noticeable Difference, respectively). However, in the discrimination task, we observed a significant interaction between self-generation and intensity on perceptual bias (i.e. Point of Subjective Equality). Supra-threshold self-generated sounds were perceived softer than externally-generated ones, while at near-threshold intensities self-generated sounds were perceived louder than externally-generated ones. Our findings provide empirical support to recent theories on how predictions and signal intensity modulate perceptual processing, pointing to interactive effects of intensity and self-generation that seem to be driven by a biased estimate of perceived loudness, rather by changes in detection and discrimination sensitivity.
Collapse
Affiliation(s)
- Nadia Paraskevoudi
- Brainlab-Cognitive Neuroscience Research Group, Department of Clinical Psychology and Psychobiology, University of Barcelona, P. Vall d'Hebron 171, 08035, Barcelona, Spain.,Institute of Neurosciences, University of Barcelona, Barcelona, Spain
| | - Iria SanMiguel
- Brainlab-Cognitive Neuroscience Research Group, Department of Clinical Psychology and Psychobiology, University of Barcelona, P. Vall d'Hebron 171, 08035, Barcelona, Spain. .,Institute of Neurosciences, University of Barcelona, Barcelona, Spain. .,Institut de Recerca Sant Joan de Déu, Esplugues de Llobregat, Spain.
| |
Collapse
|
16
|
Clayton KK, Asokan MM, Watanabe Y, Hancock KE, Polley DB. Behavioral Approaches to Study Top-Down Influences on Active Listening. Front Neurosci 2021; 15:666627. [PMID: 34305516 PMCID: PMC8299106 DOI: 10.3389/fnins.2021.666627] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2021] [Accepted: 06/09/2021] [Indexed: 11/21/2022] Open
Abstract
The massive network of descending corticofugal projections has been long-recognized by anatomists, but their functional contributions to sound processing and auditory-guided behaviors remain a mystery. Most efforts to characterize the auditory corticofugal system have been inductive; wherein function is inferred from a few studies employing a wide range of methods to manipulate varying limbs of the descending system in a variety of species and preparations. An alternative approach, which we focus on here, is to first establish auditory-guided behaviors that reflect the contribution of top-down influences on auditory perception. To this end, we postulate that auditory corticofugal systems may contribute to active listening behaviors in which the timing of bottom-up sound cues can be predicted from top-down signals arising from cross-modal cues, temporal integration, or self-initiated movements. Here, we describe a behavioral framework for investigating how auditory perceptual performance is enhanced when subjects can anticipate the timing of upcoming target sounds. Our first paradigm, studied both in human subjects and mice, reports species-specific differences in visually cued expectation of sound onset in a signal-in-noise detection task. A second paradigm performed in mice reveals the benefits of temporal regularity as a perceptual grouping cue when detecting repeating target tones in complex background noise. A final behavioral approach demonstrates significant improvements in frequency discrimination threshold and perceptual sensitivity when auditory targets are presented at a predictable temporal interval following motor self-initiation of the trial. Collectively, these three behavioral approaches identify paradigms to study top-down influences on sound perception that are amenable to head-fixed preparations in genetically tractable animals, where it is possible to monitor and manipulate particular nodes of the descending auditory pathway with unparalleled precision.
Collapse
Affiliation(s)
- Kameron K. Clayton
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA, United States
| | - Meenakshi M. Asokan
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA, United States
| | - Yurika Watanabe
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA, United States
| | - Kenneth E. Hancock
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA, United States
- Department of Otolaryngology – Head and Neck Surgery, Harvard Medical School, Boston, MA, United States
| | - Daniel B. Polley
- Eaton-Peabody Laboratories, Massachusetts Eye and Ear, Boston, MA, United States
- Department of Otolaryngology – Head and Neck Surgery, Harvard Medical School, Boston, MA, United States
| |
Collapse
|
17
|
Reznik D, Guttman N, Buaron B, Zion-Golumbic E, Mukamel R. Action-locked Neural Responses in Auditory Cortex to Self-generated Sounds. Cereb Cortex 2021; 31:5560-5569. [PMID: 34185837 DOI: 10.1093/cercor/bhab179] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2021] [Revised: 05/24/2021] [Accepted: 05/25/2021] [Indexed: 11/14/2022] Open
Abstract
Sensory perception is a product of interactions between the internal state of an organism and the physical attributes of a stimulus. It has been shown across the animal kingdom that perception and sensory-evoked physiological responses are modulated depending on whether or not the stimulus is the consequence of voluntary actions. These phenomena are often attributed to motor signals sent to relevant sensory regions that convey information about upcoming sensory consequences. However, the neurophysiological signature of action-locked modulations in sensory cortex, and their relationship with perception, is still unclear. In the current study, we recorded neurophysiological (using Magnetoencephalography) and behavioral responses from 16 healthy subjects performing an auditory detection task of faint tones. Tones were either generated by subjects' voluntary button presses or occurred predictably following a visual cue. By introducing a constant temporal delay between button press/cue and tone delivery, and applying source-level analysis, we decoupled action-locked and auditory-locked activity in auditory cortex. We show action-locked evoked-responses in auditory cortex following sound-triggering actions and preceding sound onset. Such evoked-responses were not found for button-presses that were not coupled with sounds, or sounds delivered following a predictive visual cue. Our results provide evidence for efferent signals in human auditory cortex that are locked to voluntary actions coupled with future auditory consequences.
Collapse
Affiliation(s)
- Daniel Reznik
- Max Planck Institute for Human Cognitive and Brain Sciences, Psychology Department, Leipzig, 04103, Germany
| | - Noa Guttman
- The Gonda Center for Multidisciplinary Brain Research, Bar-Ilan University, Ramat Gan, 5290002, Israel
| | - Batel Buaron
- Sagol School of Neuroscience and School of Psychological Sciences, Tel-Aviv University, 69978, Israel
| | - Elana Zion-Golumbic
- The Gonda Center for Multidisciplinary Brain Research, Bar-Ilan University, Ramat Gan, 5290002, Israel
| | - Roy Mukamel
- Sagol School of Neuroscience and School of Psychological Sciences, Tel-Aviv University, 69978, Israel
| |
Collapse
|
18
|
Ford JM, Roach BJ, Mathalon DH. Vocalizing and singing reveal complex patterns of corollary discharge function in schizophrenia. Int J Psychophysiol 2021; 164:30-40. [PMID: 33621618 DOI: 10.1016/j.ijpsycho.2021.02.013] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2020] [Revised: 01/30/2021] [Accepted: 02/16/2021] [Indexed: 10/22/2022]
Abstract
INTRODUCTION As we vocalize, our brains generate predictions of the sounds we produce to enable suppression of neural responses when intentions match vocalizations and to make adjustments when they do not. This may be instantiated by efference copy and corollary discharge mechanisms, which are impaired in people with schizophrenia (SZ). Although innate, these mechanisms can be affected by intentions. We asked if attending to pitch during vocalizations would take these mechanisms "off-line" and reduce suppression. METHODS Event-related potentials (ERP) were recorded from 96 SZ and 92 healthy controls (HC) as they vocalized triplets in monotone (Phrase) or sang triplets in ascending thirds (Pitch). Pre-vocalization activity (Bereitschaftspotential, BP), N1, and P2 ERP components to sounds were compared during vocalization and playback. RESULTS N1 was not as suppressed during Pitch as during Phrase. N1 suppression was not affected by SZ in either task when all data were collapsed across pitches (Pitch) and positions (Phrase). However, when binned according to vocalization performance, SZ showed less N1 suppression than HC at longer (>2 s) inter-stimulus intervals (Phrase) and inconsistent suppression across pitches (Pitch). Unlike N1, P2 was more suppressed during Pitch than Phrase and not affected by SZ. BP was greater during vocalization than playback but did not contribute to N1 or P2 effects. Pitch variability was inversely related to negative symptoms. CONCLUSIONS Neural processing is not suppressed when patients and controls sing, and corollary discharge abnormalities in schizophrenia are only seen at long vocalization intervals.
Collapse
Affiliation(s)
- Judith M Ford
- University of California, San Francisco (UCSF), United States of America; Veterans Affairs San Francisco Healthcare System, United States of America.
| | - Brian J Roach
- Veterans Affairs San Francisco Healthcare System, United States of America
| | - Daniel H Mathalon
- University of California, San Francisco (UCSF), United States of America; Veterans Affairs San Francisco Healthcare System, United States of America
| |
Collapse
|
19
|
Choi US, Sung YW, Ogawa S. Brain Plasticity Reflects Specialized Cognitive Development Induced by Musical Training. Cereb Cortex Commun 2021; 2:tgab037. [PMID: 34296181 DOI: 10.1093/texcom/tgab037] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2021] [Accepted: 05/22/2021] [Indexed: 11/12/2022] Open
Abstract
Learning a musical instrument requires a long period of training and might induce structural and functional changes in the brain. Previous studies have shown brain plasticity resulting from training with a musical instrument. However, these studies did not distinguish the effects on brain plasticity of specific musical instruments as they examined the brain of musicians who had learned a single musical instrument/genre and did not control for confounding factors, such as common or interactive effects involved in music training. To address this research gap, the present work investigated musicians who had experience with both a piano and a wind instrument, for example, flute, trumpet, clarinet etc. By examining the difference between the 2 musical instruments in the same subject, we avoided the effects common to all musical instruments and the confounding factors. Therefore, we identified several high-tier brain areas displaying a brain plasticity specific to each musical instrument. Our findings show that learning a musical instrument might result in the development of high cognitive functions reflecting the skills/abilities unique to the instrument played.
Collapse
Affiliation(s)
- Uk-Su Choi
- Gwangju Alzheimer's Disease and Related Dementias (GARD) Cohort Research Center, Chosun University, Gwangju 61452, Republic of Korea
| | - Yul-Wan Sung
- Kansei Fukushi Research Institute, Tohoku Fukushi University, Sendai, Miyagi 9893201, Japan
| | - Seiji Ogawa
- Kansei Fukushi Research Institute, Tohoku Fukushi University, Sendai, Miyagi 9893201, Japan
| |
Collapse
|
20
|
Neural correlates of implicit agency during the transition from adolescence to adulthood: An ERP study. Neuropsychologia 2021; 158:107908. [PMID: 34062152 DOI: 10.1016/j.neuropsychologia.2021.107908] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2020] [Revised: 05/03/2021] [Accepted: 05/26/2021] [Indexed: 11/20/2022]
Abstract
Sense of agency (SoA), the experience of being in control of our voluntary actions and their outcomes, is a key feature of normal human experience. Frontoparietal brain circuits associated with SoA undergo a major maturational process during adolescence. To examine whether this translates to neurodevelopmental changes in agency experience, we investigated two key neural processes associated with SoA, the activity that is leading to voluntary action (Readiness Potential) and the activity that is associated with the action outcome processing (attenuation of auditory N1 and P2 event related potentials, ERPs) in mid-adolescents (13-14), late-adolescents (18-20) and adults (25-28) while they perform an intentional binding task. In this task, participants pressed a button (action) that delivered a tone (outcome) after a small delay and reported the time of the tone using the Libet clock. This action-outcome condition alternated with a no-action condition where an identical tone was triggered by a computer. Mid-adolescents showed greater outcome binding, such that they perceived self-triggered tones as being temporally closer to their actions compared to adults. Suggesting greater agency experience over the outcomes of their voluntary actions during mid-adolescence. Consistent with this, greater levels of attenuated neural response to self-triggered auditory tones (specifically P2 attenuation) were found during mid-adolescence compared to older age groups. This enhanced attenuation decreased with age as observed in outcome binding. However, there were no age-related differences in the readiness potential leading to the voluntary action (button press) as well as in the N1 attenuation to the self-triggered tones. Notably, in mid-adolescents greater outcome binding scores were positively associated with greater P2 attenuation, and smaller negativity in the late readiness potential. These findings suggest that the greater experience of implicit agency observed during mid-adolescence may be mediated by a neural over-suppression of action outcomes (auditory P2 attenuation), and over-reliance on motor preparation (late readiness potential), which we found to become adult-like during late-adolescence. Implications for adolescent development and SoA related neurodevelopmental disorders are discussed.
Collapse
|
21
|
Effector-independent brain network for auditory-motor integration: fMRI evidence from singing and cello playing. Neuroimage 2021; 237:118128. [PMID: 33989814 DOI: 10.1016/j.neuroimage.2021.118128] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/24/2020] [Revised: 04/13/2021] [Accepted: 04/25/2021] [Indexed: 11/22/2022] Open
Abstract
Many everyday tasks share high-level sensory goals but differ in the movements used to accomplish them. One example of this is musical pitch regulation, where the same notes can be produced using the vocal system or a musical instrument controlled by the hands. Cello playing has previously been shown to rely on brain structures within the singing network for performance of single notes, except in areas related to primary motor control, suggesting that the brain networks for auditory feedback processing and sensorimotor integration may be shared (Segado et al. 2018). However, research has shown that singers and cellists alike can continue singing/playing in tune even in the absence of auditory feedback (Chen et al. 2013, Kleber et al. 2013), so different paradigms are required to test feedback monitoring and control mechanisms. In singing, auditory pitch feedback perturbation paradigms have been used to show that singers engage a network of brain regions including anterior cingulate cortex (ACC), anterior insula (aINS), and intraparietal sulcus (IPS) when compensating for altered pitch feedback, and posterior superior temporal gyrus (pSTG) and supramarginal gyrus (SMG) when ignoring it (Zarate et al. 2005, 2008). To determine whether the brain networks for cello playing and singing directly overlap in these sensory-motor integration areas, in the present study expert cellists were asked to compensate for or ignore introduced pitch perturbations when singing/playing during fMRI scanning. We found that cellists were able to sing/play target tones, and compensate for and ignore introduced feedback perturbations equally well. Brain activity overlapped for singing and playing in IPS and SMG when compensating, and pSTG and dPMC when ignoring; differences between singing/playing across all three conditions were most prominent in M1, centered on the relevant motor effectors (hand, larynx). These findings support the hypothesis that pitch regulation during cello playing relies on structures within the singing network and suggests that differences arise primarily at the level of forward motor control.
Collapse
|
22
|
Gale DJ, Areshenkoff CN, Honda C, Johnsrude IS, Flanagan JR, Gallivan JP. Motor Planning Modulates Neural Activity Patterns in Early Human Auditory Cortex. Cereb Cortex 2021; 31:2952-2967. [PMID: 33511976 PMCID: PMC8107793 DOI: 10.1093/cercor/bhaa403] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/11/2020] [Revised: 12/14/2020] [Accepted: 12/14/2020] [Indexed: 11/13/2022] Open
Abstract
It is well established that movement planning recruits motor-related cortical brain areas in preparation for the forthcoming action. Given that an integral component to the control of action is the processing of sensory information throughout movement, we predicted that movement planning might also modulate early sensory cortical areas, readying them for sensory processing during the unfolding action. To test this hypothesis, we performed 2 human functional magnetic resonance imaging studies involving separate delayed movement tasks and focused on premovement neural activity in early auditory cortex, given the area's direct connections to the motor system and evidence that it is modulated by motor cortex during movement in rodents. We show that effector-specific information (i.e., movements of the left vs. right hand in Experiment 1 and movements of the hand vs. eye in Experiment 2) can be decoded, well before movement, from neural activity in early auditory cortex. We find that this motor-related information is encoded in a separate subregion of auditory cortex than sensory-related information and is present even when movements are cued visually instead of auditorily. These findings suggest that action planning, in addition to preparing the motor system for movement, involves selectively modulating primary sensory areas based on the intended action.
Collapse
Affiliation(s)
- Daniel J Gale
- Centre for Neuroscience Studies, Queen’s University, Kingston, Ontario K7L 3N6, Canada
| | - Corson N Areshenkoff
- Centre for Neuroscience Studies, Queen’s University, Kingston, Ontario K7L 3N6, Canada
- Department of Psychology, Queen’s University, Kingston, Ontario K7L 3N6, Canada
| | - Claire Honda
- Department of Psychology, Queen’s University, Kingston, Ontario K7L 3N6, Canada
| | - Ingrid S Johnsrude
- Department of Psychology, University of Western Ontario, London, Ontario, N6A 3K7, Canada
- School of Communication Sciences and Disorders, University of Western Ontario, London, Ontario, N6A 3K7, Canada
- Brain and Mind Institute, University of Western Ontario, London, Ontario, N6A 3K7, Canada
| | - J Randall Flanagan
- Centre for Neuroscience Studies, Queen’s University, Kingston, Ontario K7L 3N6, Canada
- Department of Psychology, Queen’s University, Kingston, Ontario K7L 3N6, Canada
| | - Jason P Gallivan
- Centre for Neuroscience Studies, Queen’s University, Kingston, Ontario K7L 3N6, Canada
- Department of Psychology, Queen’s University, Kingston, Ontario K7L 3N6, Canada
- Department of Biomedical and Molecular Sciences, Queen’s University, Kingston, Ontario K7L 3N6, Canada
| |
Collapse
|
23
|
Sensory attenuation is modulated by the contrasting effects of predictability and control. Neuroimage 2021; 237:118103. [PMID: 33957233 DOI: 10.1016/j.neuroimage.2021.118103] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2020] [Revised: 03/18/2021] [Accepted: 04/23/2021] [Indexed: 11/22/2022] Open
Abstract
Self-generated stimuli have been found to elicit a reduced sensory response compared with externally-generated stimuli. However, much of the literature has not adequately controlled for differences in the temporal predictability and temporal control of stimuli. In two experiments, we compared the N1 (and P2) components of the auditory-evoked potential to self- and externally-generated tones that differed with respect to these two factors. In Experiment 1 (n = 42), we found that increasing temporal predictability reduced N1 amplitude in a manner that may often account for the observed reduction in sensory response to self-generated sounds. We also observed that reducing temporal control over the tones resulted in a reduction in N1 amplitude. The contrasting effects of temporal predictability and temporal control on N1 amplitude meant that sensory attenuation prevailed when controlling for each. Experiment 2 (n = 38) explored the potential effect of selective attention on the results of Experiment 1 by modifying task requirements such that similar levels of attention were allocated to the visual stimuli across conditions. The results of Experiment 2 replicated those of Experiment 1, and suggested that the observed effects of temporal control and sensory attenuation were not driven by differences in attention. Given that self- and externally-generated sensations commonly differ with respect to both temporal predictability and temporal control, findings of the present study may necessitate a re-evaluation of the experimental paradigms used to study sensory attenuation.
Collapse
|
24
|
Vroegh T, Wiesmann SL, Henschke S, Lange EB. Manual motor reaction while being absorbed into popular music. Conscious Cogn 2021; 89:103088. [PMID: 33636569 DOI: 10.1016/j.concog.2021.103088] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2020] [Revised: 01/21/2021] [Accepted: 01/24/2021] [Indexed: 11/25/2022]
Abstract
In three experiments, we investigated the behavioral consequences of being absorbed into music on performance in a concurrent task. We tested two competing hypotheses: Based on a cognitive load account, captivation of attention by the music and state absorption might slow down reactions in the decisional task. Alternatively, music could induce spontaneous motor activity, and being absorbed in music might result in a more autonomous, flow-driven behavior with quicker motor reactions. Participants performed a simple, visual, two-alternative forced-choice task while listening to popular musical excerpts. Subsequently, they rated their subjective experience using a short questionnaire. We presented music in four tempo categories (between 80 and 140 BPM) to account for a potential effect of tempo and an interaction between tempo and absorption. In Experiment 1, absorption was related to decreased reaction times (RTs) in the visual task. This effect was small, as expected in this setting, but replicable in Experiment 2. There was no effect of the music's tempo on RTs but a tendency of mind wandering to relate to task performance. After slightly changing the study setting in Experiment 3, flow predicted decreased RTs, but absorption alone - as part of the flow construct - did not predict RTs. To sum up, we demonstrated that being absorbed in music can have the behavioral consequence of speeded manual reactions in specific task contexts, and people seem to integrate the music into an active, flow-driven and therefore enhanced performance. However, shown relations depend on task settings, and a systematic study of context is necessary to understand how induced states and their measurement contribute to the findings.
Collapse
Affiliation(s)
- Thijs Vroegh
- Max Planck Institute for Empirical Aesthetics, Frankfurt, Germany
| | - Sandro L Wiesmann
- Max Planck Institute for Empirical Aesthetics, Frankfurt, Germany; Scene Grammar Lab, Department of Psychology, Goethe-University, Frankfurt, Germany
| | | | - Elke B Lange
- Max Planck Institute for Empirical Aesthetics, Frankfurt, Germany
| |
Collapse
|
25
|
Seidel A, Ghio M, Studer B, Bellebaum C. Illusion of control affects ERP amplitude reductions for auditory outcomes of self-generated actions. Psychophysiology 2021; 58:e13792. [PMID: 33604896 DOI: 10.1111/psyp.13792] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2020] [Revised: 01/28/2021] [Accepted: 01/29/2021] [Indexed: 11/27/2022]
Abstract
The reduction of neural responses to self-generated stimuli compared to external stimuli is thought to result from the matching of motor-based sensory predictions and sensory reafferences and to serve the identification of changes in the environment as caused by oneself. The amplitude of the auditory event-related potential (ERP) component N1 seems to closely reflect this matching process, while the later positive component (P2/ P3a) has been associated with judgments of agency, which are also sensitive to contextual top-down information. In this study, we examined the effect of perceived control over sound production on the processing of self-generated and external stimuli, as reflected in these components. We used a new version of a classic two-button choice task to induce different degrees of the illusion of control (IoC) and recorded ERPs for the processing of self-generated and external sounds in a subsequent task. N1 amplitudes were reduced for self-generated compared to external sounds, but not significantly affected by IoC. P2/3a amplitudes were affected by IoC: We found reduced P2/3a amplitudes after a high compared to a low IoC induction training, but only for self-generated, not for external sounds. These findings suggest that prior contextual belief information induced by an IoC affects later processing as reflected in the P2/P3a, possibly for the formation of agency judgments, while early processing reflecting motor-based predictions is not affected.
Collapse
Affiliation(s)
- Alexander Seidel
- Institute of Experimental Psychology, Heinrich-Heine University Düsseldorf, Düsseldorf, Germany
| | - Marta Ghio
- CIMeC - Center for Mind/Brain Sciences, University of Trento, Trento, Italy
| | - Bettina Studer
- Institute of Clinical Neuroscience and Medical Psychology, Medical Faculty, Heinrich-Heine-University Düsseldorf, Düsseldorf, Germany.,Department of Neurology, Mauritius Hospital Meerbusch, Meerbusch, Germany
| | - Christian Bellebaum
- Institute of Experimental Psychology, Heinrich-Heine University Düsseldorf, Düsseldorf, Germany
| |
Collapse
|
26
|
Endo N, Ito T, Mochida T, Ijiri T, Watanabe K, Nakazawa K. Precise force controls enhance loudness discrimination of self-generated sound. Exp Brain Res 2021; 239:1141-1149. [PMID: 33555383 DOI: 10.1007/s00221-020-05993-7] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2020] [Accepted: 11/19/2020] [Indexed: 10/22/2022]
Abstract
Motor executions alter sensory processes. Studies have shown that loudness perception changes when a sound is generated by active movement. However, it is still unknown where and how the motor-related changes in loudness perception depend on the task demand of motor execution. We examined whether different levels of precision demands in motor control affects loudness perception. We carried out a loudness discrimination test, in which the sound stimulus was produced in conjunction with the force generation task. We tested three target force amplitude levels. The force target was presented on a monitor as a fixed visual target. The generated force was also presented on the same monitor as a movement of the visual cursor. Participants adjusted their force amplitude in a predetermined range without overshooting using these visual targets and moving cursor. In the control condition, the sound and visual stimuli were generated externally (without a force generation task). We found that the discrimination performance was significantly improved when the sound was produced by the force generation task compared to the control condition, in which the sound was produced externally, although we did not find that this improvement in discrimination performance changed depending on the different target force amplitude levels. The results suggest that the demand for precise control to produce a fixed amount of force may be key to obtaining the facilitatory effect of motor execution in auditory processes.
Collapse
Affiliation(s)
- Nozomi Endo
- Department of Life Sciences, Graduate School of Arts and Sciences, The University of Tokyo, 3-8-1, Komaba, Meguro-ku, Tokyo, 153-8902, Japan.,Faculty of Science and Engineering, Waseda University, 3-4-1, Ohkubo, Shinjuku-ku, Tokyo, 169-8555, Japan.,Japan Society for the Promotion of Science, 5-3-1 Kojimachi, Chiyoda-ku, Tokyo, 102-0083, Japan
| | - Takayuki Ito
- Univ. Grenoble Alps, Grenoble-INP, CNRS, GIPSA-Lab, 11 rue des Mathématiques, Grenoble Campus BP46, 38402, Saint Martin D'heres Cedex, France.,Haskins Laboratories, 300 George Street, New Haven, CT, 06511, USA
| | - Takemi Mochida
- NTT Communication Science Laboratories, 3-1, Morinosato Wakamiya, Atsugi-shi, Kanagawa, 243-0198, Japan
| | - Tetsuya Ijiri
- Department of Life Sciences, Graduate School of Arts and Sciences, The University of Tokyo, 3-8-1, Komaba, Meguro-ku, Tokyo, 153-8902, Japan
| | - Katsumi Watanabe
- Faculty of Science and Engineering, Waseda University, 3-4-1, Ohkubo, Shinjuku-ku, Tokyo, 169-8555, Japan.,Art & Design, University of New South Wales, Oxford St & Greens Rd, Paddington, NSW 202, Australia
| | - Kimitaka Nakazawa
- Department of Life Sciences, Graduate School of Arts and Sciences, The University of Tokyo, 3-8-1, Komaba, Meguro-ku, Tokyo, 153-8902, Japan.
| |
Collapse
|
27
|
Neszmélyi B, Horváth J. Action-related auditory ERP attenuation is not modulated by action effect relevance. Biol Psychol 2021; 161:108029. [PMID: 33556451 DOI: 10.1016/j.biopsycho.2021.108029] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2020] [Revised: 01/13/2021] [Accepted: 01/26/2021] [Indexed: 10/22/2022]
Abstract
Event-related potentials (ERPs) elicited by self-induced sounds are often smaller than ERPs elicited by identical, but externally generated sounds. This action-related auditory ERP attenuation is more pronounced when self-induced sounds are intermixed with similar sounds generated by an external source. The current study explored whether attentional factors contributed to this phenomenon. Participants performed tone-eliciting actions, while the action-tone contingency and the set of additional action effects (tactile only, tactile and visual) were manipulated in a blocked manner. Previous action-tone contingence-effects were replicated, but the addition of other sensory action consequences did not influence the magnitude of auditory ERP attenuation. This suggests that the amount of attention allocated to concurrent non-auditory action effects does not substantially affect the magnitude of action-related auditory ERP attenuation and is on a par with the assumption that action-related auditory ERP attenuation might be related to the process of distinguishing self-induced stimuli from externally generated ones.
Collapse
Affiliation(s)
- Bence Neszmélyi
- Institute of Cognitive Neuroscience and Psychology, Research Centre for Natural Sciences, Budapest, Hungary; Budapest University of Technology and Economics, Budapest, Hungary; Pázmány Péter Catholic University, Budapest, Hungary.
| | - János Horváth
- Institute of Cognitive Neuroscience and Psychology, Research Centre for Natural Sciences, Budapest, Hungary; Károli Gáspár University of the Reformed Church in Hungary, Hungary
| |
Collapse
|
28
|
Schmitter CV, Steinsträter O, Kircher T, van Kemenade BM, Straube B. Commonalities and differences in predictive neural processing of discrete vs continuous action feedback. Neuroimage 2021; 229:117745. [PMID: 33454410 DOI: 10.1016/j.neuroimage.2021.117745] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2020] [Revised: 01/04/2021] [Accepted: 01/09/2021] [Indexed: 11/16/2022] Open
Abstract
Sensory action consequences are highly predictable and thus engage less neural resources compared to externally generated sensory events. While this has frequently been observed to lead to attenuated perceptual sensitivity and suppression of activity in sensory cortices, some studies conversely reported enhanced perceptual sensitivity for action consequences. These divergent findings might be explained by the type of action feedback, i.e., discrete outcomes vs. continuous feedback. Therefore, in the present study we investigated the impact of discrete and continuous action feedback on perceptual and neural processing during action feedback monitoring. During fMRI data acquisition, participants detected temporal delays (0-417 ms) between actively or passively generated wrist movements and visual feedback that was either continuously provided during the movement or that appeared as a discrete outcome. Both feedback types resulted in (1) a neural suppression effect (active<passive) in a largely shared network including bilateral visual and somatosensory cortices, cerebellum and temporoparietal areas. Yet, compared to discrete outcomes, (2) processing continuous feedback led to stronger suppression in right superior temporal gyrus (STG), Heschl´s gyrus, and insula suggesting specific suppression of features linked to continuous feedback. Furthermore, (3) BOLD suppression in visual cortex for discrete outcomes was specifically related to perceptual enhancement. Together, these findings indicate that neural representations of discrete and continuous action feedback are similarly suppressed but might depend on different predictive mechanisms, where reduced activation in visual cortex reflects facilitation specifically for discrete outcomes, and predictive processing in STG, Heschl´s gyrus, and insula is particularly relevant for continuous feedback.
Collapse
Affiliation(s)
- Christina V Schmitter
- Department of Psychiatry and Psychotherapy, University of Marburg, Rudolf-Bultmann-Strasse 8, 35039 Marburg, Germany; Center for Mind, Brain and Behavior (CMBB), University of Marburg and Justus Liebig University Giessen, Hans-Meerwein-Strasse 6, 35032 Marburg, Germany.
| | - Olaf Steinsträter
- Department of Psychiatry and Psychotherapy, University of Marburg, Rudolf-Bultmann-Strasse 8, 35039 Marburg, Germany; Center for Mind, Brain and Behavior (CMBB), University of Marburg and Justus Liebig University Giessen, Hans-Meerwein-Strasse 6, 35032 Marburg, Germany; Core Facility Brain Imaging, University of Marburg, Rudolf-Bultmann-Strasse 8, 35039 Marburg, Germany.
| | - Tilo Kircher
- Department of Psychiatry and Psychotherapy, University of Marburg, Rudolf-Bultmann-Strasse 8, 35039 Marburg, Germany; Center for Mind, Brain and Behavior (CMBB), University of Marburg and Justus Liebig University Giessen, Hans-Meerwein-Strasse 6, 35032 Marburg, Germany.
| | - Bianca M van Kemenade
- Department of Psychiatry and Psychotherapy, University of Marburg, Rudolf-Bultmann-Strasse 8, 35039 Marburg, Germany; Center for Mind, Brain and Behavior (CMBB), University of Marburg and Justus Liebig University Giessen, Hans-Meerwein-Strasse 6, 35032 Marburg, Germany.
| | - Benjamin Straube
- Department of Psychiatry and Psychotherapy, University of Marburg, Rudolf-Bultmann-Strasse 8, 35039 Marburg, Germany; Center for Mind, Brain and Behavior (CMBB), University of Marburg and Justus Liebig University Giessen, Hans-Meerwein-Strasse 6, 35032 Marburg, Germany.
| |
Collapse
|
29
|
Majchrowicz B, Wierzchoń M. Sensory attenuation of action outcomes of varying amplitude and valence. Conscious Cogn 2020; 87:103058. [PMID: 33278651 DOI: 10.1016/j.concog.2020.103058] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2020] [Revised: 11/13/2020] [Accepted: 11/19/2020] [Indexed: 01/23/2023]
Abstract
Stimuli caused by self-initiated actions are perceived as less intense than those caused externally; this effect is called sensory attenuation (SA). In two experiments, we aimed to assess the impact of the amplitude of outcomes and its affective valence on SA and explicit ratings of sense of agency. This allowed us to test the predictions of the available SA frameworks and better understand the link between SA, affect, and agency. The results indicated that SA can be reversed, and such sensory amplification is driven by low-amplitude and positive-valence outcomes. We also show that intentional action influences the perceived valence of outcomes, and that modulations of explicit sense of agency are divergent from those of SA. Our study shows that valence influences the processing of the amplitude of intentional action outcomes and suggests that none of the currently available frameworks give full justice to SA's variability.
Collapse
Affiliation(s)
- Bartosz Majchrowicz
- Consciousness Lab, Institute of Psychology, Jagiellonian University, Krakow, Poland.
| | - Michał Wierzchoń
- Consciousness Lab, Institute of Psychology, Jagiellonian University, Krakow, Poland
| |
Collapse
|
30
|
Buaron B, Reznik D, Gilron R, Mukamel R. Voluntary Actions Modulate Perception and Neural Representation of Action-Consequences in a Hand-Dependent Manner. Cereb Cortex 2020; 30:6097-6107. [PMID: 32607565 DOI: 10.1093/cercor/bhaa156] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2020] [Revised: 05/17/2020] [Accepted: 05/18/2020] [Indexed: 12/13/2022] Open
Abstract
Evoked neural activity in sensory regions and perception of sensory stimuli are modulated when the stimuli are the consequence of voluntary movement, as opposed to an external source. It has been suggested that such modulations are due to motor commands that are sent to relevant sensory regions during voluntary movement. However, given the anatomical-functional laterality bias of the motor system, it is plausible that the pattern of such behavioral and neural modulations will also exhibit a similar bias, depending on the effector triggering the stimulus (e.g., right/left hand). Here, we examined this issue in the visual domain using behavioral and neural measures (fMRI). Healthy participants judged the relative brightness of identical visual stimuli that were either self-triggered (using right/left hand button presses), or triggered by the computer. Stimuli were presented either in the right or left visual field. Despite identical physical properties of the visual consequences, we found stronger perceptual modulations when the triggering hand was ipsi- (rather than contra-) lateral to the stimulated visual field. Additionally, fMRI responses in visual cortices differentiated between stimuli triggered by right/left hand. Our findings support a model in which voluntary actions induce sensory modulations that follow the anatomical-functional bias of the motor system.
Collapse
Affiliation(s)
- Batel Buaron
- Sagol School of Neuroscience, School of Psychological Sciences, Tel-Aviv University, Tel Aviv 69978, Israel
| | - Daniel Reznik
- Department of Psychology, Center for Brain Science, Harvard University, Cambridge, MA 02138, USA
| | - Ro'ee Gilron
- Department of Neurological Surgery, UCSF School of Medicine, UCSF, San Francisco, CA 94115, USA
| | - Roy Mukamel
- Sagol School of Neuroscience, School of Psychological Sciences, Tel-Aviv University, Tel Aviv 69978, Israel
| |
Collapse
|
31
|
Arikan BE, van Kemenade BM, Podranski K, Steinsträter O, Straube B, Kircher T. Perceiving your hand moving: BOLD suppression in sensory cortices and the role of the cerebellum in the detection of feedback delays. J Vis 2020; 19:4. [PMID: 31826249 DOI: 10.1167/19.14.4] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
Sensory consequences of self-generated as opposed to externally generated movements are perceived as less intense and lead to less neural activity in corresponding sensory cortices, presumably due to predictive mechanisms. Self-generated sensory inputs have been mostly studied in a single modality, using abstract feedback, with control conditions not differentiating efferent from reafferent feedback. Here we investigated the neural processing of (a) naturalistic action-feedback associations of (b) self-generated versus externally generated movements, and (c) how an additional (auditory) modality influences neural processing and detection of delays. Participants executed wrist movements using a passive movement device (PMD) as they watched their movements in real time or with variable delays (0-417 ms). The task was to judge whether there was a delay between the movement and its visual feedback. In the externally generated condition, movements were induced by the PMD to disentangle efferent from reafferent feedback. Half of the trials involved auditory beeps coupled to the onset of the visual feedback. We found reduced BOLD activity in visual, auditory, and somatosensory areas during self-generated compared with externally generated movements in unimodal and bimodal conditions. Anterior and posterior cerebellar areas were engaged for trials in which action-feedback delays were detected for self-generated movements. Specifically, the left cerebellar lobule IX was functionally connected with the right superior occipital gyrus. The results indicate efference copy-based predictive mechanisms specific to self-generated movements, leading to BOLD suppression in sensory areas. In addition, our results support the cerebellum's role in the detection of temporal prediction errors during our actions and their consequences.
Collapse
Affiliation(s)
- B Ezgi Arikan
- Department of Psychology, Justus-Liebig University Giessen, Giessen, Germany
| | - Bianca M van Kemenade
- Department of Psychiatry and Psychotherapy, Philipps University Marburg, Marburg, Germany
| | - Kornelius Podranski
- Department of Psychiatry and Psychotherapy, Philipps University Marburg, Marburg, Germany.,Core Facility Brain Imaging, Faculty of Medicine, Philipps University Marburg, Marburg, Germany.,Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - Olaf Steinsträter
- Department of Psychiatry and Psychotherapy, Philipps University Marburg, Marburg, Germany.,Core Facility Brain Imaging, Faculty of Medicine, Philipps University Marburg, Marburg, Germany
| | - Benjamin Straube
- Department of Psychiatry and Psychotherapy, Philipps University Marburg, Marburg, Germany
| | - Tilo Kircher
- Department of Psychiatry and Psychotherapy, Philipps University Marburg, Marburg, Germany
| |
Collapse
|
32
|
Myers JC, Mock JR, Golob EJ. Sensorimotor Integration Can Enhance Auditory Perception. Sci Rep 2020; 10:1496. [PMID: 32001755 PMCID: PMC6992622 DOI: 10.1038/s41598-020-58447-z] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2019] [Accepted: 01/08/2020] [Indexed: 11/26/2022] Open
Abstract
Whenever we move, speak, or play musical instruments, our actions generate auditory sensory input. The sensory consequences of our actions are thought to be predicted via sensorimotor integration, which involves anatomical and functional links between auditory and motor brain regions. The physiological connections are relatively well established, but less is known about how sensorimotor integration affects auditory perception. The sensory attenuation hypothesis suggests that the perceived loudness of self-generated sounds is attenuated to help distinguish self-generated sounds from ambient sounds. Sensory attenuation would work for louder ambient sounds, but could lead to less accurate perception if the ambient sounds were quieter. We hypothesize that a key function of sensorimotor integration is the facilitated processing of self-generated sounds, leading to more accurate perception under most conditions. The sensory attenuation hypothesis predicts better performance for higher but not lower intensity comparisons, whereas sensory facilitation predicts improved perception regardless of comparison sound intensity. A series of experiments tested these hypotheses, with results supporting the enhancement hypothesis. Overall, people were more accurate at comparing the loudness of two sounds when making one of the sounds themselves. We propose that the brain selectively modulates the perception of self-generated sounds to enhance representations of action consequences.
Collapse
Affiliation(s)
- John C Myers
- Department of Psychology, University of Texas, San Antonio, USA.
| | - Jeffrey R Mock
- Department of Psychology, University of Texas, San Antonio, USA
| | - Edward J Golob
- Department of Psychology, University of Texas, San Antonio, USA
| |
Collapse
|
33
|
Parlikar R, Bose A, Venkatasubramanian G. Schizophrenia and Corollary Discharge: A Neuroscientific Overview and Translational Implications. CLINICAL PSYCHOPHARMACOLOGY AND NEUROSCIENCE 2019; 17:170-182. [PMID: 30905117 PMCID: PMC6478093 DOI: 10.9758/cpn.2019.17.2.170] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/07/2018] [Revised: 07/25/2018] [Accepted: 08/02/2018] [Indexed: 01/10/2023]
Abstract
Corollary discharge mechanism refers to the suppression of sensory consequences of self-generated actions; a process that serves to distinguish between self and non-self based on discrimination of origination of action. It explains, say for example, why we cannot tickle ourselves. This review discusses how corollary discharge model is an essential neural integration mechanism central to the motor functioning of animal kingdom. In this article, research conducted in the field of corollary discharge has been reviewed to understand the neuroanatomical and neurophysiological basis of corollary discharge and gain insight into the biochemical basis of its dysfunction. This review article also explores the role of corollary discharge and its dysfunction in the presentation of symptoms of schizophrenia, discussing the findings from corollary discharge studies on schizophrenia population. Lastly, the link between schizophrenia psychopathology and corollary discharge dysfunction has been highlighted, and an attempt has been made to establish a case for correction of corollary discharge deficit in schizophrenia through neuromodulation.
Collapse
Affiliation(s)
- Rujuta Parlikar
- WISER Program, Department of Psychiatry, National Institute of Mental Health and Neurosciences, Bangalore, India
| | - Anushree Bose
- WISER Program, Department of Psychiatry, National Institute of Mental Health and Neurosciences, Bangalore, India
| | - Ganesan Venkatasubramanian
- WISER Program, Department of Psychiatry, National Institute of Mental Health and Neurosciences, Bangalore, India
| |
Collapse
|
34
|
Motor output, neural states and auditory perception. Neurosci Biobehav Rev 2019; 96:116-126. [DOI: 10.1016/j.neubiorev.2018.10.021] [Citation(s) in RCA: 20] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2018] [Revised: 10/26/2018] [Accepted: 10/29/2018] [Indexed: 12/12/2022]
|
35
|
Pérez A, Dumas G, Karadag M, Duñabeitia JA. Differential brain-to-brain entrainment while speaking and listening in native and foreign languages. Cortex 2018; 111:303-315. [PMID: 30598230 DOI: 10.1016/j.cortex.2018.11.026] [Citation(s) in RCA: 37] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2018] [Revised: 09/28/2018] [Accepted: 11/29/2018] [Indexed: 10/27/2022]
Abstract
The study explores interbrain neural coupling when interlocutors engage in a conversation whether it be in their native or nonnative language. To this end, electroencephalographic hyperscanning was used to study brain-to-brain phase synchronization during a two-person turn-taking verbal exchange with no visual contact, in either a native or a foreign language context. Results show that the coupling strength between brain signals is increased in both, the native language context and the foreign language context, specifically, in the alpha frequency band. A difference in brain-to speech entrainment to native and foreign languages is also shown. These results indicate that between brain similarities in the timing of neural activations and their spatial distributions change depending on the language code used. We argue that factors like linguistic alignment, joint attention and brain-entrainment to speech operate with a language-idiosyncratic neural configuration, modulating the alignment of neural activity between speakers and listeners. Other possible factors leading to the differential interbrain synchronization patterns as well as the potential features of brain-to-brain entrainment as a mechanism are briefly discussed. We concluded that linguistic context should be considered when addressing interpersonal communication. The findings here open doors to quantifying linguistic interactions.
Collapse
Affiliation(s)
- Alejandro Pérez
- Centre for French & Linguistics, University of Toronto Scarborough, Toronto, Canada; Psychology Department, University of Toronto Scarborough, Toronto, Canada; BCBL, Basque Center on Cognition Brain and Language, Donostia-San Sebastián, Spain.
| | - Guillaume Dumas
- Human Genetics and Cognitive Functions Unit, Institut Pasteur, Paris, France; CNRS UMR 3571 Genes, Synapses and Cognition, Institut Pasteur, Paris, France; Human Genetics and Cognitive Functions, University Paris Diderot, Sorbonne Paris Cité, Paris, France
| | - Melek Karadag
- Centre for Speech, Language and the Brain, Department of Psychology, University of Cambridge, Cambridge, United Kingdom
| | - Jon Andoni Duñabeitia
- BCBL, Basque Center on Cognition Brain and Language, Donostia-San Sebastián, Spain; Facultad de Lenguas y Educación, Universidad Nebrija, Madrid, Spain
| |
Collapse
|
36
|
Saltuklaroglu T, Bowers A, Harkrider AW, Casenhiser D, Reilly KJ, Jenson DE, Thornton D. EEG mu rhythms: Rich sources of sensorimotor information in speech processing. BRAIN AND LANGUAGE 2018; 187:41-61. [PMID: 30509381 DOI: 10.1016/j.bandl.2018.09.005] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/11/2017] [Revised: 09/27/2017] [Accepted: 09/23/2018] [Indexed: 06/09/2023]
Affiliation(s)
- Tim Saltuklaroglu
- Department of Audiology and Speech-Language Pathology, University of Tennessee Health Sciences, Knoxville, TN 37996, USA.
| | - Andrew Bowers
- University of Arkansas, Epley Center for Health Professions, 606 N. Razorback Road, Fayetteville, AR 72701, USA
| | - Ashley W Harkrider
- Department of Audiology and Speech-Language Pathology, University of Tennessee Health Sciences, Knoxville, TN 37996, USA
| | - Devin Casenhiser
- Department of Audiology and Speech-Language Pathology, University of Tennessee Health Sciences, Knoxville, TN 37996, USA
| | - Kevin J Reilly
- Department of Audiology and Speech-Language Pathology, University of Tennessee Health Sciences, Knoxville, TN 37996, USA
| | - David E Jenson
- Department of Speech and Hearing Sciences, Elson S. Floyd College of Medicine, Spokane, WA 99210-1495, USA
| | - David Thornton
- Department of Hearing, Speech, and Language Sciences, Gallaudet University, 800 Florida Avenue NE, Washington, DC 20002, USA
| |
Collapse
|
37
|
Csifcsák G, Balla VR, Dalos VD, Kilencz T, Biró EM, Urbán G, Szalóki S. Action-associated modulation of visual event-related potentials evoked by abstract and ecological stimuli. Psychophysiology 2018; 56:e13289. [DOI: 10.1111/psyp.13289] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2018] [Revised: 08/10/2018] [Accepted: 08/14/2018] [Indexed: 11/28/2022]
Affiliation(s)
- Gábor Csifcsák
- Faculty of Health Sciences, Department of Psychology; UiT The Arctic University of Norway; Tromsø Norway
- Faculty of Arts, Department of Cognitive and Neuropsychology; Institute of Psychology, University of Szeged; Szeged Hungary
| | - Viktória Roxána Balla
- Cognitive Brain Research Unit, Faculty of Medicine, Department of Psychology and Logopedics; University of Helsinki; Helsinki Finland
| | - Vera Daniella Dalos
- Faculty of Arts, Department of Cognitive and Neuropsychology; Institute of Psychology, University of Szeged; Szeged Hungary
| | - Tünde Kilencz
- Faculty of Arts, Department of Cognitive and Neuropsychology; Institute of Psychology, University of Szeged; Szeged Hungary
| | - Edit Magdolna Biró
- Faculty of Medicine, Department of Psychiatry; University of Szeged; Szeged Hungary
| | - Gábor Urbán
- Faculty of Arts, Department of Cognitive and Neuropsychology; Institute of Psychology, University of Szeged; Szeged Hungary
| | - Szilvia Szalóki
- Faculty of Medicine, Department of Psychiatry; University of Szeged; Szeged Hungary
| |
Collapse
|
38
|
Reznik D, Simon S, Mukamel R. Predicted sensory consequences of voluntary actions modulate amplitude of preceding readiness potentials. Neuropsychologia 2018; 119:302-307. [PMID: 30172828 DOI: 10.1016/j.neuropsychologia.2018.08.028] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2018] [Revised: 08/20/2018] [Accepted: 08/29/2018] [Indexed: 10/28/2022]
Abstract
Self-generated, voluntary actions, are preceded by a slow negativity in the scalp electroencephalography (EEG) signal recorded from frontal regions (termed 'readiness potential'; RP). This signal, and its lateralized subcomponent (LRP), is mainly regarded as preparatory motor activity associated with the forthcoming voluntary motor act. However, it is not clear whether this neural signature is associated with preparatory motor activity, expectation of its associated sensory consequences, or both. Here we recorded EEG data from 14 healthy subjects while they performed self-paced button presses with their right index and middle fingers. Button-presses with one finger triggered a sound (motor+sound condition), while button-presses with the other finger did not (motor-only condition). Additionally, subjects listened to externally-generated sounds delivered in expected timings (sound-only condition). We found that the RP amplitude (locked to time of button press) was significantly more negative in the motor+sound compared with motor-only conditions. Importantly, no signal negativity was observed prior to expected sound delivery in the sound-only condition. Thus, the differences in RP amplitude between motor+sound and motor-only conditions are beyond differences in mere expectation of a forthcoming auditory sound. Our results suggest that information regarding expected auditory consequences is represented in the RP preceding voluntary action execution.
Collapse
Affiliation(s)
- Daniel Reznik
- Sagol School of Neuroscience and School of Psychological Sciences, Tel-Aviv University, Tel-Aviv 69978, Israel
| | - Shiri Simon
- Sagol School of Neuroscience and School of Psychological Sciences, Tel-Aviv University, Tel-Aviv 69978, Israel
| | - Roy Mukamel
- Sagol School of Neuroscience and School of Psychological Sciences, Tel-Aviv University, Tel-Aviv 69978, Israel.
| |
Collapse
|
39
|
Abstract
Hearing is often viewed as a passive process: Sound enters the ear, triggers a cascade of activity through the auditory system, and culminates in an auditory percept. In contrast to a passive process, motor-related signals strongly modulate the auditory system from the eardrum to the cortex. The motor modulation of auditory activity is most well documented during speech and other vocalizations but also can be detected during a wide variety of other sound-generating behaviors. An influential idea is that these motor-related signals suppress neural responses to predictable movement-generated sounds, thereby enhancing sensitivity to environmental sounds during movement while helping to detect errors in learned acoustic behaviors, including speech and musicianship. Findings in humans, monkeys, songbirds, and mice provide new insights into the circuits that convey motor-related signals to the auditory system, while lending support to the idea that these signals function predictively to facilitate hearing and vocal learning.
Collapse
Affiliation(s)
- David M Schneider
- Department of Neurobiology, Duke University, Durham, North Carolina 27710, USA;
- Current affiliation: Center for Neural Science, New York University, New York, New York 10003, USA
| | - Richard Mooney
- Department of Neurobiology, Duke University, Durham, North Carolina 27710, USA;
| |
Collapse
|
40
|
Bansal S, Ford JM, Spering M. The function and failure of sensory predictions. Ann N Y Acad Sci 2018; 1426:199-220. [PMID: 29683518 DOI: 10.1111/nyas.13686] [Citation(s) in RCA: 36] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2017] [Revised: 02/26/2018] [Accepted: 02/27/2018] [Indexed: 01/24/2023]
Abstract
Humans and other primates are equipped with neural mechanisms that allow them to automatically make predictions about future events, facilitating processing of expected sensations and actions. Prediction-driven control and monitoring of perceptual and motor acts are vital to normal cognitive functioning. This review provides an overview of corollary discharge mechanisms involved in predictions across sensory modalities and discusses consequences of predictive coding for cognition and behavior. Converging evidence now links impairments in corollary discharge mechanisms to neuropsychiatric symptoms such as hallucinations and delusions. We review studies supporting a prediction-failure hypothesis of perceptual and cognitive disturbances. We also outline neural correlates underlying prediction function and failure, highlighting similarities across the visual, auditory, and somatosensory systems. In linking basic psychophysical and psychophysiological evidence of visual, auditory, and somatosensory prediction failures to neuropsychiatric symptoms, our review furthers our understanding of disease mechanisms.
Collapse
Affiliation(s)
- Sonia Bansal
- Maryland Psychiatric Research Center, University of Maryland, Catonsville, Maryland
| | - Judith M Ford
- University of California and Veterans Affairs Medical Center, San Francisco, California
| | - Miriam Spering
- Department of Ophthalmology and Visual Sciences, University of British Columbia, Vancouver, British Columbia, Canada
| |
Collapse
|
41
|
Baum F, Wolfensteller U, Ruge H. Learning-Related Brain-Electrical Activity Dynamics Associated with the Subsequent Impact of Learnt Action-Outcome Associations. Front Hum Neurosci 2017; 11:252. [PMID: 28555101 PMCID: PMC5430059 DOI: 10.3389/fnhum.2017.00252] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/12/2017] [Accepted: 04/27/2017] [Indexed: 11/13/2022] Open
Abstract
Goal-directed behavior relies on the integration of anticipated outcomes into action planning based on acquired knowledge about the current contingencies between behavioral responses (R) and desired outcomes (O) under specific stimulus conditions (S). According to ideomotor theory, bidirectional R-O associations are an integral part of this knowledge structure. Previous EEG studies have identified neural activity markers linked to the involvement of such associations, but the initial acquisition process has not yet been characterized. The present study thus examined brain-electrical activity dynamics during the rapid acquisition of novel bidirectional R-O associations during instructed S-R learning. Within a trial, we inspected response-locked and stimulus-locked activity dynamics in order to identify markers linked to the forward and backward activation of bidirectional R-O associations as they were being increasingly strengthened under forced choice conditions. We found that a post-response anterior negativity following auditory outcomes was increasingly attenuated as a function of the acquired association strength. This suggests that previously reported action-induced sensory attenuation effects under extensively trained free choice conditions can be established within few repetitions of specific R-O pairings under forced choice conditions. Furthermore, we observed the even more rapid development of a post-response but pre-outcome fronto-central positivity which was reduced for high R-O learners which might indicate the rapid deployment of preparatory attention towards predictable outcomes. Finally, we identified a learning-related stimulus-locked activity modulation within the visual P1-N1 latency range which might reflect the multi-sensory integration of the perceived antecedent visual stimulus the anticipated auditory outcome.
Collapse
Affiliation(s)
- Fabian Baum
- Department of Psychology, Technische Universität DresdenDresden, Germany
| | - Uta Wolfensteller
- Department of Psychology, Technische Universität DresdenDresden, Germany
| | - Hannes Ruge
- Department of Psychology, Technische Universität DresdenDresden, Germany
- *Correspondence: Hannes Ruge
| |
Collapse
|
42
|
Uhlig CH, Gutschalk A. Transient human auditory cortex activation during volitional attention shifting. PLoS One 2017; 12:e0172907. [PMID: 28273110 PMCID: PMC5342206 DOI: 10.1371/journal.pone.0172907] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2016] [Accepted: 02/02/2017] [Indexed: 11/29/2022] Open
Abstract
While strong activation of auditory cortex is generally found for exogenous orienting of attention, endogenous, intra-modal shifting of auditory attention has not yet been demonstrated to evoke transient activation of the auditory cortex. Here, we used fMRI to test if endogenous shifting of attention is also associated with transient activation of the auditory cortex. In contrast to previous studies, attention shifts were completely self-initiated and not cued by transient auditory or visual stimuli. Stimuli were two dichotic, continuous streams of tones, whose perceptual grouping was not ambiguous. Participants were instructed to continuously focus on one of the streams and switch between the two after a while, indicating the time and direction of each attentional shift by pressing one of two response buttons. The BOLD response around the time of the button presses revealed robust activation of the auditory cortex, along with activation of a distributed task network. To test if the transient auditory cortex activation was specifically related to auditory orienting, a self-paced motor task was added, where participants were instructed to ignore the auditory stimulation while they pressed the response buttons in alternation and at a similar pace. Results showed that attentional orienting produced stronger activity in auditory cortex, but auditory cortex activation was also observed for button presses without focused attention to the auditory stimulus. The response related to attention shifting was stronger contralateral to the side where attention was shifted to. Contralateral-dominant activation was also observed in dorsal parietal cortex areas, confirming previous observations for auditory attention shifting in studies that used auditory cues.
Collapse
Affiliation(s)
- Christian Harm Uhlig
- Department of Neurology, Ruprecht-Karls-Universität Heidelberg, Heidelberg, Germany
| | - Alexander Gutschalk
- Department of Neurology, Ruprecht-Karls-Universität Heidelberg, Heidelberg, Germany
- * E-mail:
| |
Collapse
|
43
|
Domínguez-Borràs J, Rieger SW, Corradi-Dell'Acqua C, Neveu R, Vuilleumier P. Fear Spreading Across Senses: Visual Emotional Events Alter Cortical Responses to Touch, Audition, and Vision. Cereb Cortex 2017; 27:68-82. [PMID: 28365774 PMCID: PMC5939199 DOI: 10.1093/cercor/bhw337] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2016] [Revised: 09/07/2016] [Indexed: 12/01/2022] Open
Abstract
Attention and perception are potentiated for emotionally significant stimuli, promoting efficient reactivity and survival. But does such enhancement extend to stimuli simultaneously presented across different sensory modalities? We used functional magnetic resonance imaging in humans to examine the effects of visual emotional signals on concomitant sensory inputs in auditory, somatosensory, and visual modalities. First, we identified sensory areas responsive to task-irrelevant tones, touches, or flickers, presented bilaterally while participants attended to either a neutral or a fearful face. Then, we measured whether these responses were modulated by the emotional content of the face. Sensory responses in primary cortices were enhanced for auditory and tactile stimuli when these appeared with fearful faces, compared with neutral, but striate cortex responses to the visual stimuli were reduced in the left hemisphere, plausibly as a consequence of sensory competition. Finally, conjunction and functional connectivity analyses identified 2 distinct networks presumably responsible for these emotional modulatory processes, involving cingulate, insular, and orbitofrontal cortices for the increased sensory responses, and ventrolateral prefrontal cortex for the decreased sensory responses. These results suggest that emotion tunes the excitability of sensory systems across multiple modalities simultaneously, allowing the individual to adaptively process incoming inputs in a potentially threatening environment.
Collapse
Affiliation(s)
- Judith Domínguez-Borràs
- Laboratory for Behavioral Neurology and Imaging of Cognition, Department of Neuroscience, University Medical Center, CH-1211 Geneva, Switzerland
- Swiss Center for Affective Sciences, University of Geneva, Campus Biotech, CH-1202 Geneva, Switzerland
| | - Sebastian Walter Rieger
- Swiss Center for Affective Sciences, University of Geneva, Campus Biotech, CH-1202 Geneva, Switzerland
- Geneva Neuroscience Center, University of Geneva, CH-1211 Geneva, Switzerland
| | - Corrado Corradi-Dell'Acqua
- Laboratory for Behavioral Neurology and Imaging of Cognition, Department of Neuroscience, University Medical Center, CH-1211 Geneva, Switzerland
- Swiss Center for Affective Sciences, University of Geneva, Campus Biotech, CH-1202 Geneva, Switzerland
- Department of Psychology, FPSE, University of Geneva, CH-1205, Geneva, Switzerland
| | - Rémi Neveu
- Laboratory for Behavioral Neurology and Imaging of Cognition, Department of Neuroscience, University Medical Center, CH-1211 Geneva, Switzerland
- Swiss Center for Affective Sciences, University of Geneva, Campus Biotech, CH-1202 Geneva, Switzerland
| | - Patrik Vuilleumier
- Laboratory for Behavioral Neurology and Imaging of Cognition, Department of Neuroscience, University Medical Center, CH-1211 Geneva, Switzerland
- Swiss Center for Affective Sciences, University of Geneva, Campus Biotech, CH-1202 Geneva, Switzerland
- Geneva Neuroscience Center, University of Geneva, CH-1211 Geneva, Switzerland
- Department of Neurology, University Hospital, CH-1211 Geneva, Switzerland
| |
Collapse
|
44
|
Chhabra H, Sowmya S, Sreeraj VS, Kalmady SV, Shivakumar V, Amaresha AC, Narayanaswamy JC, Venkatasubramanian G. Auditory false perception in schizophrenia: Development and validation of auditory signal detection task. Asian J Psychiatr 2016; 24:23-27. [PMID: 27931901 DOI: 10.1016/j.ajp.2016.08.006] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/06/2016] [Revised: 08/13/2016] [Accepted: 08/13/2016] [Indexed: 01/30/2023]
Abstract
Auditory hallucinations constitute an important symptom component in 70-80% of schizophrenia patients. These hallucinations are proposed to occur due to an imbalance between perceptual expectation and external input, resulting in attachment of meaning to abstract noises; signal detection theory has been proposed to explain these phenomena. In this study, we describe the development of an auditory signal detection task using a carefully chosen set of English words that could be tested successfully in schizophrenia patients coming from varying linguistic, cultural and social backgrounds. Schizophrenia patients with significant auditory hallucinations (N=15) and healthy controls (N=15) performed the auditory signal detection task wherein they were instructed to differentiate between a 5-s burst of plain white noise and voiced-noise. The analysis showed that false alarms (p=0.02), discriminability index (p=0.001) and decision bias (p=0.004) were significantly different between the two groups. There was a significant negative correlation between false alarm rate and decision bias. These findings extend further support for impaired perceptual expectation system in schizophrenia patients.
Collapse
Affiliation(s)
- Harleen Chhabra
- The Schizophrenia Clinic, Department of Psychiatry & Translational Psychiatry Laboratory, Neurobiology Research Centre, National Institute of Mental Health and Neurosciences, Bangalore, India
| | - Selvaraj Sowmya
- The Schizophrenia Clinic, Department of Psychiatry & Translational Psychiatry Laboratory, Neurobiology Research Centre, National Institute of Mental Health and Neurosciences, Bangalore, India
| | - Vanteemar S Sreeraj
- The Schizophrenia Clinic, Department of Psychiatry & Translational Psychiatry Laboratory, Neurobiology Research Centre, National Institute of Mental Health and Neurosciences, Bangalore, India
| | - Sunil V Kalmady
- The Schizophrenia Clinic, Department of Psychiatry & Translational Psychiatry Laboratory, Neurobiology Research Centre, National Institute of Mental Health and Neurosciences, Bangalore, India
| | - Venkataram Shivakumar
- The Schizophrenia Clinic, Department of Psychiatry & Translational Psychiatry Laboratory, Neurobiology Research Centre, National Institute of Mental Health and Neurosciences, Bangalore, India
| | - Anekal C Amaresha
- The Schizophrenia Clinic, Department of Psychiatry & Translational Psychiatry Laboratory, Neurobiology Research Centre, National Institute of Mental Health and Neurosciences, Bangalore, India
| | - Janardhanan C Narayanaswamy
- The Schizophrenia Clinic, Department of Psychiatry & Translational Psychiatry Laboratory, Neurobiology Research Centre, National Institute of Mental Health and Neurosciences, Bangalore, India
| | - Ganesan Venkatasubramanian
- The Schizophrenia Clinic, Department of Psychiatry & Translational Psychiatry Laboratory, Neurobiology Research Centre, National Institute of Mental Health and Neurosciences, Bangalore, India.
| |
Collapse
|
45
|
Abstract
OBJECTIVE To investigate the cerebral gray matter volume alterations in unilateral sudden sensorineural hearing loss patients within the acute period by the voxel-based morphometry method, and to determine if hearing impairment is associated with regional gray matter alterations in unilateral sudden sensorineural hearing loss patients. STUDY DESIGN Prospective case study. SETTING Tertiary class A teaching hospital. PATIENTS Thirty-nine patients with left-side unilateral sudden sensorineural hearing loss and 47 patients with right-side unilateral sudden sensorineural hearing loss. INTERVENTION Diagnostic. MAIN OUTCOME MEASURE To compare the regional gray matter of unilateral sudden sensorineural hearing loss patients and healthy control participants. RESULTS Compared with control groups, patients with left side unilateral sudden sensorineural hearing loss had significant gray matter reductions in the right middle temporal gyrus and right superior temporal gyrus, whereas patients with right side unilateral sudden sensorineural hearing loss showed gray matter decreases in the left superior temporal gyrus and left middle temporal gyrus. A significant negative correlation with the duration of the sudden sensorineural hearing loss (R = -0.427, p = 0.012 for left-side unilateral SSNHL and R = -0.412, p = 0.013 for right-side unilateral SSNHL) was also found in these brain areas. There was no region with increased gray matter found in both groups of unilateral sudden sensorineural hearing loss patients. CONCLUSIONS This study confirms that detectable decreased contralateral auditory cortical morphological changes have occurred in unilateral SSNHL patients within the acute period by voxel-based morphometry methods. The gray matter volumes of these brain areas also perform a negative correlation with the duration of the disease, which suggests a gradual brain structural impairment after the progression of the disease.
Collapse
|
46
|
Hall AJ, Butler BE, Lomber SG. The cat's meow: A high-field fMRI assessment of cortical activity in response to vocalizations and complex auditory stimuli. Neuroimage 2016; 127:44-57. [DOI: 10.1016/j.neuroimage.2015.11.056] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2015] [Revised: 11/22/2015] [Accepted: 11/24/2015] [Indexed: 01/26/2023] Open
|
47
|
Wikman PA, Vainio L, Rinne T. The effect of precision and power grips on activations in human auditory cortex. Front Neurosci 2015; 9:378. [PMID: 26528121 PMCID: PMC4606019 DOI: 10.3389/fnins.2015.00378] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/26/2015] [Accepted: 09/28/2015] [Indexed: 11/23/2022] Open
Abstract
The neuroanatomical pathways interconnecting auditory and motor cortices play a key role in current models of human auditory cortex (AC). Evidently, auditory-motor interaction is important in speech and music production, but the significance of these cortical pathways in other auditory processing is not well known. We investigated the general effects of motor responding on AC activations to sounds during auditory and visual tasks (motor regions were not imaged). During all task blocks, subjects detected targets in the designated modality, reported the relative number of targets at the end of the block, and ignored the stimuli presented in the opposite modality. In each block, they were also instructed to respond to targets either using a precision grip, power grip, or to give no overt target responses. We found that motor responding strongly modulated AC activations. First, during both visual and auditory tasks, activations in widespread regions of AC decreased when subjects made precision and power grip responses to targets. Second, activations in AC were modulated by grip type during the auditory but not during the visual task. Further, the motor effects were distinct from the present strong attention-related modulations in AC. These results are consistent with the idea that operations in AC are shaped by its connections with motor cortical regions.
Collapse
Affiliation(s)
- Patrik A Wikman
- Institute of Behavioural Sciences, University of Helsinki Helsinki, Finland
| | - Lari Vainio
- Institute of Behavioural Sciences, University of Helsinki Helsinki, Finland
| | - Teemu Rinne
- Institute of Behavioural Sciences, University of Helsinki Helsinki, Finland ; Advanced Magnetic Imaging Centre, Aalto University School of Science Espoo, Finland
| |
Collapse
|
48
|
Jenson D, Harkrider AW, Thornton D, Bowers AL, Saltuklaroglu T. Auditory cortical deactivation during speech production and following speech perception: an EEG investigation of the temporal dynamics of the auditory alpha rhythm. Front Hum Neurosci 2015; 9:534. [PMID: 26500519 PMCID: PMC4597480 DOI: 10.3389/fnhum.2015.00534] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/12/2015] [Accepted: 09/14/2015] [Indexed: 11/22/2022] Open
Abstract
Sensorimotor integration (SMI) across the dorsal stream enables online monitoring of speech. Jenson et al. (2014) used independent component analysis (ICA) and event related spectral perturbation (ERSP) analysis of electroencephalography (EEG) data to describe anterior sensorimotor (e.g., premotor cortex, PMC) activity during speech perception and production. The purpose of the current study was to identify and temporally map neural activity from posterior (i.e., auditory) regions of the dorsal stream in the same tasks. Perception tasks required "active" discrimination of syllable pairs (/ba/ and /da/) in quiet and noisy conditions. Production conditions required overt production of syllable pairs and nouns. ICA performed on concatenated raw 68 channel EEG data from all tasks identified bilateral "auditory" alpha (α) components in 15 of 29 participants localized to pSTG (left) and pMTG (right). ERSP analyses were performed to reveal fluctuations in the spectral power of the α rhythm clusters across time. Production conditions were characterized by significant α event related synchronization (ERS; pFDR < 0.05) concurrent with EMG activity from speech production, consistent with speech-induced auditory inhibition. Discrimination conditions were also characterized by α ERS following stimulus offset. Auditory α ERS in all conditions temporally aligned with PMC activity reported in Jenson et al. (2014). These findings are indicative of speech-induced suppression of auditory regions, possibly via efference copy. The presence of the same pattern following stimulus offset in discrimination conditions suggests that sensorimotor contributions following speech perception reflect covert replay, and that covert replay provides one source of the motor activity previously observed in some speech perception tasks. To our knowledge, this is the first time that inhibition of auditory regions by speech has been observed in real-time with the ICA/ERSP technique.
Collapse
Affiliation(s)
- David Jenson
- Department of Audiology and Speech Pathology, University of Tennessee Health Science CenterKnoxville, TN, USA
| | - Ashley W. Harkrider
- Department of Audiology and Speech Pathology, University of Tennessee Health Science CenterKnoxville, TN, USA
| | - David Thornton
- Department of Audiology and Speech Pathology, University of Tennessee Health Science CenterKnoxville, TN, USA
| | - Andrew L. Bowers
- Department of Communication Disorders, University of ArkansasFayetteville, AR, USA
| | - Tim Saltuklaroglu
- Department of Audiology and Speech Pathology, University of Tennessee Health Science CenterKnoxville, TN, USA
| |
Collapse
|
49
|
Reznik D, Henkin Y, Levy O, Mukamel R. Perceived loudness of self-generated sounds is differentially modified by expected sound intensity. PLoS One 2015; 10:e0127651. [PMID: 25992603 PMCID: PMC4436370 DOI: 10.1371/journal.pone.0127651] [Citation(s) in RCA: 33] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2015] [Accepted: 04/17/2015] [Indexed: 11/30/2022] Open
Abstract
Performing actions with sensory consequences modifies physiological and behavioral responses relative to otherwise identical sensory input perceived in a passive manner. It is assumed that such modifications occur through an efference copy sent from motor cortex to sensory regions during performance of voluntary actions. In the auditory domain most behavioral studies report attenuated perceived loudness of self-generated auditory action-consequences. However, several recent behavioral and physiological studies report enhanced responses to such consequences. Here we manipulated the intensity of self-generated and externally-generated sounds and examined the type of perceptual modification (enhancement vs. attenuation) reported by healthy human subjects. We found that when the intensity of self-generated sounds was low, perceived loudness is enhanced. Conversely, when the intensity of self-generated sounds was high, perceived loudness is attenuated. These results might reconcile some of the apparent discrepancies in the reported literature and suggest that efference copies can adapt perception according to the differential sensory context of voluntary actions.
Collapse
Affiliation(s)
- Daniel Reznik
- School of Psychological Sciences, Tel Aviv University, Tel Aviv, Israel
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel
| | - Yael Henkin
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel
- Department of Communication Disorders, Sackler Faculty of Medicine, Tel Aviv University, Tel Aviv, Israel
- Hearing, Speech, and Language Center, Sheba Medical Center, Tel Hashomer, Ramat Gan, Israel
| | - Osnat Levy
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel
| | - Roy Mukamel
- School of Psychological Sciences, Tel Aviv University, Tel Aviv, Israel
- Sagol School of Neuroscience, Tel Aviv University, Tel Aviv, Israel
- * E-mail:
| |
Collapse
|
50
|
Enhanced auditory evoked activity to self-generated sounds is mediated by primary and supplementary motor cortices. J Neurosci 2015; 35:2173-80. [PMID: 25653372 DOI: 10.1523/jneurosci.3723-14.2015] [Citation(s) in RCA: 33] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Accumulating evidence demonstrates that responses in auditory cortex to auditory consequences of self-generated actions are modified relative to the responses evoked by identical sounds generated by an external source. Such modifications have been suggested to occur through a corollary discharge sent from the motor system, although the exact neuroanatomical origin is unknown. Furthermore, since tactile input has also been shown to modify responses in auditory cortex, it is not even clear whether the source of such modifications is motor output or somatosensory feedback. We recorded functional magnetic resonance imaging (fMRI) data from healthy human subjects (n = 11) while manipulating the rate at which they performed sound-producing actions with their right hand. In addition, we manipulated the amount of tactile feedback to examine the relative roles of motor and somatosensory cortices in modifying evoked activity in auditory cortex (superior temporal gyrus). We found an enhanced fMRI signal in left auditory cortex during perception of self-generated sounds relative to passive listening to identical sounds. Moreover, the signal difference between active and passive conditions in left auditory cortex covaried with the rate of sound-producing actions and was invariant to the amount of tactile feedback. Together with functional connectivity analysis, our results suggest motor output from supplementary motor area and left primary motor cortex as the source of signal modification in auditory cortex during perception of self-generated sounds. Motor signals from these regions could represent a predictive signal of the expected auditory consequences of the performed action.
Collapse
|