1
|
Preisig BC, Hervais-Adelman A. The Predictive Value of Individual Electric Field Modeling for Transcranial Alternating Current Stimulation Induced Brain Modulation. Front Cell Neurosci 2022; 16:818703. [PMID: 35273479 PMCID: PMC8901488 DOI: 10.3389/fncel.2022.818703] [Citation(s) in RCA: 18] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/19/2021] [Accepted: 02/02/2022] [Indexed: 11/13/2022] Open
Abstract
There is considerable individual variability in the reported effectiveness of non-invasive brain stimulation. This variability has often been ascribed to differences in the neuroanatomy and resulting differences in the induced electric field inside the brain. In this study, we addressed the question whether individual differences in the induced electric field can predict the neurophysiological and behavioral consequences of gamma band tACS. In a within-subject experiment, bi-hemispheric gamma band tACS and sham stimulation was applied in alternating blocks to the participants' superior temporal lobe, while task-evoked auditory brain activity was measured with concurrent functional magnetic resonance imaging (fMRI) and a dichotic listening task. Gamma tACS was applied with different interhemispheric phase lags. In a recent study, we could show that anti-phase tACS (180° interhemispheric phase lag), but not in-phase tACS (0° interhemispheric phase lag), selectively modulates interhemispheric brain connectivity. Using a T1 structural image of each participant's brain, an individual simulation of the induced electric field was computed. From these simulations, we derived two predictor variables: maximal strength (average of the 10,000 voxels with largest electric field values) and precision of the electric field (spatial correlation between the electric field and the task evoked brain activity during sham stimulation). We found considerable variability in the individual strength and precision of the electric fields. Importantly, the strength of the electric field over the right hemisphere predicted individual differences of tACS induced brain connectivity changes. Moreover, we found in both hemispheres a statistical trend for the effect of electric field strength on tACS induced BOLD signal changes. In contrast, the precision of the electric field did not predict any neurophysiological measure. Further, neither strength, nor precision predicted interhemispheric integration. In conclusion, we found evidence for the dose-response relationship between individual differences in electric fields and tACS induced activity and connectivity changes in concurrent fMRI. However, the fact that this relationship was stronger in the right hemisphere suggests that the relationship between the electric field parameters, neurophysiology, and behavior may be more complex for bi-hemispheric tACS.
Collapse
|
research-article |
3 |
18 |
2
|
Preisig BC, Eggenberger N, Cazzoli D, Nyffeler T, Gutbrod K, Annoni JM, Meichtry JR, Nef T, Müri RM. Multimodal Communication in Aphasia: Perception and Production of Co-speech Gestures During Face-to-Face Conversation. Front Hum Neurosci 2018; 12:200. [PMID: 29962942 PMCID: PMC6010555 DOI: 10.3389/fnhum.2018.00200] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2018] [Accepted: 04/30/2018] [Indexed: 11/24/2022] Open
Abstract
The role of nonverbal communication in patients with post-stroke language impairment (aphasia) is not yet fully understood. This study investigated how aphasic patients perceive and produce co-speech gestures during face-to-face interaction, and whether distinct brain lesions would predict the frequency of spontaneous co-speech gesturing. For this purpose, we recorded samples of conversations in patients with aphasia and healthy participants. Gesture perception was assessed by means of a head-mounted eye-tracking system, and the produced co-speech gestures were coded according to a linguistic classification system. The main results are that meaning-laden gestures (e.g., iconic gestures representing object shapes) are more likely to attract visual attention than meaningless hand movements, and that patients with aphasia are more likely to fixate co-speech gestures overall than healthy participants. This implies that patients with aphasia may benefit from the multimodal information provided by co-speech gestures. On the level of co-speech gesture production, we found that patients with damage to the anterior part of the arcuate fasciculus showed a higher frequency of meaning-laden gestures. This area lies in close vicinity to the premotor cortex and is considered to be important for speech production. This may suggest that the use of meaning-laden gestures depends on the integrity of patients’ speech production abilities.
Collapse
|
Journal Article |
7 |
15 |
3
|
Preisig BC, Riecke L, Sjerps MJ, Kösem A, Kop BR, Bramson B, Hagoort P, Hervais-Adelman A. Selective modulation of interhemispheric connectivity by transcranial alternating current stimulation influences binaural integration. Proc Natl Acad Sci U S A 2021; 118:e2015488118. [PMID: 33568530 PMCID: PMC7896308 DOI: 10.1073/pnas.2015488118] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Brain connectivity plays a major role in the encoding, transfer, and integration of sensory information. Interregional synchronization of neural oscillations in the γ-frequency band has been suggested as a key mechanism underlying perceptual integration. In a recent study, we found evidence for this hypothesis showing that the modulation of interhemispheric oscillatory synchrony by means of bihemispheric high-density transcranial alternating current stimulation (HD-TACS) affects binaural integration of dichotic acoustic features. Here, we aimed to establish a direct link between oscillatory synchrony, effective brain connectivity, and binaural integration. We experimentally manipulated oscillatory synchrony (using bihemispheric γ-TACS with different interhemispheric phase lags) and assessed the effect on effective brain connectivity and binaural integration (as measured with functional MRI and a dichotic listening task, respectively). We found that TACS reduced intrahemispheric connectivity within the auditory cortices and antiphase (interhemispheric phase lag 180°) TACS modulated connectivity between the two auditory cortices. Importantly, the changes in intra- and interhemispheric connectivity induced by TACS were correlated with changes in perceptual integration. Our results indicate that γ-band synchronization between the two auditory cortices plays a functional role in binaural integration, supporting the proposed role of interregional oscillatory synchrony in perceptual integration.
Collapse
|
research-article |
4 |
12 |
4
|
Schumacher R, Cazzoli D, Eggenberger N, Preisig B, Nef T, Nyffeler T, Gutbrod K, Annoni JM, Müri RM. Cue Recognition and Integration - Eye Tracking Evidence of Processing Differences in Sentence Comprehension in Aphasia. PLoS One 2015; 10:e0142853. [PMID: 26562795 PMCID: PMC4642964 DOI: 10.1371/journal.pone.0142853] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2015] [Accepted: 10/26/2015] [Indexed: 11/18/2022] Open
Abstract
Purpose We aimed at further elucidating whether aphasic patients’ difficulties in understanding non-canonical sentence structures, such as Passive or Object-Verb-Subject sentences, can be attributed to impaired morphosyntactic cue recognition, and to problems in integrating competing interpretations. Methods A sentence-picture matching task with canonical and non-canonical spoken sentences was performed using concurrent eye tracking. Accuracy, reaction time, and eye tracking data (fixations) of 50 healthy subjects and 12 aphasic patients were analysed. Results Patients showed increased error rates and reaction times, as well as delayed fixation preferences for target pictures in non-canonical sentences. Patients’ fixation patterns differed from healthy controls and revealed deficits in recognizing and immediately integrating morphosyntactic cues. Conclusion Our study corroborates the notion that difficulties in understanding syntactically complex sentences are attributable to a processing deficit encompassing delayed and therefore impaired recognition and integration of cues, as well as increased competition between interpretations.
Collapse
|
Research Support, Non-U.S. Gov't |
10 |
11 |
5
|
Cazzoli D, Hopfner S, Preisig B, Zito G, Vanbellingen T, Jäger M, Nef T, Mosimann U, Bohlhalter S, Müri RM, Nyffeler T. The influence of naturalistic, directionally non-specific motion on the spatial deployment of visual attention in right-hemispheric stroke. Neuropsychologia 2016; 92:181-189. [DOI: 10.1016/j.neuropsychologia.2016.04.017] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2015] [Revised: 04/04/2016] [Accepted: 04/17/2016] [Indexed: 11/24/2022]
|
|
9 |
7 |
6
|
Preisig BC, Sjerps MJ, Hervais-Adelman A, Kösem A, Hagoort P, Riecke L. Bilateral Gamma/Delta Transcranial Alternating Current Stimulation Affects Interhemispheric Speech Sound Integration. J Cogn Neurosci 2019; 32:1242-1250. [PMID: 31682569 DOI: 10.1162/jocn_a_01498] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/12/2023]
Abstract
Perceiving speech requires the integration of different speech cues, that is, formants. When the speech signal is split so that different cues are presented to the right and left ear (dichotic listening), comprehension requires the integration of binaural information. Based on prior electrophysiological evidence, we hypothesized that the integration of dichotically presented speech cues is enabled by interhemispheric phase synchronization between primary and secondary auditory cortex in the gamma frequency band. We tested this hypothesis by applying transcranial alternating current stimulation (TACS) bilaterally above the superior temporal lobe to induce or disrupt interhemispheric gamma-phase coupling. In contrast to initial predictions, we found that gamma TACS applied in-phase above the two hemispheres (interhemispheric lag 0°) perturbs interhemispheric integration of speech cues, possibly because the applied stimulation perturbs an inherent phase lag between the left and right auditory cortex. We also observed this disruptive effect when applying antiphasic delta TACS (interhemispheric lag 180°). We conclude that interhemispheric phase coupling plays a functional role in interhemispheric speech integration. The direction of this effect may depend on the stimulation frequency.
Collapse
|
|
6 |
6 |
7
|
Vanbellingen T, Schumacher R, Eggenberger N, Hopfner S, Cazzoli D, Preisig BC, Bertschi M, Nyffeler T, Gutbrod K, Bassetti CL, Bohlhalter S, Müri RM. Different visual exploration of tool-related gestures in left hemisphere brain damaged patients is associated with poor gestural imitation. Neuropsychologia 2015; 71:158-64. [PMID: 25841335 DOI: 10.1016/j.neuropsychologia.2015.04.001] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2014] [Revised: 02/09/2015] [Accepted: 04/01/2015] [Indexed: 10/23/2022]
Abstract
According to the direct matching hypothesis, perceived movements automatically activate existing motor components through matching of the perceived gesture and its execution. The aim of the present study was to test the direct matching hypothesis by assessing whether visual exploration behavior correlate with deficits in gestural imitation in left hemisphere damaged (LHD) patients. Eighteen LHD patients and twenty healthy control subjects took part in the study. Gesture imitation performance was measured by the test for upper limb apraxia (TULIA). Visual exploration behavior was measured by an infrared eye-tracking system. Short videos including forty gestures (20 meaningless and 20 communicative gestures) were presented. Cumulative fixation duration was measured in different regions of interest (ROIs), namely the face, the gesturing hand, the body, and the surrounding environment. Compared to healthy subjects, patients fixated significantly less the ROIs comprising the face and the gesturing hand during the exploration of emblematic and tool-related gestures. Moreover, visual exploration of tool-related gestures significantly correlated with tool-related imitation as measured by TULIA in LHD patients. Patients and controls did not differ in the visual exploration of meaningless gestures, and no significant relationships were found between visual exploration behavior and the imitation of emblematic and meaningless gestures in TULIA. The present study thus suggests that altered visual exploration may lead to disturbed imitation of tool related gestures, however not of emblematic and meaningless gestures. Consequently, our findings partially support the direct matching hypothesis.
Collapse
|
Research Support, Non-U.S. Gov't |
10 |
6 |
8
|
Preisig BC, Sjerps MJ. Hemispheric specializations affect interhemispheric speech sound integration during duplex perception. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2019; 145:EL190. [PMID: 31067965 DOI: 10.1121/1.5092829] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/11/2019] [Accepted: 02/13/2019] [Indexed: 06/09/2023]
Abstract
The present study investigated whether speech-related spectral information benefits from initially predominant right or left hemisphere processing. Normal hearing individuals categorized speech sounds composed of an ambiguous base (perceptually intermediate between /ga/ and /da/), presented to one ear, and a disambiguating low or high F3 chirp presented to the other ear. Shorter response times were found when the chirp was presented to the left ear than to the right ear (inducing initially right-hemisphere chirp processing), but no between-ear differences in strength of overall integration. The results are in line with the assumptions of a right hemispheric dominance for spectral processing.
Collapse
|
|
6 |
5 |
9
|
Preisig BC, Eggenberger N, Zito G, Vanbellingen T, Schumacher R, Hopfner S, Gutbrod K, Nyffeler T, Cazzoli D, Annoni JM, Bohlhalter S, Müri RM. Eye Gaze Behavior at Turn Transition: How Aphasic Patients Process Speakers' Turns during Video Observation. J Cogn Neurosci 2016; 28:1613-24. [PMID: 27243612 DOI: 10.1162/jocn_a_00983] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
The human turn-taking system regulates the smooth and precise exchange of speaking turns during face-to-face interaction. Recent studies investigated the processing of ongoing turns during conversation by measuring the eye movements of noninvolved observers. The findings suggest that humans shift their gaze in anticipation to the next speaker before the start of the next turn. Moreover, there is evidence that the ability to timely detect turn transitions mainly relies on the lexico-syntactic content provided by the conversation. Consequently, patients with aphasia, who often experience deficits in both semantic and syntactic processing, might encounter difficulties to detect and timely shift their gaze at turn transitions. To test this assumption, we presented video vignettes of natural conversations to aphasic patients and healthy controls, while their eye movements were measured. The frequency and latency of event-related gaze shifts, with respect to the end of the current turn in the videos, were compared between the two groups. Our results suggest that, compared with healthy controls, aphasic patients have a reduced probability to shift their gaze at turn transitions but do not show significantly increased gaze shift latencies. In healthy controls, but not in aphasic patients, the probability to shift the gaze at turn transition was increased when the video content of the current turn had a higher lexico-syntactic complexity. Furthermore, the results from voxel-based lesion symptom mapping indicate that the association between lexico-syntactic complexity and gaze shift latency in aphasic patients is predicted by brain lesions located in the posterior branch of the left arcuate fasciculus. Higher lexico-syntactic processing demands seem to lead to a reduced gaze shift probability in aphasic patients. This finding may represent missed opportunities for patients to place their contributions during everyday conversation.
Collapse
|
|
9 |
3 |
10
|
Preisig BC, Riecke L, Hervais-Adelman A. Speech sound categorization: The contribution of non-auditory and auditory cortical regions. Neuroimage 2022; 258:119375. [PMID: 35700949 DOI: 10.1016/j.neuroimage.2022.119375] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2022] [Revised: 05/13/2022] [Accepted: 06/10/2022] [Indexed: 11/26/2022] Open
Abstract
Which processes in the human brain lead to the categorical perception of speech sounds? Investigation of this question is hampered by the fact that categorical speech perception is normally confounded by acoustic differences in the stimulus. By using ambiguous sounds, however, it is possible to dissociate acoustic from perceptual stimulus representations. Twenty-seven normally hearing individuals took part in an fMRI study in which they were presented with an ambiguous syllable (intermediate between /da/ and /ga/) in one ear and with disambiguating acoustic feature (third formant, F3) in the other ear. Multi-voxel pattern searchlight analysis was used to identify brain areas that consistently differentiated between response patterns associated with different syllable reports. By comparing responses to different stimuli with identical syllable reports and identical stimuli with different syllable reports, we disambiguated whether these regions primarily differentiated the acoustics of the stimuli or the syllable report. We found that BOLD activity patterns in left perisylvian regions (STG, SMG), left inferior frontal regions (vMC, IFG, AI), left supplementary motor cortex (SMA/pre-SMA), and right motor and somatosensory regions (M1/S1) represent listeners' syllable report irrespective of stimulus acoustics. Most of these regions are outside of what is traditionally regarded as auditory or phonological processing areas. Our results indicate that the process of speech sound categorization implicates decision-making mechanisms and auditory-motor transformations.
Collapse
|
|
3 |
|
11
|
van Nispen K, Sekine K, van der Meulen I, Preisig BC. Gesture in the eye of the beholder: An eye-tracking study on factors determining the attention for gestures produced by people with aphasia. Neuropsychologia 2022; 174:108315. [DOI: 10.1016/j.neuropsychologia.2022.108315] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2021] [Revised: 06/28/2022] [Accepted: 06/30/2022] [Indexed: 10/17/2022]
|
|
3 |
|
12
|
Preisig BC, Meyer M. Predictive coding and dimension-selective attention enhance the lateralization of spoken language processing. Neurosci Biobehav Rev 2025; 172:106111. [PMID: 40118260 DOI: 10.1016/j.neubiorev.2025.106111] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2024] [Revised: 02/12/2025] [Accepted: 03/15/2025] [Indexed: 03/23/2025]
Abstract
Hemispheric lateralization in speech and language processing exemplifies functional brain specialization. Seminal work in patients with left hemisphere damage highlighted the left-hemispheric dominance in language functions. However, speech processing is not confined to the left hemisphere. Hence, some researchers associate lateralization with auditory processing asymmetries: slow temporal and fine spectral acoustic information is preferentially processed in right auditory regions, while faster temporal information is primarily handled by left auditory regions. Other scholars posit that lateralization relates more to linguistic processing, particularly for speech and speech-like stimuli. We argue that these seemingly distinct accounts are interdependent. Linguistic analysis of speech relies on top-down processes, such as predictive coding and dimension-selective auditory attention, which enhance lateralized processing by engaging left-lateralized sensorimotor networks. Our review highlights that lateralization is weaker for simple sounds, stronger for speech-like sounds, and strongest for meaningful speech. Evidence shows that predictive speech processing and selective attention enhance lateralization. We illustrate that these top-down processes rely on left-lateralized sensorimotor networks and provide insights into the role of these networks in speech processing.
Collapse
|
Review |
1 |
|