1
|
Fiorini L, Di Russo F, Lucia S, Bianco V. Modality predictability modulation confirms the sensorial readiness function of the pre-stimulus activity in sensory brain areas. Cortex 2023; 159:193-204. [PMID: 36640619 DOI: 10.1016/j.cortex.2022.12.008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/28/2021] [Revised: 08/03/2022] [Accepted: 12/13/2022] [Indexed: 12/30/2022]
Abstract
The auditory Positivity (aP) and the visual Negativity (vN) are recently discovered modality-specific event-related potential (ERP) components associated with sensory readiness, which seems promising to study anticipatory perception and attention. However, a crucial aspect of these waves remains to be determined since it is still unclear if these components are indeed related to sensory readiness or represent the result of stimulus predictably. Indeed, earlier studies found these components in tasks where stimuli were repeatedly presented uniquely in the same sensory modality. To disentangle this issue, we used an experimental design consisting of three passive tasks: a unimodal auditory condition, a unimodal visual condition, and an intermodal condition in which the visual and auditory stimuli were unpredictably alternated. Then, we compared the amplitudes of the aP and vN in the three conditions and performed correlation analyses between pre-stimulus and post-stimulus components. Crucially, results showed that in the intermodal condition the components still occur, but their amplitudes are decreased compared to unimodal condition, providing evidence that they are only partially dependent on the task and that expectancy might modulate them. This result is in line with the "modality-shift effect" costs phenomenon which can occur also for passive tasks even before stimulus presentation. In addition, the amplitude of the post-stimulus components correlated with pre-stimulus ERP. Collectively, the present study confirms that the aP and the vN reflect sensory readiness processes that "boost" post-stimulus auditory N1 and visual P1 components.
Collapse
Affiliation(s)
- Linda Fiorini
- Dept. of Movement, Human and Health Sciences, University of Rome "Foro Italico", Rome, Italy; IMT School for Advanced Studies Lucca, Lucca, Italy.
| | - Francesco Di Russo
- Dept. of Movement, Human and Health Sciences, University of Rome "Foro Italico", Rome, Italy; IRCCS Fondazione Santa Lucia, Rome, Italy
| | - Stefania Lucia
- Dept. of Movement, Human and Health Sciences, University of Rome "Foro Italico", Rome, Italy
| | - Valentina Bianco
- Laboratory of Cognitive Neuroscience, Department of Languages and Literatures, Communication, Education and Society, University of Udine, Udine, Italy
| |
Collapse
|
2
|
Setti W, Cuturi LF, Cocchi E, Gori M. Spatial Memory and Blindness: The Role of Visual Loss on the Exploration and Memorization of Spatialized Sounds. Front Psychol 2022; 13:784188. [PMID: 35686077 PMCID: PMC9171105 DOI: 10.3389/fpsyg.2022.784188] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2021] [Accepted: 04/21/2022] [Indexed: 11/20/2022] Open
Abstract
Spatial memory relies on encoding, storing, and retrieval of knowledge about objects’ positions in their surrounding environment. Blind people have to rely on sensory modalities other than vision to memorize items that are spatially displaced, however, to date, very little is known about the influence of early visual deprivation on a person’s ability to remember and process sound locations. To fill this gap, we tested sighted and congenitally blind adults and adolescents in an audio-spatial memory task inspired by the classical card game “Memory.” In this research, subjects (blind, n = 12; sighted, n = 12) had to find pairs among sounds (i.e., animal calls) displaced on an audio-tactile device composed of loudspeakers covered by tactile sensors. To accomplish this task, participants had to remember the spatialized sounds’ position and develop a proper mental spatial representation of their locations. The test was divided into two experimental conditions of increasing difficulty dependent on the number of sounds to be remembered (8 vs. 24). Results showed that sighted participants outperformed blind participants in both conditions. Findings were discussed considering the crucial role of visual experience in properly manipulating auditory spatial representations, particularly in relation to the ability to explore complex acoustic configurations.
Collapse
Affiliation(s)
- Walter Setti
- Unit for Visually Impaired People (U-VIP), Italian Institute of Technology, Genoa, Italy
- *Correspondence: Walter Setti,
| | - Luigi F. Cuturi
- Unit for Visually Impaired People (U-VIP), Italian Institute of Technology, Genoa, Italy
| | | | - Monica Gori
- Unit for Visually Impaired People (U-VIP), Italian Institute of Technology, Genoa, Italy
| |
Collapse
|
3
|
Multisensory stimuli shift perceptual priors to facilitate rapid behavior. Sci Rep 2021; 11:23052. [PMID: 34845325 PMCID: PMC8629992 DOI: 10.1038/s41598-021-02566-8] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2021] [Accepted: 11/16/2021] [Indexed: 11/08/2022] Open
Abstract
Multisensory stimuli speed behavioral responses, but the mechanisms subserving these effects remain disputed. Historically, the observation that multisensory reaction times (RTs) outpace models assuming independent sensory channels has been taken as evidence for multisensory integration (the "redundant target effect"; RTE). However, this interpretation has been challenged by alternative explanations based on stimulus sequence effects, RT variability, and/or negative correlations in unisensory processing. To clarify the mechanisms subserving the RTE, we collected RTs from 78 undergraduates in a multisensory simple RT task. Based on previous neurophysiological findings, we hypothesized that the RTE was unlikely to reflect these alternative mechanisms, and more likely reflected pre-potentiation of sensory responses through crossmodal phase-resetting. Contrary to accounts based on stimulus sequence effects, we found that preceding stimuli explained only 3-9% of the variance in apparent RTEs. Comparing three plausible evidence accumulator models, we found that multisensory RT distributions were best explained by increased sensory evidence at stimulus onset. Because crossmodal phase-resetting increases cortical excitability before sensory input arrives, these results are consistent with a mechanism based on pre-potentiation through phase-resetting. Mathematically, this model entails increasing the prior log-odds of stimulus presence, providing a potential link between neurophysiological, behavioral, and computational accounts of multisensory interactions.
Collapse
|
4
|
Shiroshita Y, Kirimoto H, Watanabe T, Yunoki K, Sobue I. Event-related potentials evoked by skin puncture reflect activation of Aβ fibers: comparison with intraepidermal and transcutaneous electrical stimulations. PeerJ 2021; 9:e12250. [PMID: 34707936 PMCID: PMC8504465 DOI: 10.7717/peerj.12250] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2021] [Accepted: 09/13/2021] [Indexed: 11/20/2022] Open
Abstract
Background Recently, event-related potentials (ERPs) evoked by skin puncture, commonly used for blood sampling, have received attention as a pain assessment tool in neonates. However, their latency appears to be far shorter than the latency of ERPs evoked by intraepidermal electrical stimulation (IES), which selectively activates nociceptive Aδ and C fibers. To clarify this important issue, we examined whether ERPs evoked by skin puncture appropriately reflect central nociceptive processing, as is the case with IES. Methods In Experiment 1, we recorded evoked potentials to the click sound produced by a lance device (click-only), lance stimulation with the click sound (click+lance), or lance stimulation with white noise (WN+lance) in eight healthy adults to investigate the effect of the click sound on the ERP evoked by skin puncture. In Experiment 2, we tested 18 heathy adults and recorded evoked potentials to shallow lance stimulation (SL) with a blade that did not reach the dermis (0.1 mm insertion depth); normal lance stimulation (CL) (1 mm depth); transcutaneous electrical stimulation (ES), which mainly activates Aβ fibers; and IES, which selectively activates Aδ fibers when low stimulation current intensities are applied. White noise was continuously presented during the experiments. The stimulations were applied to the hand dorsum. In the SL, the lance device did not touch the skin and the blade was inserted to a depth of 0.1 mm into the epidermis, where the free nerve endings of Aδ fibers are located, which minimized the tactile sensation caused by the device touching the skin and the activation of Aβ fibers by the blade reaching the dermis. In the CL, as in clinical use, the lance device touched the skin and the blade reached a depth of 1 mm from the skin surface, i.e., the depth of the dermis at which the Aβ fibers are located. Results The ERP N2 latencies for click-only (122 ± 2.9 ms) and click+lance (121 ± 6.5 ms) were significantly shorter than that for WN+lance (154 ± 7.1 ms). The ERP P2 latency for click-only (191 ± 11.3 ms) was significantly shorter than those for click+lance (249 ± 18.6 ms) and WN+lance (253 ± 11.2 ms). This suggests that the click sound shortens the N2 latency of the ERP evoked by skin puncture. The ERP N2 latencies for SL, CL, ES, and IES were 146 ± 8.3, 149 ± 9.9, 148 ± 13.1, and 197 ± 21.2 ms, respectively. The ERP P2 latencies were 250 ± 18.2, 251 ± 14.1, 237 ± 26.3, and 294 ± 30.0 ms, respectively. The ERP latency for SL was significantly shorter than that for IES and was similar to that for ES. This suggests that the penetration force generated by the blade of the lance device activates the Aβ fibers, consequently shortening the ERP latency. Conclusions Lance ERP may reflect the activation of Aβ fibers rather than Aδ fibers. A pain index that correctly and reliably reflects nociceptive processing must be developed to improve pain assessment and management in neonates.
Collapse
Affiliation(s)
- Yui Shiroshita
- Department of Nursing Science, Graduate School of Biomedical and Health Sciences, Hiroshima University, Hiroshima, Japan
| | - Hikari Kirimoto
- Department of Sensorimotor Neuroscience, Graduate School of Biomedical and Health Sciences, Hiroshima University, Hiroshima, Japan
| | - Tatsunori Watanabe
- Department of Sensorimotor Neuroscience, Graduate School of Biomedical and Health Sciences, Hiroshima University, Hiroshima, Japan
| | - Keisuke Yunoki
- Department of Sensorimotor Neuroscience, Graduate School of Biomedical and Health Sciences, Hiroshima University, Hiroshima, Japan
| | - Ikuko Sobue
- Department of Nursing Science, Graduate School of Biomedical and Health Sciences, Hiroshima University, Hiroshima, Japan
| |
Collapse
|
5
|
Fiorini L, Berchicci M, Mussini E, Bianco V, Lucia S, Di Russo F. Neural Basis of Anticipatory Multisensory Integration. Brain Sci 2021; 11:brainsci11070843. [PMID: 34201992 PMCID: PMC8301880 DOI: 10.3390/brainsci11070843] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2021] [Revised: 06/03/2021] [Accepted: 06/23/2021] [Indexed: 11/25/2022] Open
Abstract
The brain is able to gather different sensory information to enhance salient event perception, thus yielding a unified perceptual experience of multisensory events. Multisensory integration has been widely studied, and the literature supports the hypothesis that it can occur across various stages of stimulus processing, including both bottom-up and top-down control. However, evidence on anticipatory multisensory integration occurring in the fore period preceding the presentation of the expected stimulus in passive tasks, is missing. By means of event-related potentials (ERPs), it has been recently proposed that visual and auditory unimodal stimulations are preceded by sensory-specific readiness activities. Accordingly, in the present study, we tested the occurrence of multisensory integration in the endogenous anticipatory phase of sensory processing, combining visual and auditory stimuli during unimodal and multimodal passive ERP paradigms. Results showed that the modality-specific pre-stimulus ERP components (i.e., the auditory positivity -aP- and the visual negativity -vN-) started earlier and were larger in the multimodal stimulation compared with the sum of the ERPs elicited by the unimodal stimulations. The same amplitude effect was also present for the early auditory N1 and visual P1 components. This anticipatory multisensory effect seems to influence stimulus processing, boosting the magnitude of early stimulus processing. This paves the way for new perspectives on the neural basis of multisensory integration.
Collapse
Affiliation(s)
- Linda Fiorini
- Department of Movement, Human and Health Sciences, University of Rome “Foro Italico”, 00135 Rome, Italy; (M.B.); (E.M.); (V.B.); (S.L.); (F.D.R.)
- Department of Psychology, University of Rome “La Sapienza”, 00185 Rome, Italy
- Correspondence:
| | - Marika Berchicci
- Department of Movement, Human and Health Sciences, University of Rome “Foro Italico”, 00135 Rome, Italy; (M.B.); (E.M.); (V.B.); (S.L.); (F.D.R.)
- University “G. d’Annunzio” of Chieti-Pescara, 66100 Chieti, Italy
| | - Elena Mussini
- Department of Movement, Human and Health Sciences, University of Rome “Foro Italico”, 00135 Rome, Italy; (M.B.); (E.M.); (V.B.); (S.L.); (F.D.R.)
- University “G. d’Annunzio” of Chieti-Pescara, 66100 Chieti, Italy
| | - Valentina Bianco
- Department of Movement, Human and Health Sciences, University of Rome “Foro Italico”, 00135 Rome, Italy; (M.B.); (E.M.); (V.B.); (S.L.); (F.D.R.)
- Department of Languages and Literatures, Communication, Education and Society, University of Udine, 33100 Udine, Italy
| | - Stefania Lucia
- Department of Movement, Human and Health Sciences, University of Rome “Foro Italico”, 00135 Rome, Italy; (M.B.); (E.M.); (V.B.); (S.L.); (F.D.R.)
| | - Francesco Di Russo
- Department of Movement, Human and Health Sciences, University of Rome “Foro Italico”, 00135 Rome, Italy; (M.B.); (E.M.); (V.B.); (S.L.); (F.D.R.)
- IRCCS Fondazione Santa Lucia, 00179 Rome, Italy
| |
Collapse
|
6
|
Shaw LH, Freedman EG, Crosse MJ, Nicholas E, Chen AM, Braiman MS, Molholm S, Foxe JJ. Operating in a Multisensory Context: Assessing the Interplay Between Multisensory Reaction Time Facilitation and Inter-sensory Task-switching Effects. Neuroscience 2020; 436:122-135. [PMID: 32325100 DOI: 10.1016/j.neuroscience.2020.04.013] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2019] [Revised: 04/03/2020] [Accepted: 04/06/2020] [Indexed: 11/28/2022]
Abstract
Individuals respond faster to presentations of bisensory stimuli (e.g. audio-visual targets) than to presentations of either unisensory constituent in isolation (i.e. to the auditory-alone or visual-alone components of an audio-visual stimulus). This well-established multisensory speeding effect, termed the redundant signals effect (RSE), is not predicted by simple linear summation of the unisensory response time probability distributions. Rather, the speeding is typically faster than this prediction, leading researchers to ascribe the RSE to a so-called co-activation account. According to this account, multisensory neural processing occurs whereby the unisensory inputs are integrated to produce more effective sensory-motor activation. However, the typical paradigm used to test for RSE involves random sequencing of unisensory and bisensory inputs in a mixed design, raising the possibility of an alternate attention-switching account. This intermixed design requires participants to switch between sensory modalities on many task trials (e.g. from responding to a visual stimulus to an auditory stimulus). Here we show that much, if not all, of the RSE under this paradigm can be attributed to slowing of reaction times to unisensory stimuli resulting from modality switching, and is not in fact due to speeding of responses to AV stimuli. As such, the present data do not support a co-activation account, but rather suggest that switching and mixing costs akin to those observed during classic task-switching paradigms account for the observed RSE.
Collapse
Affiliation(s)
- Luke H Shaw
- The Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY 14642, USA
| | - Edward G Freedman
- The Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY 14642, USA
| | - Michael J Crosse
- The Cognitive Neurophysiology Laboratory, Department of Pediatrics & Neuroscience, Albert Einstein College of Medicine & Montefiore Medical Center, Bronx, NY 10461, USA
| | - Eric Nicholas
- The Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY 14642, USA
| | - Allen M Chen
- The Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY 14642, USA
| | - Matthew S Braiman
- The Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY 14642, USA
| | - Sophie Molholm
- The Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY 14642, USA; The Cognitive Neurophysiology Laboratory, Department of Pediatrics & Neuroscience, Albert Einstein College of Medicine & Montefiore Medical Center, Bronx, NY 10461, USA
| | - John J Foxe
- The Cognitive Neurophysiology Laboratory, The Del Monte Institute for Neuroscience, Department of Neuroscience, University of Rochester School of Medicine and Dentistry, Rochester, NY 14642, USA; The Cognitive Neurophysiology Laboratory, Department of Pediatrics & Neuroscience, Albert Einstein College of Medicine & Montefiore Medical Center, Bronx, NY 10461, USA.
| |
Collapse
|
7
|
Keller AS, Payne L, Sekuler R. Characterizing the roles of alpha and theta oscillations in multisensory attention. Neuropsychologia 2017; 99:48-63. [PMID: 28259771 DOI: 10.1016/j.neuropsychologia.2017.02.021] [Citation(s) in RCA: 58] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2016] [Revised: 02/16/2017] [Accepted: 02/26/2017] [Indexed: 11/16/2022]
Abstract
Cortical alpha oscillations (8-13Hz) appear to play a role in suppressing distractions when just one sensory modality is being attended, but do they also contribute when attention is distributed over multiple sensory modalities? For an answer, we examined cortical oscillations in human subjects who were dividing attention between auditory and visual sequences. In Experiment 1, subjects performed an oddball task with auditory, visual, or simultaneous audiovisual sequences in separate blocks, while the electroencephalogram was recorded using high-density scalp electrodes. Alpha oscillations were present continuously over posterior regions while subjects were attending to auditory sequences. This supports the idea that the brain suppresses processing of visual input in order to advantage auditory processing. During a divided-attention audiovisual condition, an oddball (a rare, unusual stimulus) occurred in either the auditory or the visual domain, requiring that attention be divided between the two modalities. Fronto-central theta band (4-7Hz) activity was strongest in this audiovisual condition, when subjects monitored auditory and visual sequences simultaneously. Theta oscillations have been associated with both attention and with short-term memory. Experiment 2 sought to distinguish these possible roles of fronto-central theta activity during multisensory divided attention. Using a modified version of the oddball task from Experiment 1, Experiment 2 showed that differences in theta power among conditions were independent of short-term memory load. Ruling out theta's association with short-term memory, we conclude that fronto-central theta activity is likely a marker of multisensory divided attention.
Collapse
Affiliation(s)
- Arielle S Keller
- Volen Center for Complex Systems, Brandeis University, 415 South Street, Waltham MA 02453, USA.
| | - Lisa Payne
- Swarthmore College, 500 College Ave, Swarthmore PA 19081, USA.
| | - Robert Sekuler
- Volen Center for Complex Systems, Brandeis University, 415 South Street, Waltham MA 02453, USA.
| |
Collapse
|
8
|
Juan C, Cappe C, Alric B, Roby B, Gilardeau S, Barone P, Girard P. The variability of multisensory processes of natural stimuli in human and non-human primates in a detection task. PLoS One 2017; 12:e0172480. [PMID: 28212416 PMCID: PMC5315309 DOI: 10.1371/journal.pone.0172480] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2016] [Accepted: 02/06/2017] [Indexed: 11/19/2022] Open
Abstract
Background Behavioral studies in both human and animals generally converge to the dogma that multisensory integration improves reaction times (RTs) in comparison to unimodal stimulation. These multisensory effects depend on diverse conditions among which the most studied were the spatial and temporal congruences. Further, most of the studies are using relatively simple stimuli while in everyday life, we are confronted to a large variety of complex stimulations constantly changing our attentional focus over time, a modality switch that can impact on stimuli detection. In the present study, we examined the potential sources of the variability in reaction times and multisensory gains with respect to the intrinsic features of a large set of natural stimuli. Methodology/Principle findings Rhesus macaque monkeys and human subjects performed a simple audio-visual stimulus detection task in which a large collection of unimodal and bimodal natural stimuli with semantic specificities was presented at different saliencies. Although we were able to reproduce the well-established redundant signal effect, we failed to reveal a systematic violation of the race model which is considered to demonstrate multisensory integration. In both monkeys and human species, our study revealed a large range of multisensory gains, with negative and positive values. While modality switch has clear effects on reaction times, one of the main causes of the variability of multisensory gains appeared to be linked to the intrinsic physical parameters of the stimuli. Conclusion/Significance Based on the variability of multisensory benefits, our results suggest that the neuronal mechanisms responsible of the redundant effect (interactions vs. integration) are highly dependent on the stimulus complexity suggesting different implications of uni- and multisensory brain regions. Further, in a simple detection task, the semantic values of individual stimuli tend to have no significant impact on task performances, an effect which is probably present in more cognitive tasks.
Collapse
Affiliation(s)
- Cécile Juan
- Cerco, CNRS UMR 5549, Toulouse, France
- Université de Toulouse, UPS, Centre de Recherche Cerveau et Cognition, Toulouse, France
| | - Céline Cappe
- Cerco, CNRS UMR 5549, Toulouse, France
- Université de Toulouse, UPS, Centre de Recherche Cerveau et Cognition, Toulouse, France
| | - Baptiste Alric
- Cerco, CNRS UMR 5549, Toulouse, France
- Université de Toulouse, UPS, Centre de Recherche Cerveau et Cognition, Toulouse, France
| | - Benoit Roby
- Cerco, CNRS UMR 5549, Toulouse, France
- Université de Toulouse, UPS, Centre de Recherche Cerveau et Cognition, Toulouse, France
| | - Sophie Gilardeau
- Cerco, CNRS UMR 5549, Toulouse, France
- Université de Toulouse, UPS, Centre de Recherche Cerveau et Cognition, Toulouse, France
| | - Pascal Barone
- Cerco, CNRS UMR 5549, Toulouse, France
- Université de Toulouse, UPS, Centre de Recherche Cerveau et Cognition, Toulouse, France
| | - Pascal Girard
- Cerco, CNRS UMR 5549, Toulouse, France
- Université de Toulouse, UPS, Centre de Recherche Cerveau et Cognition, Toulouse, France
- INSERM, Toulouse, France
- * E-mail:
| |
Collapse
|
9
|
De Meo R, Murray MM, Clarke S, Matusz PJ. Top-down control and early multisensory processes: chicken vs. egg. Front Integr Neurosci 2015; 9:17. [PMID: 25784863 PMCID: PMC4347447 DOI: 10.3389/fnint.2015.00017] [Citation(s) in RCA: 45] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2014] [Accepted: 02/13/2015] [Indexed: 11/13/2022] Open
Affiliation(s)
- Rosanna De Meo
- Neuropsychology and Neurorehabilitation Service, Centre Hospitalier Universitaire Vaudois and University of Lausanne Lausanne, Switzerland
| | - Micah M Murray
- Neuropsychology and Neurorehabilitation Service, Centre Hospitalier Universitaire Vaudois and University of Lausanne Lausanne, Switzerland ; Electroencephalography Brain Mapping Core, Center for Biomedical Imaging (CIBM) Lausanne and Geneva, Switzerland
| | - Stephanie Clarke
- Neuropsychology and Neurorehabilitation Service, Centre Hospitalier Universitaire Vaudois and University of Lausanne Lausanne, Switzerland
| | - Pawel J Matusz
- The Laboratory for Investigative Neurophysiology, Neuropsychology and Neurorehabilitation Service and Department of Radiology, Centre Hospitalier Universitaire Vaudois and University of Lausanne Lausanne, Switzerland ; Faculty in Wroclaw, University of Social Sciences and Humanities Wroclaw, Poland ; Attention, Brain and Cognitive Development Group, Department of Experimental Psychology, University of Oxford Oxford, UK
| |
Collapse
|
10
|
Smolka E, Gondan M, Rösler F. Take a stand on understanding: electrophysiological evidence for stem access in German complex verbs. Front Hum Neurosci 2015; 9:62. [PMID: 25767442 PMCID: PMC4341544 DOI: 10.3389/fnhum.2015.00062] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2014] [Accepted: 01/23/2015] [Indexed: 11/25/2022] Open
Abstract
The lexical representation of complex words in Indo-European languages is generally assumed to depend on semantic compositionality. This study investigated whether semantically compositional and noncompositional derivations are accessed via their constituent units or as whole words. In an overt visual priming experiment (300 ms stimulus onset asynchrony, SOA), event-related potentials (ERPs) were recorded for verbs (e.g., ziehen, “pull”) that were preceded by purely semantically related verbs (e.g., zerren, “drag”), by morphologically related and semantically compositional verbs (e.g., zuziehen, “pull together”), by morphologically related and semantically noncompositional verbs (e.g., erziehen, “educate”), by orthographically similar verbs (e.g., zielen, “aim”), or by unrelated verbs (e.g., tarnen, “mask”). Compared to the unrelated condition, which evoked an N400 effect with the largest amplitude at centro-parietal recording sites, the N400 was reduced in all other conditions. The rank order of N400 amplitudes turned out as follows: morphologically related and semantically compositional ≈ morphologically related and semantically noncompositional < purely semantically related < orthographically similar < unrelated. Surprisingly, morphologically related primes produced similar N400 modulations—irrespective of their semantic compositionality. The control conditions with orthographic similarity confirmed that these morphological effects were not the result of a simple form overlap between primes and targets. Our findings suggest that the lexical representation of German complex verbs refers to their base form, regardless of meaning compositionality. Theories of the lexical representation of German words need to incorporate this aspect of language processing in German.
Collapse
Affiliation(s)
- Eva Smolka
- Department of Linguistics, University of Konstanz Konstanz, Germany
| | - Matthias Gondan
- Department of Psychology, University of Copenhagen Copenhagen, Denmark
| | - Frank Rösler
- Biological Psychology and Neuropsychology, University of Hamburg Hamburg, Germany
| |
Collapse
|
11
|
Miles E, Brown R, Poliakoff E. Investigating the nature and time-course of the modality shift effect between vision and touch. Q J Exp Psychol (Hove) 2011; 64:871-88. [DOI: 10.1080/17470218.2010.514054] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Abstract
It is well known that stimuli grab attention to their location, but do they also grab attention to their sensory modality? The modality shift effect (MSE), the observation that responding to a stimulus leads to reaction time benefits for subsequent stimuli in the same modality, suggests that this may be the case. If noninformative cue stimuli, which do not require a response, also lead to benefits for their modality, this would suggest that the effect is automatic. We investigated the time-course of the visuotactile MSE and the difference between the effects of cues and targets. In Experiment 1, when visual and tactile tasks and stimulus locations were matched, uninformative cues did not lead to reaction time benefits for targets in the same modality. However, the modality of the previous target led to a significant MSE. Only stimuli that require a response, therefore, appear to lead to reaction time benefits for their modality. In Experiment 2, increasing attention to the cue stimuli attenuated the effect of the previous target, but the cues still did not lead to a MSE. In Experiment 3, a MSE was demonstrated between successive targets, and this effect decreased with increasing intertrial intervals. Overall, these studies demonstrate how cue- and target-induced effects interact and suggest that modalities do not automatically capture attention as locations do; rather, the MSE is more similar to other task repetition effects.
Collapse
|
12
|
Collins J, Pecher D, Zeelenberg R, Coulson S. Modality Switching in a Property Verification Task: An ERP Study of What Happens When Candles Flicker after High Heels Click. Front Psychol 2011; 2:10. [PMID: 21713128 PMCID: PMC3111443 DOI: 10.3389/fpsyg.2011.00010] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/16/2010] [Accepted: 01/09/2011] [Indexed: 11/13/2022] Open
Abstract
The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties.
Collapse
Affiliation(s)
- Jennifer Collins
- Brain and Cognition Lab, Department of Cognitive Science, University of California San Diego San Diego, CA, USA
| | | | | | | |
Collapse
|
13
|
Abstract
In divided-attention tasks with two classes of target stimuli, participants typically respond more quickly if both targets are presented simultaneously, as compared with single-target presentation (redundant-signals effect). Different explanations exist for this effect, including serial, parallel, and coactivation models of information processing. In two experiments, we investigated redundancy gains in simple and go/no-go responses to auditory-visual stimuli presented with an onset asynchrony. In Experiment 1, go/no-go discrimination was performed for near-threshold and suprathreshold stimuli. Response times in both the simple and go/no-go responses were well explained by a common coactivation model assuming linear superposition of modality-specific activation. In Experiment 2, the go/no-go task was made more difficult. Participants had to respond to high-frequency tones or right-tilted Gabor patches and to withhold their response for low tones and left-tilted Gabors. Redundancy gains were consistent with coactivation models; however, channel-specific buildup of evidence seems to occur at different speeds in the two tasks. Response times of 1 participant support a serial self-terminating model of modality-specific information processing. Supplemental materials for this article may be downloaded from http://app.psychonomic-journals.org/content/supplemental.
Collapse
|
14
|
The race model inequality for censored reaction time distributions. Atten Percept Psychophys 2010; 72:839-47. [DOI: 10.3758/app.72.3.839] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
15
|
An exploratory event-related potential study of multisensory integration in sensory over-responsive children. Brain Res 2010; 1321:67-77. [PMID: 20097181 DOI: 10.1016/j.brainres.2010.01.043] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2009] [Revised: 01/14/2010] [Accepted: 01/15/2010] [Indexed: 11/20/2022]
Abstract
Children who are over-responsive to sensation have defensive and "fight or flight" reactions to ordinary levels of sensory stimulation in the environment. Based on clinical observations, sensory over-responsivity is hypothesized to reflect atypical neural integration of sensory input. To examine a possible underlying neural mechanism of the disorder, integration of simultaneous multisensory auditory and somatosensory stimulation was studied in twenty children with sensory over-responsivity (SOR) using event-related potentials (ERPs). Three types of sensory stimuli were presented and ERPs were recorded from thirty-two scalp electrodes while participants watched a silent cartoon: bilateral auditory clicks, right somatosensory median nerve electrical pulses, or both simultaneously. The paradigm was passive; no behavioral responses were required. To examine integration, responses to simultaneous multisensory auditory-somatosensory stimulation were compared to the sum of unisensory auditory plus unisensory somatosensory responses in four time-windows: (60-80 ms, 80-110 ms, 110-150 ms, and 180-220 ms). Specific midline and lateral electrode sites were examined over scalp regions where auditory-somatosensory integration was expected based on previous studies. Midline electrode sites (Fz, Cz, and Pz) showed significant integration during two time-windows: 60-80 ms and 180-220 ms. Significant integration was also found at contralateral electrode site (C3) for the time-window between 180 and 220 ms. At ipsilateral electrode sites (C4 and CP6), no significant integration was found during any of the time-windows (i.e. the multisensory ERP was not significantly different from the summed unisensory ERP). These results demonstrate that MSI can be reliably measured in children with SOR and provide evidence that multisensory auditory-somatosensory input is integrated during both early and later stages of sensory information processing, mainly over fronto-central scalp regions.
Collapse
|
16
|
Costa MH, Tavares MC. Removing harmonic power line interference from biopotential signals in low cost acquisition systems. Comput Biol Med 2009; 39:519-26. [DOI: 10.1016/j.compbiomed.2009.03.004] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2008] [Revised: 03/06/2009] [Accepted: 03/11/2009] [Indexed: 10/20/2022]
|