101
|
Dewey JA, Carr TH. Is that what I wanted to do? Cued vocalizations influence the phenomenology of controlling a moving object. Conscious Cogn 2012; 21:507-25. [PMID: 22301454 DOI: 10.1016/j.concog.2012.01.004] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2011] [Revised: 01/04/2012] [Accepted: 01/05/2012] [Indexed: 11/17/2022]
Abstract
The phenomenology of controlled action depends on comparisons between predicted and actually perceived sensory feedback called action-effects. We investigated if intervening task-irrelevant but semantically related information influences monitoring processes that give rise to a sense of control. Participants judged whether a moving box "obeyed" or "disobeyed" their own arrow keystrokes (Experiments 1 and 2) or visual cues representing the computer's choices (Experiment 3). During 1s delays between keystrokes/cues and box movements, participants vocalized directions ("up", "down", "left", or "right") cued by letters inside the box. Congruency of cued vocalizations was manipulated relative to previously selected keystrokes and upcoming box movements. In Experiment 1, reported obey moves and feelings of control reflected the true frequency of obey moves, but were also modulated by vocalizations. Incongruent vocalizations reduced reported obey moves, whereas congruent vocalizations increased them. In Experiment 2, vocalizations had stronger effects when their congruence with primary-task box movement was consistent for several consecutive moves before congruence changed. In Experiment 3, analogous impacts of vocalizations occurred when the computer selected the directions and participants judged whether the computer had control of the box. We conclude that predicted and perceived action-effects associated with semantically related but separate and ostensibly irrelevant actions can be conflated with one another. This interference is not restricted to actions performed with the same effector or within the same modality, or even by the same actor. Thus in estimating degrees of control, the mind integrates across ongoing action systems, whether or not they are logically task-relevant.
Collapse
Affiliation(s)
- John A Dewey
- Department of Psychology, Michigan State University, East Lansing, MI 48824, USA.
| | | |
Collapse
|
102
|
Top down influence on visuo-tactile interaction modulates neural oscillatory responses. Neuroimage 2012; 59:3406-17. [DOI: 10.1016/j.neuroimage.2011.11.076] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2011] [Revised: 10/07/2011] [Accepted: 11/25/2011] [Indexed: 10/14/2022] Open
|
103
|
Abstract
We recently used an audiovisual illusion (Shams et al., 2000) during fast and accurate reaching movements and showed that susceptibility to the fusion illusion is reduced at high limb velocities (Tremblay and Nguyen, 2010). This study aimed to determine if auditory information processing is suppressed during voluntary action (Chapman and Beauchamp, 2006), which could explain reduced fusion during reaching movements. Instead of asking our participants () to report the number of flashes, we asked them to report the number of beeps (Andersen et al., 2004). Before each trial, participants were asked to fixate on a target LED presented on a horizontal reaching surface. The secondary stimuli combined 3 flash (0, 1, 2) by 2 beep (1, 2). During control tests, the secondary stimuli were presented at rest. In the experimental phase, stimuli were presented 0, 100 or 200 ms relative to the onset of a fast and accurate movement. Participants reported the number of beeps after each trial. A 3 flash × 2 beep × 4 presentation condition (0, 100, 200 ms + Control) ANOVA revealed that participants were less accurate at perceiving the actual number of beeps during the movement as compared to the control condition. More importantly, the number of flashes influenced the number of perceived beeps during the movement but not in the control condition. Lastly, no relationship was found between limb velocity and the number of perceived beeps. These results indicate that auditory information is significantly suppressed during goal-directed action but this mechanism alone fails to explain the link between limb velocity and the fusion illusion.
Collapse
|
104
|
|
105
|
Innes-Brown H, Barutchu A, Shivdasani MN, Crewther DP, Grayden DB, Paolini AG. Susceptibility to the flash-beep illusion is increased in children compared to adults. Dev Sci 2011; 14:1089-99. [DOI: 10.1111/j.1467-7687.2011.01059.x] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
106
|
Bolognini N, Maravita A. Uncovering Multisensory Processing through Non-Invasive Brain Stimulation. Front Psychol 2011; 2:46. [PMID: 21716922 PMCID: PMC3110874 DOI: 10.3389/fpsyg.2011.00046] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2010] [Accepted: 03/04/2011] [Indexed: 02/04/2023] Open
Abstract
Most of current knowledge about the mechanisms of multisensory integration of environmental stimuli by the human brain derives from neuroimaging experiments. However, neuroimaging studies do not always provide conclusive evidence about the causal role of a given area for multisensory interactions, since these techniques can mainly derive correlations between brain activations and behavior. Conversely, techniques of non-invasive brain stimulation (NIBS) represent a unique and powerful approach to inform models of causal relations between specific brain regions and individual cognitive and perceptual functions. Although NIBS has been widely used in cognitive neuroscience, its use in the study of multisensory processing in the human brain appears a quite novel field of research. In this paper, we review and discuss recent studies that have used two techniques of NIBS, namely transcranial magnetic stimulation and transcranial direct current stimulation, for investigating the causal involvement of unisensory and heteromodal cortical areas in multisensory processing, the effects of multisensory cues on cortical excitability in unisensory areas, and the putative functional connections among different cortical areas subserving multisensory interactions. The emerging view is that NIBS is an essential tool available to neuroscientists seeking for causal relationships between a given area or network and multisensory processes. With its already large and fast increasing usage, future work using NIBS in isolation, as well as in conjunction with different neuroimaging techniques, could substantially improve our understanding of multisensory processing in the human brain.
Collapse
Affiliation(s)
- Nadia Bolognini
- Department of Psychology, University of Milano-Bicocca Milan, Italy
| | | |
Collapse
|
107
|
Familiarity of objects affects susceptibility to the sound-induced flash illusion. Neurosci Lett 2011; 492:19-22. [DOI: 10.1016/j.neulet.2011.01.042] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2010] [Revised: 01/11/2011] [Accepted: 01/15/2011] [Indexed: 11/19/2022]
|
108
|
Setti A, Burke KE, Kenny RA, Newell FN. Is inefficient multisensory processing associated with falls in older people? Exp Brain Res 2011; 209:375-84. [PMID: 21293851 DOI: 10.1007/s00221-011-2560-z] [Citation(s) in RCA: 119] [Impact Index Per Article: 9.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2010] [Accepted: 01/15/2011] [Indexed: 01/15/2023]
Abstract
Although falling is a significant problem for older persons, little is understood about its underlying causes. Spatial cognition and balance maintenance rely on the efficient integration of information across the main senses. We investigated general multisensory efficiency in older persons with a history of falls compared to age- and sensory acuity-matched controls and younger adults using a sound-induced flash illusion. Older fallers were as susceptible to the illusion as age-matched, non-fallers or younger adults at a short delay of 70 ms between the auditory and visual stimuli. Both older adult groups were more susceptible to the illusion at longer SOAs than younger adults. However, with increasing delays between the visual and auditory stimuli, older fallers did not show a decline in the frequency at which the illusion was experienced even with delays of up to 270 ms. We argue that this relatively higher susceptibility to the illusion reflects inefficient audio-visual processing in the central nervous system and has important implications for the diagnosis and rehabilitation of falling in older persons.
Collapse
Affiliation(s)
- Annalisa Setti
- School of Psychology, Institute of Neuroscience, Lloyd Building, Trinity College Dublin, Dublin 2, Ireland
| | | | | | | |
Collapse
|
109
|
Besson P, Richiardi J, Bourdin C, Bringoux L, Mestre DR, Vercher JL. Bayesian networks and information theory for audio-visual perception modeling. BIOLOGICAL CYBERNETICS 2010; 103:213-226. [PMID: 20502912 DOI: 10.1007/s00422-010-0392-8] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/18/2009] [Accepted: 04/12/2010] [Indexed: 05/29/2023]
Abstract
Thanks to their different senses, human observers acquire multiple information coming from their environment. Complex cross-modal interactions occur during this perceptual process. This article proposes a framework to analyze and model these interactions through a rigorous and systematic data-driven process. This requires considering the general relationships between the physical events or factors involved in the process, not only in quantitative terms, but also in term of the influence of one factor on another. We use tools from information theory and probabilistic reasoning to derive relationships between the random variables of interest, where the central notion is that of conditional independence. Using mutual information analysis to guide the model elicitation process, a probabilistic causal model encoded as a Bayesian network is obtained. We exemplify the method by using data collected in an audio-visual localization task for human subjects, and we show that it yields a well-motivated model with good predictive ability. The model elicitation process offers new prospects for the investigation of the cognitive mechanisms of multisensory perception.
Collapse
Affiliation(s)
- Patricia Besson
- Institute of Movement Sciences, CNRS & Université de la Méditerranée, Marseille, France.
| | | | | | | | | | | |
Collapse
|
110
|
Latinus M, VanRullen R, Taylor MJ. Top-down and bottom-up modulation in processing bimodal face/voice stimuli. BMC Neurosci 2010; 11:36. [PMID: 20222946 PMCID: PMC2850913 DOI: 10.1186/1471-2202-11-36] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2009] [Accepted: 03/11/2010] [Indexed: 11/16/2022] Open
Abstract
Background Processing of multimodal information is a critical capacity of the human brain, with classic studies showing bimodal stimulation either facilitating or interfering in perceptual processing. Comparing activity to congruent and incongruent bimodal stimuli can reveal sensory dominance in particular cognitive tasks. Results We investigated audiovisual interactions driven by stimulus properties (bottom-up influences) or by task (top-down influences) on congruent and incongruent simultaneously presented faces and voices while ERPs were recorded. Subjects performed gender categorisation, directing attention either to faces or to voices and also judged whether the face/voice stimuli were congruent in terms of gender. Behaviourally, the unattended modality affected processing in the attended modality: the disruption was greater for attended voices. ERPs revealed top-down modulations of early brain processing (30-100 ms) over unisensory cortices. No effects were found on N170 or VPP, but from 180-230 ms larger right frontal activity was seen for incongruent than congruent stimuli. Conclusions Our data demonstrates that in a gender categorisation task the processing of faces dominate over the processing of voices. Brain activity showed different modulation by top-down and bottom-up information. Top-down influences modulated early brain activity whereas bottom-up interactions occurred relatively late.
Collapse
Affiliation(s)
- Marianne Latinus
- Université de Toulouse, UPS, CNRS, Centre de recherche Cerveau et Cognition, Toulouse, France.
| | | | | |
Collapse
|
111
|
Kawabe T, Shirai N, Wada Y, Miura K, Kanazawa S, Yamaguchi MK. The audiovisual tau effect in infancy. PLoS One 2010; 5:e9503. [PMID: 20209137 PMCID: PMC2831064 DOI: 10.1371/journal.pone.0009503] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2009] [Accepted: 02/10/2010] [Indexed: 12/04/2022] Open
Abstract
BACKGROUND Perceived spatial intervals between successive flashes can be distorted by varying the temporal intervals between them (the "tau effect"). A previous study showed that a tau effect for visual flashes could be induced when they were accompanied by auditory beeps with varied temporal intervals (an audiovisual tau effect). METHODOLOGY/PRINCIPAL FINDINGS We conducted two experiments to investigate whether the audiovisual tau effect occurs in infancy. Forty-eight infants aged 5-8 months took part in this study. In Experiment 1, infants were familiarized with audiovisual stimuli consisting of three pairs of two flashes and three beeps. The onsets of the first and third pairs of flashes were respectively matched to those of the first and third beeps. The onset of the second pair of flashes was separated from that of the second beep by 150 ms. Following the familiarization phase, infants were exposed to a test stimulus composed of two vertical arrays of three static flashes with different spatial intervals. We hypothesized that if the audiovisual tau effect occurred in infancy then infants would preferentially look at the flash array with spatial intervals that would be expected to be different from the perceived spatial intervals between flashes they were exposed to in the familiarization phase. The results of Experiment 1 supported this hypothesis. In Experiment 2, the first and third beeps were removed from the familiarization stimuli, resulting in the disappearance of the audiovisual tau effect. This indicates that the modulation of temporal intervals among flashes by beeps was essential for the audiovisual tau effect to occur (Experiment 2). CONCLUSIONS/SIGNIFICANCE These results suggest that the cross-modal processing that underlies the audiovisual tau effect occurs even in early infancy. In particular, the results indicate that audiovisual modulation of temporal intervals emerges by 5-8 months of age.
Collapse
Affiliation(s)
- Takahiro Kawabe
- Institute for Advanced Study, Kyushu University, Fukuoka, Fukuoka, Japan.
| | | | | | | | | | | |
Collapse
|
112
|
Tremblay L, Nguyen T. Real-time decreased sensitivity to an audio-visual illusion during goal-directed reaching. PLoS One 2010; 5:e8952. [PMID: 20126451 PMCID: PMC2813281 DOI: 10.1371/journal.pone.0008952] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2009] [Accepted: 01/11/2010] [Indexed: 11/19/2022] Open
Abstract
In humans, sensory afferences are combined and integrated by the central nervous system (Ernst MO, Bülthoff HH (2004) Trends Cogn. Sci. 8: 162–169) and appear to provide a holistic representation of the environment. Empirical studies have repeatedly shown that vision dominates the other senses, especially for tasks with spatial demands. In contrast, it has also been observed that sound can strongly alter the perception of visual events. For example, when presented with 2 flashes and 1 beep in a very brief period of time, humans often report seeing 1 flash (i.e. fusion illusion, Andersen TS, Tiippana K, Sams M (2004) Brain Res. Cogn. Brain Res. 21: 301–308). However, it is not known how an unfolding movement modulates the contribution of vision to perception. Here, we used the audio-visual illusion to demonstrate that goal-directed movements can alter visual information processing in real-time. Specifically, the fusion illusion was linearly reduced as a function of limb velocity. These results suggest that cue combination and integration can be modulated in real-time by goal-directed behaviors; perhaps through sensory gating (Chapman CE, Beauchamp E (2006) J. Neurophysiol. 96: 1664–1675) and/or altered sensory noise (Ernst MO, Bülthoff HH (2004) Trends Cogn. Sci. 8: 162–169) during limb movements.
Collapse
Affiliation(s)
- Luc Tremblay
- Faculty of Physical Education and Health, University of Toronto, Toronto, Ontario, Canada
| | | |
Collapse
|
113
|
Mortensen DH, Bech S, Begault DR, Adelstein BD. The relative importance of visual, auditory, and haptic information for the user's experience of mechanical switches. Perception 2009; 38:1560-71. [PMID: 19950486 DOI: 10.1068/p5929] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
While the use of hand tools and other everyday manually controlled devices is naturally accompanied by multisensory feedback, the deployment of fully multimodal virtual interfaces requires that haptic, acoustic, and visual cues be synthesised. The complexity and character of this synthesis will depend on a thorough understanding of the multimodal perceptual experience, including the interrelations between the individual sensory channels during manual interaction. In this study seventy participants were asked to rank the manual operation of ten electromechanical switches according to preference. The participants were randomly assigned in groups of ten to one of seven sensory presentation conditions. These conditions comprised six bimodal and unimodal sensory combinations created by selectively restricting the flow of haptic, auditory, and visual information, plus one condition in which full sensory information was available. A principal components analysis on the obtained ranking data indicated that the sensory conditions with unimpeded haptic information were clearly distinct from those in which the haptic cues were impeded. The analysis also showed that, for switch use, the unimodal haptic condition most closely approached the condition with combined haptic, auditory, and visual feedback, compared with all of the conditions where haptic feedback was restricted.
Collapse
|
114
|
The impact of spatial incongruence on an auditory-visual illusion. PLoS One 2009; 4:e6450. [PMID: 19649293 PMCID: PMC2714182 DOI: 10.1371/journal.pone.0006450] [Citation(s) in RCA: 39] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2009] [Accepted: 07/03/2009] [Indexed: 11/19/2022] Open
Abstract
Background The sound-induced flash illusion is an auditory-visual illusion – when a single flash is presented along with two or more beeps, observers report seeing two or more flashes. Previous research has shown that the illusion gradually disappears as the temporal delay between auditory and visual stimuli increases, suggesting that the illusion is consistent with existing temporal rules of neural activation in the superior colliculus to multisensory stimuli. However little is known about the effect of spatial incongruence, and whether the illusion follows the corresponding spatial rule. If the illusion occurs less strongly when auditory and visual stimuli are separated, then integrative processes supporting the illusion must be strongly dependant on spatial congruence. In this case, the illusion would be consistent with both the spatial and temporal rules describing response properties of multisensory neurons in the superior colliculus. Methodology/Principal Findings The main aim of this study was to investigate the importance of spatial congruence in the flash-beep illusion. Selected combinations of one to four short flashes and zero to four short 3.5 KHz tones were presented. Observers were asked to count the number of flashes they saw. After replication of the basic illusion using centrally-presented stimuli, the auditory and visual components of the illusion stimuli were presented either both 10 degrees to the left or right of fixation (spatially congruent) or on opposite (spatially incongruent) sides, for a total separation of 20 degrees. Conclusions/Significance The sound-induced flash fission illusion was successfully replicated. However, when the sources of the auditory and visual stimuli were spatially separated, perception of the illusion was unaffected, suggesting that the “spatial rule” does not extend to describing behavioural responses in this illusion. We also find no evidence for an associated “fusion” illusion reportedly occurring when multiple flashes are accompanied by a single beep.
Collapse
|
115
|
Auditory dominance over vision in the perception of interval duration. Exp Brain Res 2009; 198:49-57. [PMID: 19597804 DOI: 10.1007/s00221-009-1933-z] [Citation(s) in RCA: 150] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2009] [Accepted: 06/29/2009] [Indexed: 10/20/2022]
|
116
|
Kawabe T. Audiovisual temporal capture underlies flash fusion. Exp Brain Res 2009; 198:195-208. [PMID: 19521693 DOI: 10.1007/s00221-009-1877-3] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/04/2008] [Accepted: 05/21/2009] [Indexed: 11/24/2022]
Abstract
When sequential visual flashes are accompanied by a lower number of sequential auditory pulses, the perceived number of visual flashes is lower than the actual number, an illusion termed 'flash fusion'. We examined whether temporal capture of flashes by pulses underlay flash fusion. One of the visual flashes was given a luminance increment, and observers reported which flash had the luminance increment. Results showed that the pulse strongly captured the flashes in its temporal vicinity, resulting in flash fusion. Moreover, when one of the successive pulses was given a higher frequency than others, the luminance increment was perceptually paired with the pulse with the higher frequency. The pairing of audiovisual features disappeared when the temporal pattern of the pulse frequency was difficult for the observer to anticipate. These data indicate that flash fusion is caused by temporal capture of flashes by the pulse, and that feature matching between auditory and visual signals also contributes to the modulation of perceived temporal structure of flashes during flash fusion.
Collapse
|
117
|
The role of attention on the integration of visual and inertial cues. Exp Brain Res 2009; 198:287-300. [PMID: 19350230 PMCID: PMC2733186 DOI: 10.1007/s00221-009-1767-8] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2008] [Accepted: 03/03/2009] [Indexed: 11/05/2022]
Abstract
The extent to which attending to one stimulus while ignoring another influences the integration of visual and inertial (vestibular, somatosensory, proprioceptive) stimuli is currently unknown. It is also unclear how cue integration is affected by an awareness of cue conflicts. We investigated these questions using a turn-reproduction paradigm, where participants were seated on a motion platform equipped with a projection screen and were asked to actively return a combined visual and inertial whole-body rotation around an earth-vertical axis. By introducing cue conflicts during the active return and asking the participants whether they had noticed a cue conflict, we measured the influence of each cue on the response. We found that the task instruction had a significant effect on cue weighting in the response, with a higher weight assigned to the attended modality, only when participants noticed the cue conflict. This suggests that participants used task-induced attention to reduce the influence of stimuli that conflict with the task instructions.
Collapse
|
118
|
Vroomen J, Keetels M. Sounds change four-dot masking. Acta Psychol (Amst) 2009; 130:58-63. [PMID: 19012870 DOI: 10.1016/j.actpsy.2008.10.001] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/26/2008] [Revised: 09/05/2008] [Accepted: 10/06/2008] [Indexed: 10/21/2022] Open
Abstract
The temporal occurrence of a flash can be shifted towards a slightly offset sound (temporal ventriloquism). Here we examined whether four-dot masking is affected by this phenomenon. In Experiment 1, we demonstrate that there is release from four-dot masking if two sounds--one before the target and one after the mask--are presented at approximately 100 ms intervals rather than at approximately 0 ms intervals or a silent condition. In Experiment 2, we show that the release from masking originates from an alerting effect of the first sound, and a temporal ventriloquist effect from the first and second sounds that lengthened the perceived interval between target and mask, thereby leaving more time for the target to consolidate. Results thus show that sounds penetrate the visual system at more than one level.
Collapse
|
119
|
LUU SHEENA, WONG WILLY, NOORDIN HAFIZ. Low-level audiovisual synchrony: Experiments and model. JAPANESE PSYCHOLOGICAL RESEARCH 2008. [DOI: 10.1111/j.1468-5884.2008.00377.x] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
120
|
Andersen TS, Mamassian P. Audiovisual integration of stimulus transients. Vision Res 2008; 48:2537-44. [DOI: 10.1016/j.visres.2008.08.018] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/04/2008] [Revised: 08/13/2008] [Accepted: 08/25/2008] [Indexed: 10/21/2022]
|
121
|
Bruns P, Getzmann S. Audiovisual influences on the perception of visual apparent motion: exploring the effect of a single sound. Acta Psychol (Amst) 2008; 129:273-83. [PMID: 18790468 DOI: 10.1016/j.actpsy.2008.08.002] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2007] [Revised: 06/17/2008] [Accepted: 08/05/2008] [Indexed: 11/19/2022] Open
Abstract
Previous research has shown that irrelevant sounds can facilitate the perception of visual apparent motion. Here the effectiveness of a single sound to facilitate motion perception was investigated in three experiments. Observers were presented with two discrete lights temporally separated by stimulus onset asynchronies from 0 to 350 ms. After each trial, observers classified their impression of the stimuli using a categorisation system. A short sound presented temporally (and spatially) midway between the lights facilitated the impression of motion relative to baseline (lights without sound), whereas a sound presented either before the first or after the second light or simultaneously with the lights did not affect motion impression. The facilitation effect also occurred with sound presented far from the visual display, as well as with continuous-sound that was started with the first light and terminated with the second light. No facilitation of visual motion perception occurred if the sound was part of a tone sequence that allowed for intramodal perceptual grouping of the auditory stimuli prior to the critical audiovisual stimuli. Taken together, the findings are consistent with a low-level audiovisual integration approach in which the perceptual system merges temporally proximate sound and light stimuli, thereby provoking the impression of a single multimodal moving object.
Collapse
Affiliation(s)
- Patrick Bruns
- Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, D-20146 Hamburg, Germany.
| | | |
Collapse
|
122
|
Auvray M, Spence C. The multisensory perception of flavor. Conscious Cogn 2008; 17:1016-31. [PMID: 17689100 DOI: 10.1016/j.concog.2007.06.005] [Citation(s) in RCA: 226] [Impact Index Per Article: 14.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2007] [Revised: 06/05/2007] [Accepted: 06/14/2007] [Indexed: 01/24/2023]
Abstract
Following on from ecological theories of perception, such as the one proposed by [Gibson, J. J. (1966). The senses considered as perceptual systems. Boston: Houghton Mifflin] this paper reviews the literature on the multisensory interactions underlying the perception of flavor in order to determine the extent to which it is really appropriate to consider flavor perception as a distinct perceptual system. We propose that the multisensory perception of flavor may be indicative of the fact that the taxonomy currently used to define our senses is simply not appropriate. According to the view outlined here, the act of eating allows the different qualities of foodstuffs to be combined into unified percepts; and flavor can be used as a term to describe the combination of tastes, smells, trigeminal, and tactile sensations as well as the visual and auditory cues, that we perceive when tasting food.
Collapse
Affiliation(s)
- Malika Auvray
- Department of Experimental Psychology, Oxford University, South Parks Road, Oxford OX1 3UD, UK.
| | | |
Collapse
|
123
|
On perceived synchrony—neural dynamics of audiovisual illusions and suppressions. Brain Res 2008; 1220:132-41. [DOI: 10.1016/j.brainres.2007.09.045] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2007] [Revised: 09/17/2007] [Accepted: 09/17/2007] [Indexed: 11/19/2022]
|
124
|
Cortical processes underlying sound-induced flash fusion. Brain Res 2008; 1242:102-15. [PMID: 18585695 DOI: 10.1016/j.brainres.2008.05.023] [Citation(s) in RCA: 55] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2008] [Revised: 05/03/2008] [Accepted: 05/05/2008] [Indexed: 11/23/2022]
Abstract
When two brief flashes presented in rapid succession (<100 ms apart) are paired with a single auditory stimulus, subjects often report perceiving only a single flash [Andersen, T.S., Tiippana, K., Sams, M., 2004. Factors influencing audiovisual fission and fusion illusions. Brain Res. Cogn. Brain Res. 21, 301-308; Shams, L., Iwaki, S., Chawla, A., Bhattacharya, J., 2005a. Early modulation of visual cortex by sound: an MEG study. Neurosci. Lett. 378, 76-81, Shams, L., Ma, W.J., Beierholm, U., 2005b. Sound-induced flash illusion as an optimal percept. Neuroreport 16, 1923-1927]. We used event-related potentials (ERPs) to investigate the timing and localization of the cortical processes that underlie this sound induced flash fusion, which is complementary to the sound-induced extra flash illusion that we analyzed previously [Mishra, J., Martinez, A., Sejnowski, T.J. and Hillyard, S.A., Early cross-modal interactions in auditory and visual cortex underlie a sound-induced visual illusion. J. Neurosci. 27 (2007) 4120-4131]. The difference ERP that represented the cross-modal interaction between the visual (two flashes) and auditory (one sound) constituents of the bimodal stimulus revealed a positive component elicited 160-190 ms after stimulus onset, which was markedly attenuated in subjects who did not perceive the second flash. This component, previously designated as PD180 [Mishra, J., Martinez, A., Sejnowski, T.J. and Hillyard, S.A., Early cross-modal interactions in auditory and visual cortex underlie a sound-induced visual illusion. J. Neurosci. 27 (2007) 4120-4131], was localized by dipole modeling to polysensory superior temporal cortex. PD180 was found to covary in amplitude across subjects with the visual evoked N1 component (148-184 ms), suggesting that inter-individual differences in perceiving the illusion are based at least in part on differences in visual processing. A trial-by-trial analysis found that the PD180 as well as a subsequent modulation in visual cortex at 228-248 ms was diminished on trials when the two flashes were perceived as one relative to trials when two flashes were correctly reported. These results suggest that the sound induced flash fusion is based on an interaction between polysensory and visual cortical areas.
Collapse
|
125
|
McCormick D, Mamassian P. What does the illusory-flash look like? Vision Res 2007; 48:63-9. [PMID: 18054372 DOI: 10.1016/j.visres.2007.10.010] [Citation(s) in RCA: 42] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2007] [Revised: 10/08/2007] [Accepted: 10/10/2007] [Indexed: 11/15/2022]
Abstract
In the illusory-flash effect (Shams, L., Kamitani, Y., & Shimojo, S. (2000). Illusions. What you see is what you hear. Nature, 408, 788), one flash presented with two tones has a tendency to be seen as two flashes. Previous studies of this effect have been ill-equipped to establish whether this illusory-flash is the result of a genuine percept, or that of a shift in criterion. We addressed this issue by using a stimulus comprising two locations. This enabled contrast-threshold measurement by means of a location detection task. High-contrast white or black flashes were presented simultaneously to both locations, followed by threshold contrast flashes of the same contrast polarity at the two locations in half of the trials; observers reported whether or not the low-contrast flashes had been present. Irrelevant to the task, half of the trials contained one tone, the other half contained two tones. In this way, we were able to compute the change in sensitivity and shift in criterion between illusory and non-illusory trials. We observe both a decrease in visual sensitivity and a criterion shift in the illusory-flash conditions. In a second experiment, we were interested in determining whether this change in visual sensitivity gave rise to measurable visual attributes of the illusory-flash. If it has a contrast, it should interact with a spatio-temporally concurrent real flash. Using a similar two-location stimulus presentation, we found that under certain conditions, we were able to infer the polarity of the perceived illusory-flash. We conclude that the illusory-flash is indeed a perceptual effect with psychophysically assessable characteristics.
Collapse
Affiliation(s)
- David McCormick
- Laboratoire Psychologie de la Perception, CNRS & Université Paris Descartes, 45 rue des Saints-Pères, 75270 Paris Cedex 06, France.
| | | |
Collapse
|
126
|
Tremblay C, Champoux F, Voss P, Bacon BA, Lepore F, Théoret H. Speech and non-speech audio-visual illusions: a developmental study. PLoS One 2007; 2:e742. [PMID: 17710142 PMCID: PMC1937019 DOI: 10.1371/journal.pone.0000742] [Citation(s) in RCA: 77] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2007] [Accepted: 07/16/2007] [Indexed: 11/19/2022] Open
Abstract
It is well known that simultaneous presentation of incongruent audio and visual stimuli can lead to illusory percepts. Recent data suggest that distinct processes underlie non-specific intersensory speech as opposed to non-speech perception. However, the development of both speech and non-speech intersensory perception across childhood and adolescence remains poorly defined. Thirty-eight observers aged 5 to 19 were tested on the McGurk effect (an audio-visual illusion involving speech), the Illusory Flash effect and the Fusion effect (two audio-visual illusions not involving speech) to investigate the development of audio-visual interactions and contrast speech vs. non-speech developmental patterns. Whereas the strength of audio-visual speech illusions varied as a direct function of maturational level, performance on non-speech illusory tasks appeared to be homogeneous across all ages. These data support the existence of independent maturational processes underlying speech and non-speech audio-visual illusory effects.
Collapse
Affiliation(s)
- Corinne Tremblay
- Department of Psychology, University of Montreal, Montreal, Canada
- Research Center, Sainte-Justine Hospital, Montreal, Canada
| | - François Champoux
- Speech Language Pathology and Audiology, University of Montreal, Montreal, Canada
| | - Patrice Voss
- Department of Psychology, University of Montreal, Montreal, Canada
| | - Benoit A. Bacon
- Department of Psychology, Bishop's University, Sherbrooke, Quebec, Canada
| | - Franco Lepore
- Department of Psychology, University of Montreal, Montreal, Canada
- Research Center, Sainte-Justine Hospital, Montreal, Canada
| | - Hugo Théoret
- Department of Psychology, University of Montreal, Montreal, Canada
- Research Center, Sainte-Justine Hospital, Montreal, Canada
- * To whom correspondence should be addressed. E-mail:
| |
Collapse
|
127
|
Watkins S, Shams L, Josephs O, Rees G. Activity in human V1 follows multisensory perception. Neuroimage 2007; 37:572-8. [PMID: 17604652 DOI: 10.1016/j.neuroimage.2007.05.027] [Citation(s) in RCA: 90] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2006] [Revised: 04/11/2007] [Accepted: 05/03/2007] [Indexed: 11/25/2022] Open
Abstract
When a single brief visual flash is accompanied by two auditory bleeps, it is frequently perceived incorrectly as two flashes. Such illusory multisensory perception is associated with increased activation of retinotopic human primary visual cortex (V1) suggesting that such activity reflects subjective perception [Watkins, S., Shams, L., Tanaka, S., Haynes, J.D., Rees, G., 2006. Sound alters activity in human V1 in association with illusory visual perception. Neuroimage. 31, 1247-1256]. However, an alternate possibility is that increased V1 activity reflects either fluctuating attention or auditory-visual perceptual matching on illusion trials. Here, we rule out these possibilities by studying the complementary illusion, where a double flash is accompanied by a single bleep and perceived incorrectly as a single flash. We replicate findings of increased activity in retinotopic V1 when a single flash is perceived incorrectly as two flashes, and now show that activity is decreased in retinotopic V1 when a double flash is perceived incorrectly as a single flash. Our findings provide strong support for the notion that human V1 activity reflects subjective perception in these multisensory illusions.
Collapse
Affiliation(s)
- S Watkins
- Wellcome Trust Centre for Neuroimaging, University College London, 12 Queen Square, London WC1N 3BG, UK.
| | | | | | | |
Collapse
|
128
|
van Atteveldt NM, Formisano E, Goebel R, Blomert L. Top–down task effects overrule automatic multisensory responses to letter–sound pairs in auditory association cortex. Neuroimage 2007; 36:1345-60. [PMID: 17513133 DOI: 10.1016/j.neuroimage.2007.03.065] [Citation(s) in RCA: 77] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/19/2007] [Revised: 03/20/2007] [Accepted: 03/24/2007] [Indexed: 10/23/2022] Open
Abstract
In alphabetic scripts, letters and speech sounds are the basic elements of correspondence between spoken and written language. In two previous fMRI studies, we showed that the response to speech sounds in the auditory association cortex was enhanced by congruent letters and suppressed by incongruent letters. Interestingly, temporal synchrony was critical for this congruency effect to occur. We interpreted these results as a neural correlate of letter-sound integration, driven by the learned congruency of letter-sound pairs. The present event-related fMRI study was designed to address two questions that could not directly be addressed in the previous studies, due to their passive nature and blocked design. Specifically: (1) to examine whether the enhancement/suppression of auditory cortex are truly multisensory integration effects or can be explained by different attention levels during congruent/incongruent blocks, and (2) to examine the effect of top-down task demands on the neural integration of letter-sound pairs. Firstly, we replicated the previous results with random stimulus presentation, which rules out an explanation of the congruency effect in auditory cortex solely in terms of attention. Secondly, we showed that the effects of congruency and temporal asynchrony in the auditory association cortex were absent during active matching. This indicates that multisensory responses in the auditory association cortex heavily depend on task demands. Without task instructions, the auditory cortex is modulated to favor the processing of congruent and synchronous information. This modulation is overruled during explicit matching when all audiovisual stimuli are equally relevant, independent of congruency and temporal relation.
Collapse
Affiliation(s)
- Nienke M van Atteveldt
- University of Maastricht, Faculty of Psychology, Department of Cognitive Neuroscience, 6200 MD Maastricht, The Netherlands.
| | | | | | | |
Collapse
|
129
|
Mishra J, Martinez A, Sejnowski TJ, Hillyard SA. Early cross-modal interactions in auditory and visual cortex underlie a sound-induced visual illusion. J Neurosci 2007; 27:4120-31. [PMID: 17428990 PMCID: PMC2905511 DOI: 10.1523/jneurosci.4912-06.2007] [Citation(s) in RCA: 173] [Impact Index Per Article: 10.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2006] [Revised: 02/28/2007] [Accepted: 03/08/2007] [Indexed: 11/21/2022] Open
Abstract
When a single flash of light is presented interposed between two brief auditory stimuli separated by 60-100 ms, subjects typically report perceiving two flashes (Shams et al., 2000, 2002). We investigated the timing and localization of the cortical processes that underlie this illusory flash effect in 34 subjects by means of 64-channel recordings of event-related potentials (ERPs). A difference ERP calculated to isolate neural activity associated with the illusory second flash revealed an early modulation of visual cortex activity at 30-60 ms after the second sound, which was larger in amplitude in subjects who saw the illusory flash more frequently. These subjects also showed this early modulation in response to other combinations of auditory and visual stimuli, thus pointing to consistent individual differences in the neural connectivity that underlies cross-modal integration. The overall pattern of cortical activity associated with the cross-modally induced illusory flash, however, differed markedly from that evoked by a real second flash. A trial-by-trial analysis showed that short-latency ERP activity localized to auditory cortex and polymodal cortex of the temporal lobe, concurrent with gamma bursts in visual cortex, were associated with perception of the double-flash illusion. These results provide evidence that perception of the illusory second flash is based on a very rapid dynamic interplay between auditory and visual cortical areas that is triggered by the second sound.
Collapse
Affiliation(s)
| | - Antigona Martinez
- Department of Neurosciences, University of California, San Diego, La Jolla, California 92093
- Nathan S. Kline Institute for Psychiatric Research, Orangeburg, New York 10962, and
| | - Terrence J. Sejnowski
- Division of Biological Sciences and
- Howard Hughes Medical Institute, Computational Neurobiology Laboratory, Salk Institute, La Jolla, California 92037
| | - Steven A. Hillyard
- Department of Neurosciences, University of California, San Diego, La Jolla, California 92093
| |
Collapse
|
130
|
Meylan RV, Murray MM. Auditory-visual multisensory interactions attenuate subsequent visual responses in humans. Neuroimage 2007; 35:244-54. [PMID: 17215144 DOI: 10.1016/j.neuroimage.2006.11.033] [Citation(s) in RCA: 33] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2006] [Revised: 11/03/2006] [Accepted: 11/15/2006] [Indexed: 11/28/2022] Open
Abstract
Effects of multisensory interactions on how subsequent sensory inputs are processed remain poorly understood. We investigated whether multisensory interactions between rudimentary visual and auditory stimuli (flashes and beeps) affect later visual processing. A 2 x 3 design varied the number of flashes (1 or 2) with the number of beeps (0, 1, or 2) presented on each trial, such that '2F1B' refers to the presentation of 2 flashes with 1 beep. Beeps, when present, were synchronous with the first flash, and pairs of stimuli within a trial were separated by 52 ms ISI. Subjects indicated the number of flashes presented. Electrical neuroimaging of 128-channel event-related potentials assessed both the electric field strength and topography. Isolation of responses a visual stimulus that was preceded by a multisensory event was achieved by calculating the difference between the 2F1B and 1F1B conditions, and responses to a visual stimulus preceded by a unisensory event were isolated by calculating the difference between the 2F0B and 1F0B conditions (MUL and VIS, respectively). Comparison of MUL and VIS revealed that the treatment of visual information was significantly attenuated approximately 160 ms after the onset of the second flash when it was preceded by a multisensory event. Source estimations further indicated that this attenuation occurred within low-level visual cortices. Multisensory interactions are ongoing in low-level visual cortices and affect incoming sensory processing. These data provide evidence that multisensory interactions are not restricted in time and can dramatically influence the treatment of subsequent stimuli, opening new lines of multisensory research.
Collapse
Affiliation(s)
- Raphaël V Meylan
- The Functional Electrical Neuroimaging Laboratory, Neuropsychology Division, Centre Hospitalier Universitaire Vaudois, Lausanne, Switzerland
| | | |
Collapse
|
131
|
Holmes NP, Sanabria D, Calvert GA, Spence C. Multisensory interactions follow the hands across the midline: evidence from a non-spatial visual-tactile congruency task. Brain Res 2006; 1077:108-15. [PMID: 16483553 PMCID: PMC1482253 DOI: 10.1016/j.brainres.2005.11.010] [Citation(s) in RCA: 32] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2005] [Revised: 06/29/2005] [Accepted: 11/07/2005] [Indexed: 11/30/2022]
Abstract
Crossing the hands over, whether across the body midline or with respect to each other, leads to measurable changes in spatial compatibility, spatial attention, and frequently to a general decrement in discrimination performance for tactile stimuli. The majority of multisensory crossed hands effects, however, have been demonstrated with explicit or implicit spatial discrimination tasks, raising the question of whether non-spatial discrimination tasks also show spatial effects when the hands are crossed. We designed a novel, non-spatial tactile discrimination task to address this issue. Participants made speeded discriminations of single- versus double-pulse vibrotactile targets, while trying to ignore simultaneous visual distractor stimuli, in both hands uncrossed and hands crossed postures. Tactile discrimination performance was significantly affected by the visual distractors (demonstrating a significant crossmodal congruency effect) and was affected most by visual distractors in the same external location as the tactile target (i.e., spatial modulation), regardless of the posture (uncrossed or crossed) of the hands (i.e., spatial 'remapping' of visual-tactile interactions). Finally, crossing the hands led to a general performance decrement with visual distractors, but not in a control task with unimodal visual or tactile judgements. These results demonstrate, for the first time, significant spatial and postural modulations of crossmodal congruency effects in a non-spatial discrimination task.
Collapse
Affiliation(s)
- Nicholas P Holmes
- Department of Experimental Psychology, South Parks Road, Oxford OX1 3UD, UK.
| | | | | | | |
Collapse
|
132
|
Sanabria D, Soto-Faraco S, Spence C. Spatiotemporal interactions between audition and touch depend on hand posture. Exp Brain Res 2005; 165:505-14. [PMID: 15942735 DOI: 10.1007/s00221-005-2327-5] [Citation(s) in RCA: 32] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2004] [Accepted: 02/26/2005] [Indexed: 11/27/2022]
Abstract
We report two experiments designed to assess the consequences of posture change on audiotactile spatiotemporal interactions. In Experiment 1, participants had to discriminate the direction of an auditory stream (consisting of the sequential presentation of two tones from different spatial positions) while attempting to ignore a task-irrelevant tactile stream (consisting of the sequential presentation of two vibrations, one to each of the participant's hands). The tactile stream presented to the participants' hands was either spatiotemporally congruent or incongruent with respect to the sounds. A significant decrease in performance in incongruent trials compared with congruent trials was demonstrated when the participants adopted an uncrossed-hands posture but not when their hands were crossed over the midline. In Experiment 2, we investigated the ability of participants to discriminate the direction of two sequentially presented tactile stimuli (one presented to each hand) as a function of the presence of congruent vs incongruent auditory distractors. Here, the crossmodal effect was stronger in the crossed-hands posture than in the uncrossed-hands posture. These results demonstrate the reciprocal nature of audiotactile interactions in spatiotemporal processing, and highlight the important role played by body posture in modulating such crossmodal interactions.
Collapse
Affiliation(s)
- Daniel Sanabria
- Department of Experimental Psychology, University of Oxford, South Parks Road, Oxford, OX1 3UD, UK.
| | | | | |
Collapse
|
133
|
Andersen TS, Tiippana K, Sams M. Maximum Likelihood Integration of rapid flashes and beeps. Neurosci Lett 2005; 380:155-60. [PMID: 15854769 DOI: 10.1016/j.neulet.2005.01.030] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2004] [Revised: 12/13/2004] [Accepted: 01/11/2005] [Indexed: 11/22/2022]
Abstract
Maximum likelihood models of multisensory integration are theoretically attractive because the goals and assumptions of sensory information processing are explicitly stated in such optimal models. When subjects perceive stimuli categorically, as opposed to on a continuous scale, Maximum Likelihood Integration (MLI) can occur before or after categorization-early or late. We introduce early MLI and apply it to the audiovisual perception of rapid beeps and flashes. We compare it to late MLI and show that early MLI is a better fitting and more parsimonious model. We also show that early MLI is better able to account for the effects of information reliability, modality appropriateness and intermodal attention which affect multisensory perception.
Collapse
Affiliation(s)
- Tobias S Andersen
- Laboratory of Computational Engineering, Helsinki University of Technology, Finland, P.O. Box 3000, 02015 HUT, Finland.
| | | | | |
Collapse
|