1
|
Hu R, Li S, Yuan P, Wang Y, Jiang Y. Temporal integration by multi-level regularities fosters the emergence of dynamic conscious experience. Ann N Y Acad Sci 2024; 1533:156-168. [PMID: 38294967 DOI: 10.1111/nyas.15099] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/02/2024]
Abstract
The relationship between integration and awareness is central to contemporary theories and research on consciousness. Here, we investigated whether and how information integration over time, by incorporating the underlying regularities, contributes to our awareness of the dynamic world. Using binocular rivalry, we demonstrated that structured visual streams, constituted by shape, motion, or idiom sequences containing perceptual- or semantic-level regularities, predominated over their nonstructured but otherwise matched counterparts in the competition for visual awareness. Despite the apparent resemblance, a substantial dissociation of the observed rivalry advantages emerged between perceptual- and semantic-level regularities. These effects stem from nonconscious and conscious temporal integration processes, respectively, with the former but not the latter being vulnerable to perturbations in the spatiotemporal integration window. These findings corroborate the essential role of structure-guided information integration in visual awareness and highlight a multi-level mechanism where temporal integration by perceptually and semantically defined regularities fosters the emergence of continuous conscious experience.
Collapse
Affiliation(s)
- Ruichen Hu
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Shuo Li
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Peijun Yuan
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Ying Wang
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| | - Yi Jiang
- State Key Laboratory of Brain and Cognitive Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
| |
Collapse
|
2
|
Montoya S, Badde S. Only visible flicker helps flutter: Tactile-visual integration breaks in the absence of visual awareness. Cognition 2023; 238:105528. [PMID: 37354787 DOI: 10.1016/j.cognition.2023.105528] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2023] [Revised: 06/15/2023] [Accepted: 06/16/2023] [Indexed: 06/26/2023]
Abstract
Combining information from multiple senses enhances our perception of the world. Whether we need to be aware of all stimuli to benefit from multisensory integration, however, is still under investigation. Here, we tested whether tactile frequency perception benefits from the presence of congruent visual flicker even if the flicker is so rapid that it is perceptually fused into a steady light and therefore invisible. Our participants completed a tactile frequency discrimination task given either unisensory tactile or congruent tactile-visual stimulation. Tactile and tactile-visual test frequencies ranged from far below to far above participants' flicker fusion threshold (determined separately). For frequencies distinctively below their flicker fusion threshold, participants performed significantly better given tactile-visual stimulation than when presented with only tactile stimuli. Yet, for frequencies above their flicker fusion threshold, participants' tactile frequency perception did not profit from the presence of congruent but likely fused and thus invisible visual flicker. The results matched the predictions of an ideal-observer model in which tactile-visual integration is conditional on awareness of both stimuli. In contrast, it was impossible to reproduce the observed results with a model that assumed tactile-visual integration proceeds irrespective of stimulus awareness. In sum, we revealed that the benefits of congruent visual stimulation for tactile flutter frequency perception depend on the visibility of the visual flicker, suggesting that multisensory integration requires awareness.
Collapse
Affiliation(s)
- Sofia Montoya
- Department of Psychology, Tufts University, 490 Boston Avenue, 02155 Medford, MA, USA
| | - Stephanie Badde
- Department of Psychology, Tufts University, 490 Boston Avenue, 02155 Medford, MA, USA.
| |
Collapse
|
3
|
Sathian K, Lacey S. Cross-Modal Interactions of the Tactile System. CURRENT DIRECTIONS IN PSYCHOLOGICAL SCIENCE 2022; 31:411-418. [PMID: 36408466 PMCID: PMC9674209 DOI: 10.1177/09637214221101877] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/29/2023]
Abstract
The sensory systems responsible for perceptions of touch, vision, hearing, etc. have traditionally been regarded as mostly separate, only converging at late stages of processing. Contrary to this dogma, recent work has shown that interactions between the senses are robust and abundant. Touch and vision are both commonly used to obtain information about a number of object properties, and share perceptual and neural representations in many domains. Additionally, visuotactile interactions are implicated in the sense of body ownership, as revealed by powerful illusions that can be evoked by manipulating these interactions. Touch and hearing both rely in part on temporal frequency information, leading to a number of audiotactile interactions reflecting a good deal of perceptual and neural overlap. The focus in sensory neuroscience and psychophysics is now on characterizing the multisensory interactions that lead to our panoply of perceptual experiences.
Collapse
Affiliation(s)
- K. Sathian
- Department of Neurology, Penn State Health Milton S. Hershey Medical Center
- Department of Neural & Behavioral Sciences, Penn State College of Medicine
- Department of Psychology, Penn State College of Liberal Arts
| | - Simon Lacey
- Department of Neurology, Penn State Health Milton S. Hershey Medical Center
- Department of Neural & Behavioral Sciences, Penn State College of Medicine
| |
Collapse
|
4
|
Ono M, Hirose N, Mori S. Tactile information affects alternating visual percepts during binocular rivalry using naturalistic objects. Cogn Res Princ Implic 2022; 7:40. [PMID: 35543826 PMCID: PMC9095789 DOI: 10.1186/s41235-022-00390-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2020] [Accepted: 04/17/2022] [Indexed: 12/14/2022] Open
Abstract
INTRODUCTION Past studies have provided evidence that the effects of tactile stimulation on binocular rivalry are mediated by primitive features (orientation and spatial frequency) common in vision and touch. In this study, we examined whether such effects on binocular rivalry can be obtained through the roughness of naturalistic objects. In three experiments, the total dominant time of visual percepts of two objects was measured under binocular rivalry when participants touched one of the objects. RESULT In Experiment 1, the total dominant time for the image of artificial turf and bathmat was prolonged by congruent tactile stimulation and shortened by incongruent tactile stimulation. In Experiment 2, we used the same stimuli but rotated their visual images in opposite directions. The dominant time for either image was prolonged by congruent tactile stimulation. In Experiment 3, we used different types of stimuli, smooth marble and rough fabric, and noted significant effects of the congruent and incongruent tactile stimulation on the dominant time of visual percepts. CONCLUSION These three experiments demonstrated that visuo-tactile interaction on binocular rivalry can be mediated by roughness.
Collapse
Affiliation(s)
- Mikoto Ono
- grid.177174.30000 0001 2242 4849Department of Informatics, Graduate school of Information Science and Electrical Engineering, Kyushu University, 744 Motooka, Nishi-ku, Fukuoka City, Fukuoka 819-0395 Japan
| | - Nobuyuki Hirose
- grid.177174.30000 0001 2242 4849Department of Informatics, Graduate school of Information Science and Electrical Engineering, Kyushu University, 744 Motooka, Nishi-ku, Fukuoka City, Fukuoka 819-0395 Japan
| | - Shuji Mori
- grid.177174.30000 0001 2242 4849Department of Informatics, Graduate school of Information Science and Electrical Engineering, Kyushu University, 744 Motooka, Nishi-ku, Fukuoka City, Fukuoka 819-0395 Japan
| |
Collapse
|
5
|
Direction-selective modulation of visual motion rivalry by collocated tactile motion. Atten Percept Psychophys 2022; 84:899-914. [PMID: 35194773 PMCID: PMC9001558 DOI: 10.3758/s13414-022-02453-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/28/2022] [Indexed: 12/03/2022]
Abstract
Early models of multisensory integration posited that cross-modal signals only converged in higher-order association cortices and that vision automatically dominates. However, recent studies have challenged this view. In this study, the significance of the alignment of motion axes and spatial alignment across visual and tactile stimuli, as well as the effect of hand visibility on visuo-tactile interactions were examined. Using binocular rivalry, opposed motions were presented to each eye and participants were required to track the perceived visual direction. A tactile motion that was either a leftward or rightward sweep across the fingerpad was intermittently presented. Results showed that tactile effects on visual percepts were dependent on the alignment of motion axes: rivalry between up/down visual motions was not modulated at all by left/right tactile motion. On the other hand, visual percepts could be altered by tactile motion signals when both modalities shared a common axis of motion: a tactile stimulus could maintain the dominance duration of a congruent visual stimulus and shorten its suppression period. The effects were also conditional on the spatial alignment of the visual and tactile stimuli, being eliminated when the tactile device was displaced 15 cm away to the right of the visual stimulus. In contrast, visibility of the hand touching the tactile stimulus facilitated congruent switches relative to a visual-only baseline but did not present a significant advantage overall. In sum, these results show a low-level sensory interaction that is conditional on visual and tactile stimuli sharing a common motion axis and location in space.
Collapse
|
6
|
Vestibular and active self-motion signals drive visual perception in binocular rivalry. iScience 2021; 24:103417. [PMID: 34877486 PMCID: PMC8632839 DOI: 10.1016/j.isci.2021.103417] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2021] [Revised: 09/24/2021] [Accepted: 11/04/2021] [Indexed: 11/24/2022] Open
Abstract
Multisensory integration helps the brain build reliable models of the world and resolve ambiguities. Visual interactions with sound and touch are well established but vestibular influences on vision are less well studied. Here, we test the vestibular influence on vision using horizontally opposed motions presented one to each eye so that visual perception is unstable and alternates irregularly. Passive, whole-body rotations in the yaw plane stabilized visual alternations, with perceived direction oscillating congruently with rotation (leftward motion during leftward rotation, and vice versa). This demonstrates a purely vestibular signal can resolve ambiguous visual motion and determine visual perception. Active self-rotation following the same sinusoidal profile also entrained vision to the rotation cycle – more strongly and with a lesser time lag, likely because of efference copy and predictive internal models. Both experiments show that visual ambiguity provides an effective paradigm to reveal how vestibular and motor inputs can shape visual perception. Binocular rivalry between left/right motions is stabilized by congruent head movement Left/right head rotations entrain rivalry dynamics so matching direction is perceived Active and passive rotations both drive rivalry dominance to match rotation direction Resolving ambiguous vision occurs in a broader vestibular and action-based context
Collapse
|
7
|
Motyka P, Kozłowska Z, Litwin P. Perceptual Awareness of Optic Flows Paced Optimally and Non-optimally to Walking Speed. Perception 2021; 50:797-818. [PMID: 34459288 DOI: 10.1177/03010066211034368] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Previous research suggests that visual processing depends strongly on locomotor activity and is tuned to optic flows consistent with self-motion speed. Here, we used a binocular rivalry paradigm to investigate whether perceptual access to optic flows depends on their optimality in relation to walking velocity. Participants walked at two different speeds on a treadmill while viewing discrepant visualizations of a virtual tunnel in each eye. We hypothesized that visualizations paced appropriately to the walking speeds will be perceived longer than non optimal (too fast/slow) ones. The presented optic flow speeds were predetermined individually in a task based on matching visual speed to both walking velocities. In addition, perceptual preference for optimal optic flows was expected to increase with proprioceptive ability to detect threshold-level changes in walking speed. Whereas faster (more familiar) optic flows showed enhanced access to awareness during faster compared with slower walking conditions, for slower visual flows, only a nonsignificant tendency for the analogous effect was observed. These effects were not dependent on individual proprioceptive sensitivity. Our findings concur with the emerging view that the velocity of one's locomotion is used to calibrate visual perception of self-motion and extend the scope of reported action effects on visual awareness.
Collapse
Affiliation(s)
- Paweł Motyka
- Faculty of Psychology, University of Warsaw, Poland
| | | | - Piotr Litwin
- Faculty of Psychology, University of Warsaw, Poland
| |
Collapse
|
8
|
Delong P, Noppeney U. Semantic and spatial congruency mould audiovisual integration depending on perceptual awareness. Sci Rep 2021; 11:10832. [PMID: 34035358 PMCID: PMC8149651 DOI: 10.1038/s41598-021-90183-w] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2020] [Accepted: 04/22/2021] [Indexed: 11/09/2022] Open
Abstract
Information integration is considered a hallmark of human consciousness. Recent research has challenged this tenet by showing multisensory interactions in the absence of awareness. This psychophysics study assessed the impact of spatial and semantic correspondences on audiovisual binding in the presence and absence of visual awareness by combining forward-backward masking with spatial ventriloquism. Observers were presented with object pictures and synchronous sounds that were spatially and/or semantically congruent or incongruent. On each trial observers located the sound, identified the picture and rated the picture's visibility. We observed a robust ventriloquist effect for subjectively visible and invisible pictures indicating that pictures that evade our perceptual awareness influence where we perceive sounds. Critically, semantic congruency enhanced these visual biases on perceived sound location only when the picture entered observers' awareness. Our results demonstrate that crossmodal influences operating from vision to audition and vice versa are interactively controlled by spatial and semantic congruency in the presence of awareness. However, when visual processing is disrupted by masking procedures audiovisual interactions no longer depend on semantic correspondences.
Collapse
Affiliation(s)
- Patrycja Delong
- Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham, Birmingham, UK.
| | - Uta Noppeney
- Centre for Computational Neuroscience and Cognitive Robotics, University of Birmingham, Birmingham, UK.,Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, The Netherlands
| |
Collapse
|
9
|
Motyka P, Akbal M, Litwin P. Forward optic flow is prioritised in visual awareness independently of walking direction. PLoS One 2021; 16:e0250905. [PMID: 33945563 PMCID: PMC8096117 DOI: 10.1371/journal.pone.0250905] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2020] [Accepted: 04/15/2021] [Indexed: 12/31/2022] Open
Abstract
When two different images are presented separately to each eye, one experiences smooth transitions between them-a phenomenon called binocular rivalry. Previous studies have shown that exposure to signals from other senses can enhance the access of stimulation-congruent images to conscious perception. However, despite our ability to infer perceptual consequences from bodily movements, evidence that action can have an analogous influence on visual awareness is scarce and mainly limited to hand movements. Here, we investigated whether one's direction of locomotion affects perceptual access to optic flow patterns during binocular rivalry. Participants walked forwards and backwards on a treadmill while viewing highly-realistic visualisations of self-motion in a virtual environment. We hypothesised that visualisations congruent with walking direction would predominate in visual awareness over incongruent ones, and that this effect would increase with the precision of one's active proprioception. These predictions were not confirmed: optic flow consistent with forward locomotion was prioritised in visual awareness independently of walking direction and proprioceptive abilities. Our findings suggest the limited role of kinaesthetic-proprioceptive information in disambiguating visually perceived direction of self-motion and indicate that vision might be tuned to the (expanding) optic flow patterns prevalent in everyday life.
Collapse
Affiliation(s)
- Paweł Motyka
- Faculty of Psychology, University of Warsaw, Warsaw, Poland
| | - Mert Akbal
- Department of Neurology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
- Academy of Fine Arts Saar, Saarbrücken, Germany
| | - Piotr Litwin
- Faculty of Psychology, University of Warsaw, Warsaw, Poland
| |
Collapse
|
10
|
Lunghi C, Galli-Resta L, Binda P, Cicchini GM, Placidi G, Falsini B, Morrone MC. Visual Cortical Plasticity in Retinitis Pigmentosa. Invest Ophthalmol Vis Sci 2019; 60:2753-2763. [PMID: 31247082 PMCID: PMC6746622 DOI: 10.1167/iovs.18-25750] [Citation(s) in RCA: 17] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022] Open
Abstract
Purpose Retinitis pigmentosa is a family of genetic diseases inducing progressive photoreceptor degeneration. There is no cure for retinitis pigmentosa, but prospective therapeutic strategies are aimed at restoring or substituting retinal input. Yet, it is unclear whether the visual cortex of retinitis pigmentosa patients retains plasticity to react to the restored visual input. Methods To investigate short-term visual cortical plasticity in retinitis pigmentosa, we tested the effect of short-term (2 hours) monocular deprivation on sensory ocular dominance (measured with binocular rivalry) in a group of 14 patients diagnosed with retinitis pigmentosa with a central visual field sparing greater than 20° in diameter. Results After deprivation most patients showed a perceptual shift in ocular dominance in favor of the deprived eye (P < 0.001), as did control subjects, indicating a level of visual cortical plasticity in the normal range. The deprivation effect correlated negatively with visual acuity (r = −0.63, P = 0.015), and with the amplitude of the central 18° focal electroretinogram (r = −0.68, P = 0.015) of the deprived eye, revealing that in retinitis pigmentosa stronger visual impairment is associated with higher plasticity. Conclusions Our results provide a new tool to assess the ability of retinitis pigmentosa patients to adapt to altered visual inputs, and suggest that in retinitis pigmentosa the adult brain has sufficient short-term plasticity to benefit from prospective therapies.
Collapse
Affiliation(s)
- Claudia Lunghi
- Laboratoire des systèmes perceptifs, Département d'études Cognitives, École Normale Supérieure, PSL University, CNRS, Paris, France.,Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, Pisa, Italy
| | | | - Paola Binda
- Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, Pisa, Italy.,Institute of Neuroscience CNR, Pisa, Italy
| | | | - Giorgio Placidi
- Department of Ophthalmology, Policlinico Gemelli, Università Cattolica del Sacro Cuore, Rome, Italy
| | - Benedetto Falsini
- Department of Ophthalmology, Policlinico Gemelli, Università Cattolica del Sacro Cuore, Rome, Italy
| | - Maria Concetta Morrone
- Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, Pisa, Italy.,IRCCS Stella Maris, Calambrone (Pisa), Italy
| |
Collapse
|
11
|
Kim S, Kim J. Effects of Multimodal Association on Ambiguous Perception in Binocular Rivalry. Perception 2019; 48:796-819. [DOI: 10.1177/0301006619867023] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
When two eyes view dissimilar images, an observer typically reports ambiguous perception called binocular rivalry where the subjective perception fluctuates between the two inputs. This perceptual instability is often comprised of exclusive dominance of each image and a transition state called piecemeal state where the two images are intermingled in patchwork manner. Herein, we investigated the effects of multimodal association of sensory congruent pair, arbitrary pair, and reverse pair on piecemeal state in order to see how each level of association affects the ambiguous perception during binocular rivalry. To induce the multisensory associations, we designed a matching task with audiovisual feedback where subjects were required to respond according to given pairing rules. We found that explicit audiovisual associations can substantially affect the piecemeal state during binocular rivalry and that this congruency effect that reduces the amount of visual ambiguity originates primarily from explicit audiovisual association training rather than common sensory features. Furthermore, when one information is associated with multiple information, recent and preexisting associations work collectively to influence the perceptual ambiguity during rivalry. Our findings show that learned multimodal association directly affects the temporal dynamics of ambiguous perception during binocular rivalry by modulating not only the exclusive dominance but also the piecemeal state in a systematic manner.
Collapse
Affiliation(s)
- Sungyong Kim
- Graduate School of Culture Technology, Korea Advanced Institute of Science and Technology, Daejeon, Republic of Korea
| | - Jeounghoon Kim
- Graduate School of Culture Technology, Korea Advanced Institute of Science and Technology, Daejeon, Republic of Korea; School of Humanities and Social Sciences, Korea Advanced Institute of Science and Technology, Daejeon, Republic of Korea
| |
Collapse
|
12
|
Abstract
There is an ongoing debate whether or not multisensory interactions require awareness of the sensory signals. Static visual and tactile stimuli have been shown to influence each other even in the absence of visual awareness. However, it is unclear if this finding generalizes to dynamic contexts. In the present study, we presented visual and tactile motion stimuli and induced fluctuations of visual awareness by means of binocular rivalry: two gratings which drifted in opposite directions were displayed, one to each eye. One visual motion stimulus dominated and reached awareness while the other visual stimulus was suppressed from awareness. Tactile motion stimuli were presented at random time points during the visual stimulation. The motion direction of a tactile stimulus always matched the direction of one of the concurrently presented visual stimuli. The visual gratings were differently tinted, and participants reported the color of the currently seen stimulus. Tactile motion delayed perceptual switches that ended dominance periods of congruently moving visual stimuli compared to switches during visual-only stimulation. In addition, tactile motion fostered the return to dominance of suppressed, congruently moving visual stimuli, but only if the tactile motion started at a late stage of the ongoing visual suppression period. At later stages, perceptual suppression is typically decreasing. These results suggest that visual awareness facilitates but does not gate multisensory interactions between visual and tactile motion signals.
Collapse
|
13
|
Davidson MJ, Alais D, van Boxtel JJA, Tsuchiya N. Attention periodically samples competing stimuli during binocular rivalry. eLife 2018; 7:e40868. [PMID: 30507378 PMCID: PMC6298779 DOI: 10.7554/elife.40868] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2018] [Accepted: 11/19/2018] [Indexed: 12/14/2022] Open
Abstract
The attentional sampling hypothesis suggests that attention rhythmically enhances sensory processing when attending to a single (~8 Hz), or multiple (~4 Hz) objects. Here, we investigated whether attention samples sensory representations that are not part of the conscious percept during binocular rivalry. When crossmodally cued toward a conscious image, subsequent changes in consciousness occurred at ~8 Hz, consistent with the rates of undivided attentional sampling. However, when attention was cued toward the suppressed image, changes in consciousness slowed to ~3.5 Hz, indicating the division of attention away from the conscious visual image. In the electroencephalogram, we found that at attentional sampling frequencies, the strength of inter-trial phase-coherence over fronto-temporal and parieto-occipital regions correlated with changes in perception. When cues were not task-relevant, these effects disappeared, confirming that perceptual changes were dependent upon the allocation of attention, and that attention can flexibly sample away from a conscious image in a task-dependent manner.
Collapse
Affiliation(s)
- Matthew J Davidson
- School of Psychological Sciences, Faculty of Medicine, Nursing, and Health SciencesMonash UniversityMelbourneAustralia
- Monash Institute of Cognitive and Clinical NeurosciencesMonash UniversityMelbourneAustralia
| | - David Alais
- School of PsychologyThe University of SydneyCamperdownAustralia
| | - Jeroen JA van Boxtel
- School of Psychological Sciences, Faculty of Medicine, Nursing, and Health SciencesMonash UniversityMelbourneAustralia
- Monash Institute of Cognitive and Clinical NeurosciencesMonash UniversityMelbourneAustralia
- School of Psychology, Faculty of HealthUniversity of CanberraCanberraAustralia
| | - Naotsugu Tsuchiya
- School of Psychological Sciences, Faculty of Medicine, Nursing, and Health SciencesMonash UniversityMelbourneAustralia
- Monash Institute of Cognitive and Clinical NeurosciencesMonash UniversityMelbourneAustralia
| |
Collapse
|
14
|
Abstract
The spatial context in which we view a visual stimulus strongly determines how we perceive the stimulus. In the visual tilt illusion, the perceived orientation of a visual grating is affected by the orientation signals in its surrounding context. Conceivably, the spatial context in which a visual grating is perceived can be defined by interactive multisensory information rather than visual signals alone. Here, we tested the hypothesis that tactile signals engage the neural mechanisms supporting visual contextual modulation. Because tactile signals also convey orientation information and touch can selectively interact with visual orientation perception, we predicted that tactile signals would modulate the visual tilt illusion. We applied a bias-free method to measure the tilt illusion while testing visual-only, tactile-only or visuo-tactile contextual surrounds. We found that a tactile context can influence visual tilt perception. Moreover, combining visual and tactile orientation information in the surround results in a larger tilt illusion relative to the illusion achieved with the visual-only surround. These results demonstrate that the visual tilt illusion is subject to multisensory influences and imply that non-visual signals access the neural circuits whose computations underlie the contextual modulation of vision.
Collapse
|
15
|
Delong P, Aller M, Giani AS, Rohe T, Conrad V, Watanabe M, Noppeney U. Invisible Flashes Alter Perceived Sound Location. Sci Rep 2018; 8:12376. [PMID: 30120294 PMCID: PMC6098122 DOI: 10.1038/s41598-018-30773-3] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2018] [Accepted: 07/31/2018] [Indexed: 12/05/2022] Open
Abstract
Information integration across the senses is fundamental for effective interactions with our environment. The extent to which signals from different senses can interact in the absence of awareness is controversial. Combining the spatial ventriloquist illusion and dynamic continuous flash suppression (dCFS), we investigated in a series of two experiments whether visual signals that observers do not consciously perceive can influence spatial perception of sounds. Importantly, dCFS obliterated visual awareness only on a fraction of trials allowing us to compare spatial ventriloquism for physically identical flashes that were judged as visible or invisible. Our results show a stronger ventriloquist effect for visible than invisible flashes. Critically, a robust ventriloquist effect emerged also for invisible flashes even when participants were at chance when locating the flash. Collectively, our findings demonstrate that signals that we are not aware of in one sensory modality can alter spatial perception of signals in another sensory modality.
Collapse
Affiliation(s)
- Patrycja Delong
- Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, B15 2TT, Birmingham, UK.
| | - Máté Aller
- Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, B15 2TT, Birmingham, UK
| | - Anette S Giani
- Max Planck Institute for Biological Cybernetics, 72076, Tübingen, Germany
| | - Tim Rohe
- Max Planck Institute for Biological Cybernetics, 72076, Tübingen, Germany
| | - Verena Conrad
- Max Planck Institute for Biological Cybernetics, 72076, Tübingen, Germany
| | - Masataka Watanabe
- Max Planck Institute for Biological Cybernetics, 72076, Tübingen, Germany
| | - Uta Noppeney
- Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, B15 2TT, Birmingham, UK
- Max Planck Institute for Biological Cybernetics, 72076, Tübingen, Germany
| |
Collapse
|
16
|
Deroy O, Faivre N, Lunghi C, Spence C, Aller M, Noppeney U. The Complex Interplay Between Multisensory Integration and Perceptual Awareness. Multisens Res 2018; 29:585-606. [PMID: 27795942 DOI: 10.1163/22134808-00002529] [Citation(s) in RCA: 27] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/22/2022]
Abstract
The integration of information has been considered a hallmark of human consciousness, as it requires information being globally available via widespread neural interactions. Yet the complex interdependencies between multisensory integration and perceptual awareness, or consciousness, remain to be defined. While perceptual awareness has traditionally been studied in a single sense, in recent years we have witnessed a surge of interest in the role of multisensory integration in perceptual awareness. Based on a recent IMRF symposium on multisensory awareness, this review discusses three key questions from conceptual, methodological and experimental perspectives: (1) What do we study when we study multisensory awareness? (2) What is the relationship between multisensory integration and perceptual awareness? (3) Which experimental approaches are most promising to characterize multisensory awareness? We hope that this review paper will provoke lively discussions, novel experiments, and conceptual considerations to advance our understanding of the multifaceted interplay between multisensory integration and consciousness.
Collapse
Affiliation(s)
- O Deroy
- Centre for the Study of the Senses, Institute of Philosophy, School of Advanced Study, University of London, London, UK
| | - N Faivre
- Laboratory of Cognitive Neuroscience, Brain Mind Institute, Ecole Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - C Lunghi
- Department of Translational Research on New Technologies in Medicine and Surgery, University of Pisa, Pisa, Italy
| | - C Spence
- Crossmodal Research Laboratory, Department of Experimental Psychology, Oxford University, Oxford, UK
| | - M Aller
- Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, Birmingham, UK
| | - U Noppeney
- Computational Neuroscience and Cognitive Robotics Centre, University of Birmingham, Birmingham, UK
| |
Collapse
|
17
|
Piazza EA, Denison RN, Silver MA. Recent cross-modal statistical learning influences visual perceptual selection. J Vis 2018; 18:1. [PMID: 29497742 PMCID: PMC5837665 DOI: 10.1167/18.3.1] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
Incoming sensory signals are often ambiguous and consistent with multiple perceptual interpretations. Information from one sensory modality can help to resolve ambiguity in another modality, but the mechanisms by which multisensory associations come to influence the contents of conscious perception are unclear. We asked whether and how novel statistical information about the coupling between sounds and images influences the early stages of awareness of visual stimuli. We exposed subjects to consistent, arbitrary pairings of sounds and images and then measured the impact of this recent passive statistical learning on subjects' initial conscious perception of a stimulus by employing binocular rivalry, a phenomenon in which incompatible images presented separately to the two eyes result in a perceptual alternation between the two images. On each trial of the rivalry test, subjects were presented with a pair of rivalrous images (one of which had been consistently paired with a specific sound during exposure while the other had not) and an accompanying sound. We found that, at the onset of binocular rivalry, an image was significantly more likely to be perceived, and was perceived for a longer duration, when it was presented with its paired sound than when presented with other sounds. Our results indicate that recently acquired multisensory information helps resolve sensory ambiguity, and they demonstrate that statistical learning is a fast, flexible mechanism that facilitates this process.
Collapse
Affiliation(s)
- Elise A Piazza
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA.,Vision Science Graduate Group, University of California, Berkeley, Berkeley, CA, USA.,Helen Wills Neuroscience Institute, University of California, Berkeley, Berkeley, CA, USA
| | - Rachel N Denison
- Helen Wills Neuroscience Institute, University of California, Berkeley, Berkeley, CA, USA.,Department of Psychology and Center for Neural Science, New York University, New York, NY, USA
| | - Michael A Silver
- Vision Science Graduate Group, University of California, Berkeley, Berkeley, CA, USA.,Helen Wills Neuroscience Institute, University of California, Berkeley, Berkeley, CA, USA.,School of Optometry, University of California, Berkeley, Berkeley, CA, USA
| |
Collapse
|
18
|
Foisy A, Kapoula Z. Plantar Exteroceptive Inefficiency causes an asynergic use of plantar and visual afferents for postural control: Best means of remediation. Brain Behav 2017. [PMID: 28638699 PMCID: PMC5474697 DOI: 10.1002/brb3.658] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/07/2022] Open
Abstract
INTRODUCTION Some subjects have difficulty to integrate both visual and plantar inputs, showing at the same time a "postural blindness" and a Plantar Exteroceptive Inefficiency (PEI). The former corresponds to a better stability eyes closed (EC) than eyes open (EO), while the latter is defined as a better stability on foam than on firm ground. Clinical studies reported that a manipulation of either plantar or visual input could affect the weight of both cues in postural control, suggesting interdependence in their use. The purpose of the experiment is to characterize the PEI phenomenon better and see if such synergy can be objectified. METHODS We recruited 48 subjects (25 ± 3.3 years) and assessed their balance with a force platform, EO, EC, at 40 or 200 cm, on firm ground, Dépron® foam, Dynachoc® foam, or on a 3 mm-thick Anterior Bar AB®. We assessed their sensorial preferences through their PQ and RQ. RESULTS The main results are that there normally exists a synergy in the use of plantar and visual afferents, but only at 40 cm and in the absence of PEI. CONCLUSIONS Plantar Exteroceptive Inefficiency interferes with the role of vision in postural control, its effects are distance specific, are better revealed by Dépron® foam and the AB® improves posture but does not solve visual-podal asynergy. These results also have clinical interests as they indicate the best way in terms of distance and choice of foam to diagnostic PEI. Finally, they suggest restricting the use of the AB®, commonly employed. These findings can be useful for clinicians concerned with foot, eye, and posture.
Collapse
Affiliation(s)
- Arnaud Foisy
- IRIS Team, Physiopathologie de la Vision et Motricité Binoculaire FR3636 Neurosciences CNRS, University Paris Descartes Paris France
| | - Zoï Kapoula
- IRIS Team, Physiopathologie de la Vision et Motricité Binoculaire FR3636 Neurosciences CNRS, University Paris Descartes Paris France
| |
Collapse
|
19
|
Newly acquired audio-visual associations bias perception in binocular rivalry. Vision Res 2017; 133:121-129. [DOI: 10.1016/j.visres.2017.02.001] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2016] [Revised: 02/11/2017] [Accepted: 02/17/2017] [Indexed: 11/16/2022]
|
20
|
Faivre N, Arzi A, Lunghi C, Salomon R. Consciousness is more than meets the eye: a call for a multisensory study of subjective experience. Neurosci Conscious 2017; 2017:nix003. [PMID: 30042838 PMCID: PMC6007148 DOI: 10.1093/nc/nix003] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/02/2016] [Revised: 02/06/2017] [Accepted: 02/16/2017] [Indexed: 11/17/2022] Open
Abstract
Over the last 30 years, our understanding of the neurocognitive bases of consciousness has improved, mostly through studies employing vision. While studying consciousness in the visual modality presents clear advantages, we believe that a comprehensive scientific account of subjective experience must not neglect other exteroceptive and interoceptive signals as well as the role of multisensory interactions for perceptual and self-consciousness. Here, we briefly review four distinct lines of work which converge in documenting how multisensory signals are processed across several levels and contents of consciousness. Namely, how multisensory interactions occur when consciousness is prevented because of perceptual manipulations (i.e. subliminal stimuli) or because of low vigilance states (i.e. sleep, anesthesia), how interactions between exteroceptive and interoceptive signals give rise to bodily self-consciousness, and how multisensory signals are combined to form metacognitive judgments. By describing the interactions between multisensory signals at the perceptual, cognitive, and metacognitive levels, we illustrate how stepping out the visual comfort zone may help in deriving refined accounts of consciousness, and may allow cancelling out idiosyncrasies of each sense to delineate supramodal mechanisms involved during consciousness.
Collapse
Affiliation(s)
- Nathan Faivre
- Laboratory of Cognitive Neuroscience, Brain Mind Institute, Faculty of Life Sciences, Swiss Federal Institute of Technology (EPFL), Geneva, Switzerland
- Centre d’Economie de la Sorbonne, CNRS UMR 8174, Paris, France
| | - Anat Arzi
- Department of Psychology, University of Cambridge, Cambridge, UK
| | - Claudia Lunghi
- Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, Pisa, Italy
- Institute of Neuroscience, National Research Council (CNR), Pisa, Italy
| | - Roy Salomon
- Gonda Multidisciplinary Brain Research Center, Bar-Ilan University, Ramat-Gan, Israel
| |
Collapse
|
21
|
Abstract
To efficiently interact with the external environment, our nervous system combines information arising from different sensory modalities. Recent evidence suggests that cross-modal interactions can be automatic and even unconscious, reflecting the ecological relevance of cross-modal processing. Here, we use continuous flash suppression (CFS) to directly investigate whether haptic signals can interact with visual signals outside of visual awareness. We measured suppression durations of visual gratings rendered invisible by CFS either during visual stimulation alone or during visuo-haptic stimulation. We found that active exploration of a haptic grating congruent in orientation with the suppressed visual grating reduced suppression durations both compared with visual-only stimulation and to incongruent visuo-haptic stimulation. We also found that the facilitatory effect of touch on visual suppression disappeared when the visual and haptic gratings were mismatched in either spatial frequency or orientation. Together, these results demonstrate that congruent touch can accelerate the rise to consciousness of a suppressed visual stimulus and that this unconscious cross-modal interaction depends on visuo-haptic congruency. Furthermore, since CFS suppression is thought to occur early in visual cortical processing, our data reinforce the evidence suggesting that visuo-haptic interactions can occur at the earliest stages of cortical processing.
Collapse
Affiliation(s)
- Claudia Lunghi
- Department of Translational Research and New Technologies in Medicine and Surgery, University of Pisa, Italy; Institute of Neuroscience, CNR, Pisa, Italy
| | - Luca Lo Verde
- Institute of Neuroscience, CNR, Pisa, Italy; Department NEUROFARBA, University of Florence, Italy
| | - David Alais
- School of Psychology, University of Sydney, NSW, Australia
| |
Collapse
|
22
|
Abstract
It is known that, after a prolonged period of visual deprivation, the adult visual cortex can be recruited for nonvisual processing, reflecting cross-modal plasticity. Here, we investigated whether cross-modal plasticity can occur at short timescales in the typical adult brain by comparing the interaction between vision and touch during binocular rivalry before and after a brief period of monocular deprivation, which strongly alters ocular balance favoring the deprived eye. While viewing dichoptically two gratings of orthogonal orientation, participants were asked to actively explore a haptic grating congruent in orientation to one of the two rivalrous stimuli. We repeated this procedure before and after 150 min of monocular deprivation. We first confirmed that haptic stimulation interacted with vision during rivalry promoting dominance of the congruent visuo-haptic stimulus and that monocular deprivation increased the deprived eye and decreased the nondeprived eye dominance. Interestingly, after deprivation, we found that the effect of touch did not change for the nondeprived eye, whereas it disappeared for the deprived eye, which was potentiated after deprivation. The absence of visuo-haptic interaction for the deprived eye lasted for over 1 hr and was not attributable to a masking induced by the stronger response of the deprived eye as confirmed by a control experiment. Taken together, our results demonstrate that the adult human visual cortex retains a high degree of cross-modal plasticity, which can occur even at very short timescales.
Collapse
Affiliation(s)
- Luca Lo Verde
- University of Florence.,Institute of Neuroscience, Consiglio Nazionale Delle Ricerche, Pisa
| | | | - Claudia Lunghi
- Institute of Neuroscience, Consiglio Nazionale Delle Ricerche, Pisa.,University of Pisa
| |
Collapse
|
23
|
When audiovisual correspondence disturbs visual processing. Exp Brain Res 2016; 234:1325-32. [PMID: 26884130 DOI: 10.1007/s00221-016-4591-y] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2015] [Accepted: 01/30/2016] [Indexed: 10/22/2022]
Abstract
Multisensory integration is known to create a more robust and reliable perceptual representation of one's environment. Specifically, a congruent auditory input can make a visual stimulus more salient, consequently enhancing the visibility and detection of the visual target. However, it remains largely unknown whether a congruent auditory input can also impair visual processing. In the current study, we demonstrate that temporally congruent auditory input disrupts visual processing, consequently slowing down visual target detection. More importantly, this cross-modal inhibition occurs only when the contrast of visual targets is high. When the contrast of visual targets is low, enhancement of visual target detection is observed, consistent with the prediction based on the principle of inverse effectiveness (PIE) in cross-modal integration. The switch of the behavioral effect of audiovisual interaction from benefit to cost further extends the PIE to encompass the suppressive cross-modal interaction.
Collapse
|
24
|
Niechwiej-Szwedo E, Chin J, Wolfe PJ, Popovich C, Staines WR. Abnormal visual experience during development alters the early stages of visual-tactile integration. Behav Brain Res 2016; 304:111-9. [PMID: 26896697 DOI: 10.1016/j.bbr.2016.02.018] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2015] [Revised: 02/10/2016] [Accepted: 02/13/2016] [Indexed: 11/18/2022]
Abstract
Visual experience during the critical periods in early postnatal life is necessary for the normal development of the visual system. Disruption of visual input during this period results in amblyopia, which is associated with reduced activation of the striate and extrastriate cortices. It is well known that visual input converges with other sensory signals and exerts a significant influence on cortical processing in multiple association areas. Recent work in healthy adults has also shown that task-relevant visual input can modulate neural excitability at very early stages of information processing in the primary somatosensory cortex. Here we used electroencephalography to investigate visual-tactile interactions in adults with abnormal binocular vision due to amblyopia and strabismus. Results showed three main findings. First, in comparison to a visually normal control group, participants with abnormal vision had a significantly lower amplitude of the P50 somatosensory event related potential (ERP) when visual and tactile stimuli were presented concurrently. Second, the amplitude of the P100 somatosensory ERP was significantly greater in participants with abnormal vision. These results indicate that task relevant visual input does not significantly influence the excitability of the primary somatosensory cortex, instead, the excitability of the secondary somatosensory cortex is increased. Third, participants with abnormal vision had a higher amplitude of the P1 visual ERP when a tactile stimulus was presented concurrently. Importantly, these results were not modulated by viewing condition, which indicates that the impact of amblyopia on crossmodal interactions is not simply related to the reduced visual acuity as it was evident when viewing with the unaffected eye and binocularly. These results indicate that the consequences of abnormal visual experience on neurophysiological processing extend beyond the primary and secondary visual areas to other modality-specific areas.
Collapse
Affiliation(s)
| | - Jessica Chin
- Department of Kinesiology, University of Waterloo, Waterloo, Canada
| | - Paul J Wolfe
- Department of Kinesiology, University of Waterloo, Waterloo, Canada
| | | | | |
Collapse
|