1
|
Chetverikov A, Jehee JFM. Motion direction is represented as a bimodal probability distribution in the human visual cortex. Nat Commun 2023; 14:7634. [PMID: 37993430 PMCID: PMC10665457 DOI: 10.1038/s41467-023-43251-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2022] [Accepted: 11/03/2023] [Indexed: 11/24/2023] Open
Abstract
Humans infer motion direction from noisy sensory signals. We hypothesize that to make these inferences more precise, the visual system computes motion direction not only from velocity but also spatial orientation signals - a 'streak' created by moving objects. We implement this hypothesis in a Bayesian model, which quantifies knowledge with probability distributions, and test its predictions using psychophysics and fMRI. Using a probabilistic pattern-based analysis, we decode probability distributions of motion direction from trial-by-trial activity in the human visual cortex. Corroborating the predictions, the decoded distributions have a bimodal shape, with peaks that predict the direction and magnitude of behavioral errors. Interestingly, we observe similar bimodality in the distribution of the observers' behavioral responses across trials. Together, these results suggest that observers use spatial orientation signals when estimating motion direction. More broadly, our findings indicate that the cortical representation of low-level visual features, such as motion direction, can reflect a combination of several qualitatively distinct signals.
Collapse
Affiliation(s)
- Andrey Chetverikov
- Donders Institute for Brain, Cognition, and Behavior, Radboud University, Nijmegen, The Netherlands.
- Department of Psychosocial Science, Faculty of Psychology, University of Bergen, Bergen, Norway.
| | - Janneke F M Jehee
- Donders Institute for Brain, Cognition, and Behavior, Radboud University, Nijmegen, The Netherlands.
| |
Collapse
|
2
|
Retter TL, Webster MA, Jiang F. Directional Visual Motion Is Represented in the Auditory and Association Cortices of Early Deaf Individuals. J Cogn Neurosci 2019; 31:1126-1140. [PMID: 30726181 PMCID: PMC6599583 DOI: 10.1162/jocn_a_01378] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Individuals who are deaf since early life may show enhanced performance at some visual tasks, including discrimination of directional motion. The neural substrates of such behavioral enhancements remain difficult to identify in humans, although neural plasticity has been shown for early deaf people in the auditory and association cortices, including the primary auditory cortex (PAC) and STS region, respectively. Here, we investigated whether neural responses in auditory and association cortices of early deaf individuals are reorganized to be sensitive to directional visual motion. To capture direction-selective responses, we recorded fMRI responses frequency-tagged to the 0.1-Hz presentation of central directional (100% coherent random dot) motion persisting for 2 sec contrasted with nondirectional (0% coherent) motion for 8 sec. We found direction-selective responses in the STS region in both deaf and hearing participants, but the extent of activation in the right STS region was 5.5 times larger for deaf participants. Minimal but significant direction-selective responses were also found in the PAC of deaf participants, both at the group level and in five of six individuals. In response to stimuli presented separately in the right and left visual fields, the relative activation across the right and left hemispheres was similar in both the PAC and STS region of deaf participants. Notably, the enhanced right-hemisphere activation could support the right visual field advantage reported previously in behavioral studies. Taken together, these results show that the reorganized auditory cortices of early deaf individuals are sensitive to directional motion. Speculatively, these results suggest that auditory and association regions can be remapped to support enhanced visual performance.
Collapse
|
3
|
Disentangling locus of perceptual learning in the visual hierarchy of motion processing. Sci Rep 2019; 9:1557. [PMID: 30733535 PMCID: PMC6367332 DOI: 10.1038/s41598-018-37892-x] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2018] [Accepted: 12/17/2018] [Indexed: 12/03/2022] Open
Abstract
Visual perceptual learning (VPL) can lead to long-lasting perceptual improvements. One of the central topics in VPL studies is the locus of plasticity in the visual processing hierarchy. Here, we tackled this question in the context of motion processing. We took advantage of an established transition from component-dependent representations at the earliest level to pattern-dependent representations at the middle-level of cortical motion processing. Two groups of participants were trained on the same motion direction identification task using either grating or plaid stimuli. A set of pre- and post-training tests was used to determine the degree of learning specificity and generalizability. This approach allowed us to disentangle contributions from different levels of processing stages to behavioral improvements. We observed a complete bi-directional transfer of learning between component and pattern stimuli that moved to the same directions, indicating learning-induced plasticity associated with intermediate levels of motion processing. Moreover, we found that motion VPL is specific to the trained stimulus direction, speed, size, and contrast, diminishing the possibility of non-sensory decision-level enhancements. Taken together, these results indicate that, at least for the type of stimuli and the task used here, motion VPL most likely alters visual computation associated with signals at the middle stage of motion processing.
Collapse
|
4
|
Representation of Auditory Motion Directions and Sound Source Locations in the Human Planum Temporale. J Neurosci 2019; 39:2208-2220. [PMID: 30651333 DOI: 10.1523/jneurosci.2289-18.2018] [Citation(s) in RCA: 16] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2018] [Revised: 12/20/2018] [Accepted: 12/21/2018] [Indexed: 11/21/2022] Open
Abstract
The ability to compute the location and direction of sounds is a crucial perceptual skill to efficiently interact with dynamic environments. How the human brain implements spatial hearing is, however, poorly understood. In our study, we used fMRI to characterize the brain activity of male and female humans listening to sounds moving left, right, up, and down as well as static sounds. Whole-brain univariate results contrasting moving and static sounds varying in their location revealed a robust functional preference for auditory motion in bilateral human planum temporale (hPT). Using independently localized hPT, we show that this region contains information about auditory motion directions and, to a lesser extent, sound source locations. Moreover, hPT showed an axis of motion organization reminiscent of the functional organization of the middle-temporal cortex (hMT+/V5) for vision. Importantly, whereas motion direction and location rely on partially shared pattern geometries in hPT, as demonstrated by successful cross-condition decoding, the responses elicited by static and moving sounds were, however, significantly distinct. Altogether, our results demonstrate that the hPT codes for auditory motion and location but that the underlying neural computation linked to motion processing is more reliable and partially distinct from the one supporting sound source location.SIGNIFICANCE STATEMENT Compared with what we know about visual motion, little is known about how the brain implements spatial hearing. Our study reveals that motion directions and sound source locations can be reliably decoded in the human planum temporale (hPT) and that they rely on partially shared pattern geometries. Our study, therefore, sheds important new light on how computing the location or direction of sounds is implemented in the human auditory cortex by showing that those two computations rely on partially shared neural codes. Furthermore, our results show that the neural representation of moving sounds in hPT follows a "preferred axis of motion" organization, reminiscent of the coding mechanisms typically observed in the occipital middle-temporal cortex (hMT+/V5) region for computing visual motion.
Collapse
|
5
|
Cocchi L, Yang Z, Zalesky A, Stelzer J, Hearne LJ, Gollo LL, Mattingley JB. Neural decoding of visual stimuli varies with fluctuations in global network efficiency. Hum Brain Mapp 2017; 38:3069-3080. [PMID: 28342260 DOI: 10.1002/hbm.23574] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2017] [Revised: 02/25/2017] [Accepted: 03/07/2017] [Indexed: 12/14/2022] Open
Abstract
Functional magnetic resonance imaging (fMRI) studies have shown that neural activity fluctuates spontaneously between different states of global synchronization over a timescale of several seconds. Such fluctuations generate transient states of high and low correlation across distributed cortical areas. It has been hypothesized that such fluctuations in global efficiency might alter patterns of activity in local neuronal populations elicited by changes in incoming sensory stimuli. To test this prediction, we used a linear decoder to discriminate patterns of neural activity elicited by face and motion stimuli presented periodically while participants underwent time-resolved fMRI. As predicted, decoding was reliably higher during states of high global efficiency than during states of low efficiency, and this difference was evident across both visual and nonvisual cortical regions. The results indicate that slow fluctuations in global network efficiency are associated with variations in the pattern of activity across widespread cortical regions responsible for representing distinct categories of visual stimulus. More broadly, the findings highlight the importance of understanding the impact of global fluctuations in functional connectivity on specialized, stimulus driven neural processes. Hum Brain Mapp 38:3069-3080, 2017. © 2017 Wiley Periodicals, Inc.
Collapse
Affiliation(s)
- Luca Cocchi
- Queensland Brain Institute, The University of Queensland, Brisbane, Australia.,QIMR Berghofer Medical Research Institute, Brisbane, Australia
| | - Zhengyi Yang
- Queensland Brain Institute, The University of Queensland, Brisbane, Australia.,Brainnetome Center, Institute of Automation, Chinese Academy of Sciences, Beijing, China.,School of Information Technology and Electrical Engineering, The University of Queensland, Brisbane, Australia
| | - Andrew Zalesky
- Melbourne Neuropsychiatry Centre, The University of Melbourne, Melbourne, Australia
| | - Johannes Stelzer
- Department of Biomedical Magnetic Resonance Imaging, University Hospital Tuebingen, Germany.,Magnetic Resonance Centre, Max-Planck-Institute for Biological Cybernetics, Tuebingen, Germany
| | - Luke J Hearne
- Queensland Brain Institute, The University of Queensland, Brisbane, Australia
| | | | - Jason B Mattingley
- Queensland Brain Institute, The University of Queensland, Brisbane, Australia.,School of Psychology, The University of Queensland, Brisbane, Australia
| |
Collapse
|
6
|
Christophel TB, Haynes JD. Decoding complex flow-field patterns in visual working memory. Neuroimage 2014; 91:43-51. [DOI: 10.1016/j.neuroimage.2014.01.025] [Citation(s) in RCA: 35] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/08/2013] [Revised: 12/12/2013] [Accepted: 01/16/2014] [Indexed: 10/25/2022] Open
|
7
|
Palomares M, Ales JM, Wade AR, Cottereau BR, Norcia AM. Distinct effects of attention on the neural responses to form and motion processing: a SSVEP source-imaging study. J Vis 2012; 12:15. [PMID: 23019120 DOI: 10.1167/12.10.15] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
We measured neural responses to local and global aspects of form and motion stimuli using frequency-tagged, steady-state visual evoked potentials (SSVEPs) combined with magnetic resonance imaging (MRI) data. Random dot stimuli were used to portray either dynamic Glass patterns (Glass, 1969) or coherent motion displays. SSVEPs were used to estimate neural activity in a set of fMRI-defined visual areas in each subject. To compare activity associated with local versus global processing, we analyzed two frequency components of the SSVEP in each visual area: the high temporal frequency at which the local dots were updated (30 Hz) and the much lower frequency corresponding to updates in the global structure (0.83 Hz). Local and global responses were evaluated in the context of two different behavioral tasks--subjects had to either direct their attention toward or away from the global coherence of the stimuli. The data show that the effect of attention on global and local responses is both stimulus and visual area dependent. When attention was directed away from stimulus coherence, both local and global responses were higher in the coherent motion than Glass pattern condition. Directing attention to coherence in Glass patterns enhanced global activity in areas LOC, hMT+, V4, V3a, and V1, while attention to global motion modulated responses by a smaller amount in a smaller set of areas: V4, hMT+, and LOC. In contrast, directing attention towards stimulus coherence weakly increased local responses to both coherent motion and Glass patterns. These results suggest that visual attention differentially modulates the activity of early visual areas at both local and global levels of structural encoding.
Collapse
|
8
|
Abstract
Considerable information about mental states can be decoded from noninvasive measures of human brain activity. Analyses of brain activity patterns can reveal what a person is seeing, perceiving, attending to, or remembering. Moreover, multidimensional models can be used to investigate how the brain encodes complex visual scenes or abstract semantic information. Such feats of "brain reading" or "mind reading," though impressive, raise important conceptual, methodological, and ethical issues. What does successful decoding reveal about the cognitive functions performed by a brain region? How should brain signals be spatially selected and mathematically combined to ensure that decoding reflects inherent computations of the brain rather than those performed by the decoder? We highlight recent advances and describe how multivoxel pattern analysis can provide a window into mind-brain relationships with unprecedented specificity, when carefully applied. However, as brain-reading technology advances, issues of neuroethics and mental privacy will be important to consider.
Collapse
Affiliation(s)
- Frank Tong
- Psychology Department and Vanderbilt Vision Research Center, Vanderbilt University, Nashville, Tennessee 37240, USA.
| | | |
Collapse
|