1
|
Hu Y, Yu Q. Spatiotemporal dynamics of self-generated imagery reveal a reverse cortical hierarchy from cue-induced imagery. Cell Rep 2023; 42:113242. [PMID: 37831604 DOI: 10.1016/j.celrep.2023.113242] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2023] [Revised: 08/25/2023] [Accepted: 09/25/2023] [Indexed: 10/15/2023] Open
Abstract
Visual imagery allows for the construction of rich internal experience in our mental world. However, it has remained poorly understood how imagery experience derives volitionally as opposed to being cue driven. Here, using electroencephalography and functional magnetic resonance imaging, we systematically investigate the spatiotemporal dynamics of self-generated imagery by having participants volitionally imagining one of the orientations from a learned pool. We contrast self-generated imagery with cue-induced imagery, where participants imagined line orientations based on associative cues acquired previously. Our results reveal overlapping neural signatures of cue-induced and self-generated imagery. Yet, these neural signatures display substantially differential sensitivities to the two types of imagery: self-generated imagery is supported by an enhanced involvement of the anterior cortex in representing imagery contents. By contrast, cue-induced imagery is supported by enhanced imagery representations in the posterior visual cortex. These results jointly support a reverse cortical hierarchy in generating and maintaining imagery contents in self-generated versus externally cued imagery.
Collapse
Affiliation(s)
- Yiheng Hu
- Institute of Neuroscience, Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai 200031, China; University of Chinese Academy of Sciences, Beijing 100049, China
| | - Qing Yu
- Institute of Neuroscience, Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai 200031, China.
| |
Collapse
|
2
|
Gjorgieva E, Morales-Torres R, Cabeza R, Woldorff MG. Neural retrieval processes occur more rapidly for visual mental images that were previously encoded with high-vividness. Cereb Cortex 2023; 33:10234-10244. [PMID: 37526263 DOI: 10.1093/cercor/bhad278] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2023] [Revised: 06/26/2023] [Accepted: 06/27/2023] [Indexed: 08/02/2023] Open
Abstract
Visual mental imagery refers to our ability to experience visual images in the absence of sensory stimulation. Studies have shown that visual mental imagery can improve episodic memory. However, we have limited understanding of the neural mechanisms underlying this improvement. Using electroencephalography, we examined the neural processes associated with the retrieval of previously generated visual mental images, focusing on how the vividness at generation can modulate retrieval processes. Participants viewed word stimuli referring to common objects, forming a visual mental image of each word and rating the vividness of the mental image. This was followed by a surprise old/new recognition task. We compared retrieval performance for items rated as high- versus low-vividness at encoding. High-vividness items were retrieved with faster reaction times and higher confidence ratings in the memory judgment. While controlling for confidence, neural measures indicated that high-vividness items produced an earlier decrease in alpha-band activity at retrieval compared with low-vividness items, suggesting an earlier memory reinstatement. Even when low-vividness items were remembered with high confidence, they were not retrieved as quickly as high-vividness items. These results indicate that when highly vivid mental images are encoded, the speed of their retrieval occurs more rapidly, relative to low-vivid items.
Collapse
Affiliation(s)
- Eva Gjorgieva
- Department of Psychology and Neuroscience, Duke University, Durham, NC 27708, United States
- Center for Cognitive Neuroscience, Duke Institute for Brain Sciences, Duke University, Durham, NC 27708, United States
| | - Ricardo Morales-Torres
- Department of Psychology and Neuroscience, Duke University, Durham, NC 27708, United States
- Center for Cognitive Neuroscience, Duke Institute for Brain Sciences, Duke University, Durham, NC 27708, United States
| | - Roberto Cabeza
- Department of Psychology and Neuroscience, Duke University, Durham, NC 27708, United States
- Center for Cognitive Neuroscience, Duke Institute for Brain Sciences, Duke University, Durham, NC 27708, United States
| | - Marty G Woldorff
- Department of Psychology and Neuroscience, Duke University, Durham, NC 27708, United States
- Center for Cognitive Neuroscience, Duke Institute for Brain Sciences, Duke University, Durham, NC 27708, United States
- Departtment of Psychiatry, Duke University, Durham, NC 27708, United States
| |
Collapse
|
3
|
Li S, Zeng X, Shao Z, Yu Q. Neural Representations in Visual and Parietal Cortex Differentiate between Imagined, Perceived, and Illusory Experiences. J Neurosci 2023; 43:6508-6524. [PMID: 37582626 PMCID: PMC10513072 DOI: 10.1523/jneurosci.0592-23.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/30/2023] [Revised: 07/09/2023] [Accepted: 08/04/2023] [Indexed: 08/17/2023] Open
Abstract
Humans constantly receive massive amounts of information, both perceived from the external environment and imagined from the internal world. To function properly, the brain needs to correctly identify the origin of information being processed. Recent work has suggested common neural substrates for perception and imagery. However, it has remained unclear how the brain differentiates between external and internal experiences with shared neural codes. Here we tested this question in human participants (male and female) by systematically investigating the neural processes underlying the generation and maintenance of visual information from voluntary imagery, veridical perception, and illusion. The inclusion of illusion allowed us to differentiate between objective and subjective internality: while illusion has an objectively internal origin and can be viewed as involuntary imagery, it is also subjectively perceived as having an external origin like perception. Combining fMRI, eye-tracking, multivariate decoding, and encoding approaches, we observed superior orientation representations in parietal cortex during imagery compared with perception, and conversely in early visual cortex. This imagery dominance gradually developed along a posterior-to-anterior cortical hierarchy from early visual to parietal cortex, emerged in the early epoch of imagery and sustained into the delay epoch, and persisted across varied imagined contents. Moreover, representational strength of illusion was more comparable to imagery in early visual cortex, but more comparable to perception in parietal cortex, suggesting content-specific representations in parietal cortex differentiate between subjectively internal and external experiences, as opposed to early visual cortex. These findings together support a domain-general engagement of parietal cortex in internally generated experience.SIGNIFICANCE STATEMENT How does the brain differentiate between imagined and perceived experiences? Combining fMRI, eye-tracking, multivariate decoding, and encoding approaches, the current study revealed enhanced stimulus-specific representations in visual imagery originating from parietal cortex, supporting the subjective experience of imagery. This neural principle was further validated by evidence from visual illusion, wherein illusion resembled perception and imagery at different levels of cortical hierarchy. Our findings provide direct evidence for the critical role of parietal cortex as a domain-general region for content-specific imagery, and offer new insights into the neural mechanisms underlying the differentiation between subjectively internal and external experiences.
Collapse
Affiliation(s)
- Siyi Li
- Institute of Neuroscience, Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai 200031, China
| | - Xuemei Zeng
- Institute of Neuroscience, Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai 200031, China
| | - Zhujun Shao
- Institute of Neuroscience, Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai 200031, China
- University of Chinese Academy of Sciences, Beijing 100049, China
| | - Qing Yu
- Institute of Neuroscience, Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai 200031, China
| |
Collapse
|
4
|
Kuroki D, Pronk T. jsQuestPlus: A JavaScript implementation of the QUEST+ method for estimating psychometric function parameters in online experiments. Behav Res Methods 2023; 55:3179-3186. [PMID: 36070128 PMCID: PMC9450820 DOI: 10.3758/s13428-022-01948-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/01/2022] [Indexed: 11/08/2022]
Abstract
The two Bayesian adaptive psychometric methods named QUEST (Watson & Pelli, 1983) and QUEST+ (Watson, 2017) are widely used to estimate psychometric parameters, especially the threshold, in laboratory-based psychophysical experiments. Considering the increase of online psychophysical experiments in recent years, there is a growing need to have the QUEST and QUEST+ methods available online as well. We developed JavaScript libraries for both, with this article introducing one of them: jsQuestPlus. We offer integrations with online experimental tools such as jsPsych (de Leeuw, 2015), PsychoPy/JS (Peirce et al., 2019), and lab.js (Henninger et al., 2021). We measured the computation time required by jsQuestPlus under four conditions. Our simulations on 37 browser-computer combinations showed that the mean initialization time was 461.08 ms, 95% CI [328.29, 593.87], the mean computation time required to determine the stimulus parameters for the next trial was less than 1 ms, and the mean update time was 79.39 ms, 95% CI [46.22, 112.55] even in extremely demanding conditions. Additionally, psychometric parameters were estimated as accurately as the original QUEST+ method did. We conclude that jsQuestPlus is fast and accurate enough to conduct online psychophysical experiments despite the complexity of the matrix calculations. The latest version of jsQuestPlus can be downloaded freely from https://github.com/kurokida/jsQuestPlus under the MIT license.
Collapse
Affiliation(s)
- Daiichiro Kuroki
- Department of Psychology, School of Letters, Kyushu University, 744 Motooka, Nishi-ku, Fukuoka, 819-0395, Japan.
| | - Thomas Pronk
- Behavioural Science Lab, Faculty of Social and Behavioural Sciences, University of Amsterdam, Amsterdam, Netherlands
| |
Collapse
|
5
|
Preparatory attention to visual features primarily relies on non-sensory representation. Sci Rep 2022; 12:21726. [PMID: 36526653 PMCID: PMC9758135 DOI: 10.1038/s41598-022-26104-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2022] [Accepted: 12/09/2022] [Indexed: 12/23/2022] Open
Abstract
Prior knowledge of behaviorally relevant information promotes preparatory attention before the appearance of stimuli. A key question is how our brain represents the attended information during preparation. A sensory template hypothesis assumes that preparatory signals evoke neural activity patterns that resembled the perception of the attended stimuli, whereas a non-sensory, abstract template hypothesis assumes that preparatory signals reflect the abstraction of attended stimuli. To test these hypotheses, we used fMRI and multivariate analysis to characterize neural activity patterns when human participants were prepared to attend a feature and then select it from a compound stimulus. In an fMRI experiment using basic visual feature (motion direction), we observed reliable decoding of the to-be-attended feature from the preparatory activity in both visual and frontoparietal areas. However, while the neural patterns constructed by a single feature from a baseline task generalized to the activity patterns during stimulus selection, they could not generalize to the activity patterns during preparation. Our findings thus suggest that neural signals during attentional preparation are predominantly non-sensory in nature that may reflect an abstraction of the attended feature. Such a representation could provide efficient and stable guidance of attention.
Collapse
|
6
|
Zhao YJ, Kay KN, Tian Y, Ku Y. Sensory Recruitment Revisited: Ipsilateral V1 Involved in Visual Working Memory. Cereb Cortex 2021; 32:1470-1479. [PMID: 34476462 DOI: 10.1093/cercor/bhab300] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2021] [Revised: 07/27/2021] [Accepted: 07/28/2021] [Indexed: 11/12/2022] Open
Abstract
The "sensory recruitment hypothesis" posits an essential role of sensory cortices in working memory, beyond the well-accepted frontoparietal areas. Yet, this hypothesis has recently been challenged. In the present study, participants performed a delayed orientation recall task while high-spatial-resolution 3 T functional magnetic resonance imaging (fMRI) signals were measured in posterior cortices. A multivariate inverted encoding model approach was used to decode remembered orientations based on blood oxygen level-dependent fMRI signals from visual cortices during the delay period. We found that not only did activity in the contralateral primary visual cortex (V1) retain high-fidelity representations of the visual stimuli, but activity in the ipsilateral V1 also contained such orientation tuning. Moreover, although the encoded tuning was faded in the contralateral V1 during the late delay period, tuning information in the ipsilateral V1 remained sustained. Furthermore, the ipsilateral representation was presented in secondary visual cortex (V2) as well, but not in other higher-level visual areas. These results thus supported the sensory recruitment hypothesis and extended it to the ipsilateral sensory areas, which indicated the distributed involvement of visual areas in visual working memory.
Collapse
Affiliation(s)
- Yi-Jie Zhao
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai 200433, China.,Center for Brain and Mental Well-being, Department of Psychology, Sun Yat-sen University, Guangzhou 510006, China.,Peng Cheng Laboratory, Shenzhen 518055, China.,School of Psychology and Cognitive Science, East China Normal University, Shanghai 200062, China
| | - Kendrick N Kay
- Center for Magnetic Resonance Research, Department of Radiology, University of Minnesota, Minneapolis, MN 55455, USA
| | - Yonghong Tian
- Peng Cheng Laboratory, Shenzhen 518055, China.,School of Electronic Engineering and Computer Science, Peking University, Beijing 100871, China
| | - Yixuan Ku
- Center for Brain and Mental Well-being, Department of Psychology, Sun Yat-sen University, Guangzhou 510006, China.,Peng Cheng Laboratory, Shenzhen 518055, China
| |
Collapse
|
7
|
Hallenbeck GE, Sprague TC, Rahmati M, Sreenivasan KK, Curtis CE. Working memory representations in visual cortex mediate distraction effects. Nat Commun 2021; 12:4714. [PMID: 34354071 PMCID: PMC8342709 DOI: 10.1038/s41467-021-24973-1] [Citation(s) in RCA: 27] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2021] [Accepted: 07/13/2021] [Indexed: 11/17/2022] Open
Abstract
Although the contents of working memory can be decoded from visual cortex activity, these representations may play a limited role if they are not robust to distraction. We used model-based fMRI to estimate the impact of distracting visual tasks on working memory representations in several visual field maps in visual and frontoparietal association cortex. Here, we show distraction causes the fidelity of working memory representations to briefly dip when both the memorandum and distractor are jointly encoded by the population activities. Distraction induces small biases in memory errors which can be predicted by biases in neural decoding in early visual cortex, but not other regions. Although distraction briefly disrupts working memory representations, the widespread redundancy with which working memory information is encoded may protect against catastrophic loss. In early visual cortex, the neural representation of information in working memory and behavioral performance are intertwined, solidifying its importance in visual memory. The relative roles of visual, parietal, and frontal cortex in working memory have been actively debated. Here, the authors show that distraction impacts visual working memory representations in primary visual areas, indicating that these regions play a key role in the maintenance of working memory.
Collapse
Affiliation(s)
| | - Thomas C Sprague
- Department of Psychology, New York University, New York, NY, USA.,Department of Psychological and Brain Sciences, University of California, Santa Barbara, CA, USA
| | - Masih Rahmati
- Department of Psychology, New York University, New York, NY, USA.,Center for Neural Science, New York University, New York, NY, USA
| | - Kartik K Sreenivasan
- Division of Science and Mathematics, New York University Abu Dhabi, Abu Dhabi, UAE
| | - Clayton E Curtis
- Department of Psychology, New York University, New York, NY, USA. .,Center for Neural Science, New York University, New York, NY, USA.
| |
Collapse
|