1
|
Valencia GN, Khoo S, Wong T, Ta J, Hou B, Barsalou LW, Hazen K, Lin HH, Wang S, Brefczynski-Lewis JA, Frum CA, Lewis JW. Chinese-English bilinguals show linguistic-perceptual links in the brain associating short spoken phrases with corresponding real-world natural action sounds by semantic category. LANGUAGE, COGNITION AND NEUROSCIENCE 2021; 36:773-790. [PMID: 34568509 PMCID: PMC8462789 DOI: 10.1080/23273798.2021.1883073] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/05/2020] [Accepted: 01/26/2021] [Indexed: 06/13/2023]
Abstract
Higher cognitive functions such as linguistic comprehension must ultimately relate to perceptual systems in the brain, though how and why this forms remains unclear. Different brain networks that mediate perception when hearing real-world natural sounds has recently been proposed to respect a taxonomic model of acoustic-semantic categories. Using functional magnetic resonance imaging (fMRI) with Chinese/English bilingual listeners, the present study explored whether reception of short spoken phrases, in both Chinese (Mandarin) and English, describing corresponding sound-producing events would engage overlapping brain regions at a semantic category level. The results revealed a double-dissociation of cortical regions that were preferential for representing knowledge of human versus environmental action events, whether conveyed through natural sounds or the corresponding spoken phrases depicted by either language. These findings of cortical hubs exhibiting linguistic-perceptual knowledge links at a semantic category level should help to advance neurocomputational models of the neurodevelopment of language systems.
Collapse
Affiliation(s)
- Gabriela N. Valencia
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University (WVU), Morgantown, WV 26506, USA
| | - Stephanie Khoo
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University (WVU), Morgantown, WV 26506, USA
| | - Ting Wong
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University (WVU), Morgantown, WV 26506, USA
| | - Joseph Ta
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University (WVU), Morgantown, WV 26506, USA
| | - Bob Hou
- Department of Radiology, Center for Advanced Imaging
| | | | - Kirk Hazen
- Department of English, West Virginia University
| | | | - Shuo Wang
- Department of Chemical and Biomedical Engineering
| | - Julie A. Brefczynski-Lewis
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University (WVU), Morgantown, WV 26506, USA
| | - Chris A. Frum
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University (WVU), Morgantown, WV 26506, USA
| | - James W. Lewis
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University (WVU), Morgantown, WV 26506, USA
| |
Collapse
|
2
|
Latini F, Trevisi G, Fahlström M, Jemstedt M, Alberius Munkhammar Å, Zetterling M, Hesselager G, Ryttlefors M. New Insights Into the Anatomy, Connectivity and Clinical Implications of the Middle Longitudinal Fasciculus. Front Neuroanat 2021; 14:610324. [PMID: 33584207 PMCID: PMC7878690 DOI: 10.3389/fnana.2020.610324] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2020] [Accepted: 12/30/2020] [Indexed: 12/01/2022] Open
Abstract
The middle longitudinal fascicle (MdLF) is a long, associative white matter tract connecting the superior temporal gyrus (STG) with the parietal and occipital lobe. Previous studies show different cortical terminations, and a possible segmentation pattern of the tract. In this study, we performed a post-mortem white matter dissection of 12 human hemispheres and an in vivo deterministic fiber tracking of 24 subjects acquired from the Human Connectome Project to establish whether a constant organization of fibers exists among the MdLF subcomponents and to acquire anatomical information on each subcomponent. Moreover, two clinical cases of brain tumors impinged on MdLF territories are reported to further discuss the anatomical results in light of previously published data on the functional involvement of this bundle. The main finding is that the MdLF is consistently organized into two layers: an antero-ventral segment (aMdLF) connecting the anterior STG (including temporal pole and planum polare) and the extrastriate lateral occipital cortex, and a posterior-dorsal segment (pMdLF) connecting the posterior STG, anterior transverse temporal gyrus and planum temporale with the superior parietal lobule and lateral occipital cortex. The anatomical connectivity pattern and quantitative differences between the MdLF subcomponents along with the clinical cases reported in this paper support the role of MdLF in high-order functions related to acoustic information. We suggest that pMdLF may contribute to the learning process associated with verbal-auditory stimuli, especially on left side, while aMdLF may play a role in processing/retrieving auditory information already consolidated within the temporal lobe.
Collapse
Affiliation(s)
- Francesco Latini
- Neurosurgical Unit, Department of Surgery, Ospedale Santo Spirito, Pescara, Italy
| | - Gianluca Trevisi
- Neurosurgical Unit, Department of Surgery, Ospedale Santo Spirito, Pescara, Italy
| | - Markus Fahlström
- Section of Radiology, Department of Surgical Sciences, Uppsala University, Uppsala, Sweden
| | - Malin Jemstedt
- Section of Speech-Language Pathology, Department of Neuroscience, Uppsala University, Uppsala, Sweden
| | | | - Maria Zetterling
- Section of Neurosurgery, Department of Neuroscience, Uppsala University, Uppsala, Sweden
| | - Göran Hesselager
- Section of Neurosurgery, Department of Neuroscience, Uppsala University, Uppsala, Sweden
| | - Mats Ryttlefors
- Section of Neurosurgery, Department of Neuroscience, Uppsala University, Uppsala, Sweden
| |
Collapse
|
3
|
Csonka M, Mardmomen N, Webster PJ, Brefczynski-Lewis JA, Frum C, Lewis JW. Meta-Analyses Support a Taxonomic Model for Representations of Different Categories of Audio-Visual Interaction Events in the Human Brain. Cereb Cortex Commun 2021; 2:tgab002. [PMID: 33718874 PMCID: PMC7941256 DOI: 10.1093/texcom/tgab002] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2020] [Revised: 12/31/2020] [Accepted: 01/06/2021] [Indexed: 01/23/2023] Open
Abstract
Our ability to perceive meaningful action events involving objects, people, and other animate agents is characterized in part by an interplay of visual and auditory sensory processing and their cross-modal interactions. However, this multisensory ability can be altered or dysfunctional in some hearing and sighted individuals, and in some clinical populations. The present meta-analysis sought to test current hypotheses regarding neurobiological architectures that may mediate audio-visual multisensory processing. Reported coordinates from 82 neuroimaging studies (137 experiments) that revealed some form of audio-visual interaction in discrete brain regions were compiled, converted to a common coordinate space, and then organized along specific categorical dimensions to generate activation likelihood estimate (ALE) brain maps and various contrasts of those derived maps. The results revealed brain regions (cortical "hubs") preferentially involved in multisensory processing along different stimulus category dimensions, including 1) living versus nonliving audio-visual events, 2) audio-visual events involving vocalizations versus actions by living sources, 3) emotionally valent events, and 4) dynamic-visual versus static-visual audio-visual stimuli. These meta-analysis results are discussed in the context of neurocomputational theories of semantic knowledge representations and perception, and the brain volumes of interest are available for download to facilitate data interpretation for future neuroimaging studies.
Collapse
Affiliation(s)
- Matt Csonka
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - Nadia Mardmomen
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - Paula J Webster
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - Julie A Brefczynski-Lewis
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - Chris Frum
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| | - James W Lewis
- Department of Neuroscience, Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA
| |
Collapse
|
4
|
Arioli M, Ricciardi E, Cattaneo Z. Social cognition in the blind brain: A coordinate-based meta-analysis. Hum Brain Mapp 2020; 42:1243-1256. [PMID: 33320395 PMCID: PMC7927293 DOI: 10.1002/hbm.25289] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2020] [Revised: 10/05/2020] [Accepted: 10/31/2020] [Indexed: 01/04/2023] Open
Abstract
Social cognition skills are typically acquired on the basis of visual information (e.g., the observation of gaze, facial expressions, gestures). In light of this, a critical issue is whether and how the lack of visual experience affects neurocognitive mechanisms underlying social skills. This issue has been largely neglected in the literature on blindness, despite difficulties in social interactions may be particular salient in the life of blind individuals (especially children). Here we provide a meta-analysis of neuroimaging studies reporting brain activations associated to the representation of self and others' in early blind individuals and in sighted controls. Our results indicate that early blindness does not critically impact on the development of the "social brain," with social tasks performed on the basis of auditory or tactile information driving consistent activations in nodes of the action observation network, typically active during actual observation of others in sighted individuals. Interestingly though, activations along this network appeared more left-lateralized in the blind than in sighted participants. These results may have important implications for the development of specific training programs to improve social skills in blind children and young adults.
Collapse
Affiliation(s)
- Maria Arioli
- Department of Psychology, University of Milano-Bicocca, Milan, Italy
| | | | - Zaira Cattaneo
- Department of Psychology, University of Milano-Bicocca, Milan, Italy.,IRCCS Mondino Foundation, Pavia, Italy
| |
Collapse
|
5
|
What and where in the auditory systems of sighted and early blind individuals: Evidence from representational similarity analysis. J Neurol Sci 2020; 413:116805. [PMID: 32259708 DOI: 10.1016/j.jns.2020.116805] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2019] [Revised: 03/14/2020] [Accepted: 03/24/2020] [Indexed: 11/24/2022]
Abstract
Separated ventral and dorsal streams in auditory system have been proposed to process sound identification and localization respectively. Despite the popularity of the dual-pathway model, it remains controversial how much independence two neural pathways enjoy and whether visual experiences can influence the distinct cortical organizational scheme. In this study, representational similarity analysis (RSA) was used to explore the functional roles of distinct cortical regions that lay within either the ventral or dorsal auditory streams of sighted and early blind (EB) participants. We found functionally segregated auditory networks in both sighted and EB groups where anterior superior temporal gyrus (aSTG) and inferior frontal junction (IFJ) were more related to the sound identification, while posterior superior temporal gyrus (pSTG) and inferior parietal lobe (IPL) preferred the sound localization. The findings indicated visual experiences may not have an influence on this functional dissociation and the cortex of the human brain may be organized as task-specific and modality-independent strategies. Meanwhile, partial overlap of spatial and non-spatial auditory information processing was observed, illustrating the existence of interaction between the two auditory streams. Furthermore, we investigated the effect of visual experiences on the neural bases of auditory perception and observed the cortical reorganization in EB participants in whom middle occipital gyrus was recruited to process auditory information. Our findings examined the distinct cortical networks that abstractly encoded sound identification and localization, and confirmed the existence of interaction from the multivariate perspective. Furthermore, the results suggested visual experience might not impact the functional specialization of auditory regions.
Collapse
|
6
|
Zhang C, Lee TMC, Fu Y, Ren C, Chan CCH, Tao Q. Properties of cross-modal occipital responses in early blindness: An ALE meta-analysis. NEUROIMAGE-CLINICAL 2019; 24:102041. [PMID: 31677587 PMCID: PMC6838549 DOI: 10.1016/j.nicl.2019.102041] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/12/2019] [Revised: 09/20/2019] [Accepted: 10/17/2019] [Indexed: 11/10/2022]
Abstract
ALE meta-analysis reveals distributed brain networks for object and spatial functions in individuals with early blindness. ALE contrast analysis reveals specific activations in the left cuneus and lingual gyrus for language function, suggesting a reverse hierarchical organization of the visual cortex for early blind individuals. The findings contribute to visual rehabilitation in blind individuals by revealing the function-dependent and sensory-independent networks during nonvisual processing.
Cross-modal occipital responses appear to be essential for nonvisual processing in individuals with early blindness. However, it is not clear whether the recruitment of occipital regions depends on functional domain or sensory modality. The current study utilized a coordinate-based meta-analysis to identify the distinct brain regions involved in the functional domains of object, spatial/motion, and language processing and the common brain regions involved in both auditory and tactile modalities in individuals with early blindness. Following the PRISMA guidelines, a total of 55 studies were included in the meta-analysis. The specific analyses revealed the brain regions that are consistently recruited for each function, such as the dorsal fronto-parietal network for spatial function and ventral occipito-temporal network for object function. This is consistent with the literature, suggesting that the two visual streams are preserved in early blind individuals. The contrast analyses found specific activations in the left cuneus and lingual gyrus for language function. This finding is novel and suggests a reverse hierarchical organization of the visual cortex for early blind individuals. The conjunction analyses found common activations in the right middle temporal gyrus, right precuneus and a left parieto-occipital region. Clinically, this work contributes to visual rehabilitation in early blind individuals by revealing the function-dependent and sensory-independent networks during nonvisual processing.
Collapse
Affiliation(s)
- Caiyun Zhang
- Psychology Department, School of Medicine, Jinan University, Guangzhou 510632, China
| | - Tatia M C Lee
- Laboratory of Neuropsychology, The University of Hong Kong, Hong Kong, CHINA; Laboratory of Cognitive Affective Neuroscience, The University of Hong Kong, Hong Kong, CHINA; The Affiliated Brain Hospital of Guangzhou Medical University, Guangzhou, China
| | - Yunwei Fu
- Guangdong-Hongkong-Macau Institute of CNS Regeneration, Ministry of Education CNS Regeneration Collaborative Joint Laboratory, Jinan University, Guangzhou, 510632, China
| | - Chaoran Ren
- Guangdong-Hongkong-Macau Institute of CNS Regeneration, Ministry of Education CNS Regeneration Collaborative Joint Laboratory, Jinan University, Guangzhou, 510632, China; Guangdong key Laboratory of Brain Function and Diseases, Jinan University, Guangzhou, 510632, China; Co-innovation Center of Neuroregeneration, Nantong University, Nantong, 226001, China; Center for Brain Science and Brain-Inspired Intelligence, Guangdong-Hong Kong-Macao Greater Bay Area, Guangzhou, China
| | - Chetwyn C H Chan
- Applied Cognitive Neuroscience Laboratory, Department of Rehabilitation Sciences, The Hong Kong Polytechnic University, Hong Kong, CHINA.
| | - Qian Tao
- Psychology Department, School of Medicine, Jinan University, Guangzhou 510632, China; Center for Brain Science and Brain-Inspired Intelligence, Guangdong-Hong Kong-Macao Greater Bay Area, Guangzhou, China.
| |
Collapse
|
7
|
Isayama R, Vesia M, Jegatheeswaran G, Elahi B, Gunraj CA, Cardinali L, Farnè A, Chen R. Rubber hand illusion modulates the influences of somatosensory and parietal inputs to the motor cortex. J Neurophysiol 2019; 121:563-573. [DOI: 10.1152/jn.00345.2018] [Citation(s) in RCA: 26] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
The rubber hand illusion (RHI) paradigm experimentally produces an illusion of rubber hand ownership and arm shift by simultaneously stroking a rubber hand in view and a participant’s visually occluded hand. It involves visual, tactile, and proprioceptive multisensory integration and activates multisensory areas in the brain, including the posterior parietal cortex (PPC). Multisensory inputs are transformed into outputs for motor control in association areas such as PPC. A behavioral study reported decreased motor performance after RHI. However, it remains unclear whether RHI modifies the interactions between sensory and motor systems and between PPC and the primary motor cortex (M1). We used transcranial magnetic stimulation (TMS) and examined the functional connections from the primary somatosensory and association cortices to M1 and from PPC to M1 during RHI. In experiment 1, short-latency afferent inhibition (SAI) and long-latency afferent inhibition (LAI) were measured before and immediately after a synchronous (RHI) or an asynchronous (control) condition. In experiment 2, PPC-M1 interaction was measured using two coils. We found that SAI and LAI were reduced in the synchronous condition compared with baseline, suggesting that RHI decreased somatosensory processing in the primary sensory and the association cortices projecting to M1. We also found that greater inhibitory PPC-M1 interaction was associated with stronger RHI assessed by questionnaire. Our findings suggest that RHI modulates both the early and late stages of processing of tactile afferent, which leads to altered M1 excitability by reducing the gain of somatosensory afferents to resolve conflicts among multisensory inputs. NEW & NOTEWORTHY Perception of one’s own body parts involves integrating different sensory information and is important for motor control. We found decreased effects of cutaneous stimulation on motor cortical excitability during rubber hand illusion (RHI), which may reflect decreased gain of tactile input to resolve multisensory conflicts. RHI strength correlated with the degree of inhibitory posterior parietal cortex-motor cortex interaction, indicating that parietal-motor connection is involved in resolving sensory conflicts and body ownership during RHI.
Collapse
Affiliation(s)
- Reina Isayama
- Division of Neurology, Department of Medicine, University of Toronto, Toronto, Ontario, Canada
- Division of Brain, Imaging and Behaviour – Systems Neuroscience, Krembil Research Institute, Toronto, Ontario, Canada
| | - Michael Vesia
- Division of Brain, Imaging and Behaviour – Systems Neuroscience, Krembil Research Institute, Toronto, Ontario, Canada
| | - Gaayathiri Jegatheeswaran
- Division of Neurology, Department of Medicine, University of Toronto, Toronto, Ontario, Canada
- Division of Brain, Imaging and Behaviour – Systems Neuroscience, Krembil Research Institute, Toronto, Ontario, Canada
| | - Behzad Elahi
- Division of Neurology, Department of Medicine, University of Toronto, Toronto, Ontario, Canada
- Department of Neurology, Tufts Medical Center, Tufts School of Medicine, Boston, Massachusetts
| | - Carolyn A. Gunraj
- Division of Brain, Imaging and Behaviour – Systems Neuroscience, Krembil Research Institute, Toronto, Ontario, Canada
| | - Lucilla Cardinali
- Integrative Multisensory Perception Action & Cognition team (ImpAct), Lyon Neuroscience Research Center, Lyon, France
- The Brain and Mind Institute, University of Western Ontario, London, Ontario, Canada
| | - Alessandro Farnè
- Integrative Multisensory Perception Action & Cognition team (ImpAct), Lyon Neuroscience Research Center, Lyon, France
| | - Robert Chen
- Division of Neurology, Department of Medicine, University of Toronto, Toronto, Ontario, Canada
- Division of Brain, Imaging and Behaviour – Systems Neuroscience, Krembil Research Institute, Toronto, Ontario, Canada
| |
Collapse
|
8
|
Embodying functionally relevant action sounds in patients with spinal cord injury. Sci Rep 2018; 8:15641. [PMID: 30353071 PMCID: PMC6199269 DOI: 10.1038/s41598-018-34133-z] [Citation(s) in RCA: 20] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2018] [Accepted: 10/06/2018] [Indexed: 02/06/2023] Open
Abstract
Growing evidence indicates that perceptual-motor codes may be associated with and influenced by actual bodily states. Following a spinal cord injury (SCI), for example, individuals exhibit reduced visual sensitivity to biological motion. However, a dearth of direct evidence exists about whether profound alterations in sensorimotor traffic between the body and brain influence audio-motor representations. We tested 20 wheelchair-bound individuals with lower skeletal-level SCI who were unable to feel and move their lower limbs, but have retained upper limb function. In a two-choice, matching-to-sample auditory discrimination task, the participants were asked to determine which of two action sounds matched a sample action sound presented previously. We tested aural discrimination ability using sounds that arose from wheelchair, upper limb, lower limb, and animal actions. Our results indicate that an inability to move the lower limbs did not lead to impairment in the discrimination of lower limb-related action sounds in SCI patients. Importantly, patients with SCI discriminated wheelchair sounds more quickly than individuals with comparable auditory experience (i.e. physical therapists) and inexperienced, able-bodied subjects. Audio-motor associations appear to be modified and enhanced to incorporate external salient tools that now represent extensions of their body schemas.
Collapse
|
9
|
de Borst AW, de Gelder B. Mental Imagery Follows Similar Cortical Reorganization as Perception: Intra-Modal and Cross-Modal Plasticity in Congenitally Blind. Cereb Cortex 2018; 29:2859-2875. [DOI: 10.1093/cercor/bhy151] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2018] [Revised: 05/27/2018] [Accepted: 06/05/2018] [Indexed: 11/14/2022] Open
Abstract
Abstract
Cortical plasticity in congenitally blind individuals leads to cross-modal activation of the visual cortex and may lead to superior perceptual processing in the intact sensory domains. Although mental imagery is often defined as a quasi-perceptual experience, it is unknown whether it follows similar cortical reorganization as perception in blind individuals. In this study, we show that auditory versus tactile perception evokes similar intra-modal discriminative patterns in congenitally blind compared with sighted participants. These results indicate that cortical plasticity following visual deprivation does not influence broad intra-modal organization of auditory and tactile perception as measured by our task. Furthermore, not only the blind, but also the sighted participants showed cross-modal discriminative patterns for perception modality in the visual cortex. During mental imagery, both groups showed similar decoding accuracies for imagery modality in the intra-modal primary sensory cortices. However, no cross-modal discriminative information for imagery modality was found in early visual cortex of blind participants, in contrast to the sighted participants. We did find evidence of cross-modal activation of higher visual areas in blind participants, including the representation of specific-imagined auditory features in visual area V4.
Collapse
Affiliation(s)
- A W de Borst
- Department of Computer Science, University College London, London, UK
- Brain and Emotion Lab, Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, the Netherlands
| | - B de Gelder
- Department of Computer Science, University College London, London, UK
- Brain and Emotion Lab, Department of Cognitive Neuroscience, Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, the Netherlands
| |
Collapse
|
10
|
Dormal G, Pelland M, Rezk M, Yakobov E, Lepore F, Collignon O. Functional Preference for Object Sounds and Voices in the Brain of Early Blind and Sighted Individuals. J Cogn Neurosci 2018; 30:86-106. [DOI: 10.1162/jocn_a_01186] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/21/2022]
Abstract
Sounds activate occipital regions in early blind individuals. However, how different sound categories map onto specific regions of the occipital cortex remains a matter of debate. We used fMRI to characterize brain responses of early blind and sighted individuals to familiar object sounds, human voices, and their respective low-level control sounds. In addition, sighted participants were tested while viewing pictures of faces, objects, and phase-scrambled control pictures. In both early blind and sighted, a double dissociation was evidenced in bilateral auditory cortices between responses to voices and object sounds: Voices elicited categorical responses in bilateral superior temporal sulci, whereas object sounds elicited categorical responses along the lateral fissure bilaterally, including the primary auditory cortex and planum temporale. Outside the auditory regions, object sounds also elicited categorical responses in the left lateral and in the ventral occipitotemporal regions in both groups. These regions also showed response preference for images of objects in the sighted group, thus suggesting a functional specialization that is independent of sensory input and visual experience. Between-group comparisons revealed that, only in the blind group, categorical responses to object sounds extended more posteriorly into the occipital cortex. Functional connectivity analyses evidenced a selective increase in the functional coupling between these reorganized regions and regions of the ventral occipitotemporal cortex in the blind group. In contrast, vocal sounds did not elicit preferential responses in the occipital cortex in either group. Nevertheless, enhanced voice-selective connectivity between the left temporal voice area and the right fusiform gyrus were found in the blind group. Altogether, these findings suggest that, in the absence of developmental vision, separate auditory categories are not equipotent in driving selective auditory recruitment of occipitotemporal regions and highlight the presence of domain-selective constraints on the expression of cross-modal plasticity.
Collapse
Affiliation(s)
| | | | | | | | | | - Olivier Collignon
- University of Montreal
- University of Louvain
- McGill University, Montreal, Canada
| |
Collapse
|
11
|
Brefczynski-Lewis JA, Lewis JW. Auditory object perception: A neurobiological model and prospective review. Neuropsychologia 2017; 105:223-242. [PMID: 28467888 PMCID: PMC5662485 DOI: 10.1016/j.neuropsychologia.2017.04.034] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2016] [Revised: 04/27/2017] [Accepted: 04/27/2017] [Indexed: 12/15/2022]
Abstract
Interaction with the world is a multisensory experience, but most of what is known about the neural correlates of perception comes from studying vision. Auditory inputs enter cortex with its own set of unique qualities, and leads to use in oral communication, speech, music, and the understanding of emotional and intentional states of others, all of which are central to the human experience. To better understand how the auditory system develops, recovers after injury, and how it may have transitioned in its functions over the course of hominin evolution, advances are needed in models of how the human brain is organized to process real-world natural sounds and "auditory objects". This review presents a simple fundamental neurobiological model of hearing perception at a category level that incorporates principles of bottom-up signal processing together with top-down constraints of grounded cognition theories of knowledge representation. Though mostly derived from human neuroimaging literature, this theoretical framework highlights rudimentary principles of real-world sound processing that may apply to most if not all mammalian species with hearing and acoustic communication abilities. The model encompasses three basic categories of sound-source: (1) action sounds (non-vocalizations) produced by 'living things', with human (conspecific) and non-human animal sources representing two subcategories; (2) action sounds produced by 'non-living things', including environmental sources and human-made machinery; and (3) vocalizations ('living things'), with human versus non-human animals as two subcategories therein. The model is presented in the context of cognitive architectures relating to multisensory, sensory-motor, and spoken language organizations. The models' predictive values are further discussed in the context of anthropological theories of oral communication evolution and the neurodevelopment of spoken language proto-networks in infants/toddlers. These phylogenetic and ontogenetic frameworks both entail cortical network maturations that are proposed to at least in part be organized around a number of universal acoustic-semantic signal attributes of natural sounds, which are addressed herein.
Collapse
Affiliation(s)
- Julie A Brefczynski-Lewis
- Blanchette Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA; Department of Physiology, Pharmacology, & Neuroscience, West Virginia University, PO Box 9229, Morgantown, WV 26506, USA
| | - James W Lewis
- Blanchette Rockefeller Neuroscience Institute, West Virginia University, Morgantown, WV 26506, USA; Department of Physiology, Pharmacology, & Neuroscience, West Virginia University, PO Box 9229, Morgantown, WV 26506, USA.
| |
Collapse
|
12
|
Fang Y, Chen Q, Lingnau A, Han Z, Bi Y. Areas Recruited during Action Understanding Are Not Modulated by Auditory or Sign Language Experience. Front Hum Neurosci 2016; 10:94. [PMID: 27014025 PMCID: PMC4781852 DOI: 10.3389/fnhum.2016.00094] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2015] [Accepted: 02/22/2016] [Indexed: 11/26/2022] Open
Abstract
The observation of other people’s actions recruits a network of areas including the inferior frontal gyrus (IFG), the inferior parietal lobule (IPL), and posterior middle temporal gyrus (pMTG). These regions have been shown to be activated through both visual and auditory inputs. Intriguingly, previous studies found no engagement of IFG and IPL for deaf participants during non-linguistic action observation, leading to the proposal that auditory experience or sign language usage might shape the functionality of these areas. To understand which variables induce plastic changes in areas recruited during the processing of other people’s actions, we examined the effects of tasks (action understanding and passive viewing) and effectors (arm actions vs. leg actions), as well as sign language experience in a group of 12 congenitally deaf signers and 13 hearing participants. In Experiment 1, we found a stronger activation during an action recognition task in comparison to a low-level visual control task in IFG, IPL and pMTG in both deaf signers and hearing individuals, but no effect of auditory or sign language experience. In Experiment 2, we replicated the results of the first experiment using a passive viewing task. Together, our results provide robust evidence demonstrating that the response obtained in IFG, IPL, and pMTG during action recognition and passive viewing is not affected by auditory or sign language experience, adding further support for the supra-modal nature of these regions.
Collapse
Affiliation(s)
- Yuxing Fang
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University Beijing, China
| | - Quanjing Chen
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University Beijing, China
| | - Angelika Lingnau
- Center for Mind/Brain Sciences, University of TrentoRovereto, Italy; Department of Psychology and Cognitive Science, University of TrentoRovereto, Italy; Department of Psychology, Royal Holloway University of LondonEgham, UK
| | - Zaizhu Han
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University Beijing, China
| | - Yanchao Bi
- State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, Beijing Normal University Beijing, China
| |
Collapse
|
13
|
Abstract
We review our recent behavioural and imaging studies testing the consequences of congenital blindness on the chemical senses in comparison with the condition of anosmia. We found that congenitally blind (CB) subjects have increased sensitivity for orthonasal odorants and recruit their visually deprived occipital cortex to process orthonasal olfactory stimuli. In sharp contrast, CB perform less well than sighted controls in taste and retronasal olfaction, i.e. when processing chemicals inside the mouth. Interestingly, CB do not recruit their occipital cortex to process taste stimuli. In contrast to these findings in blindness, congenital anosmia is associated with lower taste and trigeminal sensitivity, accompanied by weaker activations within the 'flavour network' upon exposure to such stimuli. We conclude that functional adaptations to congenital anosmia or blindness are quite distinct, such that CB can train their exteroceptive chemical senses and recruit normally visual cortical areas to process chemical information from the surrounding environment.
Collapse
|
14
|
Neural correlates of taste perception in congenital blindness. Neuropsychologia 2015; 70:227-34. [PMID: 25708174 DOI: 10.1016/j.neuropsychologia.2015.02.027] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2014] [Revised: 02/13/2015] [Accepted: 02/19/2015] [Indexed: 11/22/2022]
Abstract
Sight is undoubtedly important for the perception and the assessment of the palatability of tastants. Although many studies have addressed the consequences of visual impairment on food selection, feeding behavior, eating habits and taste perception, nothing is known about the neural correlates of gustation in blindness. In the current study we examined brain responses during gustation using functional magnetic resonance imaging (fMRI). We scanned nine congenitally blind and 14 age- and sex-matched blindfolded sighted control subjects, matched in age, gender and body mass index (BMI), while they made judgments of either the intensity or the (un)pleasantness of different tastes (sweet, bitter) or artificial saliva that were delivered intra-orally. The fMRI data indicated that during gustation, congenitally blind individuals activate less strongly the primary taste cortex (right posterior insula and overlying Rolandic operculum) and the hypothalamus. In sharp contrast with results of multiple other sensory processing studies in congenitally blind subjects, including touch, audition and smell, the occipital cortex was not recruited during taste processing, suggesting the absence of taste-related compensatory crossmodal responses in the occipital cortex. These results underscore our earlier behavioral demonstration that congenitally blind subjects have a lower gustatory sensitivity compared to normal sighted individuals. We hypothesize that due to an underexposure to a variety of tastants, training-induced crossmodal sensory plasticity to gustatory stimulation does not occur in blind subjects.
Collapse
|
15
|
Burton H, Snyder AZ, Raichle ME. Resting state functional connectivity in early blind humans. Front Syst Neurosci 2014; 8:51. [PMID: 24778608 PMCID: PMC3985019 DOI: 10.3389/fnsys.2014.00051] [Citation(s) in RCA: 66] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2013] [Accepted: 03/19/2014] [Indexed: 12/21/2022] Open
Abstract
Task-based neuroimaging studies in early blind humans (EB) have demonstrated heightened visual cortex responses to non-visual paradigms. Several prior functional connectivity studies in EB have shown altered connections consistent with these task-based results. But these studies generally did not consider behavioral adaptations to lifelong blindness typically observed in EB. Enhanced cognitive abilities shown in EB include greater serial recall and attention to memory. Here, we address the question of the extent to which brain intrinsic activity in EB reflects such adaptations. We performed a resting-state functional magnetic resonance imaging study contrasting 14 EB with 14 age/gender matched normally sighted controls (NS). A principal finding was markedly greater functional connectivity in EB between visual cortex and regions typically associated with memory and cognitive control of attention. In contrast, correlations between visual cortex and non-deprived sensory cortices were significantly lower in EB. Thus, the available data, including that obtained in prior task-based and resting state fMRI studies, as well as the present results, indicate that visual cortex in EB becomes more heavily incorporated into functional systems instantiating episodic recall and attention to non-visual events. Moreover, EB appear to show a reduction in interactions between visual and non-deprived sensory cortices, possibly reflecting suppression of inter-sensory distracting activity.
Collapse
Affiliation(s)
- Harold Burton
- Department of Anatomy and Neurobiology, Washington University School of Medicine St. Louis, MO, USA ; Department of Radiology, Washington University School of Medicine St. Louis, MO, USA
| | - Abraham Z Snyder
- Department of Radiology, Washington University School of Medicine St. Louis, MO, USA
| | - Marcus E Raichle
- Department of Radiology, Washington University School of Medicine St. Louis, MO, USA
| |
Collapse
|
16
|
Spatially distributed effects of mental exhaustion on resting-state FMRI networks. PLoS One 2014; 9:e94222. [PMID: 24705397 PMCID: PMC3976406 DOI: 10.1371/journal.pone.0094222] [Citation(s) in RCA: 33] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2013] [Accepted: 03/13/2014] [Indexed: 12/29/2022] Open
Abstract
Brain activity during rest is spatially coherent over functional connectivity networks called resting-state networks. In resting-state functional magnetic resonance imaging, independent component analysis yields spatially distributed network representations reflecting distinct mental processes, such as intrinsic (default) or extrinsic (executive) attention, and sensory inhibition or excitation. These aspects can be related to different treatments or subjective experiences. Among these, exhaustion is a common psychological state induced by prolonged mental performance. Using repeated functional magnetic resonance imaging sessions and spatial independent component analysis, we explored the effect of several hours of sustained cognitive performances on the resting human brain. Resting-state functional magnetic resonance imaging was performed on the same healthy volunteers in two days, with and without, and before, during and after, an intensive psychological treatment (skill training and sustained practice with a flight simulator). After each scan, subjects rated their level of exhaustion and performed an N-back task to evaluate eventual decrease in cognitive performance. Spatial maps of selected resting-state network components were statistically evaluated across time points to detect possible changes induced by the sustained mental performance. The intensive treatment had a significant effect on exhaustion and effort ratings, but no effects on N-back performances. Significant changes in the most exhausted state were observed in the early visual processing and the anterior default mode networks (enhancement) and in the fronto-parietal executive networks (suppression), suggesting that mental exhaustion is associated with a more idling brain state and that internal attention processes are facilitated to the detriment of more extrinsic processes. The described application may inspire future indicators of the level of fatigue in the neural attention system.
Collapse
|
17
|
Mind the blind brain to understand the sighted one! Is there a supramodal cortical functional architecture? Neurosci Biobehav Rev 2014; 41:64-77. [DOI: 10.1016/j.neubiorev.2013.10.006] [Citation(s) in RCA: 103] [Impact Index Per Article: 10.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2013] [Revised: 08/13/2013] [Accepted: 10/03/2013] [Indexed: 11/20/2022]
|
18
|
Bedny M, Saxe R. Insights into the origins of knowledge from the cognitive neuroscience of blindness. Cogn Neuropsychol 2013; 29:56-84. [PMID: 23017086 DOI: 10.1080/02643294.2012.713342] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
Abstract
Children learn about the world through senses such as touch, smell, vision, and audition, but they conceive of the world in terms of objects, events, agents, and their mental states. A fundamental question in cognitive science is how nature and nurture contribute to the development of such conceptual categories. What innate mechanisms do children bring to the learning problem? How does experience contribute to development? In this article we discuss insights into these longstanding questions from cognitive neuroscience studies of blindness. Despite drastically different sensory experiences, behavioural and neuroscientific work suggests that blind children acquire typical concepts of objects, actions, and mental states. Blind people think and talk about these categories in ways that are similar to sighted people. Neuroimaging reveals that blind people make such judgements relying on the same neural mechanisms as sighted people. One way to interpret these findings is that neurocognitive development is largely hardwired, and so differences in experience have little consequence. Contrary to this interpretation, neuroimaging studies also show that blindness profoundly reorganizes the visual system. Most strikingly, developmental blindness enables "visual" circuits to participate in high-level cognitive functions, including language processing. Thus, blindness qualitatively changes sensory representations, but leaves conceptual representations largely unchanged. The effect of sensory experience on concepts is modest, despite the brain's potential for neuroplasticity.
Collapse
Affiliation(s)
- Marina Bedny
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, 02139, USA.
| | | |
Collapse
|
19
|
Beyond motor scheme: a supramodal distributed representation in the action-observation network. PLoS One 2013; 8:e58632. [PMID: 23472216 PMCID: PMC3589380 DOI: 10.1371/journal.pone.0058632] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2012] [Accepted: 02/05/2013] [Indexed: 11/25/2022] Open
Abstract
The representation of actions within the action-observation network is thought to rely on a distributed functional organization. Furthermore, recent findings indicate that the action-observation network encodes not merely the observed motor act, but rather a representation that is independent from a specific sensory modality or sensory experience. In the present study, we wished to determine to what extent this distributed and ‘more abstract’ representation of action is truly supramodal, i.e. shares a common coding across sensory modalities. To this aim, a pattern recognition approach was employed to analyze neural responses in sighted and congenitally blind subjects during visual and/or auditory presentation of hand-made actions. Multivoxel pattern analyses-based classifiers discriminated action from non-action stimuli across sensory conditions (visual and auditory) and experimental groups (blind and sighted). Moreover, these classifiers labeled as ‘action’ the pattern of neural responses evoked during actual motor execution. Interestingly, discriminative information for the action/non action classification was located in a bilateral, but left-prevalent, network that strongly overlaps with brain regions known to form the action-observation network and the human mirror system. The ability to identify action features with a multivoxel pattern analyses-based classifier in both sighted and blind individuals and independently from the sensory modality conveying the stimuli clearly supports the hypothesis of a supramodal, distributed functional representation of actions, mainly within the action-observation network.
Collapse
|
20
|
Zilbovicius M, Saitovitch A, Popa T, Rechtman E, Diamandis L, Chabane N, Brunelle F, Samson Y, Boddaert N. Autism, social cognition and superior temporal sulcus. ACTA ACUST UNITED AC 2013. [DOI: 10.4236/ojpsych.2013.32a008] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
|
21
|
Saitovitch A, Bargiacchi A, Chabane N, Brunelle F, Samson Y, Boddaert N, Zilbovicius M. Social cognition and the superior temporal sulcus: implications in autism. Rev Neurol (Paris) 2012; 168:762-70. [PMID: 22981269 DOI: 10.1016/j.neurol.2012.07.017] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2012] [Accepted: 07/23/2012] [Indexed: 10/27/2022]
Abstract
The most common clinical sign of autism spectrum disorders (ASD) is social interaction impairment, which is associated with communication deficits and stereotyped behaviors. Based on brain-imaging results, our hypothesis is that abnormalities in the superior temporal sulcus (STS) are highly implicated in ASD. These abnormalities are characterized by decreased grey matter concentration, rest hypoperfusion and abnormal activation during social tasks. STS anatomofunctional anomalies occurring early across brain development could constitute the first step in the cascade of neural dysfunctions underlying autism. It is known that STS is highly implicated on social perception processing, from perception of biological movements, such as body movements or eye gaze, to more complex social cognition processes. Among the impairments that can be described in social perception processing, eye gaze perception is particularly relevant in autism. Gaze abnormalities can now be objectively measured using eye-tracking methodology. In the present work, we will review recent data on STS contributions to normal social cognition and its implication in autism, with particular focus on eye gaze perception.
Collapse
Affiliation(s)
- A Saitovitch
- Unité Inserm 1000, service de radiologie pédiatrique, hôpital Necker-Enfants-Malades, AP-HP, université Paris V René-Descartes, 149, rue de Sèvres, 75015 Paris cedex 15, France.
| | | | | | | | | | | | | |
Collapse
|