1
|
How speech is produced and perceived in the human cortex. Nature 2024; 626:485-486. [PMID: 38297041 DOI: 10.1038/d41586-024-00078-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/02/2024]
|
2
|
Dynamics and maintenance of categorical responses in primary auditory cortex during task engagement. eLife 2023; 12:e85706. [PMID: 37970945 DOI: 10.7554/elife.85706] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2022] [Accepted: 11/12/2023] [Indexed: 11/19/2023] Open
Abstract
Grouping sets of sounds into relevant categories is an important cognitive ability that enables the association of stimuli with appropriate goal-directed behavioral responses. In perceptual tasks, the primary auditory cortex (A1) assumes a prominent role by concurrently encoding both sound sensory features and task-related variables. Here, we sought to explore the role of A1 in the initiation of sound categorization, shedding light on its involvement in this cognitive process. We trained ferrets to discriminate click trains of different rates in a Go/No-Go delayed categorization task and recorded neural activity during both active behavior and passive exposure to the same sounds. Purely categorical response components were extracted and analyzed separately from sensory responses to reveal their contributions to the overall population response throughout the trials. We found that categorical activity emerged during sound presentation in the population average and was present in both active behavioral and passive states. However, upon task engagement, categorical responses to the No-Go category became suppressed in the population code, leading to an asymmetrical representation of the Go stimuli relative to the No-Go sounds and pre-stimulus baseline. The population code underwent an abrupt change at stimulus offset, with sustained responses after the Go sounds during the delay period. Notably, the categorical responses observed during the stimulus period exhibited a significant correlation with those extracted from the delay epoch, suggesting an early involvement of A1 in stimulus categorization.
Collapse
|
3
|
Early selection of task-relevant features through population gating. Nat Commun 2023; 14:6837. [PMID: 37884507 PMCID: PMC10603060 DOI: 10.1038/s41467-023-42519-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2022] [Accepted: 10/12/2023] [Indexed: 10/28/2023] Open
Abstract
Brains can gracefully weed out irrelevant stimuli to guide behavior. This feat is believed to rely on a progressive selection of task-relevant stimuli across the cortical hierarchy, but the specific across-area interactions enabling stimulus selection are still unclear. Here, we propose that population gating, occurring within primary auditory cortex (A1) but controlled by top-down inputs from prelimbic region of medial prefrontal cortex (mPFC), can support across-area stimulus selection. Examining single-unit activity recorded while rats performed an auditory context-dependent task, we found that A1 encoded relevant and irrelevant stimuli along a common dimension of its neural space. Yet, the relevant stimulus encoding was enhanced along an extra dimension. In turn, mPFC encoded only the stimulus relevant to the ongoing context. To identify candidate mechanisms for stimulus selection within A1, we reverse-engineered low-rank RNNs trained on a similar task. Our analyses predicted that two context-modulated neural populations gated their preferred stimulus in opposite contexts, which we confirmed in further analyses of A1. Finally, we show in a two-region RNN how population gating within A1 could be controlled by top-down inputs from PFC, enabling flexible across-area communication despite fixed inter-areal connectivity.
Collapse
|
4
|
Pupillary dynamics reflect the impact of temporal expectation on detection strategy. iScience 2023; 26:106000. [PMID: 36798438 PMCID: PMC9926307 DOI: 10.1016/j.isci.2023.106000] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2022] [Revised: 11/09/2022] [Accepted: 01/12/2023] [Indexed: 01/18/2023] Open
Abstract
Everyday life's perceptual decision-making is informed by experience. In particular, temporal expectation can ease the detection of relevant events in noisy sensory streams. Here, we investigated if humans can extract hidden temporal cues from the occurrences of probabilistic targets and utilize them to inform target detection in a complex acoustic stream. To understand what neural mechanisms implement temporal expectation influence on decision-making, we used pupillometry as a proxy for underlying neuromodulatory activity. We found that participants' detection strategy was influenced by the hidden temporal context and correlated with sound-evoked pupil dilation. A model of urgency fitted on false alarms predicted detection reaction time. Altogether, these findings suggest that temporal expectation informs decision-making and could be implemented through neuromodulatory-mediated urgency signals.
Collapse
|
5
|
EEG evoked activity suggests amodal evidence integration in multisensory decision-making. J Vis 2022. [DOI: 10.1167/jov.22.14.3963] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022] Open
|
6
|
Distinct higher-order representations of natural sounds in human and ferret auditory cortex. eLife 2021; 10:e65566. [PMID: 34792467 PMCID: PMC8601661 DOI: 10.7554/elife.65566] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/08/2020] [Accepted: 10/22/2021] [Indexed: 11/29/2022] Open
Abstract
Little is known about how neural representations of natural sounds differ across species. For example, speech and music play a unique role in human hearing, yet it is unclear how auditory representations of speech and music differ between humans and other animals. Using functional ultrasound imaging, we measured responses in ferrets to a set of natural and spectrotemporally matched synthetic sounds previously tested in humans. Ferrets showed similar lower-level frequency and modulation tuning to that observed in humans. But while humans showed substantially larger responses to natural vs. synthetic speech and music in non-primary regions, ferret responses to natural and synthetic sounds were closely matched throughout primary and non-primary auditory cortex, even when tested with ferret vocalizations. This finding reveals that auditory representations in humans and ferrets diverge sharply at late stages of cortical processing, potentially driven by higher-order processing demands in speech and music.
Collapse
|
7
|
Characterizing amplitude and frequency modulation cues in natural soundscapes: A pilot study on four habitats of a biosphere reserve. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2020; 147:3260. [PMID: 32486802 DOI: 10.1121/10.0001174] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/03/2019] [Accepted: 04/13/2020] [Indexed: 06/11/2023]
Abstract
Natural soundscapes correspond to the acoustical patterns produced by biological and geophysical sound sources at different spatial and temporal scales for a given habitat. This pilot study aims to characterize the temporal-modulation information available to humans when perceiving variations in soundscapes within and across natural habitats. This is addressed by processing soundscapes from a previous study [Krause, Gage, and Joo. (2011). Landscape Ecol. 26, 1247] via models of human auditory processing extracting modulation at the output of cochlear filters. The soundscapes represent combinations of elevation, animal, and vegetation diversity in four habitats of the biosphere reserve in the Sequoia National Park (Sierra Nevada, USA). Bayesian statistical analysis and support vector machine classifiers indicate that: (i) amplitude-modulation (AM) and frequency-modulation (FM) spectra distinguish the soundscapes associated with each habitat; and (ii) for each habitat, diurnal and seasonal variations are associated with salient changes in AM and FM cues at rates between about 1 and 100 Hz in the low (<0.5 kHz) and high (>1-3 kHz) audio-frequency range. Support vector machine classifications further indicate that soundscape variations can be classified accurately based on these perceptually inspired representations.
Collapse
|
8
|
Mechanical coupling through the skin affects whisker movements and tactile information encoding. J Neurophysiol 2019; 122:1606-1622. [PMID: 31411931 DOI: 10.1152/jn.00863.2018] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Rats use their whiskers to extract sensory information from their environment. While exploring, they analyze peripheral stimuli distributed over several whiskers. Previous studies have reported cross-whisker integration of information at several levels of the neuronal pathways from whisker follicles to the somatosensory cortex. In the present study, we investigated the possible coupling between whiskers at a preneuronal level, transmitted by the skin and muscles between follicles. First, we quantified the movement induced on one whisker by deflecting another whisker. Our results show significant mechanical coupling, predominantly when a given whisker's caudal neighbor in the same row is deflected. The magnitude of the effect was correlated with the diameter of the deflected whisker. In addition to changes in whisker angle, we observed curvature changes when the whisker shaft was constrained distally from the base. Second, we found that trigeminal ganglion neurons innervating a given whisker follicle fire action potentials in response to high-magnitude deflections of an adjacent whisker. This functional coupling also shows a bias toward the caudal neighbor located in the same row. Finally, we designed a two-whisker biomechanical model to investigate transmission of forces across follicles. Analysis of the whisker-follicle contact forces suggests that activation of mechanoreceptors in the ring sinus region could account for our electrophysiological results. The model can fully explain the observed caudal bias by the gradient in whisker diameter, with possible contribution of the intrinsic muscles connecting follicles. Overall, our study demonstrates the functional relevance of mechanical coupling on early information processing in the whisker system.NEW & NOTEWORTHY Rodents explore their environment actively by touching objects with their whiskers. A major challenge is to understand how sensory inputs from different whiskers are merged together to form a coherent tactile percept. We demonstrate that external sensory events on one whisker can influence the position of another whisker and, importantly, that they can trigger the activity of mechanoreceptors at its base. This cross-whisker interaction occurs pre-neuronally, through mechanical transmission of forces in the skin.
Collapse
|
9
|
Temporal binding across senses facilitates change detection within senses. J Vis 2019. [DOI: 10.1167/19.10.19a] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
|
10
|
Dissociating task acquisition from expression during learning reveals latent knowledge. Nat Commun 2019; 10:2151. [PMID: 31089133 PMCID: PMC6517418 DOI: 10.1038/s41467-019-10089-0] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2019] [Accepted: 04/07/2019] [Indexed: 11/30/2022] Open
Abstract
Performance on cognitive tasks during learning is used to measure knowledge, yet it remains controversial since such testing is susceptible to contextual factors. To what extent does performance during learning depend on the testing context, rather than underlying knowledge? We trained mice, rats and ferrets on a range of tasks to examine how testing context impacts the acquisition of knowledge versus its expression. We interleaved reinforced trials with probe trials in which we omitted reinforcement. Across tasks, each animal species performed remarkably better in probe trials during learning and inter-animal variability was strikingly reduced. Reinforcement feedback is thus critical for learning-related behavioral improvements but, paradoxically masks the expression of underlying knowledge. We capture these results with a network model in which learning occurs during reinforced trials while context modulates only the read-out parameters. Probing learning by omitting reinforcement thus uncovers latent knowledge and identifies context- not “smartness”- as the major source of individual variability. Performance is generally used as a metric to assay whether an animal has learnt a particular perceptual task. Here the authors demonstrate that in the context of probe trials without the possibility of reward, animals perform the correct instrumental response suggesting a latent knowledge of the task much before it is manifest in their performance.
Collapse
|
11
|
Multi-scale mapping along the auditory hierarchy using high-resolution functional UltraSound in the awake ferret. eLife 2018; 7:35028. [PMID: 29952750 PMCID: PMC6039176 DOI: 10.7554/elife.35028] [Citation(s) in RCA: 30] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2018] [Accepted: 06/16/2018] [Indexed: 12/22/2022] Open
Abstract
A major challenge in neuroscience is to longitudinally monitor whole brain activity across multiple spatial scales in the same animal. Functional UltraSound (fUS) is an emerging technology that offers images of cerebral blood volume over large brain portions. Here we show for the first time its capability to resolve the functional organization of sensory systems at multiple scales in awake animals, both within small structures by precisely mapping and differentiating sensory responses, and between structures by elucidating the connectivity scheme of top-down projections. We demonstrate that fUS provides stable (over days), yet rapid, highly-resolved 3D tonotopic maps in the auditory pathway of awake ferrets, thus revealing its unprecedented functional resolution (100/300µm). This was performed in four different brain regions, including very small (1–2 mm3 size), deeply situated subcortical (8 mm deep) and previously undescribed structures in the ferret. Furthermore, we used fUS to map long-distance projections from frontal cortex, a key source of sensory response modulation, to auditory cortex.
Collapse
|
12
|
Evidence Integration in Natural Acoustic Textures during Active and Passive Listening. eNeuro 2018; 5:ENEURO.0090-18.2018. [PMID: 29662943 PMCID: PMC5898696 DOI: 10.1523/eneuro.0090-18.2018] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2018] [Revised: 03/15/2018] [Accepted: 03/20/2018] [Indexed: 11/21/2022] Open
Abstract
Many natural sounds can be well described on a statistical level, for example, wind, rain, or applause. Even though the spectro-temporal profile of these acoustic textures is highly dynamic, changes in their statistics are indicative of relevant changes in the environment. Here, we investigated the neural representation of change detection in natural textures in humans, and specifically addressed whether active task engagement is required for the neural representation of this change in statistics. Subjects listened to natural textures whose spectro-temporal statistics were modified at variable times by a variable amount. Subjects were instructed to either report the detection of changes (active) or to passively listen to the stimuli. A subset of passive subjects had performed the active task before (passive-aware vs passive-naive). Psychophysically, longer exposure to pre-change statistics was correlated with faster reaction times and better discrimination performance. EEG recordings revealed that the build-up rate and size of parieto-occipital (PO) potentials reflected change size and change time. Reduced effects were observed in the passive conditions. While P2 responses were comparable across conditions, slope and height of PO potentials scaled with task involvement. Neural source localization identified a parietal source as the main contributor of change-specific potentials, in addition to more limited contributions from auditory and frontal sources. In summary, the detection of statistical changes in natural acoustic textures is predominantly reflected in parietal locations both on the skull and source level. The scaling in magnitude across different levels of task involvement suggests a context-dependent degree of evidence integration.
Collapse
|
13
|
Abstract
Natural sounds such as wind or rain, are characterized by the statistical occurrence of their constituents. Despite their complexity, listeners readily detect changes in these contexts. We here address the neural basis of statistical decision-making using a combination of psychophysics, EEG and modelling. In a texture-based, change-detection paradigm, human performance and reaction times improved with longer pre-change exposure, consistent with improved estimation of baseline statistics. Change-locked and decision-related EEG responses were found in a centro-parietal scalp location, whose slope depended on change size, consistent with sensory evidence accumulation. The potential's amplitude scaled with the duration of pre-change exposure, suggesting a time-dependent decision threshold. Auditory cortex-related potentials showed no response to the change. A dual timescale, statistical estimation model accounted for subjects' performance. Furthermore, a decision-augmented auditory cortex model accounted for performance and reaction times, suggesting that the primary cortical representation requires little post-processing to enable change-detection in complex acoustic environments. DOI:http://dx.doi.org/10.7554/eLife.24910.001
Collapse
|
14
|
Whisker Contact Detection of Rodents Based on Slow and Fast Mechanical Inputs. Front Behav Neurosci 2017; 10:251. [PMID: 28119582 PMCID: PMC5222834 DOI: 10.3389/fnbeh.2016.00251] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/11/2016] [Accepted: 12/23/2016] [Indexed: 11/13/2022] Open
Abstract
Rodents use their whiskers to locate nearby objects with an extreme precision. To perform such tasks, they need to detect whisker/object contacts with a high temporal accuracy. This contact detection is conveyed by classes of mechanoreceptors whose neural activity is sensitive to either slow or fast time varying mechanical stresses acting at the base of the whiskers. We developed a biomimetic approach to separate and characterize slow quasi-static and fast vibrational stress signals acting on a whisker base in realistic exploratory phases, using experiments on both real and artificial whiskers. Both slow and fast mechanical inputs are successfully captured using a mechanical model of the whisker. We present and discuss consequences of the whisking process in purely mechanical terms and hypothesize that free whisking in air sets a mechanical threshold for contact detection. The time resolution and robustness of the contact detection strategies based on either slow or fast stress signals are determined. Contact detection based on the vibrational signal is faster and more robust to exploratory conditions than the slow quasi-static component, although both slow/fast components allow localizing the object.
Collapse
|
15
|
Change Detection in Auditory Textures. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2016; 894:229-239. [PMID: 27080663 DOI: 10.1007/978-3-319-25474-6_24] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
Many natural sounds have spectrotemporal signatures only on a statistical level, e.g. wind, fire or rain. While their local structure is highly variable, the spectrotemporal statistics of these auditory textures can be used for recognition. This suggests the existence of a neural representation of these statistics. To explore their encoding, we investigated the detectability of changes in the spectral statistics in relation to the properties of the change. To achieve precise parameter control, we designed a minimal sound texture--a modified cloud of tones--which retains the central property of auditory textures: solely statistical predictability. Listeners had to rapidly detect a change in the frequency marginal probability of the tone cloud occurring at a random time.The size of change as well as the time available to sample the original statistics were found to correlate positively with performance and negatively with reaction time, suggesting the accumulation of noisy evidence. In summary we quantified dynamic aspects of change detection in statistically defined contexts, and found evidence of integration of statistical information.
Collapse
|
16
|
Whisker encoding of mechanical events during active tactile exploration. Front Behav Neurosci 2012; 6:74. [PMID: 23133410 PMCID: PMC3490139 DOI: 10.3389/fnbeh.2012.00074] [Citation(s) in RCA: 42] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2012] [Accepted: 10/18/2012] [Indexed: 11/23/2022] Open
Abstract
Rats use their whiskers to extract a wealth of information about their immediate environment, such as the shape, position or texture of an object. The information is conveyed to mechanoreceptors located within the whisker follicle in the form of a sequence of whisker deflections induced by the whisker/object contact interaction. How the whiskers filter and shape the mechanical information and effectively participate in the coding of tactile features remains an open question to date. In the present article, a biomechanical model was developed that provides predictions of the whisker dynamics during active tactile exploration, amenable to quantitative experimental comparison. This model is based on a decomposition of the whisker profile into a slow, quasi-static sequence and rapid resonant small-scale vibrations. It was applied to the typical situation of a rat actively whisking across a solid object. Having derived the quasi-static sequence of whisker deformation, the resonant properties of the whisker were analyzed, taking into account the boundary conditions imposed by the whisker/surface contact. We then focused on two elementary mechanical events that are expected to trigger significant neural responses, namely (1) the whisker/object first contact and (2) the whisker detachment from the object. Both events were found to trigger a deflection wave propagating upward to the mystacial pad at constant velocity of ≈3–5 m/s. This yielded a characteristic mechanical signature at the whisker base, in the form of a large peak of negative curvature occurring ≈4 ms after the event has been triggered. The dependence in amplitude and lag of this mechanical signal with the main contextual parameters (such as radial or angular distance) was investigated. The model was validated experimentally by comparing its predictions to high-speed video recordings of shock-induced whisker deflections performed on anesthetized rats. The consequences of these results on possible tactile encoding schemes are briefly discussed.
Collapse
|