1
|
Nahhas MK, Türp JC, Cattin P, Gerig N, Wilhelm E, Rauter G. Toward Wearables for Bruxism Detection: Voluntary Oral Behaviors Sound Recorded Across the Head Depend on Transducer Placement. Clin Exp Dent Res 2024; 10:e70001. [PMID: 39308130 PMCID: PMC11417139 DOI: 10.1002/cre2.70001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2023] [Revised: 07/15/2024] [Accepted: 08/08/2024] [Indexed: 09/26/2024] Open
Abstract
OBJECTIVES Bruxism is a parafunctional orofacial behavior. For diagnosis, wearable devices that use sounds as biomarkers can be applied to provide the necessary information. Human beings emit various verbal and nonverbal sounds, making it challenging to identify bruxism-induced sounds. We wanted to investigate whether the acoustic emissions of different oral behaviors have distinctive characteristics and if the placement of the transducer has an impact on recording the sound signals. MATERIAL AND METHODS Sounds from five oral behaviors were investigated: jaw clenching, teeth grinding, reading, eating, and drinking. Eight transducers were used; six were attached to the temporal, frontal, and zygomatic bones with the aid of medical tape, and two were integrated into two commercial earphones. The data from 15 participants were analyzed using time-domain energy, spectral flux, and zero crossing rate (ZCR). RESULTS In summary, all oral behaviors showed distinct characteristic features except jaw clenching, though there was a peak in the recording, possibly due to tooth tapping, before its expected onset. For teeth grinding, the transducer placement did not have a significant impact (p > 0.05) based on energy, spectral flux, and ZCR. For jaw clenching, the transducer placement had an impact with regard to spectral flux (p < 0.01). For reading and eating, the transducer placement had a significant impact with regard to energy (p < 0.05 for reading, p < 0.01 for eating), spectral flux (p < 0.001 for reading, p < 0.01 for eating), and ZCR (p < 0.001 for both reading and eating). For drinking, the transducer placement only had a significant impact with regard to ZCR (p < 0.01). CONCLUSIONS We were able to record the sounds of various oral behaviors from different locations on the head. However, the ears were an advantageous location to place the transducer, since they could compensate for various head movements and ear devices are socially tolerable.
Collapse
Affiliation(s)
- Mohammad Khair Nahhas
- BIROMED‐Lab, Department of Biomedical EngineeringUniversity of BaselAllschwilSwitzerland
| | - Jens Christoph Türp
- Division of Temporomandibular Disorders and Orofacial Pain, Department of Oral Health and MedicineUniversity Center for Dental Medicine Basel UZBBaselSwitzerland
| | - Philippe Cattin
- CIAN, Department of Biomedical EngineeringUniversity of BaselAllschwilSwitzerland
| | - Nicolas Gerig
- BIROMED‐Lab, Department of Biomedical EngineeringUniversity of BaselAllschwilSwitzerland
| | - Elisabeth Wilhelm
- Discrete Technology and Production Automation, Engineering and Technology Institute Groningen, Faculty of Science and EngineeringUniversity of GroningenGroningenThe Netherlands
| | - Georg Rauter
- BIROMED‐Lab, Department of Biomedical EngineeringUniversity of BaselAllschwilSwitzerland
| |
Collapse
|
2
|
Pérez-Valenzuela C, Vicencio-Jiménez S, Caballero M, Delano PH, Elgueda D. Wireless electrocochleography in awake chinchillas: A model to study crossmodal modulations at the peripheral level. Hear Res 2024; 451:109093. [PMID: 39094370 DOI: 10.1016/j.heares.2024.109093] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/01/2023] [Revised: 07/07/2024] [Accepted: 07/25/2024] [Indexed: 08/04/2024]
Abstract
The discovery and development of electrocochleography (ECochG) in animal models has been fundamental for its implementation in clinical audiology and neurotology. In our laboratory, the use of round-window ECochG recordings in chinchillas has allowed a better understanding of auditory efferent functioning. In previous works, we gave evidence of the corticofugal modulation of auditory-nerve and cochlear responses during visual attention and working memory. However, whether these cognitive top-down mechanisms to the most peripheral structures of the auditory pathway are also active during audiovisual crossmodal stimulation is unknown. Here, we introduce a new technique, wireless ECochG to record compound-action potentials of the auditory nerve (CAP), cochlear microphonics (CM), and round-window noise (RWN) in awake chinchillas during a paradigm of crossmodal (visual and auditory) stimulation. We compared ECochG data obtained from four awake chinchillas recorded with a wireless ECochG system with wired ECochG recordings from six anesthetized animals. Although ECochG experiments with the wireless system had a lower signal-to-noise ratio than wired recordings, their quality was sufficient to compare ECochG potentials in awake crossmodal conditions. We found non-significant differences in CAP and CM amplitudes in response to audiovisual stimulation compared to auditory stimulation alone (clicks and tones). On the other hand, spontaneous auditory-nerve activity (RWN) was modulated by visual crossmodal stimulation, suggesting that visual crossmodal simulation can modulate spontaneous but not evoked auditory-nerve activity. However, given the limited sample of 10 animals (4 wireless and 6 wired), these results should be interpreted cautiously. Future experiments are required to substantiate these conclusions. In addition, we introduce the use of wireless ECochG in animal models as a useful tool for translational research.
Collapse
Affiliation(s)
| | - Sergio Vicencio-Jiménez
- Departamento de Neurociencia, Facultad de Medicina, Universidad de Chile, Santiago, Chile; Johns Hopkins School of Medicine, Otolaryngology-Head and Neck Surgery Department, Baltimore, MD 21231, USA; Biomedical Neuroscience Institute, Facultad de Medicina, Universidad de Chile, Santiago, Chile
| | - Mia Caballero
- Departamento de Neurociencia, Facultad de Medicina, Universidad de Chile, Santiago, Chile
| | - Paul H Delano
- Departamento de Neurociencia, Facultad de Medicina, Universidad de Chile, Santiago, Chile; Servicio Otorrinolaringología, Hospital Clínico de la Universidad de Chile, Santiago, Chile; Centro Avanzado de Ingeniería Eléctrica y Electrónica, AC3E, Universidad Técnica Federico Santa María, Valparaíso, Chile; Biomedical Neuroscience Institute, Facultad de Medicina, Universidad de Chile, Santiago, Chile
| | - Diego Elgueda
- Departamento de Patología Animal, Facultad de Ciencias Veterinarias y Pecuarias, Universidad de Chile 8820808, Santiago, Chile.
| |
Collapse
|
3
|
Bidelman GM, Sisson A, Rizzi R, MacLean J, Baer K. Myogenic artifacts masquerade as neuroplasticity in the auditory frequency-following response. Front Neurosci 2024; 18:1422903. [PMID: 39040631 PMCID: PMC11260751 DOI: 10.3389/fnins.2024.1422903] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/24/2024] [Accepted: 06/24/2024] [Indexed: 07/24/2024] Open
Abstract
The frequency-following response (FFR) is an evoked potential that provides a neural index of complex sound encoding in the brain. FFRs have been widely used to characterize speech and music processing, experience-dependent neuroplasticity (e.g., learning and musicianship), and biomarkers for hearing and language-based disorders that distort receptive communication abilities. It is widely assumed that FFRs stem from a mixture of phase-locked neurogenic activity from the brainstem and cortical structures along the hearing neuraxis. In this study, we challenge this prevailing view by demonstrating that upwards of ~50% of the FFR can originate from an unexpected myogenic source: contamination from the postauricular muscle (PAM) vestigial startle reflex. We measured PAM, transient auditory brainstem responses (ABRs), and sustained frequency-following response (FFR) potentials reflecting myogenic (PAM) and neurogenic (ABR/FFR) responses in young, normal-hearing listeners with varying degrees of musical training. We first establish that PAM artifact is present in all ears, varies with electrode proximity to the muscle, and can be experimentally manipulated by directing listeners' eye gaze toward the ear of sound stimulation. We then show this muscular noise easily confounds auditory FFRs, spuriously amplifying responses 3-4-fold with tandem PAM contraction and even explaining putative FFR enhancements observed in highly skilled musicians. Our findings expose a new and unrecognized myogenic source to the FFR that drives its large inter-subject variability and cast doubt on whether changes in the response typically attributed to neuroplasticity/pathology are solely of brain origin.
Collapse
Affiliation(s)
- Gavin M. Bidelman
- Department of Speech, Language and Hearing Sciences, Indiana University, Bloomington, IN, United States
- Program in Neuroscience, Indiana University, Bloomington, IN, United States
- Cognitive Science Program, Indiana University, Bloomington, IN, United States
| | - Alexandria Sisson
- Department of Speech, Language and Hearing Sciences, Indiana University, Bloomington, IN, United States
| | - Rose Rizzi
- Department of Speech, Language and Hearing Sciences, Indiana University, Bloomington, IN, United States
- Program in Neuroscience, Indiana University, Bloomington, IN, United States
| | - Jessica MacLean
- Department of Speech, Language and Hearing Sciences, Indiana University, Bloomington, IN, United States
- Program in Neuroscience, Indiana University, Bloomington, IN, United States
| | - Kaitlin Baer
- School of Communication Sciences and Disorders, University of Memphis, Memphis, TN, United States
- Veterans Affairs Medical Center, Memphis, TN, United States
| |
Collapse
|
4
|
Prendergast G, Sathe TS, Heinrich A, Munro KJ. Acoustic reflexes: should we be paying more attention? Int J Audiol 2024; 63:221-225. [PMID: 36811451 DOI: 10.1080/14992027.2023.2174455] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2022] [Revised: 01/21/2023] [Accepted: 01/23/2023] [Indexed: 02/24/2023]
Abstract
OBJECTIVE The clinical audiology test battery often involves playing physically simple sounds with questionable ecological value to the listener. In this technical report, we revisit how valid this approach is using an automated, involuntary auditory response; the acoustic reflex threshold (ART). DESIGN The ART was estimated four times in each individual in a quasi-random ordering of task conditions. The baseline condition (referred to as Neutral) measured the ART following a standard clinical practice. Three experimental conditions were then used in which a secondary task was performed whilst the reflex was measured: auditory attention, auditory distraction and visual distraction tasks. STUDY SAMPLE Thirty-eight participants (27 males) with a mean age of 23 years were tested. All participants were audiometrically healthy. RESULTS The ART was elevated when a visual task was performed at the same time as the measurements were taken. Performing an auditory task did not affect the ART. CONCLUSIONS These data indicate that simple audiometric measures widely used in the clinic, can be affected by central, non-auditory processes even in healthy, normal-hearing volunteers. The role of cognition and attention on auditory responses will become ever more important in the coming years.
Collapse
Affiliation(s)
- Garreth Prendergast
- Manchester Centre for Audiology and Deafness (ManCAD), School of Health Sciences, The University of Manchester, Manchester, UK
| | - Tanvi S Sathe
- Manchester Centre for Audiology and Deafness (ManCAD), School of Health Sciences, The University of Manchester, Manchester, UK
| | - Antje Heinrich
- Manchester Centre for Audiology and Deafness (ManCAD), School of Health Sciences, The University of Manchester, Manchester, UK
| | - Kevin J Munro
- Manchester Centre for Audiology and Deafness (ManCAD), School of Health Sciences, The University of Manchester, Manchester, UK
- Manchester Academic Health Science Centre, Manchester University Hospitals NHS Foundation Trust, Manchester, UK
| |
Collapse
|
5
|
Hu J, Vetter P. How the eyes respond to sounds. Ann N Y Acad Sci 2024; 1532:18-36. [PMID: 38152040 DOI: 10.1111/nyas.15093] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/29/2023]
Abstract
Eye movements have been extensively studied with respect to visual stimulation. However, we live in a multisensory world, and how the eyes are driven by other senses has been explored much less. Here, we review the evidence on how audition can trigger and drive different eye responses and which cortical and subcortical neural correlates are involved. We provide an overview on how different types of sounds, from simple tones and noise bursts to spatially localized sounds and complex linguistic stimuli, influence saccades, microsaccades, smooth pursuit, pupil dilation, and eye blinks. The reviewed evidence reveals how the auditory system interacts with the oculomotor system, both behaviorally and neurally, and how this differs from visually driven eye responses. Some evidence points to multisensory interaction, and potential multisensory integration, but the underlying computational and neural mechanisms are still unclear. While there are marked differences in how the eyes respond to auditory compared to visual stimuli, many aspects of auditory-evoked eye responses remain underexplored, and we summarize the key open questions for future research.
Collapse
Affiliation(s)
- Junchao Hu
- Visual and Cognitive Neuroscience Lab, Department of Psychology, University of Fribourg, Fribourg, Switzerland
| | - Petra Vetter
- Visual and Cognitive Neuroscience Lab, Department of Psychology, University of Fribourg, Fribourg, Switzerland
| |
Collapse
|
6
|
King CD, Lovich SN, Murphy DL, Landrum R, Kaylie D, Shera CA, Groh JM. Individual similarities and differences in eye-movement-related eardrum oscillations (EMREOs). Hear Res 2023; 440:108899. [PMID: 37979436 PMCID: PMC11081086 DOI: 10.1016/j.heares.2023.108899] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/03/2023] [Revised: 10/18/2023] [Accepted: 10/23/2023] [Indexed: 11/20/2023]
Abstract
We recently discovered a unique type of otoacoustic emission (OAE) time-locked to the onset (and offset) of saccadic eye movements and occurring in the absence of external sound (Gruters et al., 2018). How and why these eye-movement-related eardrum oscillations (EMREOs) are generated is unknown, with a role in visual-auditory integration being the likeliest candidate. Clues to both the drivers of EMREOs and their purpose can be gleaned by examining responses in normal hearing human subjects. Do EMREOs occur in all individuals with normal hearing? If so, what components of the response occur most consistently? Understanding which attributes of EMREOs are similar across participants and which show more variability will provide the groundwork for future comparisons with individuals with hearing abnormalities affecting the ear's various motor components. Here we report that in subjects with normal hearing thresholds and normal middle ear function, all ears show (a) measurable EMREOs (mean: 58.7 dB SPL; range 45-67 dB SPL for large contralateral saccades), (b) a phase reversal for contra- versus ipsilaterally-directed saccades, (c) a large peak in the signal occurring soon after saccade onset, (d) an additional large peak time-locked to saccade offset and (e) evidence that saccade duration is encoded in the signal. We interpret the attributes of EMREOs that are most consistent across subjects as the ones that are most likely to play an essential role in their function. The individual differences likely reflect normal variation in individuals' auditory system anatomy and physiology, much like traditional measures of auditory function such as auditory-evoked OAEs, tympanometry and auditory-evoked potentials. Future work will compare subjects with different types of auditory dysfunction to population data from normal hearing subjects. Overall, these findings provide important context for the widespread observations of visual- and eye-movement related signals found in cortical and subcortical auditory areas of the brain.
Collapse
Affiliation(s)
- Cynthia D King
- Department of Neurobiology, Duke University Medical Center, Durham, NC, USA; Department of Psychology and Neuroscience, Duke University, Durham, NC, USA; Center for Cognitive Neuroscience, Duke University, Durham, NC, USA; Duke Institute for Brain Sciences, Duke University, Durham, NC, USA.
| | - Stephanie N Lovich
- Department of Neurobiology, Duke University Medical Center, Durham, NC, USA; Department of Psychology and Neuroscience, Duke University, Durham, NC, USA; Center for Cognitive Neuroscience, Duke University, Durham, NC, USA; Duke Institute for Brain Sciences, Duke University, Durham, NC, USA
| | - David Lk Murphy
- Department of Psychology and Neuroscience, Duke University, Durham, NC, USA; Center for Cognitive Neuroscience, Duke University, Durham, NC, USA; Duke Institute for Brain Sciences, Duke University, Durham, NC, USA
| | - Rachel Landrum
- Department of Neurobiology, Duke University Medical Center, Durham, NC, USA; Department of Psychology and Neuroscience, Duke University, Durham, NC, USA; Center for Cognitive Neuroscience, Duke University, Durham, NC, USA; Duke Institute for Brain Sciences, Duke University, Durham, NC, USA
| | - David Kaylie
- Department of Otolaryngology, Duke University Medical Center, Durham, NC, USA
| | - Christopher A Shera
- Department of Otolaryngology, University of Southern California, Los Angeles, CA, USA
| | - Jennifer M Groh
- Department of Neurobiology, Duke University Medical Center, Durham, NC, USA; Department of Psychology and Neuroscience, Duke University, Durham, NC, USA; Center for Cognitive Neuroscience, Duke University, Durham, NC, USA; Duke Institute for Brain Sciences, Duke University, Durham, NC, USA; Department of Computer Science, Duke University, Durham, NC, USA; Department of Biomedical Engineering, Duke University, Durham, NC, USA
| |
Collapse
|
7
|
Lovich SN, King CD, Murphy DLK, Landrum RE, Shera CA, Groh JM. Parametric information about eye movements is sent to the ears. Proc Natl Acad Sci U S A 2023; 120:e2303562120. [PMID: 37988462 PMCID: PMC10691342 DOI: 10.1073/pnas.2303562120] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2023] [Accepted: 09/28/2023] [Indexed: 11/23/2023] Open
Abstract
Eye movements alter the relationship between the visual and auditory spatial scenes. Signals related to eye movements affect neural pathways from the ear through auditory cortex and beyond, but how these signals contribute to computing the locations of sounds with respect to the visual scene is poorly understood. Here, we evaluated the information contained in eye movement-related eardrum oscillations (EMREOs), pressure changes recorded in the ear canal that occur in conjunction with simultaneous eye movements. We show that EMREOs contain parametric information about horizontal and vertical eye displacement as well as initial/final eye position with respect to the head. The parametric information in the horizontal and vertical directions can be modeled as combining linearly, allowing accurate prediction of the EMREOs associated with oblique (diagonal) eye movements. Target location can also be inferred from the EMREO signals recorded during eye movements to those targets. We hypothesize that the (currently unknown) mechanism underlying EMREOs could impose a two-dimensional eye-movement-related transfer function on any incoming sound, permitting subsequent processing stages to compute the positions of sounds in relation to the visual scene.
Collapse
Affiliation(s)
- Stephanie N. Lovich
- Department of Psychology and Neuroscience, Duke University, Durham, NC27708
- Department of Neurobiology, Duke University, Durham, NC27710
- Center for Cognitive Neuroscience, Duke University, Durham, NC27708
- Duke Institute for Brain Sciences, Duke University, Durham, NC27708
| | - Cynthia D. King
- Department of Psychology and Neuroscience, Duke University, Durham, NC27708
- Department of Neurobiology, Duke University, Durham, NC27710
- Center for Cognitive Neuroscience, Duke University, Durham, NC27708
- Duke Institute for Brain Sciences, Duke University, Durham, NC27708
| | - David L. K. Murphy
- Department of Psychology and Neuroscience, Duke University, Durham, NC27708
- Center for Cognitive Neuroscience, Duke University, Durham, NC27708
- Duke Institute for Brain Sciences, Duke University, Durham, NC27708
| | - Rachel E. Landrum
- Department of Psychology and Neuroscience, Duke University, Durham, NC27708
- Department of Neurobiology, Duke University, Durham, NC27710
- Center for Cognitive Neuroscience, Duke University, Durham, NC27708
- Duke Institute for Brain Sciences, Duke University, Durham, NC27708
| | - Christopher A. Shera
- Department of Otolaryngology, University of Southern California, Los Angeles, CA90007
| | - Jennifer M. Groh
- Department of Psychology and Neuroscience, Duke University, Durham, NC27708
- Department of Neurobiology, Duke University, Durham, NC27710
- Center for Cognitive Neuroscience, Duke University, Durham, NC27708
- Duke Institute for Brain Sciences, Duke University, Durham, NC27708
- Department of Computer Science, Duke University, Durham, NC27708
- Department of Biomedical Engineering, Duke University, Durham, NC27708
| |
Collapse
|
8
|
Bröhl F, Kayser C. Detection of Spatially Localized Sounds Is Robust to Saccades and Concurrent Eye Movement-Related Eardrum Oscillations (EMREOs). J Neurosci 2023; 43:7668-7677. [PMID: 37734948 PMCID: PMC10634546 DOI: 10.1523/jneurosci.0818-23.2023] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2023] [Revised: 07/27/2023] [Accepted: 08/31/2023] [Indexed: 09/23/2023] Open
Abstract
Hearing is an active process, and recent studies show that even the ear is affected by cognitive states or motor actions. One example are movements of the eardrum induced by saccadic eye movements, known as "eye movement-related eardrum oscillations" (EMREOs). While these are systematically shaped by the direction and size of saccades, the consequences of saccadic eye movements and their resulting EMREOs for hearing remain unclear. We here studied their implications for the detection of near-threshold clicks in human participants. Across three experiments, sound detection was not affected by their time of presentation relative to saccade onset, by saccade amplitude or direction. While the EMREOs were shaped by the direction and amplitude of the saccadic movement, inducing covert shifts in spatial attention did not affect the EMREO, suggesting that this signature of active sensing is restricted to overt changes in visual focus. Importantly, in our experiments, fluctuations in the EMREO amplitude were not related to detection performance, at least when monaural cues are sufficient. Hence, while eye movements may shape the transduction of acoustic information, the behavioral implications remain to be understood.SIGNIFICANCE STATEMENT Previous studies suggest that oculomotor behavior may influence how we perceive spatially localized sounds. Recent work has introduced a new perspective on this question by showing that eye movements can directly modulate the eardrum. Yet, it remains unclear whether this signature of active hearing accounts for behavioral effects. We here show that overt but not covert changes in visual attention modulate the eardrum, but these modulations do not interfere with the detection of sounds. Our results provide a starting point to obtain a deeper understanding about the interplay of oculomotor behavior and the active ear.
Collapse
Affiliation(s)
- Felix Bröhl
- Department for Cognitive Neuroscience, Faculty of Biology, Bielefeld University, 33615 Bielefeld, Germany
- Center for Cognitive Interaction Technology, Bielefeld University, 33615 Bielefeld, Germany
| | - Christoph Kayser
- Department for Cognitive Neuroscience, Faculty of Biology, Bielefeld University, 33615 Bielefeld, Germany
- Center for Cognitive Interaction Technology, Bielefeld University, 33615 Bielefeld, Germany
| |
Collapse
|
9
|
Lovich SN, King CD, Murphy DLK, Abbasi H, Bruns P, Shera CA, Groh JM. Conserved features of eye movement related eardrum oscillations (EMREOs) across humans and monkeys. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220340. [PMID: 37545299 PMCID: PMC10404921 DOI: 10.1098/rstb.2022.0340] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2023] [Accepted: 05/23/2023] [Indexed: 08/08/2023] Open
Abstract
Auditory and visual information involve different coordinate systems, with auditory spatial cues anchored to the head and visual spatial cues anchored to the eyes. Information about eye movements is therefore critical for reconciling visual and auditory spatial signals. The recent discovery of eye movement-related eardrum oscillations (EMREOs) suggests that this process could begin as early as the auditory periphery. How this reconciliation might happen remains poorly understood. Because humans and monkeys both have mobile eyes and therefore both must perform this shift of reference frames, comparison of the EMREO across species can provide insights to shared and therefore important parameters of the signal. Here we show that rhesus monkeys, like humans, have a consistent, significant EMREO signal that carries parametric information about eye displacement as well as onset times of eye movements. The dependence of the EMREO on the horizontal displacement of the eye is its most consistent feature, and is shared across behavioural tasks, subjects and species. Differences chiefly involve the waveform frequency (higher in monkeys than in humans) and patterns of individual variation (more prominent in monkeys than in humans), and the waveform of the EMREO when factors due to horizontal and vertical eye displacements were controlled for. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
Affiliation(s)
- Stephanie N. Lovich
- Department of Psychology and Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Department of Neurobiology, Duke University, Durham, NC 27708-0187, USA
- Center for Cognitive Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Duke Institute for Brain Sciences, Duke University, Durham, NC 27708-0187, USA
| | - Cynthia D. King
- Department of Psychology and Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Department of Neurobiology, Duke University, Durham, NC 27708-0187, USA
- Center for Cognitive Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Duke Institute for Brain Sciences, Duke University, Durham, NC 27708-0187, USA
| | - David L. K. Murphy
- Department of Psychology and Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Center for Cognitive Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Duke Institute for Brain Sciences, Duke University, Durham, NC 27708-0187, USA
- Department of Psychiatry and Behavioral Sciences, Duke University, Durham, NC 27708-0187, USA
| | - Hossein Abbasi
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Hamburg 20146, Germany
| | - Patrick Bruns
- Biological Psychology and Neuropsychology, University of Hamburg, Hamburg, Hamburg 20146, Germany
| | - Christopher A. Shera
- Department of Otolaryngology, University of Southern California, Los Angeles, CA 90033, USA
| | - Jennifer M. Groh
- Department of Psychology and Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Department of Neurobiology, Duke University, Durham, NC 27708-0187, USA
- Center for Cognitive Neuroscience, Duke University, Durham, NC 27708-0187, USA
- Duke Institute for Brain Sciences, Duke University, Durham, NC 27708-0187, USA
- Department of Computer Science, Duke University, Durham, NC 27708-0187, USA
- Department of Biomedical Engineering, Duke University, Durham, NC 27708-0187, USA
| |
Collapse
|
10
|
Fetsch CR, Noppeney U. How the brain controls decision making in a multisensory world. Philos Trans R Soc Lond B Biol Sci 2023; 378:20220332. [PMID: 37545306 PMCID: PMC10404917 DOI: 10.1098/rstb.2022.0332] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2023] [Accepted: 07/11/2023] [Indexed: 08/08/2023] Open
Abstract
Sensory systems evolved to provide the organism with information about the environment to guide adaptive behaviour. Neuroscientists and psychologists have traditionally considered each sense independently, a legacy of Aristotle and a natural consequence of their distinct physical and anatomical bases. However, from the point of view of the organism, perception and sensorimotor behaviour are fundamentally multi-modal; after all, each modality provides complementary information about the same world. Classic studies revealed much about where and how sensory signals are combined to improve performance, but these tended to treat multisensory integration as a static, passive, bottom-up process. It has become increasingly clear how this approach falls short, ignoring the interplay between perception and action, the temporal dynamics of the decision process and the many ways by which the brain can exert top-down control of integration. The goal of this issue is to highlight recent advances on these higher order aspects of multisensory processing, which together constitute a mainstay of our understanding of complex, natural behaviour and its neural basis. This article is part of the theme issue 'Decision and control processes in multisensory perception'.
Collapse
Affiliation(s)
- Christopher R. Fetsch
- Solomon H. Snyder Department of Neuroscience, Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, MD 21218, USA
| | - Uta Noppeney
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, 6525 EN Nijmegen, Netherlands
| |
Collapse
|
11
|
King CD, Lovich SN, Murphy DLK, Landrum R, Kaylie D, Shera CA, Groh JM. Individual similarities and differences in eye-movement-related eardrum oscillations (EMREOs). BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.03.09.531896. [PMID: 36945521 PMCID: PMC10028987 DOI: 10.1101/2023.03.09.531896] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/12/2023]
Abstract
We recently discovered a unique type of low-frequency otoacoustic emission (OAE) time-locked to the onset (and offset) of saccadic eye movements and occurring in the absence of external sound (Gruters et al., 2018). How and why these eye-movement-related eardrum oscillations (EMREOs) are generated is unknown, with a role in visual-auditory integration being the likeliest candidate. Clues to both the drivers of EMREOs and their purpose can be gleaned by examining responses in normal hearing human subjects. Do EMREOs occur in all individuals with normal hearing? If so, what components of the response occur most consistently? Understanding which attributes of EMREOs are similar across participants and which show more variability will provide the groundwork for future comparisons with individuals with hearing abnormalities affecting the ear's various motor components. Here we report that in subjects with normal hearing thresholds and normal middle ear function, all ears show (a) measurable EMREOs (mean: 58.7 dB SPL; range 45-67 dB SPL for large contralateral saccades), (b) a phase reversal for contra- versus ipsilaterally-directed saccades, (c) a large peak in the signal occurring soon after saccade onset, (d) an additional large peak time-locked to saccade offset and (e) evidence that saccade duration is encoded in the signal. We interpret the attributes of EMREOs that are most consistent across subjects as the ones that are most likely to play an essential role in their function. The individual differences likely reflect normal variation in individuals' auditory system anatomy and physiology, much like traditional measures of auditory function such as auditory-evoked OAEs, tympanometry and auditory-evoked potentials. Future work will compare subjects with different types of auditory dysfunction to population data from normal hearing subjects. Overall, these findings provide important context for the widespread observations of visual- and eye-movement related signals found in cortical and subcortical auditory areas of the brain.
Collapse
|
12
|
Lovich SN, King CD, Murphy DLK, Abbasi H, Bruns P, Shera CA, Groh J. Conserved features of eye movement related eardrum oscillations (EMREOs) across humans and monkeys. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.03.08.531768. [PMID: 36945629 PMCID: PMC10028923 DOI: 10.1101/2023.03.08.531768] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 03/12/2023]
Abstract
Auditory and visual information involve different coordinate systems, with auditory spatial cues anchored to the head and visual spatial cues anchored to the eyes. Information about eye movements is therefore critical for reconciling visual and auditory spatial signals. The recent discovery of eye movement-related eardrum oscillations (EMREOs) suggests that this process could begin as early as the auditory periphery. How this reconciliation might happen remains poorly understood. Because humans and monkeys both have mobile eyes and therefore both must perform this shift of reference frames, comparison of the EMREO across species can provide insights to shared and therefore important parameters of the signal. Here we show that rhesus monkeys, like humans, have a consistent, significant EMREO signal that carries parametric information about eye displacement as well as onset times of eye movements. The dependence of the EMREO on the horizontal displacement of the eye is its most consistent feature, and is shared across behavioral tasks, subjects, and species. Differences chiefly involve the waveform frequency (higher in monkeys than in humans) and patterns of individual variation (more prominent in monkeys than humans), and the waveform of the EMREO when factors due to horizontal and vertical eye displacements were controlled for.
Collapse
|
13
|
Leszczynski M, Bickel S, Nentwich M, Russ BE, Parra L, Lakatos P, Mehta A, Schroeder CE. Saccadic modulation of neural excitability in auditory areas of the neocortex. Curr Biol 2023; 33:1185-1195.e6. [PMID: 36863343 PMCID: PMC10424710 DOI: 10.1016/j.cub.2023.02.018] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2022] [Revised: 10/25/2022] [Accepted: 02/03/2023] [Indexed: 03/04/2023]
Abstract
In natural "active" vision, humans and other primates use eye movements (saccades) to sample bits of information from visual scenes. In the visual cortex, non-retinal signals linked to saccades shift visual cortical neurons into a high excitability state as each saccade ends. The extent of this saccadic modulation outside of the visual system is unknown. Here, we show that during natural viewing, saccades modulate excitability in numerous auditory cortical areas with a temporal pattern complementary to that seen in visual areas. Control somatosensory cortical recordings indicate that the temporal pattern is unique to auditory areas. Bidirectional functional connectivity patterns suggest that these effects may arise from regions involved in saccade generation. We propose that by using saccadic signals to yoke excitability states in auditory areas to those in visual areas, the brain can improve information processing in complex natural settings.
Collapse
Affiliation(s)
- Marcin Leszczynski
- Departments of Psychiatry and Neurology, Columbia University College of Physicians and Surgeons, New York, NY 10032, USA; Translational Neuroscience Lab Division, Center for Biomedical Imaging and Neuromodulation, Nathan Kline Institute, Orangeburg, NY 10962, USA; Cognitive Science Department, Institute of Philosophy, Jagiellonian University, Krakow 31-007, Poland.
| | - Stephan Bickel
- Translational Neuroscience Lab Division, Center for Biomedical Imaging and Neuromodulation, Nathan Kline Institute, Orangeburg, NY 10962, USA; The Feinstein Institutes for Medical Research, Northwell Health, Manhasset, NY 11030, USA; Departments of Neurosurgery and Neurology, Zucker School of Medicine at Hofstra/Northwell, Manhasset, NY 11549, USA
| | - Maximilian Nentwich
- Biomedical Engineering Department, City College, CUNY, New York, NY 10031, USA
| | - Brian E Russ
- Translational Neuroscience Lab Division, Center for Biomedical Imaging and Neuromodulation, Nathan Kline Institute, Orangeburg, NY 10962, USA; Nash Family Department of Neuroscience and Friedman Brain Institute, Icahn School of Medicine at Mount Sinai, New York, NY 10029, USA; Department of Psychiatry, New York University at Langone, New York, NY 10016, USA
| | - Lucas Parra
- Biomedical Engineering Department, City College, CUNY, New York, NY 10031, USA
| | - Peter Lakatos
- Translational Neuroscience Lab Division, Center for Biomedical Imaging and Neuromodulation, Nathan Kline Institute, Orangeburg, NY 10962, USA; Department of Psychiatry, New York University at Langone, New York, NY 10016, USA
| | - Ashesh Mehta
- The Feinstein Institutes for Medical Research, Northwell Health, Manhasset, NY 11030, USA; Departments of Neurosurgery and Neurology, Zucker School of Medicine at Hofstra/Northwell, Manhasset, NY 11549, USA
| | - Charles E Schroeder
- Departments of Psychiatry and Neurology, Columbia University College of Physicians and Surgeons, New York, NY 10032, USA; Translational Neuroscience Lab Division, Center for Biomedical Imaging and Neuromodulation, Nathan Kline Institute, Orangeburg, NY 10962, USA.
| |
Collapse
|
14
|
Köhler MHA, Weisz N. Cochlear Theta Activity Oscillates in Phase Opposition during Interaural Attention. J Cogn Neurosci 2023; 35:588-602. [PMID: 36626349 DOI: 10.1162/jocn_a_01959] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/11/2023]
Abstract
It is widely established that sensory perception is a rhythmic process as opposed to a continuous one. In the context of auditory perception, this effect is only established on a cortical and behavioral level. Yet, the unique architecture of the auditory sensory system allows its primary sensory cortex to modulate the processes of its sensory receptors at the cochlear level. Previously, we could demonstrate the existence of a genuine cochlear theta (∼6-Hz) rhythm that is modulated in amplitude by intermodal selective attention. As the study's paradigm was not suited to assess attentional effects on the oscillatory phase of cochlear activity, the question of whether attention can also affect the temporal organization of the cochlea's ongoing activity remained open. The present study utilizes an interaural attention paradigm to investigate ongoing otoacoustic activity during a stimulus-free cue-target interval and an omission period of the auditory target in humans. We were able to replicate the existence of the cochlear theta rhythm. Importantly, we found significant phase opposition between the two ears and attention conditions of anticipatory as well as cochlear oscillatory activity during target presentation. Yet, the amplitude was unaffected by interaural attention. These results are the first to demonstrate that intermodal and interaural attention deploy different aspects of excitation and inhibition at the first level of auditory processing. Whereas intermodal attention modulates the level of cochlear activity, interaural attention modulates the timing.
Collapse
Affiliation(s)
| | - Nathan Weisz
- University of Salzburg.,Paracelsus Medical University, Salzburg, Austria
| |
Collapse
|
15
|
Popov T, Gips B, Weisz N, Jensen O. Brain areas associated with visual spatial attention display topographic organization during auditory spatial attention. Cereb Cortex 2023; 33:3478-3489. [PMID: 35972419 PMCID: PMC10068281 DOI: 10.1093/cercor/bhac285] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2022] [Revised: 07/02/2022] [Accepted: 07/05/2022] [Indexed: 11/12/2022] Open
Abstract
Spatially selective modulation of alpha power (8-14 Hz) is a robust finding in electrophysiological studies of visual attention, and has been recently generalized to auditory spatial attention. This modulation pattern is interpreted as reflecting a top-down mechanism for suppressing distracting input from unattended directions of sound origin. The present study on auditory spatial attention extends this interpretation by demonstrating that alpha power modulation is closely linked to oculomotor action. We designed an auditory paradigm in which participants were required to attend to upcoming sounds from one of 24 loudspeakers arranged in a circular array around the head. Maintaining the location of an auditory cue was associated with a topographically modulated distribution of posterior alpha power resembling the findings known from visual attention. Multivariate analyses allowed the prediction of the sound location in the horizontal plane. Importantly, this prediction was also possible, when derived from signals capturing saccadic activity. A control experiment on auditory spatial attention confirmed that, in absence of any visual/auditory input, lateralization of alpha power is linked to the lateralized direction of gaze. Attending to an auditory target engages oculomotor and visual cortical areas in a topographic manner akin to the retinotopic organization associated with visual attention.
Collapse
Affiliation(s)
- Tzvetan Popov
- Methods of Plasticity Research, Department of Psychology, University of Zurich, 1-80502-784644-50205-B15 2TT, Zurich, Switzerland
- Department of Psychology, University of Konstanz, Konstanz, Germany
| | - Bart Gips
- NATO Science and Technology Organization Centre for Maritime Research and Experimentation (CMRE) La Spezia, La Spezia 19126, Italy
| | - Nathan Weisz
- Centre for Cognitive Neuroscience and Department of Psychology, University of Salzburg, Salzburg, Austria
| | - Ole Jensen
- School of Psychology, University of Birmingham, Birmingham, UK
| |
Collapse
|
16
|
Human middle-ear muscle pulls change tympanic-membrane shape and low-frequency middle-ear transmission magnitudes and delays. Hear Res 2023; 430:108721. [PMID: 36821982 DOI: 10.1016/j.heares.2023.108721] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/05/2022] [Revised: 01/27/2023] [Accepted: 02/09/2023] [Indexed: 02/13/2023]
Abstract
The three-bone flexible ossicular chain in mammals may allow independent alterations of middle-ear (ME) sound transmission via its two attached muscles, for both acoustic and non-acoustic stimuli. The tensor tympani (TT) muscle, which has its insertion on the malleus neck, is thought to increase tension of the tympanic membrane (TM). The stapedius (St) muscle, which has its insertion on the stapes posterior crus, is known to stiffen the stapes annular ligament. We produced ME changes in human cadaveric temporal bones by statically pulling on the TT and St muscles. The 3D static TM shape and sound-induced umbo motions from 20 Hz to 10 kHz were measured with optical coherence tomography (OCT); stapes motion was measured using laser-Doppler vibrometry (LDV). TT pulls made the TM shape more conical and moved the umbo medially, while St pulls moved the umbo laterally. In response to sound below about 1 kHz, stapes-velocity magnitudes generally decreased by about 10 dB due to TT pulls and 5 dB due to St pulls. In the 250 to 500 Hz region, the group delay calculated from stapes-velocity phase showed a decrease in transmission delay of about 150 µs by TT pulls and 60 µs by St pulls. Our interpretation of these results is that ME-muscle activity may provide a way of mechanically changing interaural time- and level-difference cues. These effects could help the brain align head-centered auditory and ocular-centered visual representations of the environment.
Collapse
|
17
|
Schroeer A, Andersen MR, Rank ML, Hannemann R, Petersen EB, Rønne FM, Strauss DJ, Corona-Strauss FI. Assessment of Vestigial Auriculomotor Activity to Acoustic Stimuli Using Electrodes In and Around the Ear. Trends Hear 2023; 27:23312165231200158. [PMID: 37830146 PMCID: PMC10588413 DOI: 10.1177/23312165231200158] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/17/2023] [Revised: 08/16/2023] [Accepted: 08/23/2023] [Indexed: 10/14/2023] Open
Abstract
Recently, it has been demonstrated that electromyographic (EMG) activity of auricular muscles in humans, especially the postauricular muscle (PAM), depends on the spatial location of auditory stimuli. This observation has only been shown using wet electrodes placed directly on auricular muscles. To move towards a more applied, out-of-the-laboratory setting, this study aims to investigate if similar results can be obtained using electrodes placed in custom-fitted earpieces. Furthermore, with the exception of the ground electrode, only dry-contact electrodes were used to record EMG signals, which require little to no skin preparation and can therefore be applied extremely fast. In two experiments, auditory stimuli were presented to ten participants from different spatial directions. In experiment 1, stimuli were rapid onset naturalistic stimuli presented in silence, and in experiment 2, the corresponding participant's first name, presented in a "cocktail party" environment. In both experiments, ipsilateral responses were significantly larger than contralateral responses. Furthermore, machine learning models objectively decoded the direction of stimuli significantly above chance level on a single trial basis (PAM: ≈ 80%, in-ear: ≈ 69%). There were no significant differences when participants repeated the experiments after several weeks. This study provides evidence that auricular muscle responses can be recorded reliably using an almost entirely dry-contact in-ear electrode system. The location of the PAM, and the fact that in-ear electrodes can record comparable signals, would make hearing aids interesting devices to record these auricular EMG signals and potentially utilize them as control signals in the future.
Collapse
Affiliation(s)
- Andreas Schroeer
- Systems Neuroscience and Neurotechnology Unit, Faculty of Medicine, Saarland University and School of Engineering, htw saar, Homburg/Saar, Germany
- Center for Digital Neurotechnologies Saar, Homburg/Saar, Germany
| | | | | | | | - Eline Borch Petersen
- WS Audiology AS, Erlangen, Germany
- Scientific Audiology Department, WS Audiology AS, Lynge, Denmark
| | - Filip Marchman Rønne
- WS Audiology AS, Erlangen, Germany
- Scientific Audiology Department, WS Audiology AS, Lynge, Denmark
| | - Daniel J. Strauss
- Systems Neuroscience and Neurotechnology Unit, Faculty of Medicine, Saarland University and School of Engineering, htw saar, Homburg/Saar, Germany
- Center for Digital Neurotechnologies Saar, Homburg/Saar, Germany
- Key Numerics – Neurocognitive Technolgies GmbH, Saarbruecken, Germany
| | - Farah I. Corona-Strauss
- Systems Neuroscience and Neurotechnology Unit, Faculty of Medicine, Saarland University and School of Engineering, htw saar, Homburg/Saar, Germany
- Center for Digital Neurotechnologies Saar, Homburg/Saar, Germany
- Key Numerics – Neurocognitive Technolgies GmbH, Saarbruecken, Germany
| |
Collapse
|
18
|
The Readiness Potential Correlates with Action-Linked Modulation of Visual Accuracy. eNeuro 2022; 9:ENEURO.0085-22.2022. [PMID: 36351819 PMCID: PMC9698660 DOI: 10.1523/eneuro.0085-22.2022] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2022] [Revised: 10/18/2022] [Accepted: 10/24/2022] [Indexed: 11/11/2022] Open
Abstract
Visual accuracy is consistently shown to be modulated around the time of the action execution. The neural underpinning of this motor-induced modulation of visual perception is still unclear. Here, we investigate with EEG whether it is related to the readiness potential, an event-related potential (ERP) linked to motor preparation. Across 18 human participants, the magnitude of visual modulation following a voluntary button press was found to correlate with the readiness potential amplitude measured during visual discrimination. Participants' amplitude of the readiness potential in a purely motor-task was also found to correlate with the extent of the motor-induced modulation of visual perception in the visuomotor task. These results provide strong evidence that perceptual changes close to action execution are associated with motor preparation processes and that this mechanism is independent of task contingencies. Further, our findings suggest that the readiness potential provides a fingerprint of individual visuomotor interaction.
Collapse
|
19
|
Artiran S, Ravisankar R, Luo S, Chukoskie L, Cosman P. Measuring Social Modulation of Gaze in Autism Spectrum Condition With Virtual Reality Interviews. IEEE Trans Neural Syst Rehabil Eng 2022; 30:2373-2384. [PMID: 35969548 DOI: 10.1109/tnsre.2022.3198933] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Gaze behavior in dyadic conversations can indicate active listening and attention. However, gaze behavior that is different from the engagement expected during neurotypical social interaction cues may be interpreted as uninterested or inattentive, which can be problematic in both personal and professional situations. Neurodivergent individuals, such as those with autism spectrum conditions, often exhibit social communication differences broadly including via gaze behavior. This project aims to support situational social gaze practice through a virtual reality (VR) mock job interview practice using the HTC Vive Pro Eye VR headset. We show how gaze behavior varies in the mock job interview between neurodivergent and neurotypical participants. We also investigate the social modulation of gaze behavior based on conversational role (speaking and listening). Our three main contributions are: (i) a system for fully-automatic analysis of social modulation of gaze behavior using a portable VR headset with a novel realistic mock job interview, (ii) a signal processing pipeline, which employs Kalman filtering and spatial-temporal density-based clustering techniques, that can improve the accuracy of the headset's built-in eye-tracker, and (iii) being the first to investigate social modulation of gaze behavior among neurotypical/divergent individuals in the realm of immersive VR.
Collapse
|
20
|
Tasko SM, Deiters KK, Flamme GA, Smith MV, Murphy WJ, Jones HG, Greene NT, Ahroon WA. Effects of unilateral eye closure on middle ear muscle contractions. Hear Res 2022; 424:108594. [DOI: 10.1016/j.heares.2022.108594] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/19/2020] [Revised: 06/21/2022] [Accepted: 07/26/2022] [Indexed: 11/04/2022]
|
21
|
Comparing Vestibular Responses to Linear and Angular Whole-Body Accelerations in Real and Immersive Environments. Ann Biomed Eng 2022; 50:575-586. [PMID: 35325362 DOI: 10.1007/s10439-022-02947-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2021] [Accepted: 03/09/2022] [Indexed: 11/01/2022]
Abstract
The vestibular end organs differ in terms of anatomical and physiological characteristics. Sensory modalities' stimuli including visual stimuli and vestibular sensation can influence these organs differently. This paper explores differences between vestibular responses to axial tilts in physical and virtual environments. Four passive whole-body movements (linear: up-down, and angular: yaw, pitch, and roll) were applied to twenty-seven healthy participants once using a hydraulic chair (physical) and once visually using a head-mounted display (virtual). Electrovestibulography (EVestG) was used as the outcome measure to investigate the magnitude of vestibular-response-change in both ears for physical and virtual stimuli. Three features including average action potential (AP) area, AP amplitude, and mean detected firing rate change were used as indices of response. The results show that for both physical and virtual stimuli (1) generally the pitch and roll tilts produce the largest EVestG changes compared to other tilts (2) roll and pitch tilt responses are not significantly different from each other and (3) right side and left side roll tilts' responses are not significantly different. The findings indicate although visually- and physically-induced vestibular responses are different in terms of afferent activity, visual stimuli can still result in distinct responses when exposed to different axial tilts.
Collapse
|
22
|
Sou KL, Say A, Xu H. Unity Assumption in Audiovisual Emotion Perception. Front Neurosci 2022; 16:782318. [PMID: 35310087 PMCID: PMC8931414 DOI: 10.3389/fnins.2022.782318] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2021] [Accepted: 02/09/2022] [Indexed: 11/29/2022] Open
Abstract
We experience various sensory stimuli every day. How does this integration occur? What are the inherent mechanisms in this integration? The “unity assumption” proposes a perceiver’s belief of unity in individual unisensory information to modulate the degree of multisensory integration. However, this has yet to be verified or quantified in the context of semantic emotion integration. In the present study, we investigate the ability of subjects to judge the intensities and degrees of similarity in faces and voices of two emotions (angry and happy). We found more similar stimulus intensities to be associated with stronger likelihoods of the face and voice being integrated. More interestingly, multisensory integration in emotion perception was observed to follow a Gaussian distribution as a function of the emotion intensity difference between the face and voice—the optimal cut-off at about 2.50 points difference on a 7-point Likert scale. This provides a quantitative estimation of the multisensory integration function in audio-visual semantic emotion perception with regards to stimulus intensity. Moreover, to investigate the variation of multisensory integration across the population, we examined the effects of personality and autistic traits of participants. Here, we found no correlation of autistic traits with unisensory processing in a nonclinical population. Our findings shed light on the current understanding of multisensory integration mechanisms.
Collapse
Affiliation(s)
- Ka Lon Sou
- Psychology, School of Social Sciences, Nanyang Technological University, Singapore, Singapore
- Humanities, Arts and Social Sciences, Singapore University of Technology and Design, Singapore, Singapore
| | - Ashley Say
- Psychology, School of Social Sciences, Nanyang Technological University, Singapore, Singapore
| | - Hong Xu
- Psychology, School of Social Sciences, Nanyang Technological University, Singapore, Singapore
- *Correspondence: Hong Xu,
| |
Collapse
|
23
|
Tsai TI, Dlugaiczyk J, Bardins S, Huppert D, Brandt T, Wuehr M. Physiological oculo-auricular-facial-mandibular synkinesis elicited in humans by gaze deviations. J Neurophysiol 2022; 127:984-994. [PMID: 35235436 DOI: 10.1152/jn.00199.2021] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Integrated motor behaviors involving ocular motion-associated movements of the head, neck, pinna, and parts of the face are commonly seen in animals orienting to a visual target. A number of coordinated movements have also been observed in humans making rapid gaze shifts to horizontal extremes, which may be vestiges of these. Since such integrated mechanisms point to a non-pathological co-activation of several anatomically separate cranial circuits in humans, it is important to see how the different pairs of integrative motor behaviors with a common trigger (i.e., ocular motion) manifest in relation to one another. Here, we systematically examined the pattern of eye movement-induced recruitment of multiple cranial muscles in humans. Simultaneous video-oculography and bilateral surface electromyograms of transverse auricular, temporalis, frontalis, and masseter muscles were recorded in 15 healthy subjects (8 females; 29.3±5.2 years) while they made head-fixed, horizontal saccadic, pursuit and optokinetic eye movements. Potential chin laterotrusion linked to contractions of masticator muscles was captured with a yaw-fixed accelerometer. Our findings objectively show an orchestrated aural-facial-masticatory muscle response to a range of horizontal eye movements (prevalence of 21-93%). These responses were most prominent during eccentric saccades. We further reveal distinctions between the various observed activation patterns in terms of their profile (transient or sustained), laterality (with respect to direction of gaze) and timing (with respect to saccade onset). Possible underlying neural substrates, their atavistic behavioral significance, and potential clinical applications for monitoring sensory attention and designing attention-directed hearing aids in the future are discussed.
Collapse
Affiliation(s)
- Tina I Tsai
- German Center for Vertigo and Balance Disorders, University Hospital, Ludwig-Maximilians-University Munich, Munich, Germany
| | - Julia Dlugaiczyk
- German Center for Vertigo and Balance Disorders, University Hospital, Ludwig-Maximilians-University Munich, Munich, Germany.,Department of Otorhinolaryngology, Head and Neck Surgery, University Hospital Zurich 9 (USZ), University of Zurich, Switzerland
| | - Stanislav Bardins
- German Center for Vertigo and Balance Disorders, University Hospital, Ludwig-Maximilians-University Munich, Munich, Germany
| | - Doreen Huppert
- German Center for Vertigo and Balance Disorders, University Hospital, Ludwig-Maximilians-University Munich, Munich, Germany.,Department of Neurology, University Hospital, Ludwig-Maximilians-University Munich, Munich, Germany
| | - Thomas Brandt
- German Center for Vertigo and Balance Disorders, University Hospital, Ludwig-Maximilians-University Munich, Munich, Germany
| | - Max Wuehr
- German Center for Vertigo and Balance Disorders, University Hospital, Ludwig-Maximilians-University Munich, Munich, Germany
| |
Collapse
|
24
|
Shera CA. Whistling While it Works: Spontaneous Otoacoustic Emissions and the Cochlear Amplifier. J Assoc Res Otolaryngol 2022; 23:17-25. [PMID: 34981262 PMCID: PMC8782959 DOI: 10.1007/s10162-021-00829-9] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2021] [Accepted: 12/01/2021] [Indexed: 02/03/2023] Open
Abstract
Perhaps the most striking evidence for active processes operating within the inner ears of mammals and non-mammals alike is their ability to spontaneously produce sound. Predicted by Thomas Gold in 1948, some 30 years prior to their discovery, the narrow-band sounds now known as spontaneous otoacoustic emissions (SOAEs) remain incompletely understood, their origins controversial. Without a single equation in the main text, we review the essential concepts underlying the "local-" and "global-oscillator" frameworks for understanding SOAE generation. Comparing their key assumptions and predictions, we relate the two frameworks to unresolved questions about the biophysical mechanisms of cochlear amplification.
Collapse
Affiliation(s)
- Christopher A Shera
- Caruso Department of Otolaryngology and Department of Physics & Astronomy, University of Southern California, California, Los Angeles, 90033, USA.
| |
Collapse
|
25
|
Delayed Auditory Brainstem Responses (ABR) in children after sight-recovery. Neuropsychologia 2021; 163:108089. [PMID: 34801518 DOI: 10.1016/j.neuropsychologia.2021.108089] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2021] [Revised: 10/29/2021] [Accepted: 11/15/2021] [Indexed: 01/25/2023]
Abstract
Studies in non-human animal models have revealed that in early development, the onset of visual input gates the critical period closure of some auditory functions. The study of rare individuals whose sight was restored after a period of congenital blindness offers the rare opportunity to assess whether early visual input is a prerequisite for the full development of auditory functions in humans as well. Here, we investigated whether a few months of delayed visual onset would affect the development of Auditory Brainstem Responses (ABRs). ABRs are widely used in the clinical practice to assess both functionality and development of the subcortical auditory pathway and, provide reliable data at the individual level. We collected Auditory Brainstem Responses from two case studies, young children (both having less than 5 years of age) who experienced a transient visual deprivation since birth due to congenital bilateral dense cataracts (BC), and who acquired sight at about two months of age. As controls, we tested 41 children (sighted controls, SC) with typical development, as well as two children who were treated (at about two months of age) for congenital monocular cataracts (MC). The SC group data served to predict, at the individual level, wave latencies of each BC and MC participant. Statistics were performed both at the single subject as well as at the group levels on latencies of main ABR waves (I, III, V and SN10). Results revealed delayed response latencies for both BC children compared with the SC group starting from the wave III. Conversely, no difference emerged between MC children and the SC group. These findings suggest that in case the onset of patterned visual input is delayed, the functional development of the subcortical auditory pathway lags behind typical developmental trajectories. Ultimately results are in favor of the presence of a crossmodal sensitive period in the human subcortical auditory system.
Collapse
|
26
|
Thompson-Bell J, Martin A, Hobkinson C. ‘Unusual ingredients’: Developing a cross-domain model for multisensory artistic practice linking food and music. INTERNATIONAL JOURNAL OF FOOD DESIGN 2021. [DOI: 10.1386/ijfd_00032_1] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/02/2022] Open
Abstract
This article explores linkages between sensory experiences of food and music in light of recent research from gastrophysics, 4E cognition (i.e. embodied, embedded, extended and enactive) and ecological perception theory. Drawing on these research disciplines, this article outlines a
model for multisensory artistic practice, and a taxonomy of cross-domain creative strategies, based on the identification of sensory affordances between the domains of food and music. Food objects are shown to ‘afford’ cross-domain interrelationships with sound stimuli based on
our capacity to sense their material characteristics, and to make sense of them through prior experience and contextual association. We propose that multisensory artistic works can themselves afford extended forms of sensory awareness by synthesizing and mediating stimuli across the selected
domains, in order to form novel, or unexpected sensory linkages. These ideas are explored with reference to an ongoing artistic research project entitled ‘Unusual ingredients’, creating new music to complement and enhance the characteristics of selected food.
Collapse
Affiliation(s)
| | | | - Caroline Hobkinson
- Independent Artist and Fellow 0000000404244934Royal Anthropological Institute
| |
Collapse
|
27
|
Abstract
Coordination between different sensory systems is a necessary element of sensory processing. Where and how signals from different sense organs converge onto common neural circuitry have become topics of increasing interest in recent years. In this article, we focus specifically on visual-auditory interactions in areas of the mammalian brain that are commonly considered to be auditory in function. The auditory cortex and inferior colliculus are two key points of entry where visual signals reach the auditory pathway, and both contain visual- and/or eye movement-related signals in humans and other animals. The visual signals observed in these auditory structures reflect a mixture of visual modulation of auditory-evoked activity and visually driven responses that are selective for stimulus location or features. These key response attributes also appear in the classic visual pathway but may play a different role in the auditory pathway: to modify auditory rather than visual perception. Finally, while this review focuses on two particular areas of the auditory pathway where this question has been studied, robust descending as well as ascending connections within this pathway suggest that undiscovered visual signals may be present at other stages as well. Expected final online publication date for the Annual Review of Vision Science, Volume 7 is September 2021. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.
Collapse
Affiliation(s)
- Meredith N Schmehl
- Department of Neurobiology, Duke University, Durham, North Carolina 27708, USA; , .,Center for Cognitive Neuroscience, Duke University, Durham, North Carolina 27708, USA.,Duke Institute for Brain Sciences, Duke University, Durham, North Carolina 27708, USA
| | - Jennifer M Groh
- Department of Neurobiology, Duke University, Durham, North Carolina 27708, USA; , .,Department of Psychology & Neuroscience, Duke University, Durham, North Carolina 27708, USA.,Department of Computer Science, Duke University, Durham, North Carolina 27708, USA.,Department of Biomedical Engineering, Duke University, Durham, North Carolina 27708, USA.,Center for Cognitive Neuroscience, Duke University, Durham, North Carolina 27708, USA.,Duke Institute for Brain Sciences, Duke University, Durham, North Carolina 27708, USA
| |
Collapse
|
28
|
Gallagher L, Diop M, Olson ES. Time-domain and frequency-domain effects of tensor tympani contraction on middle ear sound transmission in gerbil. Hear Res 2021; 405:108231. [PMID: 33915400 PMCID: PMC8113157 DOI: 10.1016/j.heares.2021.108231] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/12/2021] [Revised: 03/17/2021] [Accepted: 03/20/2021] [Indexed: 11/18/2022]
Abstract
The middle ear is a high-fidelity, broadband impedance transformer that transmits acoustic stimuli at the eardrum to the inner ear. It is home to the two smallest muscles in mammalian species, which modulate middle ear transmission. Of this pair, the function of the tensor tympani muscle (TTM) has remained obscure. We investigated the acoustic effects of this muscle in young adult gerbils. We measured changes in middle ear vibration produced by pulse-train-elicited TTM contraction - in the time-domain with a click stimulus and in the frequency-domain with multitone zwuis stimuli. In our click experiments, there was generally a small reduction in the primary peak of the response and a slight increase in the subsequent ringing, but there was essentially no change in the delay of the click response at the umbo (less than 1 µs change). In our multitone experiments, there were consistent patterns of attenuation and enhancement in the velocity responses at the umbo and ossicles. TTM contraction produced a narrow band of enhancement around 6 kHz (maximally ~5 dB) that can be modeled with an increased stiffness of an overdamped spring-mass resonance. At frequencies below 2 kHz and above 35 kHz, TTM contraction attenuated middle ear vibrations by as much as fivefold.
Collapse
Affiliation(s)
- Liam Gallagher
- OTO/HNS, Columbia University Medical Center, New York, United States
| | - Mohamed Diop
- OTO/HNS, Columbia University Medical Center, New York, United States
| | - Elizabeth S Olson
- OTO/HNS and BME, Columbia University, 630 W 168th street, New York, NY 10032 United States.
| |
Collapse
|
29
|
Bell A, Jedrzejczak WW. Muscles in and around the ear as the source of "physiological noise" during auditory selective attention: A review and novel synthesis. Eur J Neurosci 2021; 53:2726-2739. [PMID: 33484588 DOI: 10.1111/ejn.15122] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2020] [Accepted: 01/17/2021] [Indexed: 12/01/2022]
Abstract
The sensitivity of the auditory system is regulated via two major efferent pathways: the medial olivocochlear system that connects to the outer hair cells, and by the middle ear muscles-the tensor tympani and stapedius. The role of the former system in suppressing otoacoustic emissions has been extensively studied, but that of the complementary network has not. In studies of selective attention, decreases in otoacoustic emissions from contralateral stimulation have been ascribed to the medial olivocochlear system, but the acknowledged problem is that the results can be confounded by parallel muscle activity. Here, the potential role of the muscle system is examined through a wide but not exhaustive review of the selective attention literature, and the unifying hypothesis is made that the prominent "physiological noise" detected in such experiments, which is reduced during attention, is the sound produced by the muscles in proximity to the ear-including the middle ear muscles. All muscles produce low-frequency sound during contraction, but the implications for selective attention experiments-in which muscles near the ear are likely to be active-have not been adequately considered. This review and synthesis suggests that selective attention may reduce physiological noise in the ear canal by reducing the activity of muscles close to the ear. Indeed, such an experiment has already been done, but the significance of its findings have not been widely appreciated. Further sets of experiments are needed in this area.
Collapse
Affiliation(s)
- Andrew Bell
- Eccles Institute of Neuroscience, John Curtin School of Medical Research, Australian National University, Canberra, ACT, Australia
| | | |
Collapse
|
30
|
Mohl JT, Pearson JM, Groh JM. Monkeys and humans implement causal inference to simultaneously localize auditory and visual stimuli. J Neurophysiol 2020; 124:715-727. [PMID: 32727263 DOI: 10.1152/jn.00046.2020] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/01/2023] Open
Abstract
The environment is sampled by multiple senses, which are woven together to produce a unified perceptual state. However, optimally unifying such signals requires assigning particular signals to the same or different underlying objects or events. Many prior studies (especially in animals) have assumed fusion of cross-modal information, whereas recent work in humans has begun to probe the appropriateness of this assumption. Here we present results from a novel behavioral task in which both monkeys (Macaca mulatta) and humans localized visual and auditory stimuli and reported their perceived sources through saccadic eye movements. When the locations of visual and auditory stimuli were widely separated, subjects made two saccades, while when the two stimuli were presented at the same location they made only a single saccade. Intermediate levels of separation produced mixed response patterns: a single saccade to an intermediate position on some trials or separate saccades to both locations on others. The distribution of responses was well described by a hierarchical causal inference model that accurately predicted both the explicit "same vs. different" source judgments as well as biases in localization of the source(s) under each of these conditions. The results from this task are broadly consistent with prior work in humans across a wide variety of analogous tasks, extending the study of multisensory causal inference to nonhuman primates and to a natural behavioral task with both a categorical assay of the number of perceived sources and a continuous report of the perceived position of the stimuli.NEW & NOTEWORTHY We developed a novel behavioral paradigm for the study of multisensory causal inference in both humans and monkeys and found that both species make causal judgments in the same Bayes-optimal fashion. To our knowledge, this is the first demonstration of behavioral causal inference in animals, and this cross-species comparison lays the groundwork for future experiments using neuronal recording techniques that are impractical or impossible in human subjects.
Collapse
Affiliation(s)
- Jeff T Mohl
- Duke Institute for Brain Sciences, Duke University, Durham, North Carolina.,Center for Cognitive Neuroscience, Duke University, Durham, North Carolina.,Department of Neurobiology, Duke University, Durham, North Carolina
| | - John M Pearson
- Duke Institute for Brain Sciences, Duke University, Durham, North Carolina.,Center for Cognitive Neuroscience, Duke University, Durham, North Carolina.,Department of Neurobiology, Duke University, Durham, North Carolina.,Department of Psychology and Neuroscience, Duke University, Durham, North Carolina.,Department of Biostatistics and Bioinformatics, Duke University Medical School, Durham, North Carolina
| | - Jennifer M Groh
- Duke Institute for Brain Sciences, Duke University, Durham, North Carolina.,Center for Cognitive Neuroscience, Duke University, Durham, North Carolina.,Department of Neurobiology, Duke University, Durham, North Carolina.,Department of Psychology and Neuroscience, Duke University, Durham, North Carolina
| |
Collapse
|
31
|
Battal C, Occelli V, Bertonati G, Falagiarda F, Collignon O. General Enhancement of Spatial Hearing in Congenitally Blind People. Psychol Sci 2020; 31:1129-1139. [PMID: 32846109 DOI: 10.1177/0956797620935584] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
Vision is thought to support the development of spatial abilities in the other senses. If this is true, how does spatial hearing develop in people lacking visual experience? We comprehensively addressed this question by investigating auditory-localization abilities in 17 congenitally blind and 17 sighted individuals using a psychophysical minimum-audible-angle task that lacked sensorimotor confounds. Participants were asked to compare the relative position of two sound sources located in central and peripheral, horizontal and vertical, or frontal and rear spaces. We observed unequivocal enhancement of spatial-hearing abilities in congenitally blind people, irrespective of the field of space that was assessed. Our results conclusively demonstrate that visual experience is not a prerequisite for developing optimal spatial-hearing abilities and that, in striking contrast, the lack of vision leads to a general enhancement of auditory-spatial skills.
Collapse
Affiliation(s)
- Ceren Battal
- Institute for Research in Psychology, Institute of Neuroscience, Université Catholique de Louvain.,Center for Mind/Brain Sciences, University of Trento
| | | | | | - Federica Falagiarda
- Institute for Research in Psychology, Institute of Neuroscience, Université Catholique de Louvain
| | - Olivier Collignon
- Institute for Research in Psychology, Institute of Neuroscience, Université Catholique de Louvain.,Center for Mind/Brain Sciences, University of Trento
| |
Collapse
|
32
|
O'Connell MN, Barczak A, McGinnis T, Mackin K, Mowery T, Schroeder CE, Lakatos P. The Role of Motor and Environmental Visual Rhythms in Structuring Auditory Cortical Excitability. iScience 2020; 23:101374. [PMID: 32738615 PMCID: PMC7394914 DOI: 10.1016/j.isci.2020.101374] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2020] [Revised: 06/14/2020] [Accepted: 07/13/2020] [Indexed: 10/26/2022] Open
Abstract
Previous studies indicate that motor sampling patterns modulate neuronal excitability in sensory brain regions by entraining brain rhythms, a process termed motor-initiated entrainment. In addition, rhythms of the external environment are also capable of entraining brain rhythms. Our first goal was to investigate the properties of motor-initiated entrainment in the auditory system using a prominent visual motor sampling pattern in primates, saccades. Second, we wanted to determine whether/how motor-initiated entrainment interacts with visual environmental entrainment. We examined laminar profiles of neuronal ensemble activity in primary auditory cortex and found that whereas motor-initiated entrainment has a suppressive effect, visual environmental entrainment has an enhancive effect. We also found that these processes are temporally coupled, and their temporal relationship ensures that their effect on excitability is complementary rather than interfering. Altogether, our results demonstrate that motor and sensory systems continuously interact in orchestrating the brain's context for the optimal sampling of our multisensory environment.
Collapse
Affiliation(s)
- Monica N O'Connell
- Translational Neuroscience Division, Center for Biomedical Imaging and Neuromodulation, Nathan S. Kline Institute for Psychiatric Research, Orangeburg, NY 10962, USA.
| | - Annamaria Barczak
- Translational Neuroscience Division, Center for Biomedical Imaging and Neuromodulation, Nathan S. Kline Institute for Psychiatric Research, Orangeburg, NY 10962, USA
| | - Tammy McGinnis
- Translational Neuroscience Division, Center for Biomedical Imaging and Neuromodulation, Nathan S. Kline Institute for Psychiatric Research, Orangeburg, NY 10962, USA
| | - Kieran Mackin
- Translational Neuroscience Division, Center for Biomedical Imaging and Neuromodulation, Nathan S. Kline Institute for Psychiatric Research, Orangeburg, NY 10962, USA
| | - Todd Mowery
- Center for Neural Science, New York University, 4 Washington Place, New York, NY 10003, USA
| | - Charles E Schroeder
- Translational Neuroscience Division, Center for Biomedical Imaging and Neuromodulation, Nathan S. Kline Institute for Psychiatric Research, Orangeburg, NY 10962, USA; Departments of Neurological Surgery and Psychiatry, Columbia University College of Physicians and Surgeons, New York, NY 10032, USA
| | - Peter Lakatos
- Translational Neuroscience Division, Center for Biomedical Imaging and Neuromodulation, Nathan S. Kline Institute for Psychiatric Research, Orangeburg, NY 10962, USA; Department of Psychiatry, New York University School of Medicine, New York, NY 10016, USA.
| |
Collapse
|
33
|
Strauss DJ, Corona-Strauss FI, Schroeer A, Flotho P, Hannemann R, Hackley SA. Vestigial auriculomotor activity indicates the direction of auditory attention in humans. eLife 2020; 9:54536. [PMID: 32618268 PMCID: PMC7334025 DOI: 10.7554/elife.54536] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/24/2020] [Accepted: 05/28/2020] [Indexed: 01/21/2023] Open
Abstract
Unlike dogs and cats, people do not point their ears as they focus attention on novel, salient, or task-relevant stimuli. Our species may nevertheless have retained a vestigial pinna-orienting system that has persisted as a 'neural fossil’ within in the brain for about 25 million years. Consistent with this hypothesis, we demonstrate that the direction of auditory attention is reflected in sustained electrical activity of muscles within the vestigial auriculomotor system. Surface electromyograms (EMGs) were taken from muscles that either move the pinna or alter its shape. To assess reflexive, stimulus-driven attention we presented novel sounds from speakers at four different lateral locations while the participants silently read a boring text in front of them. To test voluntary, goal-directed attention we instructed participants to listen to a short story coming from one of these speakers, while ignoring a competing story from the corresponding speaker on the opposite side. In both experiments, EMG recordings showed larger activity at the ear on the side of the attended stimulus, but with slightly different patterns. Upward movement (perking) differed according to the lateral focus of attention only during voluntary orienting; rearward folding of the pinna’s upper-lateral edge exhibited such differences only during reflexive orienting. The existence of a pinna-orienting system in humans, one that is experimentally accessible, offers opportunities for basic as well as applied science. Dogs, cats, monkeys and other animals perk their ears in the direction of sounds they are interested in. Humans and their closest ape relatives, however, appear to have lost this ability. Some humans are able to wiggle their ears, suggesting that some of the brain circuits and muscles that allow automatic ear movements towards sounds are still present. This may be a ‘vestigial feature’, an ability that is maintained even though it no longer serves its original purpose. Now, Strauss et al. show that vestigial movements of muscles around the ear indicate the direction of sounds a person is paying attention to. In the experiments, human volunteers tried to read a boring text while surprising sounds like a traffic jam, a baby crying, or footsteps played. During this exercise, Strauss et al. recorded the electrical activity in the muscles of their ears to see if they moved in response to the direction the sound came from. In a second set of experiments, the same electrical recordings were made as participants listened to a podcast while a second podcast was playing from a different direction. The individuals’ ears were also recorded using high resolution video. Both sets of experiments revealed tiny involuntary movements in muscles surrounding the ear closest to the direction of a sound the person is listening to. When the participants tried to listen to one podcast and tune out another, they also made ear ‘perking’ movements in the direction of their preferred podcast. The results suggest that movements of the vestigial muscles in the human ear indicate the direction of sounds a person is paying attention to. These tiny movements could be used to develop better hearing aids that sense the electrical activity in the ear muscles and amplify sounds the person is trying to focus on, while minimizing other sounds.
Collapse
Affiliation(s)
- Daniel J Strauss
- Systems Neuroscience and Neurotechnology Unit, Faculty of Medicine, Saarland University & School of Engineering, htw saar, Homburg/Saar, Germany
| | - Farah I Corona-Strauss
- Systems Neuroscience and Neurotechnology Unit, Faculty of Medicine, Saarland University & School of Engineering, htw saar, Homburg/Saar, Germany
| | - Andreas Schroeer
- Systems Neuroscience and Neurotechnology Unit, Faculty of Medicine, Saarland University & School of Engineering, htw saar, Homburg/Saar, Germany
| | - Philipp Flotho
- Systems Neuroscience and Neurotechnology Unit, Faculty of Medicine, Saarland University & School of Engineering, htw saar, Homburg/Saar, Germany
| | - Ronny Hannemann
- Audiological Research Unit, Sivantos GmbH, Erlangen, Germany
| | - Steven A Hackley
- Clinical and Cognitive Neuroscience Laboratory, Department of Psychological Sciences, University of Missouri, Columbia, United States
| |
Collapse
|
34
|
No intermodal interference effects of threatening information during concurrent audiovisual stimulation. Neuropsychologia 2019; 136:107283. [PMID: 31783079 DOI: 10.1016/j.neuropsychologia.2019.107283] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2019] [Revised: 11/05/2019] [Accepted: 11/24/2019] [Indexed: 11/24/2022]
Abstract
Changes in attention can result in sensory processing trade-off effects, in which sensory cortical responses to attended stimuli are heightened and responses to competing distractors are attenuated. However, it is unclear if competition or facilitation effects will be observed at the level of sensory cortex when attending to competing stimuli in two modalities. The present study used electroencephalogram (EEG) and frequency-tagging to quantitatively assess auditory-visual interactions during sustained multimodal sensory stimulation. The emotional content of a 6.66 Hz rapid serial visual presentation (RSVP) was manipulated to elicit well-established emotional attention effects, while a constant 63 dB tone with a 40.8 Hz modulation served as a concurrent auditory stimulus in two experiments. As a directed attention manipulation, participants were instructed to detect transient sound level events in the auditory stream in Experiment 1. To manipulate attention through threat anticipation, participants were instructed to expect an aversive noise burst after a higher 40.8 Hz modulated tone in Experiment 2. Each stimulus evoked reliable steady-state sensory cortical responses in all participants (n = 30) in both experiments. The visual cortical responses were modulated by the auditory detection task, but not by threat anticipation: Visual responses were smaller during auditory streams with a transient target as compared to uninterrupted auditory streams. Conversely, visual stimulus condition had no significant effects on auditory sensory cortical responses in either experiment. These results indicate that there is neither a competition nor facilitation effect of visual content on concurrent auditory sensory cortical processing. They further indicate that competition effects of auditory stream content on sustained visuocortical responses are limited to auditory target processing.
Collapse
|
35
|
Deiters KK, Flamme GA, Tasko SM, Murphy WJ, Greene NT, Jones HG, Ahroon WA. Generalizability of clinically measured acoustic reflexes to brief sounds. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2019; 146:3993. [PMID: 31795698 PMCID: PMC7043895 DOI: 10.1121/1.5132705] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/21/2019] [Revised: 08/20/2019] [Accepted: 08/22/2019] [Indexed: 05/08/2023]
Abstract
Middle ear muscle contractions (MEMC) can be elicited in response to high-level sounds, and have been used clinically as acoustic reflexes (ARs) during evaluations of auditory system integrity. The results of clinical AR evaluations do not necessarily generalize to different signal types or durations. The purpose of this study was to evaluate the likelihood of observing MEMC in response to brief sound stimuli (tones, recorded gunshots, noise) in adult participants (N = 190) exhibiting clinical ARs and excellent hearing sensitivity. Results revealed that the presence of clinical ARs was not a sufficient indication that listeners will also exhibit MEMC for brief sounds. Detection rates varied across stimulus types between approximately 20% and 80%. Probabilities of observing MEMC also differed by clinical AR magnitude and latency, and declined over the period of minutes during the course of the MEMC measurement series. These results provide no support for the inclusion of MEMC as a protective factor in damage-risk criteria for impulsive noises, and the limited predictability of whether a given individual will exhibit MEMC in response to a brief sound indicates a need to measure and control for MEMC in studies evaluating pharmaceutical interventions for hearing loss.
Collapse
Affiliation(s)
- Kristy K Deiters
- Stephenson and Stephenson Research and Consulting (SASRAC), Forest Grove, Oregon 97116, USA
| | - Gregory A Flamme
- Stephenson and Stephenson Research and Consulting (SASRAC), Forest Grove, Oregon 97116, USA
| | - Stephen M Tasko
- Stephenson and Stephenson Research and Consulting (SASRAC), Forest Grove, Oregon 97116, USA
| | - William J Murphy
- National Institute for Occupational Safety and Health (NIOSH), Cincinnati, Ohio 45226, USA
| | - Nathaniel T Greene
- United States (U.S.) Army Aeromedical Research Lab (USAARL), Fort Rucker, Alabama 36362, USA
| | - Heath G Jones
- United States (U.S.) Army Aeromedical Research Lab (USAARL), Fort Rucker, Alabama 36362, USA
| | - William A Ahroon
- United States (U.S.) Army Aeromedical Research Lab (USAARL), Fort Rucker, Alabama 36362, USA
| |
Collapse
|
36
|
Kaminski JA, Sterzer P, Mishara AL. "Seeing Rain": Integrating phenomenological and Bayesian predictive coding approaches to visual hallucinations and self-disturbances (Ichstörungen) in schizophrenia. Conscious Cogn 2019; 73:102757. [PMID: 31284176 DOI: 10.1016/j.concog.2019.05.005] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2018] [Revised: 05/10/2019] [Accepted: 05/17/2019] [Indexed: 01/01/2023]
Abstract
We present a schizophrenia patient who reports "seeing rain" with attendant somatosensory features which separate him from his surroundings. Because visual/multimodal hallucinations are understudied in schizophrenia, we examine a case history to determine the role of these hallucinations in self-disturbances (Ichstörungen). Developed by the early Heidelberg School, self-disturbances comprise two components: 1. The self experiences its own automatic processing as alien to self in a split-off, "doubled-I." 2. In "I-paralysis," the disruption to automatic processing is now outside the self in omnipotent agents. Self-disturbances (as indicated by visual/multimodal hallucinations) involve impairment in the ability to predict moment-to-moment experiences in the ongoing perception-action cycle. The phenomenological approach to subjective experience of self-disturbances complements efforts to model psychosis using the computational framework of hierarchical predictive coding. We conclude that self-disturbances play an adaptive, compensatory role following the uncoupling of perception and action, and possibly, other low-level perceptual anomalies.
Collapse
Affiliation(s)
- J A Kaminski
- Department of Psychiatry and Psychotherapy, Campus Charité Mitte, Charité - Universitätsmedizin, D-10117 Berlin, Germany; Berlin Institute of Health (BIH), D-10117 Berlin, Germany
| | - P Sterzer
- Department of Psychiatry and Psychotherapy, Campus Charité Mitte, Charité - Universitätsmedizin, D-10117 Berlin, Germany
| | - A L Mishara
- The Chicago School of Professional Psychology, Los Angeles Campus, Los Angeles, CA, United States..
| |
Collapse
|
37
|
Huang Y, Heil P, Brosch M. Associations between sounds and actions in early auditory cortex of nonhuman primates. eLife 2019; 8:43281. [PMID: 30946010 PMCID: PMC6467566 DOI: 10.7554/elife.43281] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2018] [Accepted: 04/03/2019] [Indexed: 11/17/2022] Open
Abstract
An individual may need to take different actions to the same stimulus in different situations to achieve a given goal. The selection of the appropriate action hinges on the previously learned associations between stimuli, actions, and outcomes in the situations. Here, using a go/no-go paradigm and a symmetrical reward, we show that early auditory cortex of nonhuman primates represents such associations, in both the spiking activity and the local field potentials. Sound-evoked neuronal responses changed with sensorimotor associations shortly after sound onset, and the neuronal responses were largest when the sound signaled that a no-go response was required in a trial to obtain a reward. Our findings suggest that association processes take place in the auditory system and do not necessarily rely on association cortex. Thus, auditory cortex may contribute to a rapid selection of the appropriate motor responses to sounds during goal-directed behavior.
Collapse
Affiliation(s)
- Ying Huang
- Special Lab Primate Neurobiology, Leibniz Institute for Neurobiology, Magdeburg, Germany.,Center for Behavioral Brain Sciences, Otto-von-Guericke-University, Magdeburg, Germany
| | - Peter Heil
- Center for Behavioral Brain Sciences, Otto-von-Guericke-University, Magdeburg, Germany.,Department Systems Physiology of Learning, Leibniz Institute for Neurobiology, Magdeburg, Germany
| | - Michael Brosch
- Special Lab Primate Neurobiology, Leibniz Institute for Neurobiology, Magdeburg, Germany.,Center for Behavioral Brain Sciences, Otto-von-Guericke-University, Magdeburg, Germany
| |
Collapse
|
38
|
Prior expectation of objects in space is dependent on the direction of gaze. Cognition 2019; 182:220-226. [DOI: 10.1016/j.cognition.2018.10.011] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2018] [Revised: 10/09/2018] [Accepted: 10/12/2018] [Indexed: 10/28/2022]
|
39
|
Francis NA, Zhao W, Guinan Jr. JJ. Auditory Attention Reduced Ear-Canal Noise in Humans by Reducing Subject Motion, Not by Medial Olivocochlear Efferent Inhibition: Implications for Measuring Otoacoustic Emissions During a Behavioral Task. Front Syst Neurosci 2018; 12:42. [PMID: 30271329 PMCID: PMC6146202 DOI: 10.3389/fnsys.2018.00042] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2018] [Accepted: 08/24/2018] [Indexed: 12/12/2022] Open
Abstract
Otoacoustic emissions (OAEs) are often measured to non-invasively determine activation of medial olivocochlear (MOC) efferents in humans. Usually these experiments assume that ear-canal noise remains constant. However, changes in ear-canal noise have been reported in some behavioral experiments. We studied the variability of ear-canal noise in eight subjects who performed a two-interval-forced-choice (2IFC) sound-level-discrimination task on monaural tone pips in masker noise. Ear-canal noise was recorded directly from the unstimulated ear opposite the task ear. Recordings were also made with similar sounds presented, but no task done. In task trials, ear-canal noise was reduced at the time the subject did the discrimination, relative to the ear-canal noise level earlier in the trial. In two subjects, there was a decrease in ear-canal noise, primarily at 1-2 kHz, with a time course similar to that expected from inhibition by MOC activity elicited by the task-ear masker noise. These were the only subjects with spontaneous OAEs (SOAEs). We hypothesize that the SOAEs were inhibited by MOC activity elicited by the task-ear masker. Based on the standard rationale in OAE experiments that large bursts of ear-canal noise are artifacts due to subject movement, ear-canal noise bursts above a sound-level criterion were removed. As the criterion was lowered and more high- and moderate-level ear-canal noise bursts were removed, the reduction in ear-canal noise level at the time of the 2IFC discrimination decreased to almost zero, for the six subjects without SOAEs. This pattern is opposite that expected from MOC-induced inhibition (which is greater on lower-level sounds), but can be explained by the hypothesis that subjects move less and create fewer bursts of ear-canal noise when they concentrate on doing the task. In no-task trials for these six subjects, the ear-canal noise level was little changed throughout the trial. Our results show that measurements of MOC effects on OAEs must measure and account for changes in ear-canal noise, especially in behavioral experiments. The results also provide a novel way of showing the time course of the buildup of attention via the time course of the reduction in ear-canal noise.
Collapse
Affiliation(s)
- Nikolas A. Francis
- Speech and Hearing Bioscience and Technology, Harvard-Massachusetts Institute of Technology (MIT) Division of Health Sciences and Technology, Cambridge, MA, United States
- Eaton Peabody Laboratories, Department of Otolaryngology, Massachusetts Eye and Ear, Boston, MA, United States
| | - Wei Zhao
- Eaton Peabody Laboratories, Department of Otolaryngology, Massachusetts Eye and Ear, Boston, MA, United States
- Department of Otolaryngology, Harvard Medical School, Harvard University, Boston, MA, United States
| | - John J. Guinan Jr.
- Speech and Hearing Bioscience and Technology, Harvard-Massachusetts Institute of Technology (MIT) Division of Health Sciences and Technology, Cambridge, MA, United States
- Eaton Peabody Laboratories, Department of Otolaryngology, Massachusetts Eye and Ear, Boston, MA, United States
- Department of Otolaryngology, Harvard Medical School, Harvard University, Boston, MA, United States
| |
Collapse
|
40
|
Mifsud NG, Beesley T, Watson TL, Elijah RB, Sharp TS, Whitford TJ. Attenuation of visual evoked responses to hand and saccade-initiated flashes. Cognition 2018; 179:14-22. [PMID: 29894867 DOI: 10.1016/j.cognition.2018.06.005] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/18/2017] [Revised: 06/04/2018] [Accepted: 06/05/2018] [Indexed: 12/01/2022]
Abstract
Sensory attenuation refers to reduced brain responses to self-initiated sensations relative to those produced by the external world. It is a low-level process that may be linked to higher-level cognitive tasks such as reality monitoring. The phenomenon is often explained by prediction error mechanisms of universal applicability to sensory modality; however, it is most widely reported for auditory stimuli resulting from self-initiated hand movements. The present series of event-related potential (ERP) experiments explored the generalizability of sensory attenuation to the visual domain by exposing participants to flashes initiated by either their own button press or volitional saccade and comparing these conditions to identical, computer-initiated stimuli. The key results showed that the largest reduction of anterior visual N1 amplitude occurred for saccade-initiated flashes, while button press-initiated flashes evoked an intermediary response between the saccade-initiated and externally initiated conditions. This indicates that sensory attenuation occurs for visual stimuli and suggests that the degree of electrophysiological attenuation may relate to the causal likelihood of pairings between the type of motor action and the modality of its sensory response.
Collapse
Affiliation(s)
- Nathan G Mifsud
- School of Psychology, UNSW Sydney, Sydney, New South Wales, Australia.
| | - Tom Beesley
- School of Psychology, UNSW Sydney, Sydney, New South Wales, Australia
| | - Tamara L Watson
- School of Social Sciences and Psychology, Western Sydney University, Bankstown, New South Wales, Australia
| | - Ruth B Elijah
- School of Psychology, UNSW Sydney, Sydney, New South Wales, Australia
| | - Tegan S Sharp
- School of Psychology, UNSW Sydney, Sydney, New South Wales, Australia
| | - Thomas J Whitford
- School of Psychology, UNSW Sydney, Sydney, New South Wales, Australia
| |
Collapse
|