1
|
Snir A, Cieśla K, Ozdemir G, Vekslar R, Amedi A. Localizing 3D motion through the fingertips: Following in the footsteps of elephants. iScience 2024; 27:109820. [PMID: 38799571 PMCID: PMC11126990 DOI: 10.1016/j.isci.2024.109820] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/14/2024] [Revised: 03/07/2024] [Accepted: 04/24/2024] [Indexed: 05/29/2024] Open
Abstract
Each sense serves a different specific function in spatial perception, and they all form a joint multisensory spatial representation. For instance, hearing enables localization in the entire 3D external space, while touch traditionally only allows localization of objects on the body (i.e., within the peripersonal space alone). We use an in-house touch-motion algorithm (TMA) to evaluate individuals' capability to understand externalized 3D information through touch, a skill that was not acquired during an individual's development or in evolution. Four experiments demonstrate quick learning and high accuracy in localization of motion using vibrotactile inputs on fingertips and successful audio-tactile integration in background noise. Subjective responses in some participants imply spatial experiences through visualization and perception of tactile "moving" sources beyond reach. We discuss our findings with respect to developing new skills in an adult brain, including combining a newly acquired "sense" with an existing one and computation-based brain organization.
Collapse
Affiliation(s)
- Adi Snir
- The Baruch Ivcher Institute for Brain, Cognition, and Technology, The Baruch Ivcher School of Psychology, Reichman University, HaUniversita 8, Herzliya 461010, Israel
| | - Katarzyna Cieśla
- The Baruch Ivcher Institute for Brain, Cognition, and Technology, The Baruch Ivcher School of Psychology, Reichman University, HaUniversita 8, Herzliya 461010, Israel
- World Hearing Centre, Institute of Physiology and Pathology of Hearing, Mokra 17, 05-830 Kajetany, Nadarzyn, Poland
| | - Gizem Ozdemir
- The Baruch Ivcher Institute for Brain, Cognition, and Technology, The Baruch Ivcher School of Psychology, Reichman University, HaUniversita 8, Herzliya 461010, Israel
| | - Rotem Vekslar
- The Baruch Ivcher Institute for Brain, Cognition, and Technology, The Baruch Ivcher School of Psychology, Reichman University, HaUniversita 8, Herzliya 461010, Israel
| | - Amir Amedi
- The Baruch Ivcher Institute for Brain, Cognition, and Technology, The Baruch Ivcher School of Psychology, Reichman University, HaUniversita 8, Herzliya 461010, Israel
| |
Collapse
|
2
|
Shestopalova LB, Petropavlovskaia EA, Salikova DA, Semenova VV. Temporal integration of sound motion: Motion-onset response and perception. Hear Res 2024; 441:108922. [PMID: 38043403 DOI: 10.1016/j.heares.2023.108922] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/10/2023] [Revised: 11/14/2023] [Accepted: 11/20/2023] [Indexed: 12/05/2023]
Abstract
The purpose of our study was to estimate the time interval required for integrating the acoustical changes related to sound motion using both psychophysical and EEG measures. Healthy listeners performed direction identification tasks under dichotic conditions in the delayed-motion paradigm. Minimal audible movement angle (MAMA) has been measured over the range of velocities from 60 to 360 deg/s. We also measured minimal duration of motion, at which the listeners could identify its direction. EEG was recorded in the same group of subjects during passive listening. Motion onset responses (MOR) were analyzed. MAMA increased linearly with motion velocity. Minimum audible angle (MAA) calculated from this linear function was about 2 deg. For higher velocities of the delayed motion, we found 2- to 3-fold better spatial resolution than the one previously reported for motion starting at the sound onset. The time required for optimal discrimination of motion direction was about 34 ms. The main finding of our study was that both direction identification time obtained in the behavioral task and cN1 latency behaved like hyperbolic functions of the sound's velocity. Direction identification time decreased asymptotically to 8 ms, which was considered minimal integration time for the instantaneous shift detection. Peak latency of cN1 also decreased with increasing velocity and asymptotically approached 137 ms. This limit corresponded to the latency of response to the instantaneous sound shift and was 37 ms later than the latency of the sound-onset response. The direction discrimination time (34 ms) was of the same magnitude as the additional time required for motion processing to be reflected in the MOR potential. Thus, MOR latency can be viewed as a neurophysiological index of temporal integration. Based on the findings obtained, we may assume that no measurable MOR would be evoked by slowly moving stimuli as they would reach their MAMAs in a time longer than the optimal integration time.
Collapse
Affiliation(s)
- Lidia B Shestopalova
- Pavlov Institute of Physiology, Russian Academy of Sciences, Makarova emb., 6, 199034, Saint Petersburg, Russia.
| | | | - Diana A Salikova
- Pavlov Institute of Physiology, Russian Academy of Sciences, Makarova emb., 6, 199034, Saint Petersburg, Russia
| | - Varvara V Semenova
- Pavlov Institute of Physiology, Russian Academy of Sciences, Makarova emb., 6, 199034, Saint Petersburg, Russia
| |
Collapse
|
3
|
Fine I, Park WJ. Do you hear what I see? How do early blind individuals experience object motion? Philos Trans R Soc Lond B Biol Sci 2023; 378:20210460. [PMID: 36511418 PMCID: PMC9745882 DOI: 10.1098/rstb.2021.0460] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2022] [Accepted: 09/13/2022] [Indexed: 12/15/2022] Open
Abstract
One of the most important tasks for 3D vision is tracking the movement of objects in space. The ability of early blind individuals to understand motion in the environment from noisy and unreliable auditory information is an impressive example of cortical adaptation that is only just beginning to be understood. Here, we compare visual and auditory motion processing, and discuss the effect of early blindness on the perception of auditory motion. Blindness leads to cross-modal recruitment of the visual motion area hMT+ for auditory motion processing. Meanwhile, the planum temporale, associated with auditory motion in sighted individuals, shows reduced selectivity for auditory motion. We discuss how this dramatic shift in the cortical basis of motion processing might influence the perceptual experience of motion in early blind individuals. This article is part of a discussion meeting issue 'New approaches to 3D vision'.
Collapse
Affiliation(s)
- Ione Fine
- Department of Psychology, University of Washington, Seattle, WA 98195-1525, USA
| | - Woon Ju Park
- Department of Psychology, University of Washington, Seattle, WA 98195-1525, USA
| |
Collapse
|
4
|
Towards a Consensus on an ICF-Based Classification System for Horizontal Sound-Source Localization. J Pers Med 2022; 12:jpm12121971. [PMID: 36556192 PMCID: PMC9786639 DOI: 10.3390/jpm12121971] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2022] [Revised: 11/15/2022] [Accepted: 11/18/2022] [Indexed: 12/05/2022] Open
Abstract
The study aimed to develop a consensus classification system for the reporting of sound localization testing results, especially in the field of cochlear implantation. Against the background of an overview of the wide variations present in localization testing procedures and reporting metrics, a novel classification system was proposed to report localization errors according to the widely accepted International Classification of Functioning, Disability and Health (ICF) framework. The obtained HEARRING_LOC_ICF scale includes the ICF graded scale: 0 (no impairment), 1 (mild impairment), 2 (moderate impairment), 3 (severe impairment), and 4 (complete impairment). Improvement of comparability of localization results across institutes, localization testing setups, and listeners was demonstrated by applying the classification system retrospectively to data obtained from cohorts of normal-hearing and cochlear implant listeners at our institutes. The application of our classification system will help to facilitate multi-center studies, as well as allowing better meta-analyses of data, resulting in improved evidence-based practice in the field.
Collapse
|