1
|
Sound Localization Ability in Dogs. Vet Sci 2022; 9:vetsci9110619. [DOI: 10.3390/vetsci9110619] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2022] [Revised: 11/02/2022] [Accepted: 11/03/2022] [Indexed: 11/09/2022] Open
Abstract
The minimum audible angle (MAA), defined as the smallest detectable difference between the azimuths of two identical sources of sound, is a standard measure of spatial auditory acuity in animals. Few studies have explored the MAA of dogs, using methods that do not allow potential improvement throughout the assessment, and with a very small number of dog(s) assessed. To overcome these limits, we adopted a staircase method on 10 dogs, using a two-forced choice procedure with two sound sources, testing angles of separation from 60° to 1°. The staircase method permits the level of difficulty for each dog to be continuously adapted and allows for the observation of improvement over time. The dogs’ average MAA was 7.6°, although with a large interindividual variability, ranging from 1.3° to 13.2°. A global improvement was observed across the procedure, substantiated by a gradual lowering of the MAA and of choice latency across sessions. The results indicate that the staircase method is feasible and reliable in the assessment of auditory spatial localization in dogs, highlighting the importance of using an appropriate method in a sensory discrimination task, so as to allow improvement over time. The results also reveal that the MAA of dogs is more variable than previously reported, potentially reaching values lower than 2°. Although no clear patterns of association emerged between MAA and dogs’ characteristics such as ear shape, head shape or age, the results suggest the value of conducting larger-scale studies to determine whether these or other factors influence sound localization abilities in dogs.
Collapse
|
2
|
Honda A, Tsunokake S, Suzuki Y, Sakamoto S. Auditory Subjective-Straight-Ahead Blurs during Significantly Slow Passive Body Rotation. Iperception 2022; 13:20416695211070616. [PMID: 35024134 PMCID: PMC8744180 DOI: 10.1177/20416695211070616] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2021] [Accepted: 12/15/2021] [Indexed: 11/15/2022] Open
Abstract
This paper reports on the deterioration in sound-localization accuracy during listeners' head and body movements. We investigated the sound-localization accuracy during passive body rotations at speeds in the range of 0.625-5 °/s. Participants were asked to determine whether a 30-ms noise stimuli emerged relative to their subjective-straight-ahead reference. Results indicated that the sound-localization resolution degraded with passive rotation, irrespective of the rotation speed, even at speeds of 0.625 °/s.
Collapse
Affiliation(s)
- Akio Honda
- Department of Information Design, Faculty of Informatics, Shizuoka Institute of Science and Technology, Fukuroi, Japan
| | - Sayaka Tsunokake
- Research Institute of Electrical Communication, Tohoku University, Sendai, Japan
| | - Yôiti Suzuki
- Research Institute of Electrical Communication, Tohoku University, Sendai, Japan
| | - Shuichi Sakamoto
- Research Institute of Electrical Communication, Tohoku University, Sendai, Japan
| |
Collapse
|
3
|
Risoud M, Hanson JN, Gauvrit F, Renard C, Bonne NX, Vincent C. Azimuthal sound source localization of various sound stimuli under different conditions. Eur Ann Otorhinolaryngol Head Neck Dis 2019; 137:21-29. [PMID: 31582332 DOI: 10.1016/j.anorl.2019.09.007] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
AIM To evaluate azimuthal sound-source localization performance under different conditions, with a view to optimizing a routine sound localization protocol. MATERIAL AND METHOD Two groups of healthy, normal-hearing subjects were tested identically, except that one had to keep their head still while the other was allowed to turn it. Sound localization was tested without and then with a right ear plug (acute auditory asymmetry) for each of the following sound stimuli: pulsed narrow-band centered on 250Hz, continuous narrowband centered on 2000Hz, 4000Hz and 8000Hz, continuous 4000Hz warble, pulsed white noise, and word ("lac" (lake)). Root mean square error was used to calculate sound-source localization accuracy. RESULTS With fixed head, localization was significantly disturbed by the earplug for all stimuli (P<0.05). The most discriminating stimulus was continuous 4000Hz narrow-band: area under the ROC curve (AUC), 0.99 [95% CI, 0.95-1.01] for screening and 0.85 [0.82-0.89] for diagnosis. With mobile head, localization was significantly better than with fixed head for 4000 and 8000Hz stimuli (P<0.05). The most discriminating stimulus was continuous 2000Hz narrow-band: AUC, 0.90 [0.83-0.97] for screening and 0.75 [0.71-0.79] for diagnosis. In both conditions, pulsed noise (250Hz narrow-band, white noise or word) was less difficult to localize than continuous noise. CONCLUSION The test was more sensitive with the head immobile. Continuous narrow-band stimulation centered on 4000Hz most effectively explored interaural level difference. Pulsed narrow-band stimulation centered on 250Hz most effectively explored interaural time difference. Testing with mobile head, closer to real-life conditions, was most effective with continuous narrow-band stimulation centered on 2000Hz.
Collapse
Affiliation(s)
- M Risoud
- Department of Otology and Neurotology, CHU de Lille, 59000 Lille, France; Inserm U1008 - Controlled Drug Delivery Systems and Biomaterials, University of Lille, CHU de Lille, 59000 Lille, France.
| | - J-N Hanson
- Department of Otology and Neurotology, CHU de Lille, 59000 Lille, France
| | - F Gauvrit
- Department of Otology and Neurotology, CHU de Lille, 59000 Lille, France
| | - C Renard
- Department of Otology and Neurotology, CHU de Lille, 59000 Lille, France
| | - N-X Bonne
- Department of Otology and Neurotology, CHU de Lille, 59000 Lille, France; Inserm U1192 - Proteomics Inflammatory Response Mass Spectrometry (PRISM), University of Lille, CHU de Lille, 59000 Lille, France
| | - C Vincent
- Department of Otology and Neurotology, CHU de Lille, 59000 Lille, France; Inserm U1008 - Controlled Drug Delivery Systems and Biomaterials, University of Lille, CHU de Lille, 59000 Lille, France
| |
Collapse
|
4
|
Pöntynen H, Salminen NH. Resolving front-back ambiguity with head rotation: The role of level dynamics. Hear Res 2019; 377:196-207. [PMID: 30981050 DOI: 10.1016/j.heares.2019.03.020] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/29/2018] [Revised: 03/25/2019] [Accepted: 03/27/2019] [Indexed: 11/18/2022]
Abstract
Making small head movements facilitates spatial hearing by resolving front-back confusions, otherwise common in free field sound source localization. The changes in interaural time difference (ITD) in response to head rotation provide a robust front-back cue, but whether interaural level difference (ILD) can be used as a dynamic cue is not clear. Therefore, the purpose of the present study was to assess the usefulness of dynamic ILD as a localization cue. The results show that human listeners were capable of correctly indicating the front-back dimension of high-frequency sinusoids based on level dynamics in free field conditions, but only if a wide movement range was allowed (±40∘). When the free field conditions were replaced by simplistic headphone stimulation, front-back responses were in agreement with the simulated source directions even with relatively small movement ranges (±5∘), whenever monaural sound level and ILD changed monotonically in response to head rotation. In conclusion, human listeners can use level dynamics as a front-back localization cue when the dynamics are monotonic. However, in free field conditions and particularly for narrowband target signals, this is often not the case. Therefore, the primary limiting factor in the use of dynamic level cues resides in the acoustic domain behavior of the cue itself, rather than in potential processing limitations or strategies of the human auditory system.
Collapse
Affiliation(s)
- Henri Pöntynen
- Aalto Acoustics Lab, Department of Signal Processing and Acoustics, School of Electrical Engineering, Aalto University, FI-02150, Espoo, Finland.
| | - Nelli H Salminen
- Aalto Acoustics Lab, Department of Signal Processing and Acoustics, School of Electrical Engineering, Aalto University, FI-02150, Espoo, Finland
| |
Collapse
|
5
|
Risoud M, Hanson JN, Gauvrit F, Renard C, Lemesre PE, Bonne NX, Vincent C. Sound source localization. Eur Ann Otorhinolaryngol Head Neck Dis 2018; 135:259-264. [PMID: 29731298 DOI: 10.1016/j.anorl.2018.04.009] [Citation(s) in RCA: 25] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
Abstract
Sound source localization is paramount for comfort of life, determining the position of a sound source in 3 dimensions: azimuth, height and distance. It is based on 3 types of cue: 2 binaural (interaural time difference and interaural level difference) and 1 monaural spectral cue (head-related transfer function). These are complementary and vary according to the acoustic characteristics of the incident sound. The objective of this report is to update the current state of knowledge on the physical basis of spatial sound localization.
Collapse
Affiliation(s)
- M Risoud
- Department of otology and neurotology, CHU de Lille, 59000 Lille, France; Inserm U1008 - controlled drug delivery systems and biomaterials, université de Lille 2, CHU de Lille, 59000 Lille, France.
| | - J-N Hanson
- Department of otology and neurotology, CHU de Lille, 59000 Lille, France
| | - F Gauvrit
- Department of otology and neurotology, CHU de Lille, 59000 Lille, France
| | - C Renard
- Department of otology and neurotology, CHU de Lille, 59000 Lille, France
| | - P-E Lemesre
- Department of otology and neurotology, CHU de Lille, 59000 Lille, France
| | - N-X Bonne
- Department of otology and neurotology, CHU de Lille, 59000 Lille, France; Inserm U1008 - controlled drug delivery systems and biomaterials, université de Lille 2, CHU de Lille, 59000 Lille, France
| | - C Vincent
- Department of otology and neurotology, CHU de Lille, 59000 Lille, France; Inserm U1008 - controlled drug delivery systems and biomaterials, université de Lille 2, CHU de Lille, 59000 Lille, France
| |
Collapse
|
6
|
Honda A, Ohba K, Iwaya Y, Suzuki Y. Detection of Sound Image Movement During Horizontal Head Rotation. Iperception 2016; 7:2041669516669614. [PMID: 27698993 PMCID: PMC5030746 DOI: 10.1177/2041669516669614] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022] Open
Abstract
Movement detection for a virtual sound source was measured during the listener’s horizontal head rotation. Listeners were instructed to do head rotation at a given speed. A trial consisted of two intervals. During an interval, a virtual sound source was presented 60° to the right or left of the listener, who was instructed to rotate the head to face the sound image position. Then in one of a pair of intervals, the sound position was moved slightly in the middle of the rotation. Listeners were asked to judge the interval in a trial during which the sound stimuli moved. Results suggest that detection thresholds are higher when listeners do head rotation. Moreover, this effect was found to be independent of the rotation velocity.
Collapse
Affiliation(s)
- Akio Honda
- Yamanashi Eiwa College, Yamanashi, Japan
| | | | | | | |
Collapse
|
7
|
Genzel D, Firzlaff U, Wiegrebe L, MacNeilage PR. Dependence of auditory spatial updating on vestibular, proprioceptive, and efference copy signals. J Neurophysiol 2016; 116:765-75. [PMID: 27169504 DOI: 10.1152/jn.00052.2016] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2016] [Accepted: 05/09/2016] [Indexed: 11/22/2022] Open
Abstract
Humans localize sounds by comparing inputs across the two ears, resulting in a head-centered representation of sound-source position. When the head moves, information about head movement must be combined with the head-centered estimate to correctly update the world-centered sound-source position. Spatial updating has been extensively studied in the visual system, but less is known about how head movement signals interact with binaural information during auditory spatial updating. In the current experiments, listeners compared the world-centered azimuthal position of two sound sources presented before and after a head rotation that depended on condition. In the active condition, subjects rotated their head by ∼35° to the left or right, following a pretrained trajectory. In the passive condition, subjects were rotated along the same trajectory in a rotating chair. In the cancellation condition, subjects rotated their head as in the active condition, but the chair was counter-rotated on the basis of head-tracking data such that the head effectively remained fixed in space while the body rotated beneath it. Subjects updated most accurately in the passive condition but erred in the active and cancellation conditions. Performance is interpreted as reflecting the accuracy of perceived head rotation across conditions, which is modeled as a linear combination of proprioceptive/efference copy signals and vestibular signals. Resulting weights suggest that auditory updating is dominated by vestibular signals but with significant contributions from proprioception/efference copy. Overall, results shed light on the interplay of sensory and motor signals that determine the accuracy of auditory spatial updating.
Collapse
Affiliation(s)
- Daria Genzel
- Department Biology II, Ludwig-Maximilian University of Munich, Planegg-Martinsried, Germany; Bernstein Center for Computational Neuroscience Munich, Planegg-Martinsried, Germany
| | - Uwe Firzlaff
- Bernstein Center for Computational Neuroscience Munich, Planegg-Martinsried, Germany; Chair of Zoology, Technische Universität München, Freising-Weihenstephan, Germany; and
| | - Lutz Wiegrebe
- Department Biology II, Ludwig-Maximilian University of Munich, Planegg-Martinsried, Germany; Bernstein Center for Computational Neuroscience Munich, Planegg-Martinsried, Germany
| | - Paul R MacNeilage
- Bernstein Center for Computational Neuroscience Munich, Planegg-Martinsried, Germany; Deutsches Schwindel- und Gleichgewichtszentrum, University Hospital of Munich, Munich, Germany
| |
Collapse
|
8
|
Carlile S, Leung J. The Perception of Auditory Motion. Trends Hear 2016; 20:2331216516644254. [PMID: 27094029 PMCID: PMC4871213 DOI: 10.1177/2331216516644254] [Citation(s) in RCA: 38] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2015] [Revised: 03/22/2016] [Accepted: 03/22/2016] [Indexed: 11/16/2022] Open
Abstract
The growing availability of efficient and relatively inexpensive virtual auditory display technology has provided new research platforms to explore the perception of auditory motion. At the same time, deployment of these technologies in command and control as well as in entertainment roles is generating an increasing need to better understand the complex processes underlying auditory motion perception. This is a particularly challenging processing feat because it involves the rapid deconvolution of the relative change in the locations of sound sources produced by rotational and translations of the head in space (self-motion) to enable the perception of actual source motion. The fact that we perceive our auditory world to be stable despite almost continual movement of the head demonstrates the efficiency and effectiveness of this process. This review examines the acoustical basis of auditory motion perception and a wide range of psychophysical, electrophysiological, and cortical imaging studies that have probed the limits and possible mechanisms underlying this perception.
Collapse
Affiliation(s)
- Simon Carlile
- School of Medical Sciences, University of Sydney, NSW, Australia Starkey Hearing Research Center, Berkeley, CA, USA
| | - Johahn Leung
- School of Medical Sciences, University of Sydney, NSW, Australia
| |
Collapse
|
9
|
Leung J, Wei V, Burgess M, Carlile S. Head Tracking of Auditory, Visual, and Audio-Visual Targets. Front Neurosci 2016; 9:493. [PMID: 26778952 PMCID: PMC4701932 DOI: 10.3389/fnins.2015.00493] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2015] [Accepted: 12/11/2015] [Indexed: 11/13/2022] Open
Abstract
The ability to actively follow a moving auditory target with our heads remains unexplored even though it is a common behavioral response. Previous studies of auditory motion perception have focused on the condition where the subjects are passive. The current study examined head tracking behavior to a moving auditory target along a horizontal 100° arc in the frontal hemisphere, with velocities ranging from 20 to 110°/s. By integrating high fidelity virtual auditory space with a high-speed visual presentation we compared tracking responses of auditory targets against visual-only and audio-visual "bisensory" stimuli. Three metrics were measured-onset, RMS, and gain error. The results showed that tracking accuracy (RMS error) varied linearly with target velocity, with a significantly higher rate in audition. Also, when the target moved faster than 80°/s, onset and RMS error were significantly worst in audition the other modalities while responses in the visual and bisensory conditions were statistically identical for all metrics measured. Lastly, audio-visual facilitation was not observed when tracking bisensory targets.
Collapse
Affiliation(s)
- Johahn Leung
- Auditory Neuroscience Laboratory, School of Medical Sciences, University of SydneySydney, NSW, Australia
| | | | | | | |
Collapse
|
10
|
Ruhland JL, Jones AE, Yin TCT. Dynamic sound localization in cats. J Neurophysiol 2015; 114:958-68. [PMID: 26063772 DOI: 10.1152/jn.00105.2015] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2015] [Accepted: 06/05/2015] [Indexed: 11/22/2022] Open
Abstract
Sound localization in cats and humans relies on head-centered acoustic cues. Studies have shown that humans are able to localize sounds during rapid head movements that are directed toward the target or other objects of interest. We studied whether cats are able to utilize similar dynamic acoustic cues to localize acoustic targets delivered during rapid eye-head gaze shifts. We trained cats with visual-auditory two-step tasks in which we presented a brief sound burst during saccadic eye-head gaze shifts toward a prior visual target. No consistent or significant differences in accuracy or precision were found between this dynamic task (2-step saccade) and the comparable static task (single saccade when the head is stable) in either horizontal or vertical direction. Cats appear to be able to process dynamic auditory cues and execute complex motor adjustments to accurately localize auditory targets during rapid eye-head gaze shifts.
Collapse
Affiliation(s)
- Janet L Ruhland
- Department of Neuroscience and Neuroscience Training Program, University of Wisconsin, Madison, Wisconsin
| | - Amy E Jones
- Department of Neuroscience and Neuroscience Training Program, University of Wisconsin, Madison, Wisconsin
| | - Tom C T Yin
- Department of Neuroscience and Neuroscience Training Program, University of Wisconsin, Madison, Wisconsin
| |
Collapse
|
11
|
Teramoto W, Cui Z, Sakamoto S, Gyoba J. Distortion of auditory space during visually induced self-motion in depth. Front Psychol 2014; 5:848. [PMID: 25140162 PMCID: PMC4122181 DOI: 10.3389/fpsyg.2014.00848] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2014] [Accepted: 07/16/2014] [Indexed: 11/13/2022] Open
Abstract
Perception of self-motion is based on the integration of multiple sensory inputs, in particular from the vestibular and visual systems. Our previous study demonstrated that vestibular linear acceleration information distorted auditory space perception (Teramoto et al., 2012). However, it is unclear whether this phenomenon is contingent on vestibular signals or whether it can be caused by inputs from other sensory modalities involved in self-motion perception. Here, we investigated whether visual linear self-motion information can also alter auditory space perception. Large-field visual motion was presented to induce self-motion perception with constant accelerations (Experiment 1) and a constant velocity (Experiment 2) either in a forward or backward direction. During participants' experience of self-motion, a short noise burst was delivered from one of the loudspeakers aligned parallel to the motion direction along a wall to the left of the listener. Participants indicated from which direction the sound was presented, forward or backward, relative to their coronal (i.e., frontal) plane. Results showed that the sound position aligned with the subjective coronal plane (SCP) was significantly displaced in the direction of self-motion, especially in the backward self-motion condition as compared with a no motion condition. These results suggest that self-motion information, irrespective of its origin, is crucial for auditory space perception.
Collapse
Affiliation(s)
- Wataru Teramoto
- Department of Computer Science and Systems Engineering, Muroran Institute of Technology Muroran, Japan ; Research Institute of Electrical Communication, Tohoku University Sendai, Japan
| | - Zhenglie Cui
- Research Institute of Electrical Communication, Tohoku University Sendai, Japan
| | - Shuichi Sakamoto
- Research Institute of Electrical Communication, Tohoku University Sendai, Japan
| | - Jiro Gyoba
- Department of Psychology, Graduate School of Arts and Letters, Tohoku University Sendai, Japan
| |
Collapse
|
12
|
Teramoto W, Sakamoto S, Furune F, Gyoba J, Suzuki Y. Compression of auditory space during forward self-motion. PLoS One 2012; 7:e39402. [PMID: 22768076 PMCID: PMC3387142 DOI: 10.1371/journal.pone.0039402] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2011] [Accepted: 05/21/2012] [Indexed: 11/19/2022] Open
Abstract
BACKGROUND Spatial inputs from the auditory periphery can be changed with movements of the head or whole body relative to the sound source. Nevertheless, humans can perceive a stable auditory environment and appropriately react to a sound source. This suggests that the inputs are reinterpreted in the brain, while being integrated with information on the movements. Little is known, however, about how these movements modulate auditory perceptual processing. Here, we investigate the effect of the linear acceleration on auditory space representation. METHODOLOGY/PRINCIPAL FINDINGS Participants were passively transported forward/backward at constant accelerations using a robotic wheelchair. An array of loudspeakers was aligned parallel to the motion direction along a wall to the right of the listener. A short noise burst was presented during the self-motion from one of the loudspeakers when the listener's physical coronal plane reached the location of one of the speakers (null point). In Experiments 1 and 2, the participants indicated which direction the sound was presented, forward or backward relative to their subjective coronal plane. The results showed that the sound position aligned with the subjective coronal plane was displaced ahead of the null point only during forward self-motion and that the magnitude of the displacement increased with increasing the acceleration. Experiment 3 investigated the structure of the auditory space in the traveling direction during forward self-motion. The sounds were presented at various distances from the null point. The participants indicated the perceived sound location by pointing a rod. All the sounds that were actually located in the traveling direction were perceived as being biased towards the null point. CONCLUSIONS/SIGNIFICANCE These results suggest a distortion of the auditory space in the direction of movement during forward self-motion. The underlying mechanism might involve anticipatory spatial shifts in the auditory receptive field locations driven by afferent signals from vestibular system.
Collapse
Affiliation(s)
- Wataru Teramoto
- Research Institute of Electrical Communication, Tohoku University, Sendai, Japan.
| | | | | | | | | |
Collapse
|
13
|
Barnett-Cowan M, Raeder SM, Bülthoff HH. Persistent perceptual delay for head movement onset relative to auditory stimuli of different durations and rise times. Exp Brain Res 2012; 220:41-50. [PMID: 22580574 PMCID: PMC3375482 DOI: 10.1007/s00221-012-3112-x] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/27/2012] [Accepted: 04/27/2012] [Indexed: 11/06/2022]
Abstract
The perception of simultaneity between auditory and vestibular information is crucially important for maintaining a coherent representation of the acoustic environment whenever the head moves. It has been recently reported, however, that despite having similar transduction latencies, vestibular stimuli are perceived significantly later than auditory stimuli when simultaneously generated. This suggests that perceptual latency of a head movement is longer than a co-occurring sound. However, these studies paired a vestibular stimulation of long duration (~1 s) and of a continuously changing temporal envelope with a brief (10–50 ms) sound pulse. In the present study, the stimuli were matched for temporal envelope duration and shape. Participants judged the temporal order of the two stimuli, the onset of an active head movement and the onset of brief (50 ms) or long (1,400 ms) sounds with a square- or raised-cosine-shaped envelope. Consistent with previous reports, head movement onset had to precede the onset of a brief sound by about 73 ms in order for the stimuli to be perceived as simultaneous. Head movements paired with long square sounds (~100 ms) were not significantly different than brief sounds. Surprisingly, head movements paired with long raised-cosine sound (~115 ms) had to be presented even earlier than brief stimuli. This additional lead time could not be accounted for by differences in the comparison stimulus characteristics (temporal envelope duration and shape). Rather, differences between sound conditions were found to be attributable to variability in the time for head movement to reach peak velocity: the head moved faster when paired with a brief sound. The persistent lead time required for vestibular stimulation provides further evidence that the perceptual latency of vestibular stimulation is greater than the other senses.
Collapse
Affiliation(s)
- Michael Barnett-Cowan
- Department of Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Tübingen, Germany.
| | | | | |
Collapse
|
14
|
Vicario CM, Martino D, Pavone EF, Fuggetta G. Lateral head turning affects temporal memory. Percept Mot Skills 2011; 113:3-10. [PMID: 21987905 DOI: 10.2466/04.22.pms.113.4.3-10] [Citation(s) in RCA: 24] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Spatial attention is a key factor in the exploration and processing of the surrounding environment, and plays a role in linking magnitudes such as space, time, and numbers. The present work evaluates whether shifting the coordinates of spatial attention through rotational head movements may affect the ability to estimate the duration of different time intervals. A computer-based implicit timing task was employed, in which participants were asked to concentrate and report verbally on colour changes of sequential stimuli displayed on a computer screen; subsequently, they were required to reproduce the temporal duration (ranging between 5 and 80 sec.) of the perceived stimuli using the computer keyboard. There was statistically significant overestimation of the 80-sec. intervals exclusively on the rightward rotation head posture, whereas head posture did not affect timing performances on shorter intervals. These findings support the hypothesis that the coordinates of spatial attention influence the ability to process time, consistent with the existence of common cortical metrics of space and time in healthy humans.
Collapse
|