1
|
Lu F, Li Y, Yang J, Wang A, Zhang M. Auditory affective content facilitates time-to-contact estimation of visual affective targets. Front Psychol 2023; 14:1105824. [PMID: 37207030 PMCID: PMC10188967 DOI: 10.3389/fpsyg.2023.1105824] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2022] [Accepted: 04/06/2023] [Indexed: 05/21/2023] Open
Abstract
Reacting to a moving object requires an ability to estimate when a moving object reaches its destination, also referred to as the time-to-contact (TTC) estimation. Although the TTC estimation of threatening visually moving objects is known to be underestimated, the effect of the affective content of auditory information on visual TTC estimation remains unclear. We manipulated the velocity and presentation time to investigate the TTC of a threat or non-threat target with the addition of auditory information. In the task, a visual or an audiovisual target moved from right to left and disappeared behind an occluder. Participants' task was to estimate the TTC of the target, they needed to press a button when they thought that the target contacted a destination behind the occluder. Behaviorally, the additional auditory affective content facilitated TTC estimation; velocity was a more critical factor than presentation time in determining the audiovisual threat facilitation effect. Overall, the results indicate that exposure to auditory affective content can influence TTC estimation and that the effect of velocity on TTC estimation will provide more information than presentation time.
Collapse
Affiliation(s)
- Feifei Lu
- Department of Psychology, Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou, China
| | - You Li
- College of Chinese Language and Culture, Jinan University, Guangzhou, China
| | - Jiajia Yang
- Applied Brain Science Lab, Faculty of Interdisciplinary Science and Engineering in Health Systems, Okayama University, Okayama, Japan
| | - Aijun Wang
- Department of Psychology, Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou, China
- *Correspondence: Aijun Wang,
| | - Ming Zhang
- Department of Psychology, Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou, China
- Cognitive Neuroscience Laboratory, Graduate School of Interdisciplinary Science and Engineering in Health Systems, Okayama University, Okayama, Japan
- Ming Zhang,
| |
Collapse
|
2
|
Oberfeld D, Wessels M, Büttner D. Overestimated time-to-collision for quiet vehicles: Evidence from a study using a novel audiovisual virtual-reality system for traffic scenarios. ACCIDENT; ANALYSIS AND PREVENTION 2022; 175:106778. [PMID: 35878469 DOI: 10.1016/j.aap.2022.106778] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/03/2022] [Revised: 07/04/2022] [Accepted: 07/14/2022] [Indexed: 06/15/2023]
Abstract
To avoid collision, pedestrians intending to cross a road need to estimate the time-to-collision (TTC) of an approaching vehicle. Here, we present a novel interactive audiovisual virtual-reality system for investigating how the acoustic characteristics (loudness and engine type) of vehicles influence the TTC estimation. Using acoustic recordings of real vehicles as source signals, the dynamic spatial sound fields corresponding to a vehicle approaching in an urban setting are generated based on physical modeling of the sound propagation between vehicle and pedestrian and are presented via sound field synthesis. We studied TTC estimation for vehicles with internal combustion engine and for loudness-matched electric vehicles. The vehicle sound levels were varied by 10 dB, independently of the speed, presented TTC, and vehicle type. In an auditory-only condition, the cars were not visible, and lower loudness of the cars resulted in considerably longer TTC estimates. Importantly, the loudness of the cars also had a significant effect in the same direction on the TTC estimates in an audiovisual condition, where the cars were additionally visually presented via interactive virtual-reality simulations. Thus, pedestrians use auditory information when estimating TTC, even when full visual information is available. At equal loudness, the TTC judgments for electric and conventional vehicles were virtually identical, indicating that loudness has a stronger effect than spectral differences. Because TTC overestimations can result in risky road crossing decisions, the results imply that vehicle loudness should be considered as an important factor in pedestrian safety.
Collapse
Affiliation(s)
- Daniel Oberfeld
- Institute of Psychology, Section Experimental Psychology, Johannes Gutenberg-Universität Mainz, Wallstrasse 3, Mainz 55122, Germany.
| | - Marlene Wessels
- Institute of Psychology, Section Experimental Psychology, Johannes Gutenberg-Universität Mainz, Wallstrasse 3, Mainz 55122, Germany
| | - David Büttner
- Institute of Psychology, Section Experimental Psychology, Johannes Gutenberg-Universität Mainz, Wallstrasse 3, Mainz 55122, Germany
| |
Collapse
|
3
|
The influence of time structure on prediction motion in visual and auditory modalities. Atten Percept Psychophys 2021; 84:1994-2001. [PMID: 34725775 DOI: 10.3758/s13414-021-02369-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/16/2021] [Indexed: 11/08/2022]
Abstract
Usually people can estimate the correct position of a moving object even when it temporarily moves behind an occlusion. Studies have been performed on this type of occluded motion with prediction motion (PM) tasks in the laboratory. Previous publications have emphasized that people could use mental imagery or apply an oculomotor system to estimate the arrival of a moving stimulus at the target place. Nevertheless, these two ways cannot account for the performance difference under a different set of conditions. Our study tested the role of time structure in a time-to-collision (TTC) task using visual and auditory modalities. In the visual condition, the moving red bar travelled from left to right and was invisible during the entire course but flashed at the initial and the occluded points. The auditory condition and visual condition were alike, except that the flashes in the visual condition were changed to clicks at the initial and the occluded points. The results illustrated that participants' performance was better in the equal time structure condition. The comparison between the two sense modalities demonstrated a similar tendency, which suggested there could be common cognitive processes between visual and auditory modalities when participants took advantage of temporal cues to judge TTC.
Collapse
|
4
|
The influence of auditory rhythms on the speed of inferred motion. Atten Percept Psychophys 2021; 84:2360-2383. [PMID: 34435321 DOI: 10.3758/s13414-021-02364-4] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/26/2021] [Indexed: 12/24/2022]
Abstract
The present research explored the influence of isochronous auditory rhythms on the timing of movement-related prediction in two experiments. In both experiments, participants observed a moving disc that was visible for a predetermined period before disappearing behind a small, medium, or large occluded area for the remainder of its movement. In Experiment 1, the disc was visible for 1 s. During this period, participants were exposed to either a fast or slow auditory rhythm, or they heard nothing. They were instructed to press a key to indicate when they believed the moving disc had reached a specified location on the other side of the occluded area. The procedure measured the (signed) error in participants' estimate of the time it would take for a moving object to contact a stationary one. The principal results of Experiment 1 were main effects of the rate of the auditory rhythm and of the size of the occlusion on participants' judgments. In Experiment 2, the period of visibility was varied with size of the occlusion area to keep the total movement time constant for all three levels of occlusion. The results replicated the main effect of rhythm found in Experiment 1 and showed a small, significant interaction, but indicated no main effect of occlusion size. Overall, the results indicate that exposure to fast isochronous auditory rhythms during an interval of inferred motion can influence the imagined rate of such motion and suggest a possible role of an internal rhythmicity in the maintenance of temporally accurate dynamic mental representations.
Collapse
|
5
|
Huygelier H, van Ee R, Lanssens A, Wagemans J, Gillebert CR. Audiovisual looming signals are not always prioritised: evidence from exogenous, endogenous and sustained attention. JOURNAL OF COGNITIVE PSYCHOLOGY 2021. [DOI: 10.1080/20445911.2021.1896528] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Affiliation(s)
- Hanne Huygelier
- Department of Brain and Cognition, KU Leuven, Leuven, Belgium
| | - Raymond van Ee
- Department of Brain and Cognition, KU Leuven, Leuven, Belgium
- Donders Institute for Brain, Cognition and Behavior, Radboud University, Nijmegen, Netherlands
| | - Armien Lanssens
- Department of Brain and Cognition, KU Leuven, Leuven, Belgium
| | - Johan Wagemans
- Department of Brain and Cognition, KU Leuven, Leuven, Belgium
| | | |
Collapse
|
6
|
Stafford J, Rodger M. Educating Older Adults’ Attention towards and Away from Gap-Specifying Information in a Virtual Road-Crossing Task. ECOLOGICAL PSYCHOLOGY 2020. [DOI: 10.1080/10407413.2020.1826322] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
7
|
Auditory pitch glides influence time-to-contact judgements of visual stimuli. Exp Brain Res 2019; 237:1907-1917. [PMID: 31104086 DOI: 10.1007/s00221-019-05561-8] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2018] [Accepted: 05/13/2019] [Indexed: 10/26/2022]
Abstract
A common experimental task used to study the accuracy of estimating when a moving object arrives at a designated location is the time-to-contact (TTC) task. The previous studies have shown evidence that sound motion cues influence TTC estimates of a visual moving object. However, the extent to which sound can influence TTC of visual targets still remains unclear. Some studies on the crossmodal correspondence between pitch and speed suggest that descending pitch sounds are associated with faster speeds compared to ascending pitch sounds due to an internal model of gravity. Other studies have shown an opposite pitch-speed mapping (i.e., ascending pitch associated with faster speeds) and no influence of gravity heuristics. Here, we explored whether auditory pitch glides, a continuous pure tone sound either ascending or descending in pitch, influence TTC estimates of a vertically moving visual target and if any observed effects are consistent with a gravity-centered or gravity-unrelated pitch-speed mapping. Subjects estimated when a disc moving either upward or downward at a constant speed reached a visual landmark after the disc disappeared behind an occluder under three conditions: with an accompanying ascending pitch glide, with a descending pitch glide, or with no sound. Overall, subjects underestimated TTC with ascending pitch glides and overestimated TTC with descending pitch glides, compared to the no-sound condition. These biases in TTC were consistent in both disc motion directions. These results suggest that subjects adopted a gravity-unrelated pitch-speed mapping where ascending pitch is associated with faster speeds and descending pitch associated with slower speeds.
Collapse
|
8
|
Dittrich S, Noesselt T. Temporal Audiovisual Motion Prediction in 2D- vs. 3D-Environments. Front Psychol 2018; 9:368. [PMID: 29618999 PMCID: PMC5871701 DOI: 10.3389/fpsyg.2018.00368] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2017] [Accepted: 03/06/2018] [Indexed: 11/24/2022] Open
Abstract
Predicting motion is essential for many everyday life activities, e.g., in road traffic. Previous studies on motion prediction failed to find consistent results, which might be due to the use of very different stimulus material and behavioural tasks. Here, we directly tested the influence of task (detection, extrapolation) and stimulus features (visual vs. audiovisual and three-dimensional vs. non-three-dimensional) on temporal motion prediction in two psychophysical experiments. In both experiments a ball followed a trajectory toward the observer and temporarily disappeared behind an occluder. In audiovisual conditions a moving white noise (congruent or non-congruent to visual motion direction) was presented concurrently. In experiment 1 the ball reappeared on a predictable or a non-predictable trajectory and participants detected when the ball reappeared. In experiment 2 the ball did not reappear after occlusion and participants judged when the ball would reach a specified position at two possible distances from the occluder (extrapolation task). Both experiments were conducted in three-dimensional space (using stereoscopic screen and polarised glasses) and also without stereoscopic presentation. Participants benefitted from visually predictable trajectories and concurrent sounds during detection. Additionally, visual facilitation was more pronounced for non-3D stimulation during detection task. In contrast, for a more complex extrapolation task group mean results indicated that auditory information impaired motion prediction. However, a post hoc cross-validation procedure (split-half) revealed that participants varied in their ability to use sounds during motion extrapolation. Most participants selectively profited from either near or far extrapolation distances but were impaired for the other one. We propose that interindividual differences in extrapolation efficiency might be the mechanism governing this effect. Together, our results indicate that both a realistic experimental environment and subject-specific differences modulate the ability of audiovisual motion prediction and need to be considered in future research.
Collapse
Affiliation(s)
- Sandra Dittrich
- Department of Biological Psychology, Otto von Guericke University Magdeburg, Magdeburg, Germany
| | - Tömme Noesselt
- Department of Biological Psychology, Otto von Guericke University Magdeburg, Magdeburg, Germany.,Center for Behavioral Brain Sciences, Magdeburg, Germany
| |
Collapse
|
9
|
Sound changes that lead to seeing longer-lasting shapes. Atten Percept Psychophys 2018; 80:986-998. [PMID: 29380283 DOI: 10.3758/s13414-017-1475-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
To survive, people must construct an accurate representation of the world around them. There is a body of research on visual scene analysis, and a largely separate literature on auditory scene analysis. The current study follows up research from the smaller literature on audiovisual scene analysis. Prior work demonstrated that when there is an abrupt size change to a moving object, observers tend to see two objects rather than one-the abrupt visual change enhances visible persistence of the briefly presented different-sized object. Moreover, if a sequence of tones accompanies the moving object, visible persistence is enhanced if the tone frequency suddenly changes at the same time that the object's size changes. Here, we show that although a sound change must occur at roughly the same time as a visual change to enhance visible persistence, there is a fairly wide time frame during which the sound change can occur. In addition, the impact of a sound change on visible persistence is not simply matter of the physical pattern: The same pattern of sound can enhance visible persistence or not, depending on how the pattern is itself perceived. Specifically, a change in a tone's frequency can enhance visible persistence when it accompanies a visual size change, but the same frequency change will not do so if the shift is embedded in a larger pattern that makes the change merely a continuation of alternating frequencies. The current study supports a scene analysis process that is both multimodal and actively constructive.
Collapse
|
10
|
Chotsrisuparat C, Koning A, Jacobs R, van Lier R. Auditory Rhythms Influence Judged Time to Contact of an Occluded Moving Object. Multisens Res 2017. [DOI: 10.1163/22134808-00002592] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
We studied the expected moment of reappearance of a moving object after it disappeared from sight. In particular, we investigated whether auditory rhythms influence time to contact (TTC) judgments. Using displays in which a moving disk disappears behind an occluder, we examined whether an accompanying auditory rhythm influences the expected TTC of an occluded moving object. We manipulated a baseline auditory rhythm — consisting of equal sound and pause durations — in two ways: either the pause durations or the sound durations were increased to create slower rhythms. Participants had to press a button at the moment they expected the disk to reappear. Variations in pause duration (Experiments 1 and 2) affected expected TTC, in contrast to variations in sound duration (Experiment 3). These results show that auditory rhythms affect expected reappearance of an occluded moving object. Second, these results suggest that temporal auditory grouping is an important factor in TTC.
Collapse
Affiliation(s)
- Chayada Chotsrisuparat
- Radboud University, Donders Institute for Brain, Cognition and Behaviour, Nijmegen, The Netherlands
| | - Arno Koning
- Radboud University, Donders Institute for Brain, Cognition and Behaviour, Nijmegen, The Netherlands
| | - Richard Jacobs
- Radboud University, Donders Institute for Brain, Cognition and Behaviour, Nijmegen, The Netherlands
| | - Rob van Lier
- Radboud University, Donders Institute for Brain, Cognition and Behaviour, Nijmegen, The Netherlands
| |
Collapse
|
11
|
Neuhoff JG. Looming sounds are perceived as faster than receding sounds. COGNITIVE RESEARCH-PRINCIPLES AND IMPLICATIONS 2016; 1:15. [PMID: 28180166 PMCID: PMC5256440 DOI: 10.1186/s41235-016-0017-4] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/08/2016] [Accepted: 09/23/2016] [Indexed: 11/17/2022]
Abstract
Each year thousands of people are killed by looming motor vehicles. Throughout our evolutionary history looming objects have posed a threat to survival and perceptual systems have evolved unique solutions to confront these environmental challenges. Vision provides an accurate representation of time-to-contact with a looming object and usually allows us to interact successfully with the object if required. However, audition functions as a warning system and yields an anticipatory representation of arrival time, indicating that the object has arrived when it is still some distance away. The bias provides a temporal margin of safety that allows more time to initiate defensive actions. In two studies this bias was shown to influence the perception of the speed of looming and receding sound sources. Listeners heard looming and receding sound sources and judged how fast they were moving. Listeners perceived the speed of looming sounds as faster than that of equivalent receding sounds. Listeners also showed better discrimination of the speed of looming sounds than receding sounds. Finally, close sounds were perceived as faster than distant sounds. The results suggest a prioritization of the perception of the speed of looming and receding sounds that mirrors the level of threat posed by moving objects in the environment.
Collapse
Affiliation(s)
- John G Neuhoff
- Department of Psychology, The College of Wooster, Wooster, OH 44691 USA
| |
Collapse
|
12
|
Rosenblum LD, Dorsi J, Dias JW. The Impact and Status of Carol Fowler's Supramodal Theory of Multisensory Speech Perception. ECOLOGICAL PSYCHOLOGY 2016. [DOI: 10.1080/10407413.2016.1230373] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/31/2022]
|
13
|
Rosenblum LD, Dias JW, Dorsi J. The supramodal brain: implications for auditory perception. JOURNAL OF COGNITIVE PSYCHOLOGY 2016. [DOI: 10.1080/20445911.2016.1181691] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/14/2022]
|
14
|
DeLucia PR, Preddy D, Oberfeld D. Audiovisual Integration of Time-to-Contact Information for Approaching Objects. Multisens Res 2016; 29:365-95. [DOI: 10.1163/22134808-00002520] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
Previous studies of time-to-collision (TTC) judgments of approaching objects focused on effectiveness of visual TTC information in the optical expansion pattern (e.g., visual tau, disparity). Fewer studies examined effectiveness of auditory TTC information in the pattern of increasing intensity (auditory tau), or measured integration of auditory and visual TTC information. Here, participants judged TTC of an approaching object presented in the visual or auditory modality, or both concurrently. TTC information provided by the modalities was jittered slightly against each other, so that auditory and visual TTC were not perfectly correlated. A psychophysical reverse correlation approach was used to estimate the influence of auditory and visual cues on TTC estimates. TTC estimates were shorter in the auditory than the visual condition. On average, TTC judgments in the audiovisual condition were not significantly different from judgments in the visual condition. However, multiple regression analyses showed that TTC estimates were based on both auditory and visual information. Although heuristic cues (final sound pressure level, final optical size) and more reliable information (relative rate of change in acoustic intensity, optical expansion) contributed to auditory and visual judgments, the effect of heuristics was greater in the auditory condition. Although auditory and visual information influenced judgments, concurrent presentation of both did not result in lower response variability compared to presentation of either one alone; there was no multimodal advantage. The relative weightings of heuristics and more reliable information differed between auditory and visual TTC judgments, and when both were available, visual information was weighted more heavily.
Collapse
Affiliation(s)
- Patricia R. DeLucia
- Department of Psychological Sciences, MS 2051, Texas Tech University, Lubbock, TX 79409-2051, USA
| | - Doug Preddy
- Department of Psychological Sciences, MS 2051, Texas Tech University, Lubbock, TX 79409-2051, USA
| | - Daniel Oberfeld
- Department of Psychology, Johannes Gutenberg-Universität, 55099 Mainz, Germany
| |
Collapse
|
15
|
Sustained Magnetic Responses in Temporal Cortex Reflect Instantaneous Significance of Approaching and Receding Sounds. PLoS One 2015; 10:e0134060. [PMID: 26226395 PMCID: PMC4520611 DOI: 10.1371/journal.pone.0134060] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2015] [Accepted: 07/03/2015] [Indexed: 12/02/2022] Open
Abstract
Rising sound intensity often signals an approaching sound source and can serve as a powerful warning cue, eliciting phasic attention, perception biases and emotional responses. How the evaluation of approaching sounds unfolds over time remains elusive. Here, we capitalised on the temporal resolution of magnetoencephalograpy (MEG) to investigate in humans a dynamic encoding of perceiving approaching and receding sounds. We compared magnetic responses to intensity envelopes of complex sounds to those of white noise sounds, in which intensity change is not perceived as approaching. Sustained magnetic fields over temporal sensors tracked intensity change in complex sounds in an approximately linear fashion, an effect not seen for intensity change in white noise sounds, or for overall intensity. Hence, these fields are likely to track approach/recession, but not the apparent (instantaneous) distance of the sound source, or its intensity as such. As a likely source of this activity, the bilateral inferior temporal gyrus and right temporo-parietal junction emerged. Our results indicate that discrete temporal cortical areas parametrically encode behavioural significance in moving sound sources where the signal unfolded in a manner reminiscent of evidence accumulation. This may help an understanding of how acoustic percepts are evaluated as behaviourally relevant, where our results highlight a crucial role of cortical areas.
Collapse
|
16
|
Hearing brighter: Changing in-depth visual perception through looming sounds. Cognition 2014; 132:312-23. [DOI: 10.1016/j.cognition.2014.04.011] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/28/2013] [Revised: 04/17/2014] [Accepted: 04/26/2014] [Indexed: 11/18/2022]
|
17
|
Abstract
PURPOSE To determine how accurate normally sighted male and female pedestrians were at making time-to-arrival (TTA) judgments of approaching vehicles when using just their hearing or both their hearing and vision. METHODS Ten male and 14 female subjects with confirmed normal vision and hearing estimated the TTA of approaching vehicles along an unsignalized street under two sensory conditions: (1) using both habitual vision and hearing and (2) using habitual hearing only. All subjects estimated how long the approaching vehicle would take to reach them (i.e., the TTA). The actual TTA of vehicles was also measured using custom-made sensors. The error in TTA judgments for each subject under each sensory condition was calculated as the difference between the actual and estimated TTA. A secondary timing experiment was also conducted to adjust each subject's TTA judgments for their "internal metronome." RESULTS Error in TTA judgments changed significantly as a function of both the actual TTA (p < 0.0001) and sensory condition (p < 0.0001). Although no main effect for gender was found (p = 0.19), the way the TTA judgments varied within each sensory condition for each gender was different (p < 0.0001). Females tended to be as accurate under either condition (p ≥ 0.01), with the exception of TTA judgments made when the actual TTA was 2 seconds or less and 8 seconds or longer, during which the vision-and-hearing condition was more accurate (p ≤ 0.002). Males made more accurate TTA judgments under the hearing only condition for actual TTA values 5 seconds or less (p < 0.0001), after which there were no significant differences between the two conditions (p ≥ 0.01). CONCLUSIONS Our data suggest that males and females use visual and auditory information differently when making TTA judgments. Although the sensory condition did not affect the females' accuracy in judgments, males initially tended to be more accurate when using their hearing only.
Collapse
|
18
|
|
19
|
Neuhoff JG, Long KL, Worthington RC. Strength and physical fitness predict the perception of looming sounds. EVOL HUM BEHAV 2012. [DOI: 10.1016/j.evolhumbehav.2011.11.001] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
|
20
|
Ghazanfar A. Unity of the Senses for Primate Vocal Communication. Front Neurosci 2011. [DOI: 10.1201/b11092-41] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022] Open
|
21
|
Ghazanfar A. Unity of the Senses for Primate Vocal Communication. Front Neurosci 2011. [DOI: 10.1201/9781439812174-41] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022] Open
|
22
|
Predicting the position of moving audiovisual stimuli. Exp Brain Res 2010; 203:249-60. [DOI: 10.1007/s00221-010-2224-4] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/10/2009] [Accepted: 03/09/2010] [Indexed: 10/19/2022]
|
23
|
Ghazanfar AA. The multisensory roles for auditory cortex in primate vocal communication. Hear Res 2009; 258:113-20. [PMID: 19371776 PMCID: PMC2787678 DOI: 10.1016/j.heares.2009.04.003] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/03/2009] [Revised: 04/08/2009] [Accepted: 04/08/2009] [Indexed: 10/20/2022]
Abstract
Primate vocal communication is a fundamentally multisensory behavior and this will be reflected in the different roles brain regions play in mediating it. Auditory cortex is illustrative, being influenced, I will argue, by the visual, somatosensory, proprioceptive and motor modalities during vocal communication. It is my intention that the data reviewed here suggest that investigating auditory cortex through the lens of a specific behavior may lead to a much clearer picture of its functions and dynamic organization. One possibility is that, beyond its tonotopic and cytoarchitectural organization, the auditory cortex may be organized according to ethologically-relevant actions. Such action-specific representations would be overlayed on top of traditional mapping schemes and would help mediate motor and multisensory processes related to a particular type of behavior.
Collapse
Affiliation(s)
- Asif A Ghazanfar
- Neuroscience Institute, Departments of Psychology and Ecology & Evolutionary Biology, Princeton University, Princeton, NJ 08540, USA.
| |
Collapse
|
24
|
Cappe C, Thut G, Romei V, Murray MM. Selective integration of auditory-visual looming cues by humans. Neuropsychologia 2009; 47:1045-52. [DOI: 10.1016/j.neuropsychologia.2008.11.003] [Citation(s) in RCA: 72] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2008] [Revised: 10/30/2008] [Accepted: 11/05/2008] [Indexed: 10/21/2022]
|
25
|
Abstract
Speech perception is inherently multimodal. Visual speech (lip-reading) information is used by all perceivers and readily integrates with auditory speech. Imaging research suggests that the brain treats auditory and visual speech similarly. These findings have led some researchers to consider that speech perception works by extracting amodal information that takes the same form across modalities. From this perspective, speech integration is a property of the input information itself. Amodal speech information could explain the reported automaticity, immediacy, and completeness of audiovisual speech integration. However, recent findings suggest that speech integration can be influenced by higher cognitive properties such as lexical status and semantic context. Proponents of amodal accounts will need to explain these results.
Collapse
|
26
|
Maier JX, Chandrasekaran C, Ghazanfar AA. Integration of bimodal looming signals through neuronal coherence in the temporal lobe. Curr Biol 2008; 18:963-8. [PMID: 18585039 DOI: 10.1016/j.cub.2008.05.043] [Citation(s) in RCA: 80] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2008] [Revised: 05/22/2008] [Accepted: 05/23/2008] [Indexed: 11/18/2022]
Abstract
The ability to integrate information across multiple sensory systems offers several behavioral advantages, from quicker reaction times and more accurate responses to better detection and more robust learning. At the neural level, multisensory integration requires large-scale interactions between different brain regions--the convergence of information from separate sensory modalities, represented by distinct neuronal populations. The interactions between these neuronal populations must be fast and flexible, so that behaviorally relevant signals belonging to the same object or event can be immediately integrated and integration of unrelated signals can be prevented. Looming signals are a particular class of signals that are behaviorally relevant for animals and that occur in both the auditory and visual domain. These signals indicate the rapid approach of objects and provide highly salient warning cues about impending impact. We show here that multisensory integration of auditory and visual looming signals may be mediated by functional interactions between auditory cortex and the superior temporal sulcus, two areas involved in integrating behaviorally relevant auditory-visual signals. Audiovisual looming signals elicited increased gamma-band coherence between these areas, relative to unimodal or receding-motion signals. This suggests that the neocortex uses fast, flexible intercortical interactions to mediate multisensory integration.
Collapse
Affiliation(s)
- Joost X Maier
- Max Planck Institute for Biological Cybernetics, Spemannstrasse 38, 72076 Tuebingen, Germany
| | | | | |
Collapse
|
27
|
Bach DR, Schächinger H, Neuhoff JG, Esposito F, Di Salle F, Lehmann C, Herdener M, Scheffler K, Seifritz E. Rising sound intensity: an intrinsic warning cue activating the amygdala. Cereb Cortex 2007; 18:145-50. [PMID: 17490992 DOI: 10.1093/cercor/bhm040] [Citation(s) in RCA: 103] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Human subjects overestimate the change of rising intensity sounds compared with falling intensity sounds. Rising sound intensity has therefore been proposed to be an intrinsic warning cue. In order to test this hypothesis, we presented rising, falling, and constant intensity sounds to healthy humans and gathered psychophysiological and behavioral responses. Brain activity was measured using event-related functional magnetic resonance imaging. We found that rising compared with falling sound intensity facilitates autonomic orienting reflex and phasic alertness to auditory targets. Rising intensity sounds produced neural activity in the amygdala, which was accompanied by activity in intraparietal sulcus, superior temporal sulcus, and temporal plane. Our results indicate that rising sound intensity is an elementary warning cue eliciting adaptive responses by recruiting attentional and physiological resources. Regions involved in cross-modal integration were activated by rising sound intensity, while the right-hemisphere phasic alertness network could not be supported by this study.
Collapse
Affiliation(s)
- Dominik R Bach
- University Hospital of Psychiatry, University of Bern, 3000 Bern, Switzerland.
| | | | | | | | | | | | | | | | | |
Collapse
|