1
|
Lu F, Li Y, Yang J, Wang A, Zhang M. Auditory affective content facilitates time-to-contact estimation of visual affective targets. Front Psychol 2023; 14:1105824. [PMID: 37207030 PMCID: PMC10188967 DOI: 10.3389/fpsyg.2023.1105824] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2022] [Accepted: 04/06/2023] [Indexed: 05/21/2023] Open
Abstract
Reacting to a moving object requires an ability to estimate when a moving object reaches its destination, also referred to as the time-to-contact (TTC) estimation. Although the TTC estimation of threatening visually moving objects is known to be underestimated, the effect of the affective content of auditory information on visual TTC estimation remains unclear. We manipulated the velocity and presentation time to investigate the TTC of a threat or non-threat target with the addition of auditory information. In the task, a visual or an audiovisual target moved from right to left and disappeared behind an occluder. Participants' task was to estimate the TTC of the target, they needed to press a button when they thought that the target contacted a destination behind the occluder. Behaviorally, the additional auditory affective content facilitated TTC estimation; velocity was a more critical factor than presentation time in determining the audiovisual threat facilitation effect. Overall, the results indicate that exposure to auditory affective content can influence TTC estimation and that the effect of velocity on TTC estimation will provide more information than presentation time.
Collapse
Affiliation(s)
- Feifei Lu
- Department of Psychology, Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou, China
| | - You Li
- College of Chinese Language and Culture, Jinan University, Guangzhou, China
| | - Jiajia Yang
- Applied Brain Science Lab, Faculty of Interdisciplinary Science and Engineering in Health Systems, Okayama University, Okayama, Japan
| | - Aijun Wang
- Department of Psychology, Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou, China
- *Correspondence: Aijun Wang,
| | - Ming Zhang
- Department of Psychology, Research Center for Psychology and Behavioral Sciences, Soochow University, Suzhou, China
- Cognitive Neuroscience Laboratory, Graduate School of Interdisciplinary Science and Engineering in Health Systems, Okayama University, Okayama, Japan
- Ming Zhang,
| |
Collapse
|
2
|
Wessels M, Zähme C, Oberfeld D. Auditory Information Improves Time-to-collision Estimation for Accelerating Vehicles. CURRENT PSYCHOLOGY 2022. [DOI: 10.1007/s12144-022-03375-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
AbstractTo cross a road safely, pedestrians estimate the time remaining until an approaching vehicle arrives at their location (time-to-collision, TTC). For visually presented accelerated objects, however, TTC estimates are known to show a first-order pattern indicating that acceleration is not adequately considered. We investigated whether added vehicle sound can reduce these estimation errors. Twenty-five participants estimated the TTC of vehicles approaching with constant velocity or accelerating, from a pedestrian’s perspective at the curb in a traffic simulation. For visually-only presented accelerating vehicles, the TTC estimates showed the expected first-order pattern and thus large estimation errors. With added vehicle sound, the first-order pattern was largely removed, and TTC estimates were significantly more accurate compared to the visual-only presentation. For constant velocities, TTC estimates in both presentation conditions were predominantly accurate. Taken together, the sound of an accelerating vehicle can compensate for erroneous visual TTC estimates presumably by promoting the consideration of acceleration.
Collapse
|
3
|
Carlini A, Bigand E. Does Sound Influence Perceived Duration of Visual Motion? Front Psychol 2021; 12:751248. [PMID: 34925155 PMCID: PMC8675101 DOI: 10.3389/fpsyg.2021.751248] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2021] [Accepted: 11/10/2021] [Indexed: 11/13/2022] Open
Abstract
Multimodal perception is a key factor in obtaining a rich and meaningful representation of the world. However, how each stimulus combines to determine the overall percept remains a matter of research. The present work investigates the effect of sound on the bimodal perception of motion. A visual moving target was presented to the participants, associated with a concurrent sound, in a time reproduction task. Particular attention was paid to the structure of both the auditory and the visual stimuli. Four different laws of motion were tested for the visual motion, one of which is biological. Nine different sound profiles were tested, from an easier constant sound to more variable and complex pitch profiles, always presented synchronously with motion. Participants' responses show that constant sounds produce the worst duration estimation performance, even worse than the silent condition; more complex sounds, instead, guarantee significantly better performance. The structure of the visual stimulus and that of the auditory stimulus appear to condition the performance independently. Biological motion provides the best performance, while the motion featured by a constant-velocity profile provides the worst performance. Results clearly show that a concurrent sound influences the unified perception of motion; the type and magnitude of the bias depends on the structure of the sound stimulus. Contrary to expectations, the best performance is not generated by the simplest stimuli, but rather by more complex stimuli that are richer in information.
Collapse
Affiliation(s)
- Alessandro Carlini
- Laboratory for Research on Learning and Development, CNRS UMR 5022, University of Burgundy, Dijon, France
| | - Emmanuel Bigand
- Laboratory for Research on Learning and Development, CNRS UMR 5022, University of Burgundy, Dijon, France
| |
Collapse
|
4
|
The influence of auditory rhythms on the speed of inferred motion. Atten Percept Psychophys 2021; 84:2360-2383. [PMID: 34435321 DOI: 10.3758/s13414-021-02364-4] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/26/2021] [Indexed: 12/24/2022]
Abstract
The present research explored the influence of isochronous auditory rhythms on the timing of movement-related prediction in two experiments. In both experiments, participants observed a moving disc that was visible for a predetermined period before disappearing behind a small, medium, or large occluded area for the remainder of its movement. In Experiment 1, the disc was visible for 1 s. During this period, participants were exposed to either a fast or slow auditory rhythm, or they heard nothing. They were instructed to press a key to indicate when they believed the moving disc had reached a specified location on the other side of the occluded area. The procedure measured the (signed) error in participants' estimate of the time it would take for a moving object to contact a stationary one. The principal results of Experiment 1 were main effects of the rate of the auditory rhythm and of the size of the occlusion on participants' judgments. In Experiment 2, the period of visibility was varied with size of the occlusion area to keep the total movement time constant for all three levels of occlusion. The results replicated the main effect of rhythm found in Experiment 1 and showed a small, significant interaction, but indicated no main effect of occlusion size. Overall, the results indicate that exposure to fast isochronous auditory rhythms during an interval of inferred motion can influence the imagined rate of such motion and suggest a possible role of an internal rhythmicity in the maintenance of temporally accurate dynamic mental representations.
Collapse
|
5
|
Auditory pitch glides influence time-to-contact judgements of visual stimuli. Exp Brain Res 2019; 237:1907-1917. [PMID: 31104086 DOI: 10.1007/s00221-019-05561-8] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2018] [Accepted: 05/13/2019] [Indexed: 10/26/2022]
Abstract
A common experimental task used to study the accuracy of estimating when a moving object arrives at a designated location is the time-to-contact (TTC) task. The previous studies have shown evidence that sound motion cues influence TTC estimates of a visual moving object. However, the extent to which sound can influence TTC of visual targets still remains unclear. Some studies on the crossmodal correspondence between pitch and speed suggest that descending pitch sounds are associated with faster speeds compared to ascending pitch sounds due to an internal model of gravity. Other studies have shown an opposite pitch-speed mapping (i.e., ascending pitch associated with faster speeds) and no influence of gravity heuristics. Here, we explored whether auditory pitch glides, a continuous pure tone sound either ascending or descending in pitch, influence TTC estimates of a vertically moving visual target and if any observed effects are consistent with a gravity-centered or gravity-unrelated pitch-speed mapping. Subjects estimated when a disc moving either upward or downward at a constant speed reached a visual landmark after the disc disappeared behind an occluder under three conditions: with an accompanying ascending pitch glide, with a descending pitch glide, or with no sound. Overall, subjects underestimated TTC with ascending pitch glides and overestimated TTC with descending pitch glides, compared to the no-sound condition. These biases in TTC were consistent in both disc motion directions. These results suggest that subjects adopted a gravity-unrelated pitch-speed mapping where ascending pitch is associated with faster speeds and descending pitch associated with slower speeds.
Collapse
|
6
|
Dittrich S, Noesselt T. Temporal Audiovisual Motion Prediction in 2D- vs. 3D-Environments. Front Psychol 2018; 9:368. [PMID: 29618999 PMCID: PMC5871701 DOI: 10.3389/fpsyg.2018.00368] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2017] [Accepted: 03/06/2018] [Indexed: 11/24/2022] Open
Abstract
Predicting motion is essential for many everyday life activities, e.g., in road traffic. Previous studies on motion prediction failed to find consistent results, which might be due to the use of very different stimulus material and behavioural tasks. Here, we directly tested the influence of task (detection, extrapolation) and stimulus features (visual vs. audiovisual and three-dimensional vs. non-three-dimensional) on temporal motion prediction in two psychophysical experiments. In both experiments a ball followed a trajectory toward the observer and temporarily disappeared behind an occluder. In audiovisual conditions a moving white noise (congruent or non-congruent to visual motion direction) was presented concurrently. In experiment 1 the ball reappeared on a predictable or a non-predictable trajectory and participants detected when the ball reappeared. In experiment 2 the ball did not reappear after occlusion and participants judged when the ball would reach a specified position at two possible distances from the occluder (extrapolation task). Both experiments were conducted in three-dimensional space (using stereoscopic screen and polarised glasses) and also without stereoscopic presentation. Participants benefitted from visually predictable trajectories and concurrent sounds during detection. Additionally, visual facilitation was more pronounced for non-3D stimulation during detection task. In contrast, for a more complex extrapolation task group mean results indicated that auditory information impaired motion prediction. However, a post hoc cross-validation procedure (split-half) revealed that participants varied in their ability to use sounds during motion extrapolation. Most participants selectively profited from either near or far extrapolation distances but were impaired for the other one. We propose that interindividual differences in extrapolation efficiency might be the mechanism governing this effect. Together, our results indicate that both a realistic experimental environment and subject-specific differences modulate the ability of audiovisual motion prediction and need to be considered in future research.
Collapse
Affiliation(s)
- Sandra Dittrich
- Department of Biological Psychology, Otto von Guericke University Magdeburg, Magdeburg, Germany
| | - Tömme Noesselt
- Department of Biological Psychology, Otto von Guericke University Magdeburg, Magdeburg, Germany.,Center for Behavioral Brain Sciences, Magdeburg, Germany
| |
Collapse
|
7
|
Rosemann S, Wefel IM, Elis V, Fahle M. Audio-visual interaction in visual motion detection: Synchrony versus Asynchrony. JOURNAL OF OPTOMETRY 2017; 10:242-251. [PMID: 28237358 PMCID: PMC5595265 DOI: 10.1016/j.optom.2016.12.003] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/31/2016] [Revised: 11/17/2016] [Accepted: 12/09/2016] [Indexed: 06/06/2023]
Abstract
OBJECTIVE Detection and identification of moving targets is of paramount importance in everyday life, even if it is not widely tested in optometric practice, mostly for technical reasons. There are clear indications in the literature that in perception of moving targets, vision and hearing interact, for example in noisy surrounds and in understanding speech. The main aim of visual perception, the ability that optometry aims to optimize, is the identification of objects, from everyday objects to letters, but also the spatial orientation of subjects in natural surrounds. To subserve this aim, corresponding visual and acoustic features from the rich spectrum of signals supplied by natural environments have to be combined. METHODS Here, we investigated the influence of an auditory motion stimulus on visual motion detection, both with a concrete (left/right movement) and an abstract auditory motion (increase/decrease of pitch). RESULTS We found that incongruent audiovisual stimuli led to significantly inferior detection compared to the visual only condition. Additionally, detection was significantly better in abstract congruent than incongruent trials. For the concrete stimuli the detection threshold was significantly better in asynchronous audiovisual conditions than in the unimodal visual condition. CONCLUSION We find a clear but complex pattern of partly synergistic and partly inhibitory audio-visual interactions. It seems that asynchrony plays only a positive role in audiovisual motion while incongruence mostly disturbs in simultaneous abstract configurations but not in concrete configurations. As in speech perception in hearing-impaired patients, patients suffering from visual deficits should be able to benefit from acoustic information.
Collapse
Affiliation(s)
- Stephanie Rosemann
- Department of Human-Neurobiology, University of Bremen, Hochschulring 18, 28359 Bremen, Germany.
| | - Inga-Maria Wefel
- Department of Human-Neurobiology, University of Bremen, Hochschulring 18, 28359 Bremen, Germany
| | - Volkan Elis
- Department of Human-Neurobiology, University of Bremen, Hochschulring 18, 28359 Bremen, Germany
| | - Manfred Fahle
- Department of Human-Neurobiology, University of Bremen, Hochschulring 18, 28359 Bremen, Germany
| |
Collapse
|
8
|
Estimating the relative weights of visual and auditory tau versus heuristic-based cues for time-to-contact judgments in realistic, familiar scenes by older and younger adults. Atten Percept Psychophys 2017; 79:929-944. [DOI: 10.3758/s13414-016-1270-9] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
9
|
DeLucia PR, Preddy D, Oberfeld D. Audiovisual Integration of Time-to-Contact Information for Approaching Objects. Multisens Res 2016; 29:365-95. [DOI: 10.1163/22134808-00002520] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Abstract
Previous studies of time-to-collision (TTC) judgments of approaching objects focused on effectiveness of visual TTC information in the optical expansion pattern (e.g., visual tau, disparity). Fewer studies examined effectiveness of auditory TTC information in the pattern of increasing intensity (auditory tau), or measured integration of auditory and visual TTC information. Here, participants judged TTC of an approaching object presented in the visual or auditory modality, or both concurrently. TTC information provided by the modalities was jittered slightly against each other, so that auditory and visual TTC were not perfectly correlated. A psychophysical reverse correlation approach was used to estimate the influence of auditory and visual cues on TTC estimates. TTC estimates were shorter in the auditory than the visual condition. On average, TTC judgments in the audiovisual condition were not significantly different from judgments in the visual condition. However, multiple regression analyses showed that TTC estimates were based on both auditory and visual information. Although heuristic cues (final sound pressure level, final optical size) and more reliable information (relative rate of change in acoustic intensity, optical expansion) contributed to auditory and visual judgments, the effect of heuristics was greater in the auditory condition. Although auditory and visual information influenced judgments, concurrent presentation of both did not result in lower response variability compared to presentation of either one alone; there was no multimodal advantage. The relative weightings of heuristics and more reliable information differed between auditory and visual TTC judgments, and when both were available, visual information was weighted more heavily.
Collapse
Affiliation(s)
- Patricia R. DeLucia
- Department of Psychological Sciences, MS 2051, Texas Tech University, Lubbock, TX 79409-2051, USA
| | - Doug Preddy
- Department of Psychological Sciences, MS 2051, Texas Tech University, Lubbock, TX 79409-2051, USA
| | - Daniel Oberfeld
- Department of Psychology, Johannes Gutenberg-Universität, 55099 Mainz, Germany
| |
Collapse
|
10
|
Cognitive and motor aspects of a coincidence-timing task in Cerebral Palsy children. Neurosci Lett 2015; 602:33-7. [DOI: 10.1016/j.neulet.2015.06.043] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/03/2014] [Revised: 06/02/2015] [Accepted: 06/22/2015] [Indexed: 11/18/2022]
|
11
|
Blind(fold)ed by science: a constant target-heading angle is used in visual and nonvisual pursuit. Psychon Bull Rev 2013; 20:923-34. [PMID: 23440726 DOI: 10.3758/s13423-013-0412-5] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Previous work investigating the strategies that observers use to intercept moving targets has shown that observers maintain a constant target-heading angle (CTHA) to achieve interception. Most of this work has concluded or indirectly assumed that vision is necessary to do this. We investigated whether blindfolded pursuers chasing a ball carrier holding a beeping football would utilize the same strategy that sighted observers use to chase a ball carrier. Results confirm that both blindfolded and sighted pursuers use a CTHA strategy in order to intercept targets, whether jogging or walking and irrespective of football experience and path and speed deviations of the ball carrier during the course of the pursuit. This work shows that the mechanisms involved in intercepting moving targets may be designed to use different sensory mechanisms in order to drive behavior that leads to the same end result. This has potential implications for the supramodal representation of motion perception in the human brain.
Collapse
|