1
|
Rummukainen OS, Schlecht SJ, Habets EAP. No dynamic visual capture for self-translation minimum audible angle. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2020; 148:EL77. [PMID: 32752782 DOI: 10.1121/10.0001588] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/07/2020] [Accepted: 06/28/2020] [Indexed: 06/11/2023]
Abstract
Auditory localization is affected by visual cues. The study at hand focuses on a scenario where dynamic sound localization cues are induced by lateral listener self-translation in relation to a stationary sound source with matching or mismatching dynamic visual cues. The audio-only self-translation minimum audible angle (ST-MAA) is previously shown to be 3.3° in the horizontal plane in front of the listener. The present study found that the addition of visual cues has no significant effect on the ST-MAA.
Collapse
Affiliation(s)
- Olli S Rummukainen
- International Audio Laboratories Erlangen, A Joint Institution of the Friedrich-Alexander-University Erlangen-Nürnberg and Fraunhofer Institute for Integrated Circuits, Erlangen, Germany
| | - Sebastian J Schlecht
- Department of Signal Processing and Acoustics and Department of Media, Aalto University, Espoo, , ,
| | - Emanuël A P Habets
- International Audio Laboratories Erlangen, A Joint Institution of the Friedrich-Alexander-University Erlangen-Nürnberg and Fraunhofer Institute for Integrated Circuits, Erlangen, Germany
| |
Collapse
|
2
|
Abstract
Previous studies have demonstrated that visual apparent motion can alter the judgment of auditory apparent motion. We investigated the effect of visual apparent motion on judgments of the direction of tactile apparent motion. When visual motion was presented at the same time as, but in a direction opposite to, tactile motion, accuracy in judging the direction of tactile apparent motion was substantially reduced. This reduction in performance is referred to as ‘the congruency effect’. Similar effects were observed when the visual display was placed either near to the tactile display or at some distance from the tactile display (experiment 1). In experiment 2, the relative alignment between the visual and tactile directions of motion was varied. The size of the congruency effect was similar at 0° and 45° alignments but much reduced at a 90° alignment. In experiment 3, subjects made confidence ratings of their judgments of the direction of the tactile motion. The results indicated that the congruency effect was not due to subjects being unsure of the direction of motion and being forced to guess. In experiment 4, static visual stimuli were shown to have no effect on the judgments of direction of the tactile stimuli. The extent to which the congruency effect reflects capture effects and is the result of perceptual versus post-perceptual processes is discussed.
Collapse
Affiliation(s)
- James C Craig
- Department of Psychology, Indiana University, Bloomington, IN 47405, USA.
| |
Collapse
|
3
|
Abstract
When two discrete stimuli are presented in rapid succession, observers typically report a movement of the lead stimulus toward the lag stimulus. The object of this study was to investigate crossmodal effects of irrelevant sounds on this illusion of visual apparent motion. Observers were presented with two visual stimuli that were temporally separated by interstimulus onset intervals from 0 to 350 ms. After each trial, observers classified their impression of the stimuli using a categorisation system. The presentation of short sounds intervening between the visual stimuli facilitated the impression of apparent motion relative to baseline (visual stimuli without sounds), whereas sounds presented before the first and after the second visual stimulus as well as simultaneously presented sounds reduced the motion impression. The results demonstrate an effect of the temporal structure of irrelevant sounds on visual apparent motion that is discussed in light of a related multisensory phenomenon, ‘temporal ventriloquism’, on the assumption that sounds can attract lights in the temporal dimension.
Collapse
|
4
|
Crossmodal interactions and multisensory integration in the perception of audio-visual motion — A free-field study. Brain Res 2012; 1466:99-111. [DOI: 10.1016/j.brainres.2012.05.015] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2012] [Revised: 05/03/2012] [Accepted: 05/06/2012] [Indexed: 11/17/2022]
|
5
|
Merlo JL. Cross-modal congruency benefits for combined tactile and visual signaling. AMERICAN JOURNAL OF PSYCHOLOGY 2011. [DOI: 10.5406/amerjpsyc.124.4.0413] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
6
|
Gori M, Mazzilli G, Sandini G, Burr D. Cross-Sensory Facilitation Reveals Neural Interactions between Visual and Tactile Motion in Humans. Front Psychol 2011; 2:55. [PMID: 21734892 PMCID: PMC3110703 DOI: 10.3389/fpsyg.2011.00055] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2010] [Accepted: 03/23/2011] [Indexed: 11/13/2022] Open
Abstract
Many recent studies show that the human brain integrates information across the different senses and that stimuli of one sensory modality can enhance the perception of other modalities. Here we study the processes that mediate cross-modal facilitation and summation between visual and tactile motion. We find that while summation produced a generic, non-specific improvement of thresholds, probably reflecting higher-order interaction of decision signals, facilitation reveals a strong, direction-specific interaction, which we believe reflects sensory interactions. We measured visual and tactile velocity discrimination thresholds over a wide range of base velocities and conditions. Thresholds for both visual and tactile stimuli showed the characteristic “dipper function,” with the minimum thresholds occurring at a given “pedestal speed.” When visual and tactile coherent stimuli were combined (summation condition) the thresholds for these multisensory stimuli also showed a “dipper function” with the minimum thresholds occurring in a similar range to that for unisensory signals. However, the improvement of multisensory thresholds was weak and not directionally specific, well predicted by the maximum-likelihood estimation model (agreeing with previous research). A different technique (facilitation) did, however, reveal direction-specific enhancement. Adding a non-informative “pedestal” motion stimulus in one sensory modality (vision or touch) selectively lowered thresholds in the other, by the same amount as pedestals in the same modality. Facilitation did not occur for neutral stimuli like sounds (that would also have reduced temporal uncertainty), nor for motion in opposite direction, even in blocked trials where the subjects knew that the motion was in the opposite direction showing that the facilitation was not under subject control. Cross-sensory facilitation is strong evidence for functionally relevant cross-sensory integration at early levels of sensory processing.
Collapse
Affiliation(s)
- Monica Gori
- Istituto Italiano di Tecnologia, Robotics, Brain and Cognitive Sciences Genova, Italy
| | | | | | | |
Collapse
|
7
|
Hsieh IH, Petrosyan A, Gonçalves OF, Hickok G, Saberi K. Cross-modulation interference with lateralization of mixed-modulated waveforms. JOURNAL OF SPEECH, LANGUAGE, AND HEARING RESEARCH : JSLHR 2010; 53:1417-1428. [PMID: 20689037 DOI: 10.1044/1092-4388(2010/09-0206)] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/29/2023]
Abstract
PURPOSE This study investigated the ability to use spatial information in mixed-modulated (MM) sounds containing concurrent frequency-modulated (FM) and amplitude-modulated (AM) sounds by exploring patterns of interference when different modulation types originated from different loci as may occur in a multisource acoustic field. METHOD Interaural delay thresholds were measured from 5 normal-hearing adults for an AM sound in the presence of interfering FM and vice versa as a function of interferer modulation rate. In addition, the effects of near versus remote interferer rates, and fixed versus randomized interferer interaural delay, were investigated. RESULTS AM interfered with lateralization of FM at all modulation rates. However, the FM interfered with AM lateralization only when the FM rate was higher than the AM rate. This rate asymmetry was surprising given the prevalence of low-frequency dominance in lateralization, but was predicted by a cross-correlation model of binaural interaction. Effects were similar for fixed and randomized interferer interaural delays. CONCLUSIONS The results suggest that in multisource environments, sources containing different modulation types significantly interfere with localization in complex ways that reveal interactions between modulation type and rate. These findings contribute to the understanding of auditory object formation and localization.
Collapse
Affiliation(s)
- I-Hui Hsieh
- Department of Cognitive Sciences, University of California, Irvine, Irvine, CA 92692, USA
| | | | | | | | | |
Collapse
|
8
|
Influences of intra- and crossmodal grouping on visual and tactile Ternus apparent motion. Brain Res 2010; 1354:152-62. [DOI: 10.1016/j.brainres.2010.07.064] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2010] [Revised: 07/14/2010] [Accepted: 07/18/2010] [Indexed: 11/19/2022]
|
9
|
JAMES L. MERLO, AARON R. DULEY, PETER A. HANCOCK. Cross-modal congruency benefits for combined tactile and visual signaling. AMERICAN JOURNAL OF PSYCHOLOGY 2010; 123:413-24. [DOI: 10.5406/amerjpsyc.123.4.0413] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
10
|
Bentvelzen A, Leung J, Alais D. Discriminating Audiovisual Speed: Optimal Integration of Speed Defaults to Probability Summation When Component Reliabilities Diverge. Perception 2009; 38:966-87. [DOI: 10.1068/p6261] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
We investigated audiovisual speed perception to test the maximum-likelihood-estimation (MLE) model of multisensory integration. According to MLE, audiovisual speed perception will be based on a weighted average of visual and auditory speed estimates, with each component weighted by its inverse variance, a statistically optimal combination that produces a fused estimate with minimised variance and thereby affords maximal discrimination. We use virtual auditory space to create ecologically valid auditory motion, together with visual apparent motion around an array of 63 LEDs. To degrade the usual dominance of vision over audition, we added positional jitter to the motion sequences, and also measured peripheral trajectories. Both factors degraded visual speed discrimination, while auditory speed perception was unaffected by trajectory location. In the bimodal conditions, a speed conflict was introduced (48° versus 60° s−1) and two measures were taken: perceived audiovisual speed, and the precision (variability) of audiovisual speed discrimination. These measures showed only a weak tendency to follow MLE predictions. However, splitting the data into two groups based on whether the unimodal component weights were similar or disparate revealed interesting findings: similarly weighted components were integrated in a manner closely matching MLE predictions, while dissimilarity weighted components (greater than 3: 1 difference) were integrated according to probability-summation predictions. These results suggest that different multisensory integration strategies may be implemented depending on relative component reliabilities, with MLE integration vetoed when component weights are highly disparate.
Collapse
Affiliation(s)
- Adam Bentvelzen
- School of Psychology, University of Sydney, Sydney 2006, Australia
| | - Johahn Leung
- School of Psychology, University of Sydney, Sydney 2006, Australia
| | - David Alais
- School of Psychology, University of Sydney, Sydney 2006, Australia
| |
Collapse
|
11
|
Occelli V, Spence C, Zampini M. The effect of sound intensity on the audiotactile crossmodal dynamic capture effect. Exp Brain Res 2008; 193:409-19. [PMID: 19011842 DOI: 10.1007/s00221-008-1637-9] [Citation(s) in RCA: 15] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2008] [Accepted: 10/25/2008] [Indexed: 11/26/2022]
Abstract
We investigated the effect of varying sound intensity on the audiotactile crossmodal dynamic capture effect. Participants had to discriminate the direction of a target stream (tactile, Experiment 1; auditory, Experiment 2) while trying to ignore the direction of a distractor stream presented in a different modality (auditory, Experiment 1; tactile, Experiment 2). The distractor streams could either be spatiotemporally congruent or incongruent with respect to the target stream. In half of the trials, the participants were presented with auditory stimuli at 75 dB(A) while in the other half of the trials they were presented with auditory stimuli at 82 dB(A). Participants' performance on both tasks was significantly affected by the intensity of the sounds. Namely, the crossmodal capture of tactile motion by audition was stronger with the more intense (vs. less intense) auditory distractors (Experiment 1), whereas the capture effect exerted by the tactile distractors was stronger for less intense (than for more intense) auditory targets (Experiment 2). The crossmodal dynamic capture was larger in Experiment 1 than in Experiment 2, with a stronger congruency effect when the target streams were presented in the tactile (vs. auditory) modality. Two explanations are put forward to account for these results: an attentional biasing toward the more intense auditory stimuli, and a modulation induced by the relative perceptual weight of, respectively, the auditory and the tactile signals.
Collapse
Affiliation(s)
- Valeria Occelli
- Department of Cognitive Sciences and Education, University of Trento, Rovereto, TN, Italy.
| | | | | |
Collapse
|
12
|
Bruns P, Getzmann S. Audiovisual influences on the perception of visual apparent motion: exploring the effect of a single sound. Acta Psychol (Amst) 2008; 129:273-83. [PMID: 18790468 DOI: 10.1016/j.actpsy.2008.08.002] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2007] [Revised: 06/17/2008] [Accepted: 08/05/2008] [Indexed: 11/19/2022] Open
Abstract
Previous research has shown that irrelevant sounds can facilitate the perception of visual apparent motion. Here the effectiveness of a single sound to facilitate motion perception was investigated in three experiments. Observers were presented with two discrete lights temporally separated by stimulus onset asynchronies from 0 to 350 ms. After each trial, observers classified their impression of the stimuli using a categorisation system. A short sound presented temporally (and spatially) midway between the lights facilitated the impression of motion relative to baseline (lights without sound), whereas a sound presented either before the first or after the second light or simultaneously with the lights did not affect motion impression. The facilitation effect also occurred with sound presented far from the visual display, as well as with continuous-sound that was started with the first light and terminated with the second light. No facilitation of visual motion perception occurred if the sound was part of a tone sequence that allowed for intramodal perceptual grouping of the auditory stimuli prior to the critical audiovisual stimuli. Taken together, the findings are consistent with a low-level audiovisual integration approach in which the perceptual system merges temporally proximate sound and light stimuli, thereby provoking the impression of a single multimodal moving object.
Collapse
Affiliation(s)
- Patrick Bruns
- Biological Psychology and Neuropsychology, University of Hamburg, Von-Melle-Park 11, D-20146 Hamburg, Germany.
| | | |
Collapse
|
13
|
Zvyagintsev M, Nikolaev AR, Mathiak KA, Menning H, Hertrich I, Mathiak K. Predictability modulates motor-auditory interactions in self-triggered audio-visual apparent motion. Exp Brain Res 2008; 189:289-300. [PMID: 18500638 DOI: 10.1007/s00221-008-1423-8] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/09/2007] [Accepted: 05/06/2008] [Indexed: 11/30/2022]
Abstract
We studied an effect of predictability in an audio-visual apparent motion task using magnetoencephalography. The synchronous sequences of audio-visual stimuli were self-triggered by subjects. The task was to detect the direction of the apparent motion in experimental blocks in which the motion either started from the side selected by subjects (predictable condition) or was random (unpredictable condition). Magnetic fields yielded three patterns of activity in the motor, auditory, and visual areas. Comparison of the dipole strength between predictable and unpredictable conditions revealed a significant difference of the preparatory motor activity in the time interval from -450 to -100 ms before self-triggering the stimulus. Perception of the audio-visual apparent motion was also modulated by predictability. However, the modulation was found only for the auditory activity but not for the visual one. The effect of predictability was selective and modulated only the auditory component N1 (100 ms after stimulus), which reflects initial evaluation of stimulus meaning. Importantly, the preparatory motor activity correlates with the following auditory activity mainly in the same hemisphere. Similar modulation by predictability of the motor and auditory activities suggests interactions between these two systems within an action-perception cycle. The mechanism of these interactions can be understood as an effect of anticipation of the own action outcomes on the preparatory motor and perceptual activity.
Collapse
Affiliation(s)
- Mikhail Zvyagintsev
- Department of Psychiatry and Psychotherapy, RWTH Aachen, Pauwelsstr. 30, 52074 Aachen, Germany.
| | | | | | | | | | | |
Collapse
|
14
|
Soto-Faraco S, Kingstone A, Spence C. Integrating motion information across sensory modalities: the role of top-down factors. PROGRESS IN BRAIN RESEARCH 2007; 155:273-86. [PMID: 17027394 DOI: 10.1016/s0079-6123(06)55016-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/25/2023]
Abstract
Recent studies have highlighted the influence of multisensory integration mechanisms in the processing of motion information. One central issue in this research area concerns the extent to which the behavioral correlates of these effects can be attributed to late post-perceptual (i.e., response-related or decisional) processes rather than to perceptual mechanisms of multisensory binding. We investigated the influence of various top-down factors on the phenomenon of crossmodal dynamic capture, whereby the direction of motion in one sensory modality (audition) is strongly influenced by motion presented in another sensory modality (vision). In Experiment 1, we introduced extensive feedback in order to manipulate the motivation level of participants and the extent of their practice with the task. In Experiment 2, we reduced the variability of the irrelevant (visual) distractor stimulus by making its direction predictable beforehand. In Experiment 3, we investigated the effects of changing the stimulus-response mapping (task). None of these manipulations exerted any noticeable influence on the overall pattern of crossmodal dynamic capture that was observed. We therefore conclude that the integration of multisensory motion cues is robust to a number of top-down influences, thereby revealing that the crossmodal dynamic capture effect reflects the relatively automatic integration of multisensory motion information.
Collapse
Affiliation(s)
- Salvador Soto-Faraco
- ICREA and Parc Científic de Barcelona - Universitat de Barcelona, Barcelona, Spain.
| | | | | |
Collapse
|
15
|
Bensmaïa SJ, Killebrew JH, Craig JC. Influence of visual motion on tactile motion perception. J Neurophysiol 2006; 96:1625-37. [PMID: 16723415 PMCID: PMC1839045 DOI: 10.1152/jn.00192.2006] [Citation(s) in RCA: 47] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Subjects were presented with pairs of tactile drifting sinusoids and made speed discrimination judgments. On some trials, a visual drifting sinusoid, which subjects were instructed to ignore, was presented simultaneously with one of the two tactile stimuli. When the visual and tactile gratings drifted in the same direction (i.e., from left to right), the visual distractors were found to increase the perceived speed of the tactile gratings. The effect of the visual distractors was proportional to their temporal frequency but not to their perceived speed. When the visual and tactile gratings drifted in opposite directions, the distracting effect of the visual distractors was either substantially reduced or, in some cases, reversed (i.e., the distractors slowed the perceived speed of the tactile gratings). This result suggests that the observed visual-tactile interaction is dependent on motion and not simply on the oscillations inherent in drifting sinusoids. Finally, we find that disrupting the temporal synchrony between the visual and tactile stimuli eliminates the distracting effect of the visual stimulus. We interpret this latter finding as evidence that the observed visual-tactile interaction operates at the sensory level and does not simply reflect a response bias.
Collapse
Affiliation(s)
- S. J. Bensmaïa
- Department of Neuroscience and
- Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, Maryland; and
- Address for reprint requests and other correspondence: S. J. Bensmaïa, Krieger Mind/Brain Institute, 338 Krieger Hall, 3400 N. Charles St., Balti-more, MD 21218 (E-mail: )
| | - J. H. Killebrew
- Krieger Mind/Brain Institute, Johns Hopkins University, Baltimore, Maryland; and
| | - J. C. Craig
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, Indiana
| |
Collapse
|
16
|
Sanabria D, Soto-Faraco S, Spence C. Assessing the effect of visual and tactile distractors on the perception of auditory apparent motion. Exp Brain Res 2005; 166:548-58. [PMID: 16132965 DOI: 10.1007/s00221-005-2395-6] [Citation(s) in RCA: 15] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2004] [Accepted: 08/17/2004] [Indexed: 11/30/2022]
Abstract
In this study we investigated the effect of the directional congruency of tactile, visual, or bimodal visuotactile apparent motion distractors on the perception of auditory apparent motion. Participants had to judge the direction in which an auditory apparent motion stream moved (left-to-right or right-to-left) while trying to ignore one of a range of distractor stimuli, including unimodal tactile or visual, bimodal visuotactile, and crossmodal (i.e., composed of one visual and one tactile stimulus) distractors. Significant crossmodal dynamic capture effects (i.e., better performance when the target and distractor stimuli moved in the same direction rather than in opposite directions) were demonstrated in all conditions. Bimodal distractors elicited more crossmodal dynamic capture than unimodal distractors, thus providing the first empirical demonstration of the effect of information presented simultaneously in two irrelevant sensory modalities on the perception of motion in a third (target) sensory modality. The results of a second experiment demonstrated that the capture effect reported in the crossmodal distractor condition was most probably attributable to the combined effect of the individual static distractors (i.e., to ventriloquism) rather than to any emergent property of crossmodal apparent motion.
Collapse
Affiliation(s)
- Daniel Sanabria
- Department of Experimental Psychology, University of Oxford, South Parks Road, Oxford, OX1 3UD, UK,
| | | | | |
Collapse
|
17
|
Sanabria D, Soto-Faraco S, Spence C. Spatiotemporal interactions between audition and touch depend on hand posture. Exp Brain Res 2005; 165:505-14. [PMID: 15942735 DOI: 10.1007/s00221-005-2327-5] [Citation(s) in RCA: 32] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2004] [Accepted: 02/26/2005] [Indexed: 11/27/2022]
Abstract
We report two experiments designed to assess the consequences of posture change on audiotactile spatiotemporal interactions. In Experiment 1, participants had to discriminate the direction of an auditory stream (consisting of the sequential presentation of two tones from different spatial positions) while attempting to ignore a task-irrelevant tactile stream (consisting of the sequential presentation of two vibrations, one to each of the participant's hands). The tactile stream presented to the participants' hands was either spatiotemporally congruent or incongruent with respect to the sounds. A significant decrease in performance in incongruent trials compared with congruent trials was demonstrated when the participants adopted an uncrossed-hands posture but not when their hands were crossed over the midline. In Experiment 2, we investigated the ability of participants to discriminate the direction of two sequentially presented tactile stimuli (one presented to each hand) as a function of the presence of congruent vs incongruent auditory distractors. Here, the crossmodal effect was stronger in the crossed-hands posture than in the uncrossed-hands posture. These results demonstrate the reciprocal nature of audiotactile interactions in spatiotemporal processing, and highlight the important role played by body posture in modulating such crossmodal interactions.
Collapse
Affiliation(s)
- Daniel Sanabria
- Department of Experimental Psychology, University of Oxford, South Parks Road, Oxford, OX1 3UD, UK.
| | | | | |
Collapse
|