1
|
Zhao S, Brown CA, Holt LL, Dick F. Robust and Efficient Online Auditory Psychophysics. Trends Hear 2022; 26:23312165221118792. [PMID: 36131515 PMCID: PMC9500270 DOI: 10.1177/23312165221118792] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2021] [Revised: 07/14/2022] [Accepted: 07/21/2022] [Indexed: 11/22/2022] Open
Abstract
Most human auditory psychophysics research has historically been conducted in carefully controlled environments with calibrated audio equipment, and over potentially hours of repetitive testing with expert listeners. Here, we operationally define such conditions as having high 'auditory hygiene'. From this perspective, conducting auditory psychophysical paradigms online presents a serious challenge, in that results may hinge on absolute sound presentation level, reliably estimated perceptual thresholds, low and controlled background noise levels, and sustained motivation and attention. We introduce a set of procedures that address these challenges and facilitate auditory hygiene for online auditory psychophysics. First, we establish a simple means of setting sound presentation levels. Across a set of four level-setting conditions conducted in person, we demonstrate the stability and robustness of this level setting procedure in open air and controlled settings. Second, we test participants' tone-in-noise thresholds using widely adopted online experiment platforms and demonstrate that reliable threshold estimates can be derived online in approximately one minute of testing. Third, using these level and threshold setting procedures to establish participant-specific stimulus conditions, we show that an online implementation of the classic probe-signal paradigm can be used to demonstrate frequency-selective attention on an individual-participant basis, using a third of the trials used in recent in-lab experiments. Finally, we show how threshold and attentional measures relate to well-validated assays of online participants' in-task motivation, fatigue, and confidence. This demonstrates the promise of online auditory psychophysics for addressing new auditory perception and neuroscience questions quickly, efficiently, and with more diverse samples. Code for the tests is publicly available through Pavlovia and Gorilla.
Collapse
Affiliation(s)
- Sijia Zhao
- Department of Experimental Psychology, University of Oxford, Oxford, UK
| | - Christopher A. Brown
- Department of Communication Science and Disorders, University of Pittsburgh, Pittsburgh, PA, USA
| | - Lori L. Holt
- Department of Psychology, Carnegie Mellon University, Pittsburgh, PA, USA
- Neuroscience Institute, Carnegie Mellon University, Pittsburgh, PA, USA
| | - Frederic Dick
- Department of Psychological Sciences, Birkbeck College, University of London, London, UK
- Department of Experimental Psychology, PALS, University College London, London, UK
| |
Collapse
|
2
|
Nees MA. Have We Forgotten Auditory Sensory Memory? Retention Intervals in Studies of Nonverbal Auditory Working Memory. Front Psychol 2016; 7:1892. [PMID: 27994565 PMCID: PMC5133429 DOI: 10.3389/fpsyg.2016.01892] [Citation(s) in RCA: 35] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/23/2016] [Accepted: 11/17/2016] [Indexed: 11/24/2022] Open
Abstract
Researchers have shown increased interest in mechanisms of working memory for nonverbal sounds such as music and environmental sounds. These studies often have used two-stimulus comparison tasks: two sounds separated by a brief retention interval (often 3-5 s) are compared, and a "same" or "different" judgment is recorded. Researchers seem to have assumed that sensory memory has a negligible impact on performance in auditory two-stimulus comparison tasks. This assumption is examined in detail in this comment. According to seminal texts and recent research reports, sensory memory persists in parallel with working memory for a period of time following hearing a stimulus and can influence behavioral responses on memory tasks. Unlike verbal working memory studies that use serial recall tasks, research paradigms for exploring nonverbal working memory-especially two-stimulus comparison tasks-may not be differentiating working memory from sensory memory processes in analyses of behavioral responses, because retention interval durations have not excluded the possibility that the sensory memory trace drives task performance. This conflation of different constructs may be one contributor to discrepant research findings and the resulting proliferation of theoretical conjectures regarding mechanisms of working memory for nonverbal sounds.
Collapse
|
3
|
Petsas T, Harrison J, Kashino M, Furukawa S, Chait M. The effect of distraction on change detection in crowded acoustic scenes. Hear Res 2016; 341:179-189. [PMID: 27598040 PMCID: PMC5090045 DOI: 10.1016/j.heares.2016.08.015] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/14/2016] [Revised: 08/23/2016] [Accepted: 08/31/2016] [Indexed: 11/13/2022]
Abstract
In this series of behavioural experiments we investigated the effect of distraction on the maintenance of acoustic scene information in short-term memory. Stimuli are artificial acoustic ‘scenes’ composed of several (up to twelve) concurrent tone-pip streams (‘sources’). A gap (1000 ms) is inserted partway through the ‘scene’; Changes in the form of an appearance of a new source or disappearance of an existing source, occur after the gap in 50% of the trials. Listeners were instructed to monitor the unfolding ‘soundscapes’ for these events. Distraction was measured by presenting distractor stimuli during the gap. Experiments 1 and 2 used a dual task design where listeners were required to perform a task with varying attentional demands (‘High Demand’ vs. ‘Low Demand’) on brief auditory (Experiment 1a) or visual (Experiment 1b) signals presented during the gap. Experiments 2 and 3 required participants to ignore distractor sounds and focus on the change detection task. Our results demonstrate that the maintenance of scene information in short-term memory is influenced by the availability of attentional and/or processing resources during the gap, and that this dependence appears to be modality specific. We also show that these processes are susceptible to bottom up driven distraction even in situations when the distractors are not novel, but occur on each trial. Change detection performance is systematically linked with the, independently determined, perceptual salience of the distractor sound. The findings also demonstrate that the present task may be a useful objective means for determining relative perceptual salience. Distraction is measured by presenting distractor stimuli during a scene gap. Scene maintenance in memory depends on availability of resources during the gap. This dependence appears to be modality specific. Scene maintenance also prone to bottom up distraction even when distractors not novel. Performance depends on the perceptual salience of the distractor sound.
Collapse
Affiliation(s)
| | | | - Makio Kashino
- Human Information Science Laboratory, NTT Communication Science Laboratories, NTT Corporation, 3-1, Morinosato-Wakamiya, Atsugi-shi, Kanagawa, Japan
| | - Shigeto Furukawa
- Human Information Science Laboratory, NTT Communication Science Laboratories, NTT Corporation, 3-1, Morinosato-Wakamiya, Atsugi-shi, Kanagawa, Japan
| | - Maria Chait
- UCL Ear Institute, 332 Gray's Inn Rd, London, UK.
| |
Collapse
|
4
|
Zimmermann JF, Moscovitch M, Alain C. Attending to auditory memory. Brain Res 2015; 1640:208-21. [PMID: 26638836 DOI: 10.1016/j.brainres.2015.11.032] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2015] [Revised: 11/18/2015] [Accepted: 11/19/2015] [Indexed: 10/22/2022]
Abstract
Attention to memory describes the process of attending to memory traces when the object is no longer present. It has been studied primarily for representations of visual stimuli with only few studies examining attention to sound object representations in short-term memory. Here, we review the interplay of attention and auditory memory with an emphasis on 1) attending to auditory memory in the absence of related external stimuli (i.e., reflective attention) and 2) effects of existing memory on guiding attention. Attention to auditory memory is discussed in the context of change deafness, and we argue that failures to detect changes in our auditory environments are most likely the result of a faulty comparison system of incoming and stored information. Also, objects are the primary building blocks of auditory attention, but attention can also be directed to individual features (e.g., pitch). We review short-term and long-term memory guided modulation of attention based on characteristic features, location, and/or semantic properties of auditory objects, and propose that auditory attention to memory pathways emerge after sensory memory. A neural model for auditory attention to memory is developed, which comprises two separate pathways in the parietal cortex, one involved in attention to higher-order features and the other involved in attention to sensory information. This article is part of a Special Issue entitled SI: Auditory working memory.
Collapse
Affiliation(s)
- Jacqueline F Zimmermann
- University of Toronto, Department of Psychology, Sidney Smith Hall, 100 St. George Street, Toronto, Ontario, Canada M5S 3G3; Rotman Research Institute, Baycrest Hospital, 3560 Bathurst Street, Toronto, Ontario, Canada M6A 2E1.
| | - Morris Moscovitch
- University of Toronto, Department of Psychology, Sidney Smith Hall, 100 St. George Street, Toronto, Ontario, Canada M5S 3G3; Rotman Research Institute, Baycrest Hospital, 3560 Bathurst Street, Toronto, Ontario, Canada M6A 2E1
| | - Claude Alain
- University of Toronto, Department of Psychology, Sidney Smith Hall, 100 St. George Street, Toronto, Ontario, Canada M5S 3G3; Rotman Research Institute, Baycrest Hospital, 3560 Bathurst Street, Toronto, Ontario, Canada M6A 2E1; Institute of Medical Sciences, University of Toronto, Toronto, Ontario, Canada
| |
Collapse
|
5
|
McLachlan N. A neurocognitive model of recognition and pitch segregation. THE JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA 2011; 130:2845-2854. [PMID: 22087913 DOI: 10.1121/1.3643082] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/31/2023]
Abstract
This paper describes a neurocognitive model of pitch segregation in which it is proposed that recognition mechanisms initiate early in auditory processing pathways so that long-term memory templates may be employed to segregate and integrate auditory features. In this model neural representations of pitch height are primed by the location and pattern of excitation across auditory filter channels in relation to long-term memory templates for common stimuli. Since waveform driven pitch mechanisms may produce information at multiple frequencies for tonal stimuli, pitch priming was assumed to include competitive inhibition that would allow only one pitch estimation at any time. Consequently concurrent pitch information must be relayed to short-term memory via a parallel mechanism that employs pitch information contained in the long-term memory template of the chord. Pure tones, harmonic complexes and two pitch chords of harmonic complexes were correctly classified by the correlation of templates comprising auditory nerve excitation and off-frequency inhibition with the excitation patterns of stimuli. The model then replicated behavioral data for pitch matching of concurrent vowels. Comparison of model outputs to the behavioral data suggests that inability to recognize a stimulus was associated with poor pitch segregation due to the use of inappropriate pitch priming strategies.
Collapse
Affiliation(s)
- Neil McLachlan
- Centre for Music, Mind and Wellbeing, School of Psychological Sciences, The University of Melbourne, Parkville, 3010, Victoria, Australia.
| |
Collapse
|
6
|
Sedda A, Monaco S, Bottini G, Goodale MA. Integration of visual and auditory information for hand actions: preliminary evidence for the contribution of natural sounds to grasping. Exp Brain Res 2011; 209:365-74. [PMID: 21290243 DOI: 10.1007/s00221-011-2559-5] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/14/2010] [Accepted: 01/13/2011] [Indexed: 10/18/2022]
Abstract
When we reach out to grasp objects, vision plays a major role in the control of our movements. Nevertheless, other sensory modalities contribute to the fine-tuning of our actions. Even olfaction has been shown to play a role in the scaling of movements directed at objects. Much less is known about how auditory information might be used to program grasping movements. The aim of our study was to investigate how the sound of a target object affects the planning of grasping movements in normal right-handed subjects. We performed an experiment in which auditory information could be used to infer size of targets when the availability of visual information was varied from trial to trial. Classical kinematic parameters (such as grip aperture) were measured to evaluate the influence of auditory information. In addition, an optimal inference modeling was applied to the data. The scaling of grip aperture indicated that the introduction of sound allowed subjects to infer the size of the object when vision was not available. Moreover, auditory information affected grip aperture even when vision was available. Our findings suggest that the differences in the natural impact sounds of objects of different sizes being placed on a surface can be used to plan grasping movements.
Collapse
Affiliation(s)
- Anna Sedda
- Department of Psychology, University of Pavia, Piazza Botta 6, 27100 Pavia, Italy.
| | | | | | | |
Collapse
|
7
|
McKeown D, Mills R, Mercer T. Comparisons of Complex Sounds across Extended Retention Intervals Survives Reading Aloud. Perception 2011; 40:1193-205. [DOI: 10.1068/p6988] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/15/2022]
Abstract
A simple experimental arrangement is designed to foil verbal rehearsal during an extended (from 5 to 30 s) retention interval across which participants attempt to discriminate two periodic complex sounds. Sounds have an abstract timbre that does not lend itself to verbal labeling, they differ across trials so that no ‘standard’ comparison stimulus is built up by the participants, and the spectral change to be discriminated is very slight and therefore does not shift the stimulus into a new verbal category. And, crucially, in one experimental condition, participants read aloud during most of the retention interval. Despite these precautions, performance is robust across the extended retention interval. The inference is that one form of auditory memory does not require verbal rehearsal. Nevertheless, modest forgetting occurred. Whatever form memory takes in this situation, it is not totally secure from disruption.
Collapse
Affiliation(s)
| | | | - Tom Mercer
- Division of Psychology, School of Applied Sciences, University of Wolverhampton, Wolverhampton WV1 1LY, UK
| |
Collapse
|
8
|
Updating and feature overwriting in short-term memory for timbre. Atten Percept Psychophys 2010; 72:2289-303. [PMID: 21097870 DOI: 10.3758/bf03196702] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|