1
|
Kuroki D, Pronk T. jsQuestPlus: A JavaScript implementation of the QUEST+ method for estimating psychometric function parameters in online experiments. Behav Res Methods 2023; 55:3179-3186. [PMID: 36070128 PMCID: PMC9450820 DOI: 10.3758/s13428-022-01948-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/01/2022] [Indexed: 11/08/2022]
Abstract
The two Bayesian adaptive psychometric methods named QUEST (Watson & Pelli, 1983) and QUEST+ (Watson, 2017) are widely used to estimate psychometric parameters, especially the threshold, in laboratory-based psychophysical experiments. Considering the increase of online psychophysical experiments in recent years, there is a growing need to have the QUEST and QUEST+ methods available online as well. We developed JavaScript libraries for both, with this article introducing one of them: jsQuestPlus. We offer integrations with online experimental tools such as jsPsych (de Leeuw, 2015), PsychoPy/JS (Peirce et al., 2019), and lab.js (Henninger et al., 2021). We measured the computation time required by jsQuestPlus under four conditions. Our simulations on 37 browser-computer combinations showed that the mean initialization time was 461.08 ms, 95% CI [328.29, 593.87], the mean computation time required to determine the stimulus parameters for the next trial was less than 1 ms, and the mean update time was 79.39 ms, 95% CI [46.22, 112.55] even in extremely demanding conditions. Additionally, psychometric parameters were estimated as accurately as the original QUEST+ method did. We conclude that jsQuestPlus is fast and accurate enough to conduct online psychophysical experiments despite the complexity of the matrix calculations. The latest version of jsQuestPlus can be downloaded freely from https://github.com/kurokida/jsQuestPlus under the MIT license.
Collapse
Affiliation(s)
- Daiichiro Kuroki
- Department of Psychology, School of Letters, Kyushu University, 744 Motooka, Nishi-ku, Fukuoka, 819-0395, Japan.
| | - Thomas Pronk
- Behavioural Science Lab, Faculty of Social and Behavioural Sciences, University of Amsterdam, Amsterdam, Netherlands
| |
Collapse
|
2
|
Schramm M, Goregliad Fjaellingsdal T, Aslan B, Jung P, Lux S, Schulze M, Philipsen A. Electrophysiological evidence for increased auditory crossmodal activity in adult ADHD. Front Neurosci 2023; 17:1227767. [PMID: 37706153 PMCID: PMC10495991 DOI: 10.3389/fnins.2023.1227767] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2023] [Accepted: 08/09/2023] [Indexed: 09/15/2023] Open
Abstract
Background Attention deficit and hyperactivity disorder (ADHD) is a neurodevelopmental disorder characterized by core symptoms of inattention, and/or impulsivity and hyperactivity. In order to understand the basis for this multifaceted disorder, the investigation of sensory processing aberrancies recently reaches more interest. For example, during the processing of auditory stimuli comparable low sensory thresholds account for symptoms like higher distractibility and auditory hypersensitivity in patients with ADHD. It has further been shown that deficiencies not only exist on an intramodal, but also on a multimodal level. There is evidence that the visual cortex shows more activation during a focused auditory task in adults with ADHD than in healthy controls. This crossmodal activation is interpreted as the reallocation of more attentional resources to the visual domain as well as deficient sensory inhibition. In this study, we used, for the first time, electroencephalography to identify a potential abnormal regulated crossmodal activation in adult ADHD. Methods 15 adult subjects with clinically diagnosed ADHD and 14 healthy controls comparable in age and gender were included. ERP components P50, P100, N100, P200 and N200 were measured during the performance of a unimodal auditory and visual discrimination task in a block design. Sensory profiles and ADHD symptoms were assessed with inattention as well as childhood ADHD scores. For evaluating intramodal and crossmodal activations, we chose four EEG channels for statistical analysis and group-wise comparison. Results At the occipital channel O2 that reflects possible crossmodal activations, a significantly enhanced P200 amplitude was measured in the patient group. At the intramodal channels, a significantly enhanced N200 amplitude was observed in the control group. Statistical analysis of behavioral data showed poorer performance of subjects with ADHD as well as higher discrimination thresholds. Further, the correlation of the assessed sensory profiles with the EEG parameters revealed a negative correlation between the P200 component and sensation seeking behavior. Conclusion Our findings show increased auditory crossmodal activity that might reflect an altered stimulus processing resource allocation in ADHD. This might induce consequences for later, higher order attentional deployment. Further, the enhanced P200 amplitude might reflect more sensory registration and therefore deficient inhibition mechanisms in adults with ADHD.
Collapse
Affiliation(s)
- Mia Schramm
- Department of Psychiatry and Psychotherapy, University of Bonn, Bonn, Germany
| | - Tatiana Goregliad Fjaellingsdal
- Department of Neurology, University of Lübeck, Lübeck, Germany
- Department of Psychology, University of Lübeck, Lübeck, Germany
- Center of Brain, Behavior and Metabolism (CBBM), University of Lübeck, Lübeck, Germany
| | - Behrem Aslan
- Department of Psychiatry and Psychotherapy, University of Bonn, Bonn, Germany
| | - Paul Jung
- Department of Psychiatry and Psychotherapy, University of Bonn, Bonn, Germany
| | - Silke Lux
- Department of Psychiatry and Psychotherapy, University of Bonn, Bonn, Germany
| | - Marcel Schulze
- Department of Psychiatry and Psychotherapy, University of Bonn, Bonn, Germany
| | - Alexandra Philipsen
- Department of Psychiatry and Psychotherapy, University of Bonn, Bonn, Germany
| |
Collapse
|
3
|
Orf M, Wöstmann M, Hannemann R, Obleser J. Target enhancement but not distractor suppression in auditory neural tracking during continuous speech. iScience 2023; 26:106849. [PMID: 37305701 PMCID: PMC10251127 DOI: 10.1016/j.isci.2023.106849] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/07/2022] [Revised: 02/13/2023] [Accepted: 05/05/2023] [Indexed: 06/13/2023] Open
Abstract
Selective attention modulates the neural tracking of speech in auditory cortical regions. It is unclear whether this attentional modulation is dominated by enhanced target tracking, or suppression of distraction. To settle this long-standing debate, we employed an augmented electroencephalography (EEG) speech-tracking paradigm with target, distractor, and neutral streams. Concurrent target speech and distractor (i.e., sometimes relevant) speech were juxtaposed with a third, never task-relevant speech stream serving as neutral baseline. Listeners had to detect short target repeats and committed more false alarms originating from the distractor than from the neutral stream. Speech tracking revealed target enhancement but no distractor suppression below the neutral baseline. Speech tracking of the target (not distractor or neutral speech) explained single-trial accuracy in repeat detection. In sum, the enhanced neural representation of target speech is specific to processes of attentional gain for behaviorally relevant target speech rather than neural suppression of distraction.
Collapse
Affiliation(s)
- Martin Orf
- Department of Psychology, University of Lübeck, Lübeck, Germany
- Center of Brain, Behavior and Metabolism (CBBM), University of Lübeck, Lübeck, Germany
| | - Malte Wöstmann
- Department of Psychology, University of Lübeck, Lübeck, Germany
- Center of Brain, Behavior and Metabolism (CBBM), University of Lübeck, Lübeck, Germany
| | | | - Jonas Obleser
- Department of Psychology, University of Lübeck, Lübeck, Germany
- Center of Brain, Behavior and Metabolism (CBBM), University of Lübeck, Lübeck, Germany
| |
Collapse
|
4
|
Exploring the effectiveness of auditory, visual, and audio-visual sensory cues in a multiple object tracking environment. Atten Percept Psychophys 2022; 84:1611-1624. [PMID: 35610410 PMCID: PMC9232473 DOI: 10.3758/s13414-022-02492-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/14/2022] [Indexed: 11/08/2022]
Abstract
Maintaining object correspondence among multiple moving objects is an essential task of the perceptual system in many everyday life activities. A substantial body of research has confirmed that observers are able to track multiple target objects amongst identical distractors based only on their spatiotemporal information. However, naturalistic tasks typically involve the integration of information from more than one modality, and there is limited research investigating whether auditory and audio-visual cues improve tracking. In two experiments, we asked participants to track either five target objects or three versus five target objects amongst similarly indistinguishable distractor objects for 14 s. During the tracking interval, the target objects bounced occasionally against the boundary of a centralised orange circle. A visual cue, an auditory cue, neither or both coincided with these collisions. Following the motion interval, the participants were asked to indicate all target objects. Across both experiments and both set sizes, our results indicated that visual and auditory cues increased tracking accuracy although visual cues were more effective than auditory cues. Audio-visual cues, however, did not increase tracking performance beyond the level of purely visual cues for both high and low load conditions. We discuss the theoretical implications of our findings for multiple object tracking as well as for the principles of multisensory integration.
Collapse
|
5
|
James LS, Baier AL, Page RA, Clements P, Hunter KL, Taylor RC, Ryan MJ. Cross-modal facilitation of auditory discrimination in a frog. Biol Lett 2022; 18:20220098. [PMID: 35765810 PMCID: PMC9240679 DOI: 10.1098/rsbl.2022.0098] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2022] [Accepted: 06/06/2022] [Indexed: 11/12/2022] Open
Abstract
Stimulation in one sensory modality can affect perception in a separate modality, resulting in diverse effects including illusions in humans. This can also result in cross-modal facilitation, a process where sensory performance in one modality is improved by stimulation in another modality. For instance, a simple sound can improve performance in a visual task in both humans and cats. However, the range of contexts and underlying mechanisms that evoke such facilitation effects remain poorly understood. Here, we demonstrated cross-modal stimulation in wild-caught túngara frogs, a species with well-studied acoustic preferences in females. We first identified that a combined visual and seismic cue (vocal sac movement and water ripple) was behaviourally relevant for females choosing between two courtship calls in a phonotaxis assay. We then found that this combined cross-modal stimulus rescued a species-typical acoustic preference in the presence of background noise that otherwise abolished the preference. These results highlight how cross-modal stimulation can prime attention in receivers to improve performance during decision-making. With this, we provide the foundation for future work uncovering the processes and conditions that promote cross-modal facilitation effects.
Collapse
Affiliation(s)
- Logan S. James
- Department of Integrative Biology, University of Texas, Austin, TX 78712, USA
- Smithsonian Tropical Research Institute, Apartado 0843-03092, Balboa, Ancón, Republic of Panama
| | - A. Leonie Baier
- Department of Integrative Biology, University of Texas, Austin, TX 78712, USA
- Smithsonian Tropical Research Institute, Apartado 0843-03092, Balboa, Ancón, Republic of Panama
| | - Rachel A. Page
- Smithsonian Tropical Research Institute, Apartado 0843-03092, Balboa, Ancón, Republic of Panama
| | - Paul Clements
- Henson School of Technology, Salisbury University, 1101 Camden Ave, Salisbury, MD 21801, USA
| | - Kimberly L. Hunter
- Department of Biological Sciences, Salisbury University, 1101 Camden Ave, Salisbury, MD 21801, USA
| | - Ryan C. Taylor
- Smithsonian Tropical Research Institute, Apartado 0843-03092, Balboa, Ancón, Republic of Panama
- Department of Biological Sciences, Salisbury University, 1101 Camden Ave, Salisbury, MD 21801, USA
| | - Michael J. Ryan
- Department of Integrative Biology, University of Texas, Austin, TX 78712, USA
- Smithsonian Tropical Research Institute, Apartado 0843-03092, Balboa, Ancón, Republic of Panama
| |
Collapse
|