1
|
Undurraga JA, Luke R, Van Yper L, Monaghan JJM, McAlpine D. The neural representation of an auditory spatial cue in the primate cortex. Curr Biol 2024; 34:2162-2174.e5. [PMID: 38718798 DOI: 10.1016/j.cub.2024.04.034] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2023] [Revised: 02/14/2024] [Accepted: 04/12/2024] [Indexed: 05/23/2024]
Abstract
Humans make use of small differences in the timing of sounds at the two ears-interaural time differences (ITDs)-to locate their sources. Despite extensive investigation, however, the neural representation of ITDs in the human brain is contentious, particularly the range of ITDs explicitly represented by dedicated neural detectors. Here, using magneto- and electro-encephalography (MEG and EEG), we demonstrate evidence of a sparse neural representation of ITDs in the human cortex. The magnitude of cortical activity to sounds presented via insert earphones oscillated as a function of increasing ITD-within and beyond auditory cortical regions-and listeners rated the perceptual quality of these sounds according to the same oscillating pattern. This pattern was accurately described by a population of model neurons with preferred ITDs constrained to the narrow, sound-frequency-dependent range evident in other mammalian species. When scaled for head size, the distribution of ITD detectors in the human cortex is remarkably like that recorded in vivo from the cortex of rhesus monkeys, another large primate that uses ITDs for source localization. The data solve a long-standing issue concerning the neural representation of ITDs in humans and suggest a representation that scales for head size and sound frequency in an optimal manner.
Collapse
Affiliation(s)
- Jaime A Undurraga
- Department of Linguistics, Macquarie University, 16 University Avenue, Sydney, NSW 2109, Australia; Interacoustics Research Unit, Technical University of Denmark, Ørsteds Plads, Building 352, 2800 Kgs. Lyngby, Denmark.
| | - Robert Luke
- Department of Linguistics, Macquarie University, 16 University Avenue, Sydney, NSW 2109, Australia; The Bionics Institute, 384-388 Albert St., East Melbourne, VIC 3002, Australia
| | - Lindsey Van Yper
- Department of Linguistics, Macquarie University, 16 University Avenue, Sydney, NSW 2109, Australia; Institute of Clinical Research, University of Southern Denmark, 5230 Odense, Denmark; Research Unit for ORL, Head & Neck Surgery and Audiology, Odense University Hospital & University of Southern Denmark, 5230 Odense, Denmark
| | - Jessica J M Monaghan
- Department of Linguistics, Macquarie University, 16 University Avenue, Sydney, NSW 2109, Australia; National Acoustic Laboratories, Australian Hearing Hub, 16 University Avenue, Sydney, NSW 2109, Australia
| | - David McAlpine
- Department of Linguistics, Macquarie University, 16 University Avenue, Sydney, NSW 2109, Australia; Macquarie University Hearing and the Australian Hearing Hub, Macquarie University, 16 University Avenue, Sydney, NSW 2109, Australia.
| |
Collapse
|
2
|
Montoya S, Badde S. Only visible flicker helps flutter: Tactile-visual integration breaks in the absence of visual awareness. Cognition 2023; 238:105528. [PMID: 37354787 DOI: 10.1016/j.cognition.2023.105528] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2023] [Revised: 06/15/2023] [Accepted: 06/16/2023] [Indexed: 06/26/2023]
Abstract
Combining information from multiple senses enhances our perception of the world. Whether we need to be aware of all stimuli to benefit from multisensory integration, however, is still under investigation. Here, we tested whether tactile frequency perception benefits from the presence of congruent visual flicker even if the flicker is so rapid that it is perceptually fused into a steady light and therefore invisible. Our participants completed a tactile frequency discrimination task given either unisensory tactile or congruent tactile-visual stimulation. Tactile and tactile-visual test frequencies ranged from far below to far above participants' flicker fusion threshold (determined separately). For frequencies distinctively below their flicker fusion threshold, participants performed significantly better given tactile-visual stimulation than when presented with only tactile stimuli. Yet, for frequencies above their flicker fusion threshold, participants' tactile frequency perception did not profit from the presence of congruent but likely fused and thus invisible visual flicker. The results matched the predictions of an ideal-observer model in which tactile-visual integration is conditional on awareness of both stimuli. In contrast, it was impossible to reproduce the observed results with a model that assumed tactile-visual integration proceeds irrespective of stimulus awareness. In sum, we revealed that the benefits of congruent visual stimulation for tactile flutter frequency perception depend on the visibility of the visual flicker, suggesting that multisensory integration requires awareness.
Collapse
Affiliation(s)
- Sofia Montoya
- Department of Psychology, Tufts University, 490 Boston Avenue, 02155 Medford, MA, USA
| | - Stephanie Badde
- Department of Psychology, Tufts University, 490 Boston Avenue, 02155 Medford, MA, USA.
| |
Collapse
|
3
|
Villalonga MB, Sekuler R. Keep your finger on the pulse: Better rate perception and gap detection with vibrotactile compared to visual stimuli. Atten Percept Psychophys 2023; 85:2004-2017. [PMID: 37587355 PMCID: PMC10545646 DOI: 10.3758/s13414-023-02736-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/16/2023] [Indexed: 08/18/2023]
Abstract
Important characteristics of the environment can be represented in the temporal pattern of sensory stimulation. In two experiments, we compared accuracy of temporal processing by different modalities. Experiment 1 examined binary categorization of rate for visual (V) or vibrotactile (T) stimulus pulses presented at either 4 or 6 Hz. Inter-pulse intervals were either constant or variable, perturbed by random Gaussian variates. Subjects categorized the rate of T pulse sequences more accurately than V sequences. In V conditions only, subjects disproportionately tended to mis-categorize 4-Hz pulse rates, for all but the most variable sequences. In Experiment 2, we compared gap detection thresholds across modalities, using the same V and T pulses from Experiment 1, as well as with bimodal (VT) pulses. Visual gap detection thresholds were larger (3[Formula: see text]) than tactile thresholds. Additionally, performance with VT stimuli seemed to be nearly completely dominated by their T components. Together, these results suggest (i) that vibrotactile temporal acuity surpasses visual temporal acuity, and (ii) that vibrotactile stimulation has considerable, untapped potential to convey temporal information like that needed for eyes-free alerting signals.
Collapse
Affiliation(s)
| | - Robert Sekuler
- Department of Psychology, Brandeis University, Waltham, MA, USA
- Program in Neuroscience, Brandeis University, Waltham, MA, USA
| |
Collapse
|
4
|
Macklin AS, Yau JM, Fischer-Baum S, O'Malley MK. Representational Similarity Analysis for Tracking Neural Correlates of Haptic Learning on a Multimodal Device. IEEE TRANSACTIONS ON HAPTICS 2023; 16:424-435. [PMID: 37556331 PMCID: PMC10605963 DOI: 10.1109/toh.2023.3303838] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 08/11/2023]
Abstract
A goal of wearable haptic devices has been to enable haptic communication, where individuals learn to map information typically processed visually or aurally to haptic cues via a process of cross-modal associative learning. Neural correlates have been used to evaluate haptic perception and may provide a more objective approach to assess association performance than more commonly used behavioral measures of performance. In this article, we examine Representational Similarity Analysis (RSA) of electroencephalography (EEG) as a framework to evaluate how the neural representation of multifeatured haptic cues changes with association training. We focus on the first phase of cross-modal associative learning, perception of multimodal cues. A participant learned to map phonemes to multimodal haptic cues, and EEG data were acquired before and after training to create neural representational spaces that were compared to theoretical models. Our perceptual model showed better correlations to the neural representational space before training, while the feature-based model showed better correlations with the post-training data. These results suggest that training may lead to a sharpening of the sensory response to haptic cues. Our results show promise that an EEG-RSA approach can capture a shift in the representational space of cues, as a means to track haptic learning.
Collapse
|
5
|
Yi L, Sekuler R. Audiovisual interaction with rate-varying signals. Iperception 2022; 13:20416695221116653. [PMID: 36467124 PMCID: PMC9716610 DOI: 10.1177/20416695221116653] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2022] [Accepted: 07/06/2022] [Indexed: 08/18/2023] Open
Abstract
A task-irrelevant, amplitude-modulating sound influences perception of a size-modulating visual stimulus. To probe the limits of this audiovisual interaction we vary the second temporal derivative of object size and of sound amplitude. In the study's first phase subjects see a visual stimulus size-modulating with f ″ ( x ) > 0, 0, or <0, and judge each one's rate as increasing, constant, or decreasing. Visual stimuli are accompanied by a steady, non-modulated auditory stimulus. The novel combination of multiple stimuli and multi-alternative responses allows subjects' similarity space to be estimated from the stimulus-response confusion matrix. In the study's second phase, rate-varying visual stimuli are presented in concert with auditory stimuli whose second derivative also varied. Subjects identified each visual stimuli as one of the three types, while trying to ignore the accompanying sound. Unlike some previous results with f ″ ( x ) fixed at 0, performance benefits relatively little when visual and auditory stimuli share the same directional change in modulation. However, performance does drop when visual and auditory stimului differ in their directions of rate change. Our task's computational demands may make it particularly vulnerable to the effects of a dynamic task-irrelevant stimulus.
Collapse
Affiliation(s)
- Long Yi
- Volen Center for Complex Systems, Brandeis University,
Waltham, MA, USA
| | - Robert Sekuler
- Volen Center for Complex Systems, Brandeis University,
Waltham, MA, USA
| |
Collapse
|
6
|
Lohse M, Zimmer-Harwood P, Dahmen JC, King AJ. Integration of somatosensory and motor-related information in the auditory system. Front Neurosci 2022; 16:1010211. [PMID: 36330342 PMCID: PMC9622781 DOI: 10.3389/fnins.2022.1010211] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2022] [Accepted: 09/28/2022] [Indexed: 11/30/2022] Open
Abstract
An ability to integrate information provided by different sensory modalities is a fundamental feature of neurons in many brain areas. Because visual and auditory inputs often originate from the same external object, which may be located some distance away from the observer, the synthesis of these cues can improve localization accuracy and speed up behavioral responses. By contrast, multisensory interactions occurring close to the body typically involve a combination of tactile stimuli with other sensory modalities. Moreover, most activities involving active touch generate sound, indicating that stimuli in these modalities are frequently experienced together. In this review, we examine the basis for determining sound-source distance and the contribution of auditory inputs to the neural encoding of space around the body. We then consider the perceptual consequences of combining auditory and tactile inputs in humans and discuss recent evidence from animal studies demonstrating how cortical and subcortical areas work together to mediate communication between these senses. This research has shown that somatosensory inputs interface with and modulate sound processing at multiple levels of the auditory pathway, from the cochlear nucleus in the brainstem to the cortex. Circuits involving inputs from the primary somatosensory cortex to the auditory midbrain have been identified that mediate suppressive effects of whisker stimulation on auditory thalamocortical processing, providing a possible basis for prioritizing the processing of tactile cues from nearby objects. Close links also exist between audition and movement, and auditory responses are typically suppressed by locomotion and other actions. These movement-related signals are thought to cancel out self-generated sounds, but they may also affect auditory responses via the associated somatosensory stimulation or as a result of changes in brain state. Together, these studies highlight the importance of considering both multisensory context and movement-related activity in order to understand how the auditory cortex operates during natural behaviors, paving the way for future work to investigate auditory-somatosensory interactions in more ecological situations.
Collapse
|
7
|
Sharma D, Ng KKW, Birznieks I, Vickery RM. Auditory clicks elicit equivalent temporal frequency perception to tactile pulses: A cross-modal psychophysical study. Front Neurosci 2022; 16:1006185. [PMID: 36161171 PMCID: PMC9500524 DOI: 10.3389/fnins.2022.1006185] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2022] [Accepted: 08/24/2022] [Indexed: 12/02/2022] Open
Abstract
Both hearing and touch are sensitive to the frequency of mechanical oscillations—sound waves and tactile vibrations, respectively. The mounting evidence of parallels in temporal frequency processing between the two sensory systems led us to directly address the question of perceptual frequency equivalence between touch and hearing using stimuli of simple and more complex temporal features. In a cross-modal psychophysical paradigm, subjects compared the perceived frequency of pulsatile mechanical vibrations to that elicited by pulsatile acoustic (click) trains, and vice versa. Non-invasive pulsatile stimulation designed to excite a fixed population of afferents was used to induce desired temporal spike trains at frequencies spanning flutter up to vibratory hum (>50 Hz). The cross-modal perceived frequency for regular test pulse trains of either modality was a close match to the presented stimulus physical frequency up to 100 Hz. We then tested whether the recently discovered “burst gap” temporal code for frequency, that is shared by the two senses, renders an equivalent cross-modal frequency perception. When subjects compared trains comprising pairs of pulses (bursts) in one modality against regular trains in the other, the cross-sensory equivalent perceptual frequency best corresponded to the silent interval between the successive bursts in both auditory and tactile test stimuli. These findings suggest that identical acoustic and vibrotactile pulse trains, regardless of pattern, elicit equivalent frequencies, and imply analogous temporal frequency computation strategies in both modalities. This perceptual correspondence raises the possibility of employing a cross-modal comparison as a robust standard to overcome the prevailing methodological limitations in psychophysical investigations and strongly encourages cross-modal approaches for transmitting sensory information such as translating pitch into a similar pattern of vibration on the skin.
Collapse
Affiliation(s)
- Deepak Sharma
- School of Biomedical Sciences, The University of New South Wales (UNSW Sydney), Sydney, NSW, Australia
- Neuroscience Research Australia, Sydney, NSW, Australia
- *Correspondence: Deepak Sharma,
| | - Kevin K. W. Ng
- Center for Social and Affective Neuroscience, Department of Biomedical and Clinical Sciences, Linköping University, Linköping, Sweden
| | - Ingvars Birznieks
- School of Biomedical Sciences, The University of New South Wales (UNSW Sydney), Sydney, NSW, Australia
- Neuroscience Research Australia, Sydney, NSW, Australia
- Bionics and Bio-Robotics, Tyree Foundation Institute of Health Engineering, The University of New South Wales (UNSW Sydney), Sydney, NSW, Australia
| | - Richard M. Vickery
- School of Biomedical Sciences, The University of New South Wales (UNSW Sydney), Sydney, NSW, Australia
- Neuroscience Research Australia, Sydney, NSW, Australia
- Bionics and Bio-Robotics, Tyree Foundation Institute of Health Engineering, The University of New South Wales (UNSW Sydney), Sydney, NSW, Australia
| |
Collapse
|
8
|
Kim HS, Kim KB, Lee JH, Jung JJ, Kim YJ, Kim SP, Choi MH, Yi JH, Chung SC. Mid-Air Tactile Sensations Evoked by Laser-Induced Plasma: A Neurophysiological Study. Front Neurosci 2021; 15:733423. [PMID: 34658771 PMCID: PMC8517193 DOI: 10.3389/fnins.2021.733423] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2021] [Accepted: 09/06/2021] [Indexed: 11/22/2022] Open
Abstract
This study demonstrates the feasibility of a mid-air means of haptic stimulation at a long distance using the plasma effect induced by laser. We hypothesize that the stress wave generated by laser-induced plasma in the air can propagate through the air to reach the nearby human skin and evoke tactile sensation. To validate this hypothesis, we investigated somatosensory responses in the human brain to laser plasma stimuli by analyzing electroencephalography (EEG) in 14 participants. Three types of stimuli were provided to the index finger: a plasma stimulus induced from the laser, a mechanical stimulus transferred through Styrofoam stick, and a sham stimulus providing only the sound of the plasma and mechanical stimuli at the same time. The event-related desynchronization/synchronization (ERD/S) of sensorimotor rhythms (SMRs) in EEG was analyzed. Every participant verbally reported that they could feel a soft tap on the finger in response to the laser stimulus, but not to the sham stimulus. The spectrogram of EEG evoked by laser stimulation was similar to that evoked by mechanical stimulation; alpha ERD and beta ERS were present over the sensorimotor area in response to laser as well as mechanical stimuli. A decoding analysis revealed that classification error increased when discriminating ERD/S patterns between laser and mechanical stimuli, compared to the case of discriminating between laser and sham, or mechanical and sham stimuli. Our neurophysiological results confirm that tactile sensation can be evoked by the plasma effect induced by laser in the air, which may provide a mid-air haptic stimulation method.
Collapse
Affiliation(s)
- Hyung-Sik Kim
- Department of Biomedical Engineering, BK21 Plus Research Institute of Biomedical Engineering, School of ICT Convergence Engineering, College of Science and Technology, Konkuk University, Chungju-si, South Korea
| | - Kyu Beom Kim
- Department of Biomedical Engineering, BK21 Plus Research Institute of Biomedical Engineering, School of ICT Convergence Engineering, College of Science and Technology, Konkuk University, Chungju-si, South Korea
| | - Je-Hyeop Lee
- Department of Biomedical Engineering, BK21 Plus Research Institute of Biomedical Engineering, School of ICT Convergence Engineering, College of Science and Technology, Konkuk University, Chungju-si, South Korea
| | - Jin-Ju Jung
- Department of Biomedical Engineering, BK21 Plus Research Institute of Biomedical Engineering, School of ICT Convergence Engineering, College of Science and Technology, Konkuk University, Chungju-si, South Korea
| | - Ye-Jin Kim
- Department of Biomedical Engineering, BK21 Plus Research Institute of Biomedical Engineering, School of ICT Convergence Engineering, College of Science and Technology, Konkuk University, Chungju-si, South Korea
| | - Sung-Phil Kim
- Department of Biomedical Engineering, Ulsan National Institute of Science and Technology, Ulsan, South Korea
| | - Mi-Hyun Choi
- Department of Biomedical Engineering, BK21 Plus Research Institute of Biomedical Engineering, School of ICT Convergence Engineering, College of Science and Technology, Konkuk University, Chungju-si, South Korea
| | - Jeong-Han Yi
- Department of Biomedical Engineering, BK21 Plus Research Institute of Biomedical Engineering, School of ICT Convergence Engineering, College of Science and Technology, Konkuk University, Chungju-si, South Korea
| | - Soon-Cheol Chung
- Department of Biomedical Engineering, BK21 Plus Research Institute of Biomedical Engineering, School of ICT Convergence Engineering, College of Science and Technology, Konkuk University, Chungju-si, South Korea
| |
Collapse
|
9
|
Quantitative EEG measures in profoundly deaf and normal hearing individuals while performing a vibrotactile temporal discrimination task. Int J Psychophysiol 2021; 166:71-82. [PMID: 34023377 DOI: 10.1016/j.ijpsycho.2021.05.007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2020] [Revised: 05/10/2021] [Accepted: 05/16/2021] [Indexed: 11/22/2022]
Abstract
Challenges in early oral language acquisition in profoundly deaf individuals have an impact on cognitive neurodevelopment. This has led to the exploration of alternative sound perception methods involving training of vibrotactile discrimination of sounds within the language spectrum. In particular, stimulus duration plays an important role in linguistic categorical perception. We comparatively evaluated vibrotactile temporal discrimination of sound and how specific training can modify the underlying electrical brain activity. Fifteen profoundly deaf (PD) and 15 normal-hearing (NH) subjects performed a vibrotactile oddball task with simultaneous EEG recording, before and after a short training period (5 one-hour sessions; in 2.5-3 weeks). The stimuli consisted of 700 Hz pure-tones with different duration (target: long 500 ms; non-target: short 250 ms). The sound-wave stimuli were delivered by a small device worn on the right index finger. A similar behavioral training effect was observed in both groups showing significant improvement in sound-duration discrimination. However, quantitative EEG measurements reveal distinct neurophysiological patterns characterized by higher and more diffuse delta band magnitudes in the PD group, together with a generalized decrement in absolute power in both groups that might reflect a facilitating process associated to learning. Furthermore, training-related changes were found in the beta-band in NH. Findings suggest PD have different cognitive adaptive mechanisms which are not a mere amplification effect due to greater cortical excitability.
Collapse
|
10
|
Cléry JC, Hori Y, Schaeffer DJ, Gati JS, Pruszynski JA, Everling S. Whole brain mapping of somatosensory responses in awake marmosets investigated with ultra-high-field fMRI. J Neurophysiol 2020; 124:1900-1913. [PMID: 33112698 DOI: 10.1152/jn.00480.2020] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/16/2023] Open
Abstract
The common marmoset (Callithrix jacchus) is a small-bodied New World primate that is becoming an important model to study brain functions. Despite several studies exploring the somatosensory system of marmosets, all results have come from anesthetized animals using invasive techniques and postmortem analyses. Here, we demonstrate the feasibility for getting high-quality and reproducible somatosensory mapping in awake marmosets with functional magnetic resonance imaging (fMRI). We acquired fMRI sequences in four animals, while they received tactile stimulation (via air-puffs), delivered to the face, arm, or leg. We found a topographic body representation with the leg representation in the most medial part, the face representation in the most lateral part, and the arm representation between leg and face representation within areas 3a, 3b, and 1/2. A similar sequence from leg to face from caudal to rostral sites was identified in areas S2 and PV. By generating functional connectivity maps of seeds defined in the primary and second somatosensory regions, we identified two clusters of tactile representation within the posterior and midcingulate cortex. However, unlike humans and macaques, no clear somatotopic maps were observed. At the subcortical level, we found a somatotopic body representation in the thalamus and, for the first time in marmosets, in the putamen. These maps have similar organizations, as those previously found in Old World macaque monkeys and humans, suggesting that these subcortical somatotopic organizations were already established before Old and New World primates diverged. Our results show the first whole brain mapping of somatosensory responses acquired in a noninvasive way in awake marmosets.NEW & NOTEWORTHY We used somatosensory stimulation combined with functional MRI (fMRI) in awake marmosets to reveal the topographic body representation in areas S1, S2, thalamus, and putamen. We showed the existence of a body representation organization within the thalamus and the cingulate cortex by computing functional connectivity maps from seeds defined in S1/S2, using resting-state fMRI data. This noninvasive approach will be essential for chronic studies by guiding invasive recording and manipulation techniques.
Collapse
Affiliation(s)
- Justine C Cléry
- Centre for Functional and Metabolic Mapping, Robarts Research Institute, The University of Western Ontario, London, Ontario, Canada
| | - Yuki Hori
- Centre for Functional and Metabolic Mapping, Robarts Research Institute, The University of Western Ontario, London, Ontario, Canada
| | - David J Schaeffer
- Centre for Functional and Metabolic Mapping, Robarts Research Institute, The University of Western Ontario, London, Ontario, Canada
| | - Joseph S Gati
- Centre for Functional and Metabolic Mapping, Robarts Research Institute, The University of Western Ontario, London, Ontario, Canada.,Department of Medical Biophysics, The University of Western Ontario, London, Ontario, Canada
| | - J Andrew Pruszynski
- Centre for Functional and Metabolic Mapping, Robarts Research Institute, The University of Western Ontario, London, Ontario, Canada.,Department of Physiology and Pharmacology, The University of Western Ontario, London, Ontario, Canada
| | - Stefan Everling
- Centre for Functional and Metabolic Mapping, Robarts Research Institute, The University of Western Ontario, London, Ontario, Canada.,Department of Physiology and Pharmacology, The University of Western Ontario, London, Ontario, Canada
| |
Collapse
|
11
|
Tommerdahl M, Francisco E, Holden J, Lensch R, Tommerdahl A, Kirsch B, Dennis R, Favorov O. An Accurate Measure of Reaction Time can Provide Objective Metrics of Concussion. ACTA ACUST UNITED AC 2020. [DOI: 10.37714/josam.v2i2.31] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/13/2022]
Abstract
There have been numerous reports of neurological assessments of post-concussed athletes and many deploy some type of reaction time assessment. However, most of the assessment tools currently deployed rely on consumer-grade computer systems to collect this data. In a previous report, we demonstrated the inaccuracies that typical computer systems introduce to hardware and software to collect these metrics with robotics (Holden et al, 2020). In that same report, we described the accuracy of a tactile based reaction time test (administered with the Brain Gauge) as approximately 0.3 msec and discussed the shortcoming of other methods for collecting reaction time. The latency errors introduced with those alternative methods were reported as high as 400 msec and the system variabilities could be as high as 80 msec, and these values are several orders of magnitude above the control values previously reported for reaction time (200-220msec) and reaction time variability (10-20 msec). In this report, we examined the reaction time and reaction time variability from 396 concussed individuals and found that there were significant differences in the reaction time metrics obtained from concussed and non-concussed individuals for 14-21 days post-concussion. A survey of the literature did not reveal comparable sensitivity in reaction time testing in concussion studies using alternative methods. This finding was consistent with the prediction put forth by Holden and colleagues with robotics testing of the consumer grade computer systems that are commonly utilized by researchers conducting reaction time testing on concussed individuals. The significant difference in fidelity between the methods commonly used by concussion researchers is attributed to the differences in accuracy of the measures deployed and/or the increases in biological fidelity introduced by tactile based reaction times over visually administered reaction time tests. Additionally, while most of the commonly used computerized testing assessment tools require a pre-season baseline test to predict a neurological insult, the tactile based methods reported in this paper did not utilize any baselines for comparisons. The reaction time data reported was one test of a battery of tests administered to the population studied, and this is the first of a series of papers that will examine each of those tests independently.
Collapse
|