1
|
Klever L, Beyvers MC, Fiehler K, Mamassian P, Billino J. Cross-modal metacognition: Visual and tactile confidence share a common scale. J Vis 2023; 23:3. [PMID: 37140913 PMCID: PMC10166118 DOI: 10.1167/jov.23.5.3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/05/2023] Open
Abstract
Humans can judge the quality of their perceptual decisions-an ability known as perceptual confidence. Previous work suggested that confidence can be evaluated on an abstract scale that can be sensory modality-independent or even domain-general. However, evidence is still scarce on whether confidence judgments can be directly made across visual and tactile decisions. Here, we investigated in a sample of 56 adults whether visual and tactile confidence share a common scale by measuring visual contrast and vibrotactile discrimination thresholds in a confidence-forced choice paradigm. Confidence judgments were made about the correctness of the perceptual decision between two trials involving either the same or different modalities. To estimate confidence efficiency, we compared discrimination thresholds obtained from all trials to those from trials judged to be relatively more confident. We found evidence for metaperception because higher confidence was associated with better perceptual performance in both modalities. Importantly, participants were able to judge their confidence across modalities without any costs in metaperceptual sensitivity and only minor changes in response times compared to unimodal confidence judgments. In addition, we were able to predict cross-modal confidence well from unimodal judgments. In conclusion, our findings show that perceptual confidence is computed on an abstract scale and that it can assess the quality of our decisions across sensory modalities.
Collapse
Affiliation(s)
- Lena Klever
- Experimental Psychology, Justus Liebig University Giessen, Giessen, Germany
- Center for Mind, Brain, and Behavior (CMBB), University of Marburg and Justus Liebig University Giessen, Giessen, Germany
| | | | - Katja Fiehler
- Experimental Psychology, Justus Liebig University Giessen, Giessen, Germany
- Center for Mind, Brain, and Behavior (CMBB), University of Marburg and Justus Liebig University Giessen, Giessen, Germany
| | - Pascal Mamassian
- Laboratoire des Systèmes Perceptifs, Département d'études Cognitives, École Normale Supérieure, PSL University, Paris, France
| | - Jutta Billino
- Experimental Psychology, Justus Liebig University Giessen, Giessen, Germany
- Center for Mind, Brain, and Behavior (CMBB), University of Marburg and Justus Liebig University Giessen, Giessen, Germany
| |
Collapse
|
2
|
Dozio N, Maggioni E, Pittera D, Gallace A, Obrist M. May I Smell Your Attention: Exploration of Smell and Sound for Visuospatial Attention in Virtual Reality. Front Psychol 2021; 12:671470. [PMID: 34366990 PMCID: PMC8339311 DOI: 10.3389/fpsyg.2021.671470] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2021] [Accepted: 06/21/2021] [Indexed: 11/14/2022] Open
Abstract
When interacting with technology, attention is mainly driven by audiovisual and increasingly haptic stimulation. Olfactory stimuli are widely neglected, although the sense of smell influences many of our daily life choices, affects our behavior, and can catch and direct our attention. In this study, we investigated the effect of smell and sound on visuospatial attention in a virtual environment. We implemented the Bells Test, an established neuropsychological test to assess attentional and visuospatial disorders, in virtual reality (VR). We conducted an experiment with 24 participants comparing the performance of users under three experimental conditions (smell, sound, and smell and sound). The results show that multisensory stimuli play a key role in driving the attention of the participants and highlight asymmetries in directing spatial attention. We discuss the relevance of the results within and beyond human-computer interaction (HCI), particularly with regard to the opportunity of using VR for rehabilitation and assessment procedures for patients with spatial attention deficits.
Collapse
Affiliation(s)
- Nicolò Dozio
- Politecnico di Milano, Department of Mechanical Engineering, Milan, Italy
- Sussex Computer-Human Interaction Lab, Department of Informatics, University of Sussex, Brighton, United Kingdom
| | - Emanuela Maggioni
- Sussex Computer-Human Interaction Lab, Department of Informatics, University of Sussex, Brighton, United Kingdom
- Department of Computer Science, University College London, London, United Kingdom
| | - Dario Pittera
- Sussex Computer-Human Interaction Lab, Department of Informatics, University of Sussex, Brighton, United Kingdom
- Ultraleap Ltd., Bristol, United Kingdom
| | - Alberto Gallace
- Mind and Behavior Technological Center - MibTec, University of Milano-Bicocca, Milan, Italy
| | - Marianna Obrist
- Sussex Computer-Human Interaction Lab, Department of Informatics, University of Sussex, Brighton, United Kingdom
- Department of Computer Science, University College London, London, United Kingdom
| |
Collapse
|
3
|
Manshad MS, Brannon D. Haptic-payment: Exploring vibration feedback as a means of reducing overspending in mobile payment. JOURNAL OF BUSINESS RESEARCH 2021; 122:88-96. [PMID: 32934427 PMCID: PMC7484625 DOI: 10.1016/j.jbusres.2020.08.049] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/15/2020] [Revised: 08/22/2020] [Accepted: 08/26/2020] [Indexed: 06/11/2023]
Abstract
The proliferation of mobile payment applications in recent years has decoupled the physical act of paying from the consumption experience. Prior research suggests that this decreases the psychological sense of loss or 'pain' that consumers feel when making a purchase with more direct payment types (such as cash) and leads them to spend more money. To help address this issue, the present research explores, designs, and tests haptic vibration feedback configurations aimed at restoring the 'pain' of paying with cashless payment options (i.e., online and mobile payment). Counter-intuitively, the present research finds that lower- (vs. higher-) intensity vibration feedback reduces participants' reported willingness-to-spend when compared to a control group that does not receive any vibration feedback. This work is one of the first to explore the role of haptic vibration feedback in nudging consumers to reduce their spending when using cashless payment methods.
Collapse
Affiliation(s)
- Muhanad Shakir Manshad
- Department of Computer Information Systems, Multiexperience Lab, Monfort College of Business, University of Northern Colorado, Greeley, CO 80639, United States
| | - Daniel Brannon
- Department of Marketing, Multiexperience Lab, Monfort College of Business, University of Northern Colorado, Greeley, CO 80639, United States
| |
Collapse
|
4
|
Abstract
In everyday life, mentalizing is nested in a rich context of cognitive faculties and background information that potentially contribute to its success. Yet, we know little about these modulating effects. Here we propose that humans develop a naïve psychological model of attention (featured as a goal-dependent, intentional relation to the environment) and use this to fine-tune their mentalizing attempts, presuming that the way people represent their environment is influenced by the cognitive priorities (attention) their current intentions create. The attention model provides an opportunity to tailor mental state inferences to the temporary features of the agent whose mind is in the focus of mentalizing. The ability to trace attention is an exceptionally powerful aid for mindreading. Knowledge about the partner's attention provides background information, however being grounded in his current intentions, attention has direct relevance to the ongoing interaction. Furthermore, due to its causal connection to intentions, the output of the attention model remains valid for a prolonged but predictable amount of time: till the evoking intention is in place. The naïve attention model theory is offered as a novel theory on social attention that both incorporates existing evidence and identifies new directions in research.
Collapse
|
5
|
Invitto S, Montinaro R, Ciccarese V, Venturella I, Fronda G, Balconi M. Smell and 3D Haptic Representation: A Common Pathway to Understand Brain Dynamics in a Cross-Modal Task. A Pilot OERP and fNIRS Study. Front Behav Neurosci 2019; 13:226. [PMID: 31616263 PMCID: PMC6775200 DOI: 10.3389/fnbeh.2019.00226] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2019] [Accepted: 09/11/2019] [Indexed: 11/13/2022] Open
Abstract
Cross-modal perception allows olfactory information to integrate with other sensory modalities. Olfactory representations are processed by multisensory cortical pathways, where the aspects related to the haptic sensations are integrated. This complex reality allows the development of an integrated perception, where olfactory aspects compete with haptic and/or trigeminal activations. It is assumed that this integration involves both perceptive electrophysiological and metabolic/hemodynamic aspects, but there are no studies evaluating these activations in parallel. The aim of this study was to investigate brain dynamics during a cross-modal olfactory and haptic attention task, preceded by an exploratory session. The assessment of cross-modal dynamics was conducted through simultaneous electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS) recording, evaluating both electrophysiological and hemodynamic activities. The study consisted of two experimental sessions and was conducted with a sample of ten healthy subjects (mean age 25 ± 5.2 years). In Session 1, the subjects were trained to manipulate 3D haptic models (HC) and to smell different scents (SC). In Session 2, the subjects were tested during an attentive olfactory task, in order to investigate the olfactory event-related potentials (OERP) N1 and late positive component (LPC), and EEG rhythms associated with fNIRS components (oxy-Hb and deoxy-Hb). The main results of this study highlighted, in Task 1, a higher fNIRS oxy-Hb response during SC and a positive correlation with the delta rhythm in the central and parietal EEG region of interest. In Session 2, the N1 OERP highlighted a greater amplitude in SC. A negative correlation was found in HC for the deoxy-Hb parietal with frontal and central N1, and for the oxy-Hb frontal with N1 in the frontal, central and parietal regions of interests (ROIs). A negative correlation was found in parietal LPC amplitude with central deoxy-Hb. The data suggest that cross-modal valence modifies the attentional olfactory response and that the dorsal cortical/metabolic pathways are involved in these responses. This can be considered as an important starting point for understanding integrated cognition, as the subject could perceive in an ecological context.
Collapse
Affiliation(s)
- Sara Invitto
- Human Anatomy and Neuroscience Laboratory, Department of Biological and Environmental Sciences and Technologies, University of Salento, Lecce, Italy.,Laboratory of Interdisciplinary Research Applied to Medicine, University of Salento-Vito Fazzi Hospital, Lecce, Italy
| | - Roberta Montinaro
- Human Anatomy and Neuroscience Laboratory, Department of Biological and Environmental Sciences and Technologies, University of Salento, Lecce, Italy
| | | | - Irene Venturella
- Research Unit in Affective and Social Neuroscience, Department of Psychology, Catholic University of Milan, Milan, Italy
| | - Giulia Fronda
- Research Unit in Affective and Social Neuroscience, Department of Psychology, Catholic University of Milan, Milan, Italy
| | - Michela Balconi
- Research Unit in Affective and Social Neuroscience, Department of Psychology, Catholic University of Milan, Milan, Italy
| |
Collapse
|
6
|
The approach of visual stimuli influences expectations about stimulus types for subsequent somatosensory stimuli. Exp Brain Res 2018; 236:1563-1571. [DOI: 10.1007/s00221-018-5244-0] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/10/2017] [Accepted: 03/22/2018] [Indexed: 11/25/2022]
|
7
|
Holmes NP, Tamè L. Multisensory Perception: Magnetic Disruption of Attention in Human Parietal Lobe. Curr Biol 2018; 28:R259-R261. [DOI: 10.1016/j.cub.2018.01.078] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
|
8
|
Santangelo V, Spence C. Assessing the Automaticity of the Exogenous Orienting of Tactile Attention. Perception 2016; 36:1497-505. [DOI: 10.1068/p5848] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
We examined whether or not abrupt tactile onsets are capable of exogenously capturing tactile spatial attention when visual spatial attention is focused elsewhere. In experiment 1, we compared performance under dual-task conditions (where participants performed a tactile exogenous cuing task and a rapid serial visual presentation—RSVP—task at the same time) with their performance under single-task conditions (where the participants had to perform only the cuing task, although the RSVP stream was still presented in the background) and to a no-stream condition (where only the cuing task was presented). Tactile cuing was completely suppressed in both the dual-task and single-task conditions, showing that exogenous tactile spatial orienting is modulated by visual-spatial attention, which hence appears to be far from truly automatic. In experiment 2, we demonstrated that the abolishment of exogenous tactile orienting was not caused by the transient presentation of abrupt onset stimuli (letters). These results therefore show that exogenous spatial attentional orienting toward abrupt peripheral tactile stimuli is possible as long as perceptual resources are not depleted by a perceptually demanding (RSVP) task.
Collapse
Affiliation(s)
- Valerio Santangelo
- Department of Psychology, University of Rome “La Sapienza”, via dei Marsi 78, I 00185 Rome, Italy
| | | |
Collapse
|
9
|
Gallace A, Tan HZ, Spence C. Numerosity Judgments for Tactile Stimuli Distributed over the Body Surface. Perception 2016; 35:247-66. [PMID: 16583769 DOI: 10.1068/p5380] [Citation(s) in RCA: 41] [Impact Index Per Article: 5.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
Abstract
A large body of research now supports the claim that two different and dissociable processes are involved in making numerosity judgments regarding visual stimuli: subitising (fast and nearly errorless) for up to 4 stimuli, and counting (slow and error-prone) when more than 4 stimuli are presented. We studied tactile numerosity judgments for combinations of 1–7 vibrotactile stimuli presented simultaneously over the body surface. In experiment 1, the stimuli were presented once, while in experiment 2 conditions of single presentation and repeated presentation of the stimulus were compared. Neither experiment provided any evidence for a discontinuity in the slope of either the RT or error data suggesting that subitisation does not occur for tactile stimuli. By systematically varying the intensity of the vibrotactile stimuli in experiment 3, we were able to demonstrate that participants were not simply using the ‘global intensity’ of the whole tactile display to make their tactile numerosity judgments, but were, instead, using information concerning the number of tactors activated. The results of the three experiments reported here are discussed in relation to current theories of counting and subitising, and potential implications for the design of tactile user interfaces are highlighted.
Collapse
Affiliation(s)
- Alberto Gallace
- Department of Experimental Psychology, University of Oxford, South Parks Road, Oxford, OX1 3UD, UK.
| | | | | |
Collapse
|
10
|
Pasqualotto A, Dumitru ML, Myachykov A. Editorial: Multisensory Integration: Brain, Body, and World. Front Psychol 2016; 6:2046. [PMID: 26793155 PMCID: PMC4709421 DOI: 10.3389/fpsyg.2015.02046] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2015] [Accepted: 12/23/2015] [Indexed: 12/02/2022] Open
Affiliation(s)
| | - Magda L Dumitru
- Department of Cognitive Science, Macquarie UniversitySydney, NSW, Australia; Cognitive Science Department, Graduate School of Informatics, Middle East Technical UniversityAnkara, Turkey
| | - Andriy Myachykov
- Department of Psychology, Northumbria University NewcastleNewcastle-upon-Type, UK; School of Psychology, Centre for Cognition and Decision Making, National Research University Higher School of EconomicsMoscow, Russia
| |
Collapse
|
11
|
Stephan DN, Koch I. Tactile Stimuli Increase Effects of Modality Compatibility in Task Switching. Exp Psychol 2015; 62:276-84. [PMID: 26421450 DOI: 10.1027/1618-3169/a000291] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
Modality compatibility refers to the similarity of stimulus modality and modality of response-related sensory consequences. Previous dual-task studies found increased switch costs for modality incompatible tasks (auditory-manual/visual-vocal) compared to modality compatible tasks (auditory-vocal/visual-manual). The present task-switching study further examined modality compatibility and investigated vibrotactile stimulation as a novel alternative to visual stimulation. Interestingly, a stronger modality compatibility effect on switch costs was revealed for the group with tactile-auditory stimulation compared to the visual-auditory stimulation group. We suggest that the modality compatibility effect is based on crosstalk of central processing codes due to ideomotor "backward" linkages between the anticipated response effects and the stimuli indicating this response. This crosstalk is increased in the tactile-auditory stimulus group compared to the visual-auditory stimulus group due to a higher degree of ideomotor-compatibility in the tactile-manual tasks. Since crosstalk arises between tasks, performance is only affected in task switching and not in single tasks.
Collapse
Affiliation(s)
| | - Iring Koch
- 1 Institute of Psychology, RWTH Aachen University, Aachen, Germany
| |
Collapse
|
12
|
Pawluk DTV, Adams RJ, Kitada R. Designing Haptic Assistive Technology for Individuals Who Are Blind or Visually Impaired. IEEE TRANSACTIONS ON HAPTICS 2015; 8:258-278. [PMID: 26336151 DOI: 10.1109/toh.2015.2471300] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
This paper considers issues relevant for the design and use of haptic technology for assistive devices for individuals who are blind or visually impaired in some of the major areas of importance: Braille reading, tactile graphics, orientation and mobility. We show that there is a wealth of behavioral research that is highly applicable to assistive technology design. In a few cases, conclusions from behavioral experiments have been directly applied to design with positive results. Differences in brain organization and performance capabilities between individuals who are "early blind" and "late blind" from using the same tactile/haptic accommodations, such as the use of Braille, suggest the importance of training and assessing these groups individually. Practical restrictions on device design, such as performance limitations of the technology and cost, raise questions as to which aspects of these restrictions are truly important to overcome to achieve high performance. In general, this raises the question of what it means to provide functional equivalence as opposed to sensory equivalence.
Collapse
|
13
|
Harris LR, Carnevale MJ, D’Amour S, Fraser LE, Harrar V, Hoover AEN, Mander C, Pritchett LM. How our body influences our perception of the world. Front Psychol 2015; 6:819. [PMID: 26124739 PMCID: PMC4464078 DOI: 10.3389/fpsyg.2015.00819] [Citation(s) in RCA: 30] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2015] [Accepted: 05/29/2015] [Indexed: 12/02/2022] Open
Abstract
Incorporating the fact that the senses are embodied is necessary for an organism to interpret sensory information. Before a unified perception of the world can be formed, sensory signals must be processed with reference to body representation. The various attributes of the body such as shape, proportion, posture, and movement can be both derived from the various sensory systems and can affect perception of the world (including the body itself). In this review we examine the relationships between sensory and motor information, body representations, and perceptions of the world and the body. We provide several examples of how the body affects perception (including but not limited to body perception). First we show that body orientation effects visual distance perception and object orientation. Also, visual-auditory crossmodal-correspondences depend on the orientation of the body: audio "high" frequencies correspond to a visual "up" defined by both gravity and body coordinates. Next, we show that perceived locations of touch is affected by the orientation of the head and eyes on the body, suggesting a visual component to coding body locations. Additionally, the reference-frame used for coding touch locations seems to depend on whether gaze is static or moved relative to the body during the tactile task. The perceived attributes of the body such as body size, affect tactile perception even at the level of detection thresholds and two-point discrimination. Next, long-range tactile masking provides clues to the posture of the body in a canonical body schema. Finally, ownership of seen body parts depends on the orientation and perspective of the body part in view. Together, all of these findings demonstrate how sensory and motor information, body representations, and perceptions (of the body and the world) are interdependent.
Collapse
Affiliation(s)
- Laurence R. Harris
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Michael J. Carnevale
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Sarah D’Amour
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Lindsey E. Fraser
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Vanessa Harrar
- School of Optometry, University of Montreal, Montreal, QC, Canada
| | - Adria E. N. Hoover
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Charles Mander
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| | - Lisa M. Pritchett
- Multisensory Integration Laboratory, The Centre for Vision Research, York University, Toronto, ON, Canada
- Department of Psychology, York University, Toronto, ON, Canada
| |
Collapse
|
14
|
Approach of visual stimuli modulates spatial expectations for subsequent somatosensory stimuli. Int J Psychophysiol 2015; 96:176-82. [PMID: 25889695 DOI: 10.1016/j.ijpsycho.2015.04.002] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2015] [Revised: 04/08/2015] [Accepted: 04/09/2015] [Indexed: 11/20/2022]
Abstract
To examine how the approach of visual stimuli toward the body influences expectations regarding subsequent somatosensory stimuli, we recorded event-related brain potentials (ERPs; nose reference) during a simple reaction time to somatosensory stimuli task. Twelve participants were asked to place their arms on a desk, and three LEDs were placed between their arms at equal intervals. Electrical stimuli were presented to the left (or right) wrist at a high probability (80%) or to the opposite wrist at a low probability (20%). Each trial was composed of three visual stimuli followed by one electrical stimulus. In Experiment 1, the right, center, and left (or left, center, and right) LEDs were turned on sequentially toward the wrist to which the high probability somatosensory stimuli was presented (congruent condition), or the center LED were presented three times (neutral condition). Experiment 2 was composed of the congruent condition and the inverse of the congruent condition (incongruent condition). In both experiments, the reaction times to low probability stimuli were longer than those to high probability stimuli. Moreover, the low probability stimuli elicited a larger P3 amplitude than the high probability stimuli. In addition, the P3 amplitude was higher under the visual approach condition (i.e., the congruent condition in each experiment) than under the control condition (i.e., the neutral and incongruent conditions). Furthermore, no effect on the CNV amplitude before the somatosensory stimuli was found. These results suggest that visual stimuli directed toward the body induce an automatic spatial expectation for subsequent somatosensory stimuli.
Collapse
|
15
|
Coté CA. Visual attention in a visual-haptic, cross-modal matching task in children and adults. Percept Mot Skills 2015; 120:381-96. [PMID: 25871471 DOI: 10.2466/22.pms.120v13x9] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Visual fixation patterns were analyzed to gain insight into developmental changes in attention allocation in a cross-modal task. Two patterns that have been associated with increased task difficulty, gaze aversion and fixation duration, were recorded using an eye-tracker. In this exploratory study, 37 elementary age children (M age 7-10 yr.) and 23 undergraduates engaged in visual-only and haptic-visual shape-matching tasks. Theoretical assumptions underlying this study are that children have greater limitations on attention capacity compared to adults, and that a task presented in the cross-modal condition would pose special demands on this capacity. A 2×2 (uni- or cross-modal×age group) repeated-measures analysis of variance (ANOVA) was used to analyze both gaze aversion and average fixation duration. Children averted gaze significantly more during the cross-modal condition, supporting the idea that children use gaze aversion as an attention-shifting mechanism. Mean fixation duration increased for both groups in the cross-modal condition. Due to the small number and limited age range of the children as well as the limited number of task items, interpretations are made with caution.
Collapse
|
16
|
Rubber hand illusion reduces discomfort caused by cold stimulus. PLoS One 2014; 9:e109909. [PMID: 25295527 PMCID: PMC4190400 DOI: 10.1371/journal.pone.0109909] [Citation(s) in RCA: 27] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/03/2014] [Accepted: 09/10/2014] [Indexed: 11/19/2022] Open
Abstract
There is a growing interest in body-ownership disruptions and their consequences for subjective experiences such as tactile sensations or pain. Here, we investigated the effect of the rubber hand illusion (RHI) on the perceived discomfort caused by cold stimulus applied to the real hand. The results showed reduced discomfort to cold reflected in behavioural and subjective measures. The stronger the illusion, the later the cold temperature became unpleasant and the less intense the experience was rated. We discuss the link between thermoception and body ownership as well as possible theoretical and methodological implications for studies on pain experience under RHI.
Collapse
|
17
|
Langerak RM, La Mantia CL, Brown LE. Global and local processing near the left and right hands. Front Psychol 2013; 4:793. [PMID: 24194725 PMCID: PMC3810600 DOI: 10.3389/fpsyg.2013.00793] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2013] [Accepted: 10/08/2013] [Indexed: 11/13/2022] Open
Abstract
Visual targets can be processed more quickly and reliably when a hand is placed near the target. Both unimodal and bimodal representations of hands are largely lateralized to the contralateral hemisphere, and since each hemisphere demonstrates specialized cognitive processing, it is possible that targets appearing near the left hand may be processed differently than targets appearing near the right hand. The purpose of this study was to determine whether visual processing near the left and right hands interacts with hemispheric specialization. We presented hierarchical-letter stimuli (e.g., small characters used as local elements to compose large characters at the global level) near the left or right hands separately and instructed participants to discriminate the presence of target letters (X and O) from non-target letters (T and U) at either the global or local levels as quickly as possible. Targets appeared at either the global or local level of the display, at both levels, or were absent from the display; participants made foot-press responses. When discriminating target presence at the global level, participants responded more quickly to stimuli presented near the left hand than near either the right hand or in the no-hand condition. Hand presence did not influence target discrimination at the local level. Our interpretation is that left-hand presence may help participants discriminate global information, a right hemisphere (RH) process, and that the left hand may influence visual processing in a way that is distinct from the right hand.
Collapse
Affiliation(s)
- Robin M Langerak
- Department of Psychology, Trent University Peterborough, ON, Canada
| | | | | |
Collapse
|
18
|
Van Damme S, Crombez G, Eccleston C, Goubert L. Impaired disengagement from threatening cues of impending pain in a crossmodal cueing paradigm. Eur J Pain 2012; 8:227-36. [PMID: 15109973 DOI: 10.1016/j.ejpain.2003.08.005] [Citation(s) in RCA: 53] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2003] [Accepted: 08/25/2003] [Indexed: 11/29/2022]
Abstract
This paper reports an experimental investigation of attentional engagement to and disengagement from cues of impending pain. Pain-free volunteers performed a cueing task in which they were instructed to detect somatosensory and tone targets. Target stimuli were preceded by visual cues informing participants of the modality of the impending stimuli. Participants were randomly assigned to a pain group (n = 54) or to a control group (n = 53). Somatosensory targets consisted of painful electrocutaneous stimuli in the pain group and non-painful vibrotactile targets in the control group. Analyses revealed a similar amount of attentional engagement to both cues signalling somatosensory targets, irrespective of their threat value. However, participants had significantly more difficulty in disengaging attention from a threatening cue of impending pain compared to a cue signalling the non-painful vibrotactile target. Our findings provide further evidence that pain cues demand attention, particularly resulting in impaired disengagement.
Collapse
Affiliation(s)
- Stefaan Van Damme
- Department of Experimental Clinical and Health Psychology, Ghent University, Ghent, Belgium.
| | | | | | | |
Collapse
|
19
|
Influences of intra- and crossmodal grouping on visual and tactile Ternus apparent motion. Brain Res 2010; 1354:152-62. [DOI: 10.1016/j.brainres.2010.07.064] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2010] [Revised: 07/14/2010] [Accepted: 07/18/2010] [Indexed: 11/19/2022]
|
20
|
Abstract
In six experiments, subjects judged the sizes of squares that were presented visually and/or haptically, in unimodal or bimodal conditions. We were interested in which mode most affected size judgments in the bimodal condition when the squares presented to each mode actually differed in size. Three factors varied: whether haptic exploration was passive or active, whether the choice set from which the subjects selected their responses was visual or haptic, and whether cutaneous information was provided in addition to kinesthetic information. To match the task for each mode, visual presentations consisted of a cursor that moved along a square pathway to correspond to the haptic experience of successive segments revealed during exploration. We found that the visual influence on size judgments was greater than the influence of haptics when the haptic experience involved only kinesthesis, passive movement, and a visual choice set. However, when cutaneous input was added to kinesthetic information, size judgments were most influenced by the haptic mode. The results support hypotheses of sensory integration, rather than capture of one sense by the other.
Collapse
|
21
|
Sela L, Sobel N. Human olfaction: a constant state of change-blindness. Exp Brain Res 2010; 205:13-29. [PMID: 20603708 PMCID: PMC2908748 DOI: 10.1007/s00221-010-2348-6] [Citation(s) in RCA: 111] [Impact Index Per Article: 7.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2010] [Accepted: 06/21/2010] [Indexed: 12/01/2022]
Abstract
Paradoxically, although humans have a superb sense of smell, they don’t trust their nose. Furthermore, although human odorant detection thresholds are very low, only unusually high odorant concentrations spontaneously shift our attention to olfaction. Here we suggest that this lack of olfactory awareness reflects the nature of olfactory attention that is shaped by the spatial and temporal envelopes of olfaction. Regarding the spatial envelope, selective attention is allocated in space. Humans direct an attentional spotlight within spatial coordinates in both vision and audition. Human olfactory spatial abilities are minimal. Thus, with no olfactory space, there is no arena for olfactory selective attention. Regarding the temporal envelope, whereas vision and audition consist of nearly continuous input, olfactory input is discreet, made of sniffs widely separated in time. If similar temporal breaks are artificially introduced to vision and audition, they induce “change blindness”, a loss of attentional capture that results in a lack of awareness to change. Whereas “change blindness” is an aberration of vision and audition, the long inter-sniff-interval renders “change anosmia” the norm in human olfaction. Therefore, attentional capture in olfaction is minimal, as is human olfactory awareness. All this, however, does not diminish the role of olfaction through sub-attentive mechanisms allowing subliminal smells a profound influence on human behavior and perception.
Collapse
Affiliation(s)
- Lee Sela
- Department of Neurobiology, The Weizmann Institute of Science, Rehovot, 76100 Israel
| | - Noam Sobel
- Department of Neurobiology, The Weizmann Institute of Science, Rehovot, 76100 Israel
| |
Collapse
|
22
|
Van der Lubbe RHJ, Van Mierlo CM, Postma A. The Involvement of Occipital Cortex in the Early Blind in Auditory and Tactile Duration Discrimination Tasks. J Cogn Neurosci 2010; 22:1541-56. [DOI: 10.1162/jocn.2009.21285] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Abstract
Early blind participants outperform controls on several spatially oriented perceptual tasks such as sound localization and tactile orientation discrimination. Previous studies have suggested that the recruitment of occipital cortex in the blind is responsible for this improvement. For example, electroencephalographic studies showed an enlarged posterior negativity for the blind in these tasks compared to controls. In our study, the question was raised whether the early blind are also better at tasks in which the duration of auditory and tactile stimuli must be discriminated. The answer was affirmative. Our electroencephalographic data revealed an enlarged posterior negativity for the blind relative to controls. Source analyses showed comparable solutions in the case of auditory and tactile targets for the blind. These findings support the interpretation of these negativities in terms of a supramodal rather than a modality-specific process, although confirmation with more spatially sensitive methods seems necessary. We additionally examined whether the early blind are less affected by irrelevant tactile or auditory exogenous cues preceding auditory or tactile targets than controls. No differences in alerting and orienting effects of these cues were found between the blind and the controls. Together, our results support the view that major differences between early blind participants and sighted controls on auditory and tactile duration discrimination tasks relate to a late and likely supramodal process that takes place in occipital areas.
Collapse
Affiliation(s)
- Rob H. J. Van der Lubbe
- 1Utrecht University, Utrecht, The Netherlands
- 2University of Twente, Enschede, The Netherlands
| | - Christa M. Van Mierlo
- 1Utrecht University, Utrecht, The Netherlands
- 3VU University, Amsterdam, The Netherlands
| | | |
Collapse
|
23
|
Dionne JK, Meehan SK, Legon W, Staines WR. Crossmodal influences in somatosensory cortex: Interaction of vision and touch. Hum Brain Mapp 2010; 31:14-25. [PMID: 19572308 DOI: 10.1002/hbm.20841] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Abstract
Previous research has shown that information from one sensory modality has the potential to influence activity in a different modality, and these crossmodal interactions can occur early in the cortical sensory processing stream within sensory-specific cortex. In addition, it has been shown that when sensory information is relevant to the performance of a task, there is an upregulation of sensory cortex. This study sought to investigate the effects of simultaneous bimodal (visual and vibrotactile) stimulation on the modulation of primary somatosensory cortex (SI), in the context of a delayed sensory-to-motor task when both stimuli are task-relevant. It was hypothesized that the requirement to combine visual and vibrotactile stimuli would be associated with an increase in SI activity compared to vibrotactile stimuli alone. Functional magnetic resonance imaging (fMRI) was performed on healthy subjects using a 3T scanner. During the scanning session, subjects performed a sensory-guided motor task while receiving visual, vibrotactile, or both types of stimuli. An event-related design was used to examine cortical activity related to the stimulus onset and the motor response. A region of interest (ROI) analysis was performed on right SI and revealed an increase in percent blood oxygenation level dependent signal change in the bimodal (visual + tactile) task compared to the unimodal tasks. Results of the whole-brain analysis revealed a common fronto-parietal network that was active across both the bimodal and unimodal task conditions, suggesting that these regions are sensitive to the attentional and motor-planning aspects of the task rather than the unimodal or bimodal nature of the stimuli.
Collapse
|
24
|
Krause V, Pollok B, Schnitzler A. Perception in action: the impact of sensory information on sensorimotor synchronization in musicians and non-musicians. Acta Psychol (Amst) 2010; 133:28-37. [PMID: 19751937 DOI: 10.1016/j.actpsy.2009.08.003] [Citation(s) in RCA: 64] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/15/2008] [Revised: 08/11/2009] [Accepted: 08/13/2009] [Indexed: 11/17/2022] Open
Abstract
The present study aimed at investigating to what extent sensorimotor synchronization is related to (i) musical specialization, (ii) perceptual discrimination, and (iii) the movement's trajectory. To this end, musicians with different musical expertise (drummers, professional pianists, amateur pianists, singers, and non-musicians) performed an auditory and visual synchronization and a cross-modal temporal discrimination task. During auditory synchronization drummers performed less variably than amateur pianists, singers and non-musicians. In the cross-modal discrimination task drummers showed superior discrimination abilities which were correlated with synchronization variability as well as with the trajectory. These data suggest that (i) the type of specialized musical instrument affects synchronization abilities and (ii) synchronization accuracy is related to perceptual discrimination abilities as well as to (iii) the movement's trajectory. Since particularly synchronization variability was affected by musical expertise, the present data imply that the type of instrument improves accuracy of timekeeping mechanisms.
Collapse
Affiliation(s)
- Vanessa Krause
- Institute for Clinical Neuroscience and Medical Psychology, Heinrich-Heine-University, Duesseldorf, Germany; Department of Neurology, University Hospital Duesseldorf, Germany
| | | | | |
Collapse
|
25
|
Forster B, Sambo CF, Pavone EF. ERP correlates of tactile spatial attention differ under intra- and intermodal conditions. Biol Psychol 2009; 82:227-33. [DOI: 10.1016/j.biopsycho.2009.08.001] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2008] [Revised: 05/20/2009] [Accepted: 08/02/2009] [Indexed: 10/20/2022]
|
26
|
Dalton P, Lavie N, Spence C. Short article: The role of working memory in tactile selective attention. Q J Exp Psychol (Hove) 2009; 62:635-44. [DOI: 10.1080/17470210802483503] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Abstract
Load theory suggests that working memory controls the extent to which irrelevant distractors are processed (e.g., Lavie, Hirst, De Fockert, & Viding, 2004). However, so far this proposal has only been tested in vision. Here, we examine the extent to which tactile selective attention also depends on working memory. In Experiment 1, participants focused their attention on continuous target vibrations while attempting to ignore pulsed distractor vibrations. In Experiment 2, targets were always presented to a particular hand, with distractors being presented to the other hand. In both experiments, a high (vs. low) load in a concurrent working memory task led to greater interference by the tactile distractors. These results establish the role of working memory in the control of tactile selective attention, demonstrating for the first time that the principles of load theory also apply to the tactile modality.
Collapse
Affiliation(s)
- Polly Dalton
- Royal Holloway, University of London, Egham, Surrey, UK
| | - Nilli Lavie
- Institute of Cognitive Neuroscience, University College London, London, UK
| | - Charles Spence
- Department of Experimental Psychology, University of Oxford, Oxford, UK
| |
Collapse
|
27
|
Response requirements modulate tactile spatial congruency effects. Exp Brain Res 2008; 191:171-86. [DOI: 10.1007/s00221-008-1510-x] [Citation(s) in RCA: 32] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2007] [Accepted: 07/18/2008] [Indexed: 10/21/2022]
|
28
|
Chen JYC, Terrence PI. Effects of tactile cueing on concurrent performance of military and robotics tasks in a simulated multitasking environment. ERGONOMICS 2008; 51:1137-1152. [PMID: 18608472 DOI: 10.1080/00140130802030706] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/26/2023]
Abstract
This study examined the concurrent performance of military gunnery, robotics control and communication tasks in a simulated environment. More specifically, the study investigated how aided target recognition (AiTR) capabilities (delivered either through tactile or tactile + visual cueing) for the gunnery task might benefit overall performance. Results showed that AiTR benefited not only the gunnery task, but also the concurrent robotics and communication tasks. The participants' spatial ability was found to be a good indicator of their gunnery and robotics task performance. However, when AiTR was available to assist their gunnery task, those participants of lower spatial ability were able to perform their robotics tasks as well as those of higher spatial ability. Finally, participants' workload assessment was significantly higher when they teleoperated (i.e. remotely operated) a robot and when their gunnery task was unassisted. These results will further understanding of multitasking performance in military tasking environments. These results will also facilitate the implementation of robots in military settings and will provide useful data to military system designs.
Collapse
Affiliation(s)
- J Y C Chen
- US Army Research Laboratory-Human Research & Engineering Directorate, PEO STRI Field Element, Orlando, Florida, USA.
| | | |
Collapse
|
29
|
Rohlman DS, Lucchini R, Anger WK, Bellinger DC, van Thriel C. Neurobehavioral testing in human risk assessment. Neurotoxicology 2008; 29:556-67. [PMID: 18539229 DOI: 10.1016/j.neuro.2008.04.003] [Citation(s) in RCA: 27] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2008] [Revised: 04/09/2008] [Accepted: 04/10/2008] [Indexed: 02/06/2023]
Abstract
Neurobehavioral tests are being increasingly used in human risk assessment and there is a strong need for guidance. The field of neurobehavioral toxicology has evolved from research which initially focused on using traditional neuropsychological tests to identify "abnormal cases" to include methods used to detect sub-clinical deficits, to further incorporate the use of neurosensory assessment, and to expand testing from occupational populations to vulnerable populations including older adults and children. Even as exposures in the workplace are reduced, they have been increasing in the environment and research on exposure has now expanded to cross the entire lifetime. These neurobehavioral methods are applied in research and the findings used for regulatory purposes to develop preventative action for exposed populations. This paper reflects a summary of the talks presented at the Neurobehavioral Testing in Human Risk Assessment symposium presented at the 11th meeting of the International Neurotoxicology Association.
Collapse
Affiliation(s)
- Diane S Rohlman
- Center for Research on Occupational and Environmental Toxicology, L606, Oregon Health & Science University, Portland, OR 97239, USA.
| | | | | | | | | |
Collapse
|
30
|
Gernsbacher MA, Stevenson JL, Khandakar S, Goldsmith HH. Why Does Joint Attention Look Atypical in Autism? CHILD DEVELOPMENT PERSPECTIVES 2008; 2:38-45. [PMID: 25520747 DOI: 10.1111/j.1750-8606.2008.00039.x] [Citation(s) in RCA: 55] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
This essay answers the question of why autistic children are less likely to initiate joint attention (e.g., use their index finger to point to indicate interest in something) and why they are less likely to respond to bids for their joint attention (e.g., turn their heads to look at something to which another person points). It reviews empirical evidence that autistic toddlers, children, adolescents, and adults can attend covertly, even to social stimuli, such as the direction in which another person's eyes are gazing. It also reviews empirical evidence that autistics of various ages understand the intentionality of other persons' actions. The essay suggests that autistics' atypical resistance to distraction, atypical skill at parallel perception, and atypical execution of volitional actions underlie their atypical manifestations of joint attention.
Collapse
|
31
|
Marshall CD, Kovacs KM, Lydersen C. Feeding kinematics, suction and hydraulic jetting capabilities in bearded seals (Erignathus barbatus). J Exp Biol 2008; 211:699-708. [DOI: 10.1242/jeb.009852] [Citation(s) in RCA: 66] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
SUMMARYFeeding kinematics, suction and hydraulic jetting capabilities of bearded seals (Erignathus barbatus) were characterized during controlled feeding trials. Feeding trials were conducted both on land and in water, and allowed a choice between suction and biting, but food was also presented that could be ingested by suction alone. Four feeding phases, preparatory, jaw opening, hyoid depression and jaw closing were observed; the mean feeding cycle duration was 0.54±0.22 s, regardless of feeding mode(P>0.05). Subjects feeding on land used biting and suction 89.3%and 10.7% of the time, respectively. Subjects feeding in water used suction and hydraulic jetting 96.3% and 3.7% of the time, respectively. No biting behavior was observed underwater. Suction feeding was characterized by a small gape (2.7±0.85 cm), small gape angle (24.4±8.13°), pursing of the rostral lips to form a circular aperture, and pursing of the lateral lips to occlude lateral gape. Biting was characterized by large gape(7.3±2.2 cm), large gape angle (41.7±15.2°), and lip curling to expose the teeth. An excavation behavior in which suction and hydraulic jetting were alternated was used to extract food from recessed wells. The maximum subambient and suprambient pressures recorded were 91.2 and 53.4 kPa,respectively. The inclusion of suction data for phocids broadens the principle that suction feeding kinematics is conserved among aquatic vertebrates. Furthermore, bearded seals support predictions that mouth size, fluid flow speed, and elusiveness of prey consumed are among a suite of traits that determine the specific nature of suction feeding among species.
Collapse
Affiliation(s)
- Christopher D. Marshall
- Texas A&M University at Galveston, Department of Marine Biology, 5007 Avenue U, Galveston, TX 77551, USA
| | | | | |
Collapse
|
32
|
Van Damme S, Crombez G, Lorenz J. Pain Draws Visual Attention to Its Location: Experimental Evidence for a Threat-Related Bias. THE JOURNAL OF PAIN 2007; 8:976-82. [PMID: 17822961 DOI: 10.1016/j.jpain.2007.07.005] [Citation(s) in RCA: 61] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/25/2007] [Revised: 06/29/2007] [Accepted: 07/06/2007] [Indexed: 11/26/2022]
Abstract
UNLABELLED It has been often demonstrated that pain interferes with the processing of other information. However, the initiation of protective behavior in response to pain also requires enhanced processing of potentially relevant information, such as stimuli sharing the same spatial coordinates. In this study we test whether pain draws visual attention to its location. We report 2 experiments in which healthy individuals detected visual stimuli at 2 possible locations. Each stimulus was preceded by painful stimulation at the corresponding (congruent trial) or noncorresponding (incongruent trial) location. Based on the probability ratio of congruent to incongruent trials, pain was either spatially informative (experiment 1) or uninformative (experiment 2) for visual target detection. The detection of visual stimuli was faster at the pain location than at the other location in both experiments suggesting efficient spatially guided orienting and responding to potential sources of somatic threat. However, when pain was spatially uninformative, visual attention was only drawn to the pain location when pain was perceived as threatening. This indicates that threatening pain prioritizes the processing of visual information at its location, even if the pain is irrelevant for the upcoming visual event. PERSPECTIVE In this study a threat-related processing bias of visual information on a painful body location was demonstrated. This finding advances our knowledge on how pain modulates attention. More particularly, it seems that interruption by pain is not absolute and that pain prioritizes the processing of other perceptual information that it spatially related to the pain.
Collapse
Affiliation(s)
- Stefaan Van Damme
- Department of Experimental-Clinical and Health Psychology, Ghent University, Ghent, Belgium.
| | | | | |
Collapse
|
33
|
Moseley GL, Arntz A. The context of a noxious stimulus affects the pain it evokes. Pain 2007; 133:64-71. [PMID: 17449180 DOI: 10.1016/j.pain.2007.03.002] [Citation(s) in RCA: 92] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2006] [Revised: 02/14/2007] [Accepted: 03/05/2007] [Indexed: 11/18/2022]
Abstract
The influence of contextual factors on the pain evoked by a noxious stimulus is not well defined. In this study, a -20 degrees C rod was placed on one hand for 500 ms while we manipulated the evaluative context (or 'meaning') of, warning about, and visual attention to, the stimulus. For meaning, a red (hot, more tissue damaging) or blue (cold, less tissue damaging) visual cue was used. For warning, the stimulus occurred after the cue or they occurred together. For visual attention, subjects looked towards the stimulus or away from it. Repeated measures ANCOVA was significant (alpha=0.0125). Stimuli associated with a red cue were rated as hot, with the blue cue as cold (difference on an 11 point scale approximately 5.5). The red cue also meant the pain was rated as more unpleasant (difference approximately 3.5) and more intense (difference approximately 3). For stimuli associated with the red cue only, the pain was more unpleasant when the stimulus occurred after the cue than when it didn't (difference approximately 1.1). Pain was rated as more intense, and the stimulus as hotter, when subjects looked at the red-cued stimulus than when they didn't (difference approximately 0.9 for pain intensity and approximately 2 for temperature). We conclude that meaning affects the experience a noxious stimulus evokes, and that warning and visual attention moderate the effects of meaning when the meaning is associated with tissue-damage. Different dimensions of the stimulus' context can have differential effects on sensory-discriminative and affective-emotional components of pain.
Collapse
Affiliation(s)
- G Lorimer Moseley
- Department of Physiology, Anatomy & Genetics & fMRIB Centre, Le Gros Clark Building, Oxford University, South Parks Road, Oxford OX1 3QX, United Kingdom.
| | | |
Collapse
|
34
|
Işoğlu-alkaç U, Kedzior K, Karamürsel S, Ermutlu N. Event-related potentials during auditory oddball, and combined auditory oddball-visual paradigms. Int J Neurosci 2007; 117:487-506. [PMID: 17380607 DOI: 10.1080/00207450600773509] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Abstract
The purpose of the current study was to investigate the properties of a new modification of the classical auditory oddball paradigm (auditory oddball paradigm combined with passive visual stimulation, AERPs + VEPs) and compare the scalp topography obtained with the new paradigm and the classical auditory oddball paradigm (AERPs) in healthy humans. The responses to bimodal stimulation, and to the classical oddball paradigm were similar to those reported in other studies in terms of location, amplitudes, and latencies of P1, N1, P2, N2, and P300. The new modification of the oddball paradigm produced P300 at fronto-central locations in contrast to centro-parietal locations during the classical oddball paradigm. The amplitudes and latencies of P300 were also significantly larger during the new than the classical paradigm. Furthermore, the amplitudes of N1 and P2, but not N2 were significantly higher and differed in location during the new paradigm in response to both target and standard stimuli. The latencies of all three waves were significantly longer and the latency of P2 differed in location between the new and the classical paradigms in response to only the standard stimuli. The results of this study suggest that the new modification of the classical oddball paradigm produces different neural responses to the classical oddball paradigm. Therefore, this modification can be used to investigate dysfunctions in sensory and cognitive processing in clinical samples.
Collapse
|
35
|
Isoğlu-Alkaç U, Kedzior K, Keskindemirci G, Ermutlu N, Karamursel S. Event-related potentials to visual, auditory, and bimodal (combined auditory-visual) stimuli. Int J Neurosci 2007; 117:259-73. [PMID: 17365112 DOI: 10.1080/00207450500534118] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
The purpose of this study was to investigate the response properties of event related potentials to unimodal and bimodal stimulations. The amplitudes of N1 and P2 were larger during bimodal evoked potentials (BEPs) than auditory evoked potentials (AEPs) in the anterior sites and the amplitudes of P1 were larger during BEPs than VEPs especially at the parieto-occipital locations. Responses to bimodal stimulation had longer latencies than responses to unimodal stimulation. The N1 and P2 components were larger in amplitude and longer in latency during the bimodal paradigm and predominantly occurred at the anterior sites. Therefore, the current bimodal paradigm can be used to investigate the involvement and location of specific neural generators that contribute to higher processing of sensory information. Moreover, this paradigm may be a useful tool to investigate the level of sensory dysfunctions in clinical samples.
Collapse
|
36
|
Krämer HH, Seddigh S, Lorimer Moseley G, Birklein F. Dysynchiria is not a common feature of neuropathic pain. Eur J Pain 2007; 12:128-31. [PMID: 17446100 DOI: 10.1016/j.ejpain.2007.02.005] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2006] [Revised: 12/01/2006] [Accepted: 02/21/2007] [Indexed: 11/25/2022]
Abstract
Patients with chronic neuropathic pain (non-CRPS) and brush-evoked allodynia watched a reflected image of their corresponding but opposite skin region being brushed in a mirror. Unlike complex regional pain syndrome Type 1, this process did not evoke any sensation at the affected area ('dysynchiria'). We conclude that central nociceptive sensitisation alone is not sufficient to cause dysynchiria in neuropathic pain. The results imply a difference in cortical pain processing between complex regional pain syndrome and other chronic neuropathic pain.
Collapse
Affiliation(s)
- Heidrun H Krämer
- Department of Neurology, University of Mainz, Langenbeckstr. 1, 55101 Mainz, Germany
| | | | | | | |
Collapse
|
37
|
Somatoform dissociation and somatosensory amplification are differentially associated with attention to the tactile modality following exposure to body-related stimuli. J Psychosom Res 2007; 62:159-65. [PMID: 17270574 DOI: 10.1016/j.jpsychores.2006.08.008] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/30/2006] [Revised: 07/17/2006] [Accepted: 08/24/2006] [Indexed: 02/09/2023]
Abstract
OBJECTIVE Body-focused attention is regarded as an important maintaining factor for somatoform illness, although there is limited empirical evidence pertaining to this hypothesis. This study was conducted to assess whether individual differences in somatoform dissociation and somatosensory amplification were associated with biased attention towards the tactile modality, particularly following exposure to threatening body-related stimuli. METHODS Forty-eight nonclinical participants completed the Somatoform Dissociation Questionnaire (SDQ-20; a proxy measure of somatoform symptomatology), the Somatosensory Amplification Scale (SSAS), and a modality bias task. The task consisted of a series of body-relevant or body-irrelevant (scene) picture stimuli, half of which were threatening and half were neutral, followed by target stimuli in either the visual or the tactile modality. Participants judged the location of each target stimulus, and performance data were used to calculate the degree to which participants were biased towards the tactile modality following each of the picture types. RESULTS Participants in the high SDQ-20 group (defined by median split) showed a significant increase in tactile bias when responding to targets occurring 250 ms after the presentation of threatening body-relevant stimuli only. This effect was not observed for the low SDQ-20 group. Scores on the SSAS correlated negatively with tactile bias for both threatening and neutral body-relevant stimuli at 250 ms. CONCLUSIONS Individuals with a tendency to experience somatoform symptoms focus more on stimuli in the tactile modality immediately following exposure to threatening body-relevant information. In contrast, self-reported somatosensory amplification appears to be associated with attention away from the tactile modality rather than with increased tactile focus.
Collapse
|
38
|
Marshall DC, Lee JD, Austria RA. Alerts for in-vehicle information systems: annoyance, urgency, and appropriateness. HUMAN FACTORS 2007; 49:145-57. [PMID: 17315851 DOI: 10.1518/001872007779598145] [Citation(s) in RCA: 32] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/14/2023]
Abstract
OBJECTIVE This study assesses the influence of the auditory characteristics of alerts on perceived urgency and annoyance and whether these perceptions depend on the context in which the alert is received. BACKGROUND Alert parameters systematically affect perceived urgency, and mapping the urgency of a situation to the perceived urgency of an alert is a useful design consideration. Annoyance associated with environmental noise has been thoroughly studied, but little research has addressed whether alert parameters differentially affect annoyance and urgency. METHOD Three 2(3) x 3 mixed within/between factorial experiments, with a total of 72 participants, investigated nine alert parameters in three driving contexts. These parameters were formant (similar to harmonic series), pulse duration, interpulse interval, alert onset and offset, burst duty cycle, alert duty cycle, interburst period, and sound type. Imagined collision warning, navigation alert, and E-mail notification scenarios defined the driving context. RESULTS All parameters influenced both perceived urgency and annoyance (p < .05), with pulse duration, interpulse interval, alert duty cycle, and sound type influencing urgency substantially more than annoyance. There was strong relationship between perceived urgency and rated appropriateness for high-urgency driving scenarios and a strong relationship between annoyance and rated appropriateness for low-urgency driving scenarios. CONCLUSION Sound parameters differentially affect annoyance and urgency. Also, urgency and annoyance differentially affect perceived appropriateness of warnings. APPLICATION Annoyance may merit as much attention as urgency in the design of auditory warnings, particularly in systems that alert drivers to relatively low-urgency situations.
Collapse
|
39
|
Négyessy L, Nepusz T, Kocsis L, Bazsó F. Prediction of the main cortical areas and connections involved in the tactile function of the visual cortex by network analysis. Eur J Neurosci 2006; 23:1919-30. [PMID: 16623848 DOI: 10.1111/j.1460-9568.2006.04678.x] [Citation(s) in RCA: 36] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
Abstract
We explored the cortical pathways from the primary somatosensory cortex to the primary visual cortex (V1) by analysing connectional data in the macaque monkey using graph-theoretical tools. Cluster analysis revealed the close relationship of the dorsal visual stream and the sensorimotor cortex. It was shown that prefrontal area 46 and parietal areas VIP and 7a occupy a central position between the different clusters in the visuo-tactile network. Among these structures all the shortest paths from primary somatosensory cortex (3a, 1 and 2) to V1 pass through VIP and then reach V1 via MT, V3 and PO. Comparison of the input and output fields suggested a larger specificity for the 3a/1-VIP-MT/V3-V1 pathways among the alternative routes. A reinforcement learning algorithm was used to evaluate the importance of the aforementioned pathways. The results suggest a higher role for V3 in relaying more direct sensorimotor information to V1. Analysing cliques, which identify areas with the strongest coupling in the network, supported the role of VIP, MT and V3 in visuo-tactile integration. These findings indicate that areas 3a, 1, VIP, MT and V3 play a major role in shaping the tactile information reaching V1 in both sighted and blind subjects. Our observations greatly support the findings of the experimental studies and provide a deeper insight into the network architecture underlying visuo-tactile integration in the primate cerebral cortex.
Collapse
Affiliation(s)
- László Négyessy
- Neurobiology Research Group, United Research Organization of the Hungarian Academy of Sciences and Semmelweis Medical University, Department of Anatomy, Tuzoltó u. 58, H-1094 Budapest, Hungary.
| | | | | | | |
Collapse
|
40
|
Poliakoff E, Ashworth S, Lowe C, Spence C. Vision and touch in ageing: Crossmodal selective attention and visuotactile spatial interactions. Neuropsychologia 2006; 44:507-17. [PMID: 16098997 DOI: 10.1016/j.neuropsychologia.2005.07.004] [Citation(s) in RCA: 72] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2005] [Accepted: 07/07/2005] [Indexed: 11/23/2022]
Abstract
We investigated whether ageing affects crossmodal selective attention (the ability to focus on a relevant sensory modality and ignore an irrelevant modality) and the spatial constraints on such selective processing. Three groups of 24 participants were tested: Young (19-25 years), Young-Old (65-72 years) and Old-Old (76-92 years). The participants had to judge the elevation of vibrotactile targets (upper/index finger and lower/thumb), presented randomly to either hand while ignoring concurrent visual distractors. In a second task, the role of the target and distractor modalities was reversed. Crossmodal selective attention was assessed by comparing performance in the presence versus absence of distractors. Spatial constraints on selective attention were also investigated by comparing the effect of distractors presented on the same versus opposite side as the target. When attending to touch, the addition of visual distractors had a significantly larger effect on error rates in both of the older groups as compared to the Young group. This indicates that ageing has a detrimental effect on crossmodal selective attention. In all three age groups, performance was impaired when the target and distractor were presented at incongruent as compared to congruent elevations in both tasks. This congruency effect was modulated by the relative spatial location of the target and distractor in certain conditions for the Young and the Young-Old group. That is, participants in the two younger age groups found it harder to attend selectively to targets in one modality, when distractor stimuli came from the same side rather than from the opposite side. However, no significant spatial modulation was found in the Old-Old group. This suggests that ageing may also compromise spatial aspects of crossmodal selective attention.
Collapse
Affiliation(s)
- E Poliakoff
- School of Psychological Sciences, University of Manchester, Manchester M13 9PL, UK.
| | | | | | | |
Collapse
|
41
|
Abstract
Synesthesia is a condition in which stimulation in one modality also gives rise to a perceptual experience in a second modality. In two recent studies we found that the condition is more common than previously reported; up to 5% of the population may experience at least one type of synesthesia. Although the condition has been traditionally viewed as an anomaly (e.g., breakdown in modularity), it seems that at least some of the mechanisms underlying synesthesia do reflect universal crossmodal mechanisms. We review here a number of examples of crossmodal correspondences found in both synesthetes and nonsynesthetes including pitch-lightness and vision-touch interaction, as well as cross-domain spatial-numeric interactions. Additionally, we discuss the common role of spatial attention in binding shape and color surface features (whether ordinary or synesthetic color). Consistently with behavioral and neuroimaging data showing that chromatic-graphemic (colored-letter) synesthesia is a genuine perceptual phenomenon implicating extrastriate cortex, we also present electrophysiological data showing modulation of visual evoked potentials by synesthetic color congruency.
Collapse
Affiliation(s)
- Noam Sagiv
- Department of Psychology, University College London, 26 Bedford Way, London WC1H 0AP, UK.
| | | |
Collapse
|
42
|
Haggard P, Kitadono K, Press C, Taylor-Clarke M. The brain's fingers and hands. Exp Brain Res 2005; 172:94-102. [PMID: 16369787 DOI: 10.1007/s00221-005-0311-8] [Citation(s) in RCA: 49] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2005] [Accepted: 11/24/2005] [Indexed: 10/25/2022]
Abstract
The brain keeps track of the changing positions of body parts in space using a spatial body schema. When subjects localise a tactile stimulus on the skin, they might either use a somatotopic body map, or use a body schema to identify the location of the stimulation in external space. Healthy subjects were touched on the fingertips, with the hands in one of two postures: either the right hand was vertically above the left, or the fingers of both hands were interwoven. Subjects made speeded verbal responses to identify either the finger or the hand that was touched. Interweaving the fingers significantly impaired hand identification across several experiments, but had no effect on finger identification. Our results suggest that identification of fingers occurs in a somatotopic representation or finger schema. Identification of hands uses a general body schema, and is influenced by external spatial location. This dissociation implies that touches on the finger can only be identified with a particular hand after a process of assigning fingers to hands. This assignment is based on external spatial location. Our results suggest a role of the body schema in the identification of structural body parts from touch.
Collapse
Affiliation(s)
- Patrick Haggard
- Institute of Cognitive Neuroscience and Department of Psychology, University College London, Alexandra House, London, UK.
| | | | | | | |
Collapse
|
43
|
Weiss SJ. Haptic perception and the psychosocial functioning of preterm, low birth weight infants. Infant Behav Dev 2005. [DOI: 10.1016/j.infbeh.2005.05.006] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
|
44
|
Abstract
Previous research on multisensory integration has demonstrated that viewing the stimulated body part enhances discrimination ability. Participants in this experiment watched a video showing a hand being touched by a stick and a second video showing the stick touching the space beneath the hand. Sensory thresholds of the index fingers were tested with von Frey filaments. We found significant enhancements of the sensory threshold after showing the video with the touched hand but not after showing the video with no touch of the hand. This enhancement was specific for the index finger shown in the video. The results link the visuotactile enhancement of this study to the observation of touch rather than to the simple depiction of the body part.
Collapse
Affiliation(s)
- Michael Schaefer
- Human Cortical Physiology Section, National Institute of Neurological Disorders and Stroke, National Institutes of Health, Bethesda, MD 20892, USA.
| | | | | |
Collapse
|
45
|
Spence C, Pavani F, Maravita A, Holmes N. Multisensory contributions to the 3-D representation of visuotactile peripersonal space in humans: evidence from the crossmodal congruency task. ACTA ACUST UNITED AC 2005; 98:171-89. [PMID: 15477031 DOI: 10.1016/j.jphysparis.2004.03.008] [Citation(s) in RCA: 106] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
Abstract
In order to determine precisely the location of a tactile stimulus presented to the hand it is necessary to know not only which part of the body has been stimulated, but also where that part of the body lies in space. This involves the multisensory integration of visual, tactile, proprioceptive, and even auditory cues regarding limb position. In recent years, researchers have become increasingly interested in the question of how these various sensory cues are weighted and integrated in order to enable people to localize tactile stimuli, as well as to give rise to the 'felt' position of our limbs, and ultimately the multisensory representation of 3-D peripersonal space. We highlight recent research on this topic using the crossmodal congruency task, in which participants make speeded elevation discrimination responses to vibrotactile targets presented to the thumb or index finger, while simultaneously trying to ignore irrelevant visual distractors presented from either the same (i.e., congruent) or a different (i.e., incongruent) elevation. Crossmodal congruency effects (calculated as performance on incongruent-congruent trials) are greatest when visual and vibrotactile stimuli are presented from the same azimuthal location, thus providing an index of common position across different sensory modalities. The crossmodal congruency task has been used to investigate a number of questions related to the representation of space in both normal participants and brain-damaged patients. In this review, we detail the major findings from this research, and highlight areas of convergence with other cognitive neuroscience disciplines.
Collapse
Affiliation(s)
- Charles Spence
- Department of Experimental Psychology, University of Oxford, South Parks Road, Oxford OX1 3UD, UK.
| | | | | | | |
Collapse
|
46
|
Coslett HB, Lie E. Bare hands and attention: evidence for a tactile representation of the human body. Neuropsychologia 2004; 42:1865-76. [PMID: 15381016 DOI: 10.1016/j.neuropsychologia.2004.06.002] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2003] [Revised: 05/27/2004] [Accepted: 06/21/2004] [Indexed: 11/17/2022]
Abstract
If brain lesions impair the allocation of attention to a representation of the body surface and the hand may serve as an attentional focus or "wand", one might expect that somatosensory deficits caused by cerebral lesions would be ameliorated by contact with the ipsilesional hand. To test this prediction, tactile detection tasks were administered to two subjects with right hemisphere lesions. Subject CB's left tactile extinction was investigated in conditions in which the degree of contact between the right and left hands and the spatial relationship between his hands was systematically varied. His left tactile extinction was significantly reduced by touch of the right hand. Similarly, extinction at the left knee was ameliorated by touch of the knee by the right hand; touch of the right foot had no effect. Subject NC's ability to detect a tactile stimulus delivered to the left side was systematically assessed in conditions in which the hands touched and the spatial relationship between the hands was varied. His ability to detect a touch on the left hand improved in conditions in which the left hand was touched by the right hand. This effect was not observed if direct contact between the two hands was prevented by inserting a thin cloth between the hands. For both subjects, placing the right hand in close proximity to the left hand or altering the spatial location of the hands relative to the body did not influence performance. These data demonstrate that the hand may serve as a conduit for attention and provide strong evidence for a distinct representation of the body surface that is at least in part independent of spatial representations.
Collapse
Affiliation(s)
- H Branch Coslett
- Department of Neurology and Center for Cognitive Neuroscience, University of Pennsylvania, 3400 Spruce St., Philadelphia, PA 19104, USA.
| | | |
Collapse
|
47
|
Porro CA, Lui F, Facchin P, Maieron M, Baraldi P. Percept-related activity in the human somatosensory system: functional magnetic resonance imaging studies. Magn Reson Imaging 2004; 22:1539-48. [PMID: 15707803 DOI: 10.1016/j.mri.2004.10.003] [Citation(s) in RCA: 53] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2004] [Accepted: 10/08/2004] [Indexed: 11/28/2022]
Abstract
In this paper, we review blood oxygenation level-dependent (BOLD) functional magnetic resonance imaging (fMRI) studies addressing the neural correlates of touch, thermosensation, pain and the mechanisms of their cognitive modulation in healthy human subjects. There is evidence that fMRI signal changes can be elicited in the parietal cortex by stimulation of single mechanoceptive afferent fibers at suprathreshold intensities for conscious perception. Positive linear relationships between the amplitude or the spatial extents of BOLD fMRI signal changes, stimulus intensity and the perceived touch or pain intensity have been described in different brain areas. Some recent fMRI studies addressed the role of cortical areas in somatosensory perception by comparing the time course of cortical activity evoked by different kinds of stimuli with the temporal features of touch, heat or pain perception. Moreover, parametric single-trial functional MRI designs have been adopted in order to disentangle subprocesses within the nociceptive system. Available evidence suggest that studies that combine fMRI with psychophysical methods may provide a valuable approach for understanding complex perceptual mechanisms and top-down modulation of the somatosensory system by cognitive factors specifically related to selective attention and to anticipation. The brain networks underlying somatosensory perception are complex and highly distributed. A deeper understanding of perceptual-related brain mechanisms therefore requires new approaches suited to investigate the spatial and temporal dynamics of activation in different brain regions and their functional interaction.
Collapse
Affiliation(s)
- Carlo Adolfo Porro
- Dip. Scienze e Tecnologie Biomediche, Univ. di Udine, P.le Kolbe 4, I-33100 Udine, Italy.
| | | | | | | | | |
Collapse
|
48
|
Diederich A, Colonius H. Bimodal and trimodal multisensory enhancement: Effects of stimulus onset and intensity on reaction time. ACTA ACUST UNITED AC 2004; 66:1388-404. [PMID: 15813202 DOI: 10.3758/bf03195006] [Citation(s) in RCA: 227] [Impact Index Per Article: 11.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Manual reaction times to visual, auditory, and tactile stimuli presented simultaneously, or with a delay, were measured to test for multisensory interaction effects in a simple detection task with redundant signals. Responses to trimodal stimulus combinations were faster than those to bimodal combinations, which in turn were faster than reactions to unimodal stimuli. Response enhancement increased with decreasing auditory and tactile stimulus intensity and was a U-shaped function of stimulus onset asynchrony. Distribution inequality tests indicated that the multisensory interaction effects were larger than predicted by separate activation models, including the difference between bimodal and trimodal response facilitation. The results are discussed with respect to previous findings in a focused attention task and are compared with multisensory integration rules observed in bimodal and trimodal superior colliculus neurons in the cat and monkey.
Collapse
Affiliation(s)
- Adele Diederich
- School of Humanities and Social Sciences, International University Bremen, D-28725 Bremen, Germany.
| | | |
Collapse
|
49
|
Hötting K, Rösler F, Röder B. Altered auditory-tactile interactions in congenitally blind humans: an event-related potential study. Exp Brain Res 2004; 159:370-81. [PMID: 15241575 DOI: 10.1007/s00221-004-1965-3] [Citation(s) in RCA: 48] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2003] [Accepted: 04/28/2004] [Indexed: 10/26/2022]
Abstract
It has been shown that stimuli of a task-irrelevant modality receive enhanced processing when they are presented at an attended location in space (crossmodal attention). The present study investigated the effects of visual deprivation on the interaction of the intact sensory systems. Random streams of tactile and auditory stimuli were presented at the left or right index finger of congenitally blind participants. They had to attend to one modality (auditory or tactile) of one side (left or right) and had to respond to deviant stimuli of the attended modality and side. While in a group of sighted participants, early event-related potentials (ERPs) were negatively displaced to stimuli presented at the attended position, compared to the unattended, for both the task-relevant and the task-irrelevant modality, starting as early as 80 ms after stimulus onset (unimodal and crossmodal spatial attention effects, respectively), corresponding crossmodal effects could not be detected in the blind. In the sighted, spatial attention effects after 200 ms were only significant for the task-relevant modality, whereas a crossmodal effect for this late time window was observed in the blind. This positive rather than negative effect possibly indicates an active suppression of task-irrelevant stimuli at an attended location in space. The present data suggest that developmental visual input is essential for the use of space to integrate input of the non-visual modalities, possibly because of its high spatial resolution. Alternatively, enhanced perceptual skills of the blind within the intact modalities may result in reduced multisensory interactions ("inverse effectiveness of multisensory integration").
Collapse
Affiliation(s)
- Kirsten Hötting
- Department of Psychology, Philipps-University Marburg, Gutenbergstrasse 18, 35032 Marburg, Germany.
| | | | | |
Collapse
|
50
|
Abstract
Our understanding of the neural correlates of crossmodal binding in the human brain derives almost exclusively from studies of audition, vision and somatosensation. A new study by Gottfried and Dolan extends our understanding of multisensory integration by showing that facilitation of odor detection by visual cues depends on object congruency, as well as on enhanced activity in the superior temporal sulcus and a region of the orbitofrontal cortex that is adjacent to olfactory association cortex.
Collapse
Affiliation(s)
- Dana A Small
- Cognitive Neurology and Alzheimer's Disease Center, Institute of Neuroscience, Departmentof Neurology, Northwestern Feinberg School of Medicine, Chicago, IL 60611, USA.
| |
Collapse
|