1
|
Sabek H, Heurley LP, Guerineau R, Dru V. The Simon effect under reversed visual feedback. PSYCHOLOGICAL RESEARCH 2024; 88:1141-1156. [PMID: 38451272 DOI: 10.1007/s00426-024-01936-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2023] [Accepted: 01/30/2024] [Indexed: 03/08/2024]
Abstract
Our aim was to study the processes involved in the spatial coding of the body during actions producing multiple simultaneous effects. We specifically aimed to challenge the intentional-based account, which proposes that the effects used to code responses are those deemed relevant to the agent's goal. Accordingly, we used a Simon paradigm (widely recognized as a suitable method to investigate the spatial coding of responses) combined with a setup inducing a multimodal discrepancy between visual and tactile/proprioceptive effects (known to be crucial for body schema construction and action control). To be more precise, the setup allowed to horizontally reverse the visual effects of the hands compared to the tactile/proprioceptive effects (e.g., the right hand was seen as being on the left). In Experiment 1, the visual effects were not reversed. However, in Experiment 2, the visual effects were reversed, and the task emphasized the relevance of these effects to the participants. In Experiment 3, the visual effects were also reversed, but the task emphasized the relevance of tactile/proprioceptive effects. A Simon effect, based on the location of the tactile/proprioceptive effects, was observed in Experiments 1 and 3. However, in Experiment 2, the Simon effect was partially driven by the location of the visual effects. These findings collectively support that the agent's intention plays a prominent role in the representation of their body during action. This work also suggests a promising avenue for research in linking action and body representations.
Collapse
Affiliation(s)
- Hamza Sabek
- Laboratoire Sur Les Interactions Cognition, Action, Émotion (LICAE), Université Paris Nanterre, 200 Av. de la République, Nanterre, 92000, France.
| | - Loïc P Heurley
- Laboratoire Sur Les Interactions Cognition, Action, Émotion (LICAE), Université Paris Nanterre, 200 Av. de la République, Nanterre, 92000, France
| | - Ronan Guerineau
- Laboratoire Sur Les Interactions Cognition, Action, Émotion (LICAE), Université Paris Nanterre, 200 Av. de la République, Nanterre, 92000, France
| | - Vincent Dru
- Laboratoire Sur Les Interactions Cognition, Action, Émotion (LICAE), Université Paris Nanterre, 200 Av. de la République, Nanterre, 92000, France
| |
Collapse
|
2
|
Kayser C, Heuer H. Multisensory perception depends on the reliability of the type of judgment. J Neurophysiol 2024; 131:723-737. [PMID: 38416720 DOI: 10.1152/jn.00451.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/06/2023] [Revised: 02/05/2024] [Accepted: 02/24/2024] [Indexed: 03/01/2024] Open
Abstract
The brain engages the processes of multisensory integration and recalibration to deal with discrepant multisensory signals. These processes consider the reliability of each sensory input, with the more reliable modality receiving the stronger weight. Sensory reliability is typically assessed via the variability of participants' judgments, yet these can be shaped by factors both external and internal to the nervous system. For example, motor noise and participant's dexterity with the specific response method contribute to judgment variability, and different response methods applied to the same stimuli can result in different estimates of sensory reliabilities. Here we ask how such variations in reliability induced by variations in the response method affect multisensory integration and sensory recalibration, as well as motor adaptation, in a visuomotor paradigm. Participants performed center-out hand movements and were asked to judge the position of the hand or rotated visual feedback at the movement end points. We manipulated the variability, and thus the reliability, of repeated judgments by asking participants to respond using either a visual or a proprioceptive matching procedure. We find that the relative weights of visual and proprioceptive signals, and thus the asymmetry of multisensory integration and recalibration, depend on the reliability modulated by the judgment method. Motor adaptation, in contrast, was insensitive to this manipulation. Hence, the outcome of multisensory binding is shaped by the noise introduced by sensorimotor processing, in line with perception and action being intertwined.NEW & NOTEWORTHY Our brain tends to combine multisensory signals based on their respective reliability. This reliability depends on sensory noise in the environment, noise in the nervous system, and, as we show here, variability induced by the specific judgment procedure.
Collapse
Affiliation(s)
- Christoph Kayser
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
| | - Herbert Heuer
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
- Leibniz Research Centre for Working Environment and Human Factors, Dortmund, Germany
| |
Collapse
|
3
|
Debats NB, Heuer H, Kayser C. Different time scales of common-cause evidence shape multisensory integration, recalibration and motor adaptation. Eur J Neurosci 2023; 58:3253-3269. [PMID: 37461244 DOI: 10.1111/ejn.16095] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2023] [Accepted: 07/03/2023] [Indexed: 09/05/2023]
Abstract
Perceptual coherence in the face of discrepant multisensory signals is achieved via the processes of multisensory integration, recalibration and sometimes motor adaptation. These supposedly operate on different time scales, with integration reducing immediate sensory discrepancies and recalibration and motor adaptation reflecting the cumulative influence of their recent history. Importantly, whether discrepant signals are bound during perception is guided by the brains' inference of whether they originate from a common cause. When combined, these two notions lead to the hypothesis that the time scales on which integration and recalibration (or motor adaptation) operate are associated with different time scales of evidence about a common cause underlying two signals. We tested this prediction in a well-established visuo-motor paradigm, in which human participants performed visually guided hand movements. The kinematic correlation between hand and cursor movements indicates their common origin, which allowed us to manipulate the common-cause evidence by titrating this correlation. Specifically, we dissociated hand and cursor signals during individual movements while preserving their correlation across the series of movement endpoints. Following our hypothesis, this manipulation reduced integration compared with a condition in which visual and proprioceptive signals were perfectly correlated. In contrast, recalibration and motor adaption were not affected by this manipulation. This supports the notion that multisensory integration and recalibration deal with sensory discrepancies on different time scales guided by common-cause evidence: Integration is prompted by local common-cause evidence and reduces immediate discrepancies, whereas recalibration and motor adaptation are prompted by global common-cause evidence and reduce persistent discrepancies.
Collapse
Affiliation(s)
- Nienke B Debats
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
| | - Herbert Heuer
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
- Leibniz Research Centre for Working Environment and Human Factors, Dortmund, Germany
| | - Christoph Kayser
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
| |
Collapse
|
4
|
Kirsch W, Kunde W. On the Role of Interoception in Body and Object Perception: A Multisensory-Integration Account. PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2023; 18:321-339. [PMID: 35994810 PMCID: PMC10018064 DOI: 10.1177/17456916221096138] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Various "embodied perception" phenomena suggest that what people sense of their body shapes what they perceive of the environment and that what they perceive of the environment shapes what they perceive of their bodies. For example, an observer's own hand can be felt where a fake hand is seen, events produced by own body movements seem to occur earlier than they did, and feeling a heavy weight at an observer's back may prompt hills to look steeper. Here we argue that such and various other phenomena are instances of multisensory integration of interoceptive signals from the body and exteroceptive signals from the environment. This overarching view provides a mechanistic description of what embodiment in perception means and how it works. It suggests new research questions while questioning a special role of the body itself and various phenomenon-specific explanations in terms of ownership, agency, or action-related scaling of visual information.
Collapse
Affiliation(s)
- Wladimir Kirsch
- Wladimir Kirsch, Department of Psychology,
University of Würzburg
| | | |
Collapse
|
5
|
Debats NB, Heuer H, Kayser C. Short-term effects of visuomotor discrepancies on multisensory integration, proprioceptive recalibration, and motor adaptation. J Neurophysiol 2023; 129:465-478. [PMID: 36651909 DOI: 10.1152/jn.00478.2022] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/19/2023] Open
Abstract
Information about the position of our hand is provided by multisensory signals that are often not perfectly aligned. Discrepancies between the seen and felt hand position or its movement trajectory engage the processes of 1) multisensory integration, 2) sensory recalibration, and 3) motor adaptation, which adjust perception and behavioral responses to apparently discrepant signals. To foster our understanding of the coemergence of these three processes, we probed their short-term dependence on multisensory discrepancies in a visuomotor task that has served as a model for multisensory perception and motor control previously. We found that the well-established integration of discrepant visual and proprioceptive signals is tied to the immediate discrepancy and independent of the outcome of the integration of discrepant signals in immediately preceding trials. However, the strength of integration was context dependent, being stronger in an experiment featuring stimuli that covered a smaller range of visuomotor discrepancies (±15°) compared with one covering a larger range (±30°). Both sensory recalibration and motor adaptation for nonrepeated movement directions were absent after two bimodal trials with same or opposite visuomotor discrepancies. Hence our results suggest that short-term sensory recalibration and motor adaptation are not an obligatory consequence of the integration of preceding discrepant multisensory signals.NEW & NOTEWORTHY The functional relation between multisensory integration and recalibration remains debated. We here refute the notion that they coemerge in an obligatory manner and support the hypothesis that they serve distinct goals of perception.
Collapse
Affiliation(s)
- Nienke B Debats
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
| | - Herbert Heuer
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany.,Leibniz Research Centre for Working Environment and Human Factors, Dortmund, Germany
| | - Christoph Kayser
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
| |
Collapse
|
6
|
Kirsch W, Kunde W. Perceptual changes after learning of an arbitrary mapping between vision and hand movements. Sci Rep 2022; 12:11427. [PMID: 35794174 PMCID: PMC9259624 DOI: 10.1038/s41598-022-15579-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2022] [Accepted: 06/27/2022] [Indexed: 11/21/2022] Open
Abstract
The present study examined the perceptual consequences of learning arbitrary mappings between visual stimuli and hand movements. Participants moved a small cursor with their unseen hand twice to a large visual target object and then judged either the relative distance of the hand movements (Exp.1), or the relative number of dots that appeared in the two consecutive target objects (Exp.2) using a two-alternative forced choice method. During a learning phase, the numbers of dots that appeared in the target object were correlated with the hand movement distance. In Exp.1, we observed that after the participants were trained to expect many dots with larger hand movements, they judged movements made to targets with many dots as being longer than the same movements made to targets with few dots. In Exp.2, another group of participants who received the same training judged the same number of dots as smaller when larger rather than smaller hand movements were executed. When many dots were paired with smaller hand movements during the learning phase of both experiments, no significant changes in the perception of movements and of visual stimuli were observed. These results suggest that changes in the perception of body states and of external objects can arise when certain body characteristics co-occur with certain characteristics of the environment. They also indicate that the (dis)integration of multimodal perceptual signals depends not only on the physical or statistical relation between these signals, but on which signal is currently attended.
Collapse
Affiliation(s)
- Wladimir Kirsch
- Department of Psychology, University of Würzburg, Röntgenring 11, 97070, Würzburg, Germany.
| | - Wilfried Kunde
- Department of Psychology, University of Würzburg, Röntgenring 11, 97070, Würzburg, Germany
| |
Collapse
|
7
|
Debats NB, Heuer H, Kayser C. Visuo-proprioceptive integration and recalibration with multiple visual stimuli. Sci Rep 2021; 11:21640. [PMID: 34737371 PMCID: PMC8569193 DOI: 10.1038/s41598-021-00992-2] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2021] [Accepted: 10/18/2021] [Indexed: 11/29/2022] Open
Abstract
To organize the plethora of sensory signals from our environment into a coherent percept, our brain relies on the processes of multisensory integration and sensory recalibration. We here asked how visuo-proprioceptive integration and recalibration are shaped by the presence of more than one visual stimulus, hence paving the way to study multisensory perception under more naturalistic settings with multiple signals per sensory modality. We used a cursor-control task in which proprioceptive information on the endpoint of a reaching movement was complemented by two visual stimuli providing additional information on the movement endpoint. The visual stimuli were briefly shown, one synchronously with the hand reaching the movement endpoint, the other delayed. In Experiment 1, the judgments of hand movement endpoint revealed integration and recalibration biases oriented towards the position of the synchronous stimulus and away from the delayed one. In Experiment 2 we contrasted two alternative accounts: that only the temporally more proximal visual stimulus enters integration similar to a winner-takes-all process, or that the influences of both stimuli superpose. The proprioceptive biases revealed that integration—and likely also recalibration—are shaped by the superposed contributions of multiple stimuli rather than by only the most powerful individual one.
Collapse
Affiliation(s)
- Nienke B Debats
- Department of Cognitive Neuroscience, Universität Bielefeld, Universitätsstrasse 25, 33615, Bielefeld, Germany. .,Center for Cognitive Interaction Technology (CITEC), Universität Bielefeld, Bielefeld, Germany.
| | - Herbert Heuer
- Department of Cognitive Neuroscience, Universität Bielefeld, Universitätsstrasse 25, 33615, Bielefeld, Germany.,Leibniz Research Centre for Working Environment and Human Factors, Dortmund, Germany
| | - Christoph Kayser
- Department of Cognitive Neuroscience, Universität Bielefeld, Universitätsstrasse 25, 33615, Bielefeld, Germany.,Center for Cognitive Interaction Technology (CITEC), Universität Bielefeld, Bielefeld, Germany
| |
Collapse
|
8
|
Impact of proprioception on the perceived size and distance of external objects in a virtual action task. Psychon Bull Rev 2021; 28:1191-1201. [PMID: 33782919 PMCID: PMC8367880 DOI: 10.3758/s13423-021-01915-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/09/2021] [Indexed: 11/08/2022]
Abstract
Previous research has revealed changes in the perception of objects due to changes of object-oriented actions. In present study, we varied the arm and finger postures in the context of a virtual reaching and grasping task and tested whether this manipulation can simultaneously affect the perceived size and distance of external objects. Participants manually controlled visual cursors, aiming at reaching and enclosing a distant target object, and judged the size and distance of this object. We observed that a visual-proprioceptive discrepancy introduced during the reaching part of the action simultaneously affected the judgments of target distance and of target size (Experiment 1). A related variation applied to the grasping part of the action affected the judgments of size, but not of distance of the target (Experiment 2). These results indicate that perceptual effects observed in the context of actions can directly arise through sensory integration of multimodal redundant signals and indirectly through perceptual constancy mechanisms.
Collapse
|
9
|
Abstract
Spatial action-effect binding denotes the mutual attraction between the perceived position of an effector (e.g., one's own hand) and a distal object that is controlled by this effector. Such spatial binding can be construed as an implicit measure of object ownership, thus the belonging of a controlled object to the own body. The current study investigated how different transformations of hand movements (body-internal action component) into movements of a visual object (body-external action component) affect spatial action-effect binding, and thus implicit object ownership. In brief, participants had to bring a cursor on the computer screen into a predefined target position by moving their occluded hand on a tablet and had to estimate their final hand position. In Experiment 1, we found a significantly lower drift of the proprioceptive position of the hand towards the visual object when hand movements were transformed into laterally inverted cursor movements, rather than cursor movements in the same direction. Experiment 2 showed that this reduction reflected an elimination of spatial action-effect binding in the inverted condition. The results are discussed with respect to the prerequisites for an experience of ownership over artificial, noncorporeal objects. Our results show that predictability of an object movement alone is not a sufficient condition for ownership because, depending on the type of transformation, integration of the effector and a distal object can be fully abolished even under conditions of full controllability.
Collapse
|
10
|
Debats NB, Heuer H. Exploring the time window for causal inference and the multisensory integration of actions and their visual effects. ROYAL SOCIETY OPEN SCIENCE 2020; 7:192056. [PMID: 32968497 PMCID: PMC7481684 DOI: 10.1098/rsos.192056] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/25/2019] [Accepted: 07/13/2020] [Indexed: 06/11/2023]
Abstract
Successful computer use requires the operator to link the movement of the cursor to that of his or her hand. Previous studies suggest that the brain establishes this perceptual link through multisensory integration, whereby the causality evidence that drives the integration is provided by the correlated hand and cursor movement trajectories. Here, we explored the temporal window during which this causality evidence is effective. We used a basic cursor-control task, in which participants performed out-and-back reaching movements with their hand on a digitizer tablet. A corresponding cursor movement could be shown on a monitor, yet slightly rotated by an angle that varied from trial to trial. Upon completion of the backward movement, participants judged the endpoint of the outward hand or cursor movement. The mutually biased judgements that typically result reflect the integration of the proprioceptive information on hand endpoint with the visual information on cursor endpoint. We here manipulated the time period during which the cursor was visible, thereby selectively providing causality evidence either before or after sensory information regarding the to-be-judged movement endpoint was available. Specifically, the cursor was visible either during the outward or backward hand movement (conditions Out and Back, respectively). Our data revealed reduced integration in the condition Back compared with the condition Out, suggesting that causality evidence available before the to-be-judged movement endpoint is more powerful than later evidence in determining how strongly the brain integrates the endpoint information. This finding further suggests that sensory integration is not delayed until a judgement is requested.
Collapse
Affiliation(s)
- Nienke B. Debats
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
- Cognitive Interaction Technology Center of Excellence (CITEC), Universität Bielefeld, Bielefeld, Germany
| | - Herbert Heuer
- Department of Cognitive Neuroscience, Universität Bielefeld, Bielefeld, Germany
- Leibniz Research Centre for Working Environment and Human Factors, Dortmund, Germany
| |
Collapse
|
11
|
A condition that produces sensory recalibration and abolishes multisensory integration. Cognition 2020; 202:104326. [PMID: 32464344 DOI: 10.1016/j.cognition.2020.104326] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/14/2019] [Revised: 05/05/2020] [Accepted: 05/07/2020] [Indexed: 11/20/2022]
Abstract
We examined the influence of extended exposure to a visuomotor rotation, which induces both motor adaptation and sensory recalibration, on (partial) multisensory integration in a cursor-control task. Participants adapted to a 30° (adaptation condition) or 0° (control condition) visuomotor rotation by making center-out movements to remembered targets. In subsequent test trials of sensory integration, they made center-out movements with variable visuomotor rotations and judged the position of hand or cursor at the end of these movements. Test trials were randomly embedded among twice the number of maintenance trials with 30° or 0° rotation. The biases of perceived hand (or cursor) position toward the cursor (or hand) position were measured. We found motor adaptation together with proprioceptive and visual recalibrations in the adaptation condition. Unexpectedly, multisensory integration was absent in both the adaptation and control condition. The absence stemmed from the extensive experience of constant visuomotor rotations of 30° or 0°, which probably produced highly precise predictions of the visual consequences of hand movements. The frequently confirmed predictions then dominated the estimate of the visual movement consequences, leaving no influence of the actual visuomotor rotations in the minority of test trials. Conversely, multisensory integration was present for sensed hand positions when these were indirectly assessed from movement characteristics, indicating that the relative weighting of discrepant estimates of hand position was different for motor control. The existence of a condition that abolishes multisensory integration while keeping sensory recalibration suggests that mechanisms that reduce sensory discrepancies (partly) differ between integration and recalibration.
Collapse
|
12
|
Multisensory integration in virtual interactions with distant objects. Sci Rep 2019; 9:17362. [PMID: 31758046 PMCID: PMC6874595 DOI: 10.1038/s41598-019-53921-9] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2019] [Accepted: 11/07/2019] [Indexed: 11/08/2022] Open
Abstract
Statistically optimal integration of multimodal signals is known to take place in direct interactions with environmental objects. In the present study we tested whether the same mechanism is responsible for perceptual biases observed in a task, in which participants enclose visual objects by manually controlled visual cursors. We manipulated the relative reliability of visual object information and measured the impact of body-related information on object perception as well as the perceptual variability. The results were qualitatively consistent with statistically optimal sensory integration. However, quantitatively, the observed bias and variability measures systematically differed from the model predictions. This outcome indicates a compensatory mechanism similar to the reliability-based weighting of multisensory signals which could underlie action's effects in visual perception reported in diverse context conditions.
Collapse
|
13
|
Rand MK, Heuer H. Visual and proprioceptive recalibrations after exposure to a visuomotor rotation. Eur J Neurosci 2019; 50:3296-3310. [PMID: 31077463 DOI: 10.1111/ejn.14433] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2018] [Revised: 04/23/2019] [Accepted: 04/29/2019] [Indexed: 11/28/2022]
Abstract
Adaptation to a visuomotor rotation in a cursor-control task is accompanied by proprioceptive recalibration, whereas the existence of visual recalibration is uncertain and has even been doubted. In the present study, we tested both visual and proprioceptive recalibration; proprioceptive recalibration was not only assessed by means of psychophysical judgments of the perceived position of the hand, but also by an indirect procedure based on movement characteristics. Participants adapted to a gradually introduced visuomotor rotation of 30° by making center-out movements to remembered targets. In subsequent test trials, they made center-out movements without visual feedback or observed center-out motions of a cursor without moving the hand. In each test trial, they judged the endpoint of hand or cursor by matching the position of the hand or of a visual marker, respectively, moving along a semicircular path. This path ran through all possible endpoints of the center-out movements. We observed proprioceptive recalibration of 7.3° (3.1° with the indirect procedure) and a smaller, but significant, visual recalibration of 1.3°. Total recalibration of 8.6° was about half as strong as motor adaptation, the adaptive shift of the movement direction. The evidence of both proprioceptive and visual recalibration was obtained with a judgment procedure that suggests that recalibration is restricted to the type of movement performed during exposure to a visuomotor rotation. Consequently, identical physical positions of the hand can be perceived differently depending on how they have been reached, and similarly identical positions of a cursor on a monitor can be perceived differently.
Collapse
Affiliation(s)
- Miya K Rand
- Leibniz Research Centre for Working Environment and Human Factors, Dortmund, Germany
| | - Herbert Heuer
- Leibniz Research Centre for Working Environment and Human Factors, Dortmund, Germany
| |
Collapse
|
14
|
Rand MK, Heuer H. Effects of Hand and Hemispace on Multisensory Integration of Hand Position and Visual Feedback. Front Psychol 2019; 10:237. [PMID: 30809172 PMCID: PMC6379332 DOI: 10.3389/fpsyg.2019.00237] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2018] [Accepted: 01/24/2019] [Indexed: 11/23/2022] Open
Abstract
The brain generally integrates a multitude of sensory signals to form a unified percept. Even in cursor control tasks, such as reaching while looking at rotated visual feedback on a monitor, visual information on cursor position and proprioceptive information on hand position are partially integrated (sensory coupling), resulting in mutual biases of the perceived positions of cursor and hand. Previous studies showed that the strength of sensory coupling (sum of the mutual biases) depends on the experience of kinematic correlations between hand movements and cursor motions, whereas the asymmetry of sensory coupling (difference between the biases) depends on the relative reliabilities (inverse of variability) of hand-position and cursor-position estimates (reliability rule). Furthermore, the precision of movement control and perception of hand position are known to differ between hands (left, right) and workspaces (ipsilateral, contralateral), and so does the experience of kinematic correlations from daily life activities. Thus, in the present study, we tested whether strength and asymmetry of sensory coupling for the endpoints of reaches in a cursor control task differ between the right and left hand and between ipsilateral and contralateral hemispace. No differences were found in the strength of sensory coupling between hands or between hemispaces. However, asymmetry of sensory coupling was less in ipsilateral than in contralateral hemispace: in ipsilateral hemispace, the bias of the perceived hand position was reduced, which was accompanied by a smaller variability of the estimates. The variability of position estimates of the dominant right hand was also less than for the non-dominant left hand, but this difference was not accompanied by a difference in the asymmetry of sensory coupling – a violation of the reliability rule, probably due a stronger influence of visual information on right-hand movements. According to these results, the long-term effects of the experienced kinematic correlation between hand movements and cursor motions on the strength of sensory coupling are generic and not specific for hemispaces or hands, whereas the effects of relative reliabilities on the asymmetry of sensory coupling are specific for hemispaces but not for hands.
Collapse
Affiliation(s)
- Miya K Rand
- Leibniz Research Centre for Working Environment and Human Factors, TU Dortmund (IfADo), Dortmund, Germany
| | - Herbert Heuer
- Leibniz Research Centre for Working Environment and Human Factors, TU Dortmund (IfADo), Dortmund, Germany
| |
Collapse
|