1
|
Butz MV. Resourceful Event-Predictive Inference: The Nature of Cognitive Effort. Front Psychol 2022; 13:867328. [PMID: 35846607 PMCID: PMC9280204 DOI: 10.3389/fpsyg.2022.867328] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2022] [Accepted: 04/13/2022] [Indexed: 11/29/2022] Open
Abstract
Pursuing a precise, focused train of thought requires cognitive effort. Even more effort is necessary when more alternatives need to be considered or when the imagined situation becomes more complex. Cognitive resources available to us limit the cognitive effort we can spend. In line with previous work, an information-theoretic, Bayesian brain approach to cognitive effort is pursued: to solve tasks in our environment, our brain needs to invest information, that is, negative entropy, to impose structure, or focus, away from a uniform structure or other task-incompatible, latent structures. To get a more complete formalization of cognitive effort, a resourceful event-predictive inference model (REPI) is introduced, which offers computational and algorithmic explanations about the latent structure of our generative models, the active inference dynamics that unfold within, and the cognitive effort required to steer the dynamics-to, for example, purposefully process sensory signals, decide on responses, and invoke their execution. REPI suggests that we invest cognitive resources to infer preparatory priors, activate responses, and anticipate action consequences. Due to our limited resources, though, the inference dynamics are prone to task-irrelevant distractions. For example, the task-irrelevant side of the imperative stimulus causes the Simon effect and, due to similar reasons, we fail to optimally switch between tasks. An actual model implementation simulates such task interactions and offers first estimates of the involved cognitive effort. The approach may be further studied and promises to offer deeper explanations about why we get quickly exhausted from multitasking, how we are influenced by irrelevant stimulus modalities, why we exhibit magnitude interference, and, during social interactions, why we often fail to take the perspective of others into account.
Collapse
Affiliation(s)
- Martin V. Butz
- Neuro-Cognitive Modeling Group, Department of Computer Science, University of Tübingen, Tubingen, Germany
- Department of Psychology, Faculty of Science, University of Tübingen, Tubingen, Germany
| |
Collapse
|
2
|
Rosenblum L, Grewe E, Churan J, Bremmer F. Influence of Tactile Flow on Visual Heading Perception. Multisens Res 2022; 35:291-308. [PMID: 35263712 DOI: 10.1163/22134808-bja10071] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/06/2021] [Accepted: 02/10/2022] [Indexed: 11/19/2022]
Abstract
The integration of information from different sensory modalities is crucial for successful navigation through an environment. Among others, self-motion induces distinct optic flow patterns on the retina, vestibular signals and tactile flow, which contribute to determine traveled distance (path integration) or movement direction (heading). While the processing of combined visual-vestibular information is subject to a growing body of literature, the processing of visuo-tactile signals in the context of self-motion has received comparatively little attention. Here, we investigated whether visual heading perception is influenced by behaviorally irrelevant tactile flow. In the visual modality, we simulated an observer's self-motion across a horizontal ground plane (optic flow). Tactile self-motion stimuli were delivered by air flow from head-mounted nozzles (tactile flow). In blocks of trials, we presented only visual or tactile stimuli and subjects had to report their perceived heading. In another block of trials, tactile and visual stimuli were presented simultaneously, with the tactile flow within ±40° of the visual heading (bimodal condition). Here, importantly, participants had to report their perceived visual heading. Perceived self-motion direction in all conditions revealed a centripetal bias, i.e., heading directions were perceived as compressed toward straight ahead. In the bimodal condition, we found a small but systematic influence of task-irrelevant tactile flow on visually perceived headings as function of their directional offset. We conclude that tactile flow is more tightly linked to self-motion perception than previously thought.
Collapse
Affiliation(s)
- Lisa Rosenblum
- Department of Neurophysics, Philipps-Universität Marburg, Karl-von-Frisch-Straße 8a, 35043 Marburg, Germany.,Center for Mind, Brain and Behavior, Philipps-Universität Marburg and Justus-Liebig-Universität Giessen, 35032 Marburg, Germany
| | - Elisa Grewe
- Department of Neurophysics, Philipps-Universität Marburg, Karl-von-Frisch-Straße 8a, 35043 Marburg, Germany
| | - Jan Churan
- Department of Neurophysics, Philipps-Universität Marburg, Karl-von-Frisch-Straße 8a, 35043 Marburg, Germany.,Center for Mind, Brain and Behavior, Philipps-Universität Marburg and Justus-Liebig-Universität Giessen, 35032 Marburg, Germany
| | - Frank Bremmer
- Department of Neurophysics, Philipps-Universität Marburg, Karl-von-Frisch-Straße 8a, 35043 Marburg, Germany.,Center for Mind, Brain and Behavior, Philipps-Universität Marburg and Justus-Liebig-Universität Giessen, 35032 Marburg, Germany
| |
Collapse
|
3
|
Scotto CR, Moscatelli A, Pfeiffer T, Ernst MO. Visual pursuit biases tactile velocity perception. J Neurophysiol 2021; 126:540-549. [PMID: 34259048 DOI: 10.1152/jn.00541.2020] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
During a smooth pursuit eye movement of a target stimulus, a briefly flashed stationary background appears to move in the opposite direction as the eye's motion-an effect known as the Filehne illusion. Similar illusions occur in audition, in the vestibular system, and in touch. Recently, we found that the movement of a surface perceived from tactile slip was biased if this surface was sensed with the moving hand. The analogy between these two illusions suggests similar mechanisms of motion processing between the vision and touch. In the present study, we further assessed the interplay between these two sensory channels by investigating a novel paradigm that associated an eye pursuit of a visual target with a tactile motion over the skin of the fingertip. We showed that smooth pursuit eye movements can bias the perceived direction of motion in touch. Similarly to the classical report from the Filehne illusion in vision, a static tactile surface was perceived as moving rightward with a leftward eye pursuit movement, and vice versa. However, this time the direction of surface motion was perceived from touch. The biasing effects of eye pursuit on tactile motion were modulated by the reliability of the tactile and visual stimuli, consistently with a Bayesian model of motion perception. Overall, these results support a modality- and effector-independent process with common representations for motion perception.NEW & NOTEWORTHY The study showed that smooth pursuit eye movement produces a bias in tactile motion perception. This phenomenon is modulated by the reliability of the tactile estimate and by the presence of a visual background, in line with the predictions of the Bayesian framework of motion perception. Overall, these results support the hypothesis of shared representations for motion perception.
Collapse
Affiliation(s)
- Cécile R Scotto
- Centre de Recherches sur la Cognition et l'Apprentissage, Université de Poitiers, Université François Rabelais de Tours, Centre National de la Recherche Scientifique, Poitiers, France
| | - Alessandro Moscatelli
- Department of Systems Medicine and Centre of Space Bio-Medicine, University of Rome "Tor Vergata", Rome, Italy.,Laboratory of Neuromotor Physiology, Istituto di Ricovero e Cura a Carattere Scientifico Santa Lucia Foundation, Rome, Italy
| | - Thies Pfeiffer
- Faculty of Technology and Cognitive Interaction Technology-Center of Excellence, Bielefeld University, Bielefeld, Germany
| | - Marc O Ernst
- Applied Cognitive Systems, Ulm University, Ulm, Germany
| |
Collapse
|
4
|
Abstract
There is an ongoing debate whether or not multisensory interactions require awareness of the sensory signals. Static visual and tactile stimuli have been shown to influence each other even in the absence of visual awareness. However, it is unclear if this finding generalizes to dynamic contexts. In the present study, we presented visual and tactile motion stimuli and induced fluctuations of visual awareness by means of binocular rivalry: two gratings which drifted in opposite directions were displayed, one to each eye. One visual motion stimulus dominated and reached awareness while the other visual stimulus was suppressed from awareness. Tactile motion stimuli were presented at random time points during the visual stimulation. The motion direction of a tactile stimulus always matched the direction of one of the concurrently presented visual stimuli. The visual gratings were differently tinted, and participants reported the color of the currently seen stimulus. Tactile motion delayed perceptual switches that ended dominance periods of congruently moving visual stimuli compared to switches during visual-only stimulation. In addition, tactile motion fostered the return to dominance of suppressed, congruently moving visual stimuli, but only if the tactile motion started at a late stage of the ongoing visual suppression period. At later stages, perceptual suppression is typically decreasing. These results suggest that visual awareness facilitates but does not gate multisensory interactions between visual and tactile motion signals.
Collapse
|
5
|
Lohmann J, Rolke B, Butz MV. In touch with mental rotation: interactions between mental and tactile rotations and motor responses. Exp Brain Res 2017; 235:1063-1079. [PMID: 28078359 DOI: 10.1007/s00221-016-4861-8] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2016] [Accepted: 12/20/2016] [Indexed: 11/25/2022]
Abstract
Although several process models have described the cognitive processing stages that are involved in mentally rotating objects, the exact nature of the rotation process itself remains elusive. According to embodied cognition, cognitive functions are deeply grounded in the sensorimotor system. We thus hypothesized that modal rotation perceptions should influence mental rotations. We conducted two studies in which participants had to judge if a rotated letter was visually presented canonically or mirrored. Concurrently, participants had to judge if a tactile rotation on their palm changed direction during the trial. The results show that tactile rotations can systematically influence mental rotation performance in that same rotations are favored. In addition, the results show that mental rotations produce a response compatibility effect: clockwise mental rotations facilitate responses to the right, while counterclockwise mental rotations facilitate responses to the left. We conclude that the execution of mental rotations activates cognitive mechanisms that are also used to perceive rotations in different modalities and that are associated with directional motor control processes.
Collapse
Affiliation(s)
- Johannes Lohmann
- Cognitive Modeling, Department of Computer Science, University of Tübingen, Tübingen, Germany.
| | - Bettina Rolke
- Evolutionary Cognition, Department of Psychology, University of Tübingen, Tübingen, Germany
| | - Martin V Butz
- Cognitive Modeling, Department of Computer Science, University of Tübingen, Tübingen, Germany
| |
Collapse
|
6
|
Amemiya T, Hirota K, Ikei Y. Tactile Apparent Motion on the Torso Modulates Perceived Forward Self-Motion Velocity. IEEE TRANSACTIONS ON HAPTICS 2016; 9:474-482. [PMID: 27514066 DOI: 10.1109/toh.2016.2598332] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
The present study investigated whether a tactile flow created by a matrix of vibrators in a seat pan simultaneously presented with an optical flow in peripheral vision enhances the perceived forward velocity of self-motion. A brief tactile motion stimulus consisted of four successive rows of vibration, and the interstimulus onset between the tactile rows was varied to change the velocity of the tactile motion. The results show that the forward velocity of self-motion is significantly overestimated for rapid tactile flows and underestimated for slow ones, compared with optical flow alone or non-motion vibrotactile stimulation conditions. In addition, the effect with a temporal tactile rhythm without changing the stimulus location was smaller than that with spatiotemporal tactile motion, with the interstimulus onset interval to elicit a clear sensation of tactile apparent motion. These findings suggest that spatiotemporal tactile motion is effective in inducing a change in the perceived forward velocity of self-motion.
Collapse
|
7
|
Butz MV. Toward a Unified Sub-symbolic Computational Theory of Cognition. Front Psychol 2016; 7:925. [PMID: 27445895 PMCID: PMC4915327 DOI: 10.3389/fpsyg.2016.00925] [Citation(s) in RCA: 35] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/02/2015] [Accepted: 06/03/2016] [Indexed: 11/13/2022] Open
Abstract
This paper proposes how various disciplinary theories of cognition may be combined into a unifying, sub-symbolic, computational theory of cognition. The following theories are considered for integration: psychological theories, including the theory of event coding, event segmentation theory, the theory of anticipatory behavioral control, and concept development; artificial intelligence and machine learning theories, including reinforcement learning and generative artificial neural networks; and theories from theoretical and computational neuroscience, including predictive coding and free energy-based inference. In the light of such a potential unification, it is discussed how abstract cognitive, conceptualized knowledge and understanding may be learned from actively gathered sensorimotor experiences. The unification rests on the free energy-based inference principle, which essentially implies that the brain builds a predictive, generative model of its environment. Neural activity-oriented inference causes the continuous adaptation of the currently active predictive encodings. Neural structure-oriented inference causes the longer term adaptation of the developing generative model as a whole. Finally, active inference strives for maintaining internal homeostasis, causing goal-directed motor behavior. To learn abstract, hierarchical encodings, however, it is proposed that free energy-based inference needs to be enhanced with structural priors, which bias cognitive development toward the formation of particular, behaviorally suitable encoding structures. As a result, it is hypothesized how abstract concepts can develop from, and thus how they are structured by and grounded in, sensorimotor experiences. Moreover, it is sketched-out how symbol-like thought can be generated by a temporarily active set of predictive encodings, which constitute a distributed neural attractor in the form of an interactive free-energy minimum. The activated, interactive network attractor essentially characterizes the semantics of a concept or a concept composition, such as an actual or imagined situation in our environment. Temporal successions of attractors then encode unfolding semantics, which may be generated by a behavioral or mental interaction with an actual or imagined situation in our environment. Implications, further predictions, possible verification, and falsifications, as well as potential enhancements into a fully spelled-out unified theory of cognition are discussed at the end of the paper.
Collapse
Affiliation(s)
- Martin V Butz
- Cognitive Modeling, Department of Computer Science and Department of Psychology, Eberhard Karls University of Tübingen Tübingen, Germany
| |
Collapse
|
8
|
An invisible touch: Body-related multisensory conflicts modulate visual consciousness. Neuropsychologia 2015; 88:131-139. [PMID: 26519553 DOI: 10.1016/j.neuropsychologia.2015.10.034] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/08/2015] [Revised: 09/15/2015] [Accepted: 10/26/2015] [Indexed: 11/22/2022]
Abstract
The majority of scientific studies on consciousness have focused on vision, exploring the cognitive and neural mechanisms of conscious access to visual stimuli. In parallel, studies on bodily consciousness have revealed that bodily (i.e. tactile, proprioceptive, visceral, vestibular) signals are the basis for the sense of self. However, the role of bodily signals in the formation of visual consciousness is not well understood. Here we investigated how body-related visuo-tactile stimulation modulates conscious access to visual stimuli. We used a robotic platform to apply controlled tactile stimulation to the participants' back while they viewed a dot moving either in synchrony or asynchrony with the touch on their back. Critically, the dot was rendered invisible through continuous flash suppression. Manipulating the visual context by presenting the dot moving on either a body form, or a non-bodily object we show that: (i) conflict induced by synchronous visuo-tactile stimulation in a body context is associated with a delayed conscious access compared to asynchronous visuo-tactile stimulation, (ii) this effect occurs only in the context of a visual body form, and (iii) is not due to detection or response biases. The results indicate that body-related visuo-tactile conflicts impact visual consciousness by facilitating access of non-conflicting visual information to awareness, and that these are sensitive to the visual context in which they are presented, highlighting the interplay between bodily signals and visual experience.
Collapse
|
9
|
Keetels M, Stekelenburg JJ. Motor-induced visual motion: hand movements driving visual motion perception. Exp Brain Res 2014; 232:2865-77. [PMID: 24820287 DOI: 10.1007/s00221-014-3959-0] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2014] [Accepted: 04/09/2014] [Indexed: 10/25/2022]
Abstract
Visual perception can be changed by co-occurring input from other sensory modalities. Here, we explored how self-generated finger movements (left-right or up-down key presses) affect visual motion perception. In Experiment 1, motion perception of a blinking bar was shifted in the direction of co-occurring hand motor movements, indicative of motor-induced visual motion (MIVM). In Experiment 2, moving and static blinking bars were combined with either directional moving or stationary hand motor movements. Results showed that the directional component in the hand movement was crucial for MIVM as stationary motor movements even declined visual motion perception. In Experiment 3, the role of response bias was excluded in a two-alternative forced-choice task that ruled out the effect of response strategies. All three experiments demonstrated that alternating key presses (either horizontally or vertically aligned) induce illusory visual motion and that stationary motor movements (without a vertical or horizontal direction) induce the opposite effect, namely a decline in visual motion (more static) perception.
Collapse
Affiliation(s)
- Mirjam Keetels
- Department of Cognitive Neuropsychology, Tilburg University, Tilburg, The Netherlands,
| | | |
Collapse
|
10
|
Tactile and visual motion direction processing in hMT+/V5. Neuroimage 2014; 84:420-7. [DOI: 10.1016/j.neuroimage.2013.09.004] [Citation(s) in RCA: 48] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2013] [Revised: 08/20/2013] [Accepted: 09/03/2013] [Indexed: 11/18/2022] Open
|
11
|
Ehrenfeld S, Herbort O, Butz MV. Modular neuron-based body estimation: maintaining consistency over different limbs, modalities, and frames of reference. Front Comput Neurosci 2013; 7:148. [PMID: 24191151 PMCID: PMC3808893 DOI: 10.3389/fncom.2013.00148] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/05/2013] [Accepted: 10/08/2013] [Indexed: 11/13/2022] Open
Abstract
This paper addresses the question of how the brain maintains a probabilistic body state estimate over time from a modeling perspective. The neural Modular Modality Frame (nMMF) model simulates such a body state estimation process by continuously integrating redundant, multimodal body state information sources. The body state estimate itself is distributed over separate, but bidirectionally interacting modules. nMMF compares the incoming sensory and present body state information across the interacting modules and fuses the information sources accordingly. At the same time, nMMF enforces body state estimation consistency across the modules. nMMF is able to detect conflicting sensory information and to consequently decrease the influence of implausible sensor sources on the fly. In contrast to the previously published Modular Modality Frame (MMF) model, nMMF offers a biologically plausible neural implementation based on distributed, probabilistic population codes. Besides its neural plausibility, the neural encoding has the advantage of enabling (a) additional probabilistic information flow across the separate body state estimation modules and (b) the representation of arbitrary probability distributions of a body state. The results show that the neural estimates can detect and decrease the impact of false sensory information, can propagate conflicting information across modules, and can improve overall estimation accuracy due to additional module interactions. Even bodily illusions, such as the rubber hand illusion, can be simulated with nMMF. We conclude with an outlook on the potential of modeling human data and of invoking goal-directed behavioral control.
Collapse
Affiliation(s)
- Stephan Ehrenfeld
- Cognitive Modeling, Department of Computer Science, Eberhard Karls University of Tübingen Tübingen, Germany
| | | | | |
Collapse
|
12
|
Pei YC, Chang TY, Lee TC, Saha S, Lai HY, Gomez-Ramirez M, Chou SW, Wong AMK. Cross-modal sensory integration of visual-tactile motion information: instrument design and human psychophysics. SENSORS 2013; 13:7212-23. [PMID: 23727955 PMCID: PMC3715219 DOI: 10.3390/s130607212] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/18/2013] [Revised: 05/22/2013] [Accepted: 05/23/2013] [Indexed: 11/23/2022]
Abstract
Information obtained from multiple sensory modalities, such as vision and touch, is integrated to yield a holistic percept. As a haptic approach usually involves cross-modal sensory experiences, it is necessary to develop an apparatus that can characterize how a biological system integrates visual-tactile sensory information as well as how a robotic device infers object information emanating from both vision and touch. In the present study, we develop a novel visual-tactile cross-modal integration stimulator that consists of an LED panel to present visual stimuli and a tactile stimulator with three degrees of freedom that can present tactile motion stimuli with arbitrary motion direction, speed, and indentation depth in the skin. The apparatus can present cross-modal stimuli in which the spatial locations of visual and tactile stimulations are perfectly aligned. We presented visual-tactile stimuli in which the visual and tactile directions were either congruent or incongruent, and human observers reported the perceived visual direction of motion. Results showed that perceived direction of visual motion can be biased by the direction of tactile motion when visual signals are weakened. The results also showed that the visual-tactile motion integration follows the rule of temporal congruency of multi-modal inputs, a fundamental property known for cross-modal integration.
Collapse
Affiliation(s)
- Yu-Cheng Pei
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails: (T.-Y.C.); (T.-C.L.); (S.S.); (H.-Y.L.); (S.-W.C.); (A.M.K.W.)
- Healthy Aging Research Center, Chang Gung University, No. 259, Wen-Hwa 1st Road, Taoyuan 333, Taiwan
- School of Medicine, Chang Gung University, No. 259, Wen-Hwa 1st Road, Taoyuan 333, Taiwan
- Author to whom correspondence should be addressed; E-Mail: ; Tel.: +886-33281200 (ext. 8146); Fax: +886-33281200 (ext. 2667)
| | - Ting-Yu Chang
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails: (T.-Y.C.); (T.-C.L.); (S.S.); (H.-Y.L.); (S.-W.C.); (A.M.K.W.)
| | - Tsung-Chi Lee
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails: (T.-Y.C.); (T.-C.L.); (S.S.); (H.-Y.L.); (S.-W.C.); (A.M.K.W.)
| | - Sudipta Saha
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails: (T.-Y.C.); (T.-C.L.); (S.S.); (H.-Y.L.); (S.-W.C.); (A.M.K.W.)
| | - Hsin-Yi Lai
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails: (T.-Y.C.); (T.-C.L.); (S.S.); (H.-Y.L.); (S.-W.C.); (A.M.K.W.)
| | - Manuel Gomez-Ramirez
- The Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, 3400 N. Charles Street 338 Krieger Hall, Baltimore, MD 21218, USA; E-Mail:
| | - Shih-Wei Chou
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails: (T.-Y.C.); (T.-C.L.); (S.S.); (H.-Y.L.); (S.-W.C.); (A.M.K.W.)
| | - Alice M. K. Wong
- Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails: (T.-Y.C.); (T.-C.L.); (S.S.); (H.-Y.L.); (S.-W.C.); (A.M.K.W.)
| |
Collapse
|
13
|
Thomaschke R. Investigating ideomotor cognition with motorvisual priming paradigms: key findings, methodological challenges, and future directions. Front Psychol 2012. [PMID: 23189067 PMCID: PMC3505020 DOI: 10.3389/fpsyg.2012.00519] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Ideomotor theory claims that perceptual representations of action-effects are functionally involved in the planning of actions. Strong evidence for this claim comes from a phenomenon called motorvisual priming. Motorvisual priming refers to the finding that action planning directly affects perception, and that the effects are selective for stimuli that share features with the planned action. Motorvisual priming studies have provided detailed insights into the processing of perceptual representations in action planning. One important finding is that such representations in action planning have a categorical format, whereas metric representations are not anticipated in planning. Further essential findings regard the processing mechanisms and the time course of ideomotor cognition. Perceptual representations of action-effects are first activated by action planning and then bound into a compound representation of the action plan. This compound representation is stabilized throughout the course of the action by the shielding of all involved representations from other cognitive processes. Despite a rapid growth in the number of motorvisual priming studies in the current literature, there are still many aspects of ideomotor cognition which have not yet been investigated. These aspects include the scope of ideomotor processing with regard to action types and stimulus types, as well as the exact nature of the binding and shielding mechanisms involved.
Collapse
Affiliation(s)
- Roland Thomaschke
- Institut für Psychologie, Universität Regensburg Regensburg, Germany
| |
Collapse
|
14
|
van Elk M, Blanke O. Balancing bistable perception during self-motion. Exp Brain Res 2012; 222:219-28. [DOI: 10.1007/s00221-012-3209-2] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2012] [Accepted: 07/24/2012] [Indexed: 11/28/2022]
|
15
|
|