1
|
Yan L, Ma Y, Yang W, Xiang X, Nan W. Similarities of SNARC, cognitive Simon, and visuomotor Simon effects in terms of response time distributions, hand-stimulus proximity, and temporal dynamics. PSYCHOLOGICAL RESEARCH 2024; 88:607-620. [PMID: 37594569 DOI: 10.1007/s00426-023-01866-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2021] [Accepted: 07/30/2023] [Indexed: 08/19/2023]
Abstract
The spatial-numerical association of response codes (SNARC) and Simon effects are attributed to the same type of conflict according to dimensional overlap (DO) theory: the congruency of task-irrelevant spatial information and the selected response (e.g., left or right). However, previous studies have yielded inconsistent results regarding the relationship between the two effects, with some studies reporting an interaction while others did not. This discrepancy may be attributed to the use of different types of Simon effects (visuomotor and cognitive Simon effects) in these studies, as the spatial codes associated with these two types of Simon effects are distinct (exogenous and endogenous, respectively). The aim of this study was to address these inconsistencies and gain a better understanding of the similarities and differences in spatial representations generated by spatial location, semantic information, and numerical information. We attempted to classify the relationships among the SNARC and Simon effects. Specifically, the visuomotor Simon, cognitive Simon, and SNARC effects were compared from three perspectives: the response time (RT) distribution, hand-stimulus proximity, and temporal dynamics (with the drift diffusion model; DDM). Regarding RTs, the results showed that the visuomotor Simon effect decreased with increased values of RT bins, while the cognitive Simon and SNARC effects increased. Additionally, the visuomotor Simon effect was the only effect influenced by hand-stimulus proximity, with a stronger effect observed in the hand-proximal condition than in the hand-distal condition. Regarding the DDM results, only the visuomotor Simon effect exhibited a higher drift rate and longer non-decision time in the incompatible condition than in the compatible condition. Conversely, both the SNARC and cognitive Simon effects exhibited an inverse pattern regarding the drift rate and no significant difference in non-decision time between the two conditions. These findings suggest that the SNARC effect is more similar to the cognitive Simon effect than the visuomotor Simon effect, indicating that the endogenous spatial-numerical representation of the SNARC effect might share an underlying processing mechanism with the endogenous spatial-semantic representation of the cognitive Simon effect but not with the exogenous location representation of the visuomotor Simon effect. Our results further demonstrate that the origin of spatial information could impact the classification of conflicts and supplement DO theory.
Collapse
Affiliation(s)
- Lizhu Yan
- Department of Psychology and Center for Brain and Cognitive Sciences, School of Education, Guangzhou Higher Education Mega Center, Guangzhou University, 230 Wai Huan Xi Road, Guangzhou, 510006, China
| | - Yilin Ma
- Department of Psychology and Center for Brain and Cognitive Sciences, School of Education, Guangzhou Higher Education Mega Center, Guangzhou University, 230 Wai Huan Xi Road, Guangzhou, 510006, China
| | - Weibin Yang
- Department of Psychology and Center for Brain and Cognitive Sciences, School of Education, Guangzhou Higher Education Mega Center, Guangzhou University, 230 Wai Huan Xi Road, Guangzhou, 510006, China
| | - Xinrui Xiang
- Department of Psychology and Center for Brain and Cognitive Sciences, School of Education, Guangzhou Higher Education Mega Center, Guangzhou University, 230 Wai Huan Xi Road, Guangzhou, 510006, China
| | - Weizhi Nan
- Department of Psychology and Center for Brain and Cognitive Sciences, School of Education, Guangzhou Higher Education Mega Center, Guangzhou University, 230 Wai Huan Xi Road, Guangzhou, 510006, China.
| |
Collapse
|
2
|
Monaco S, Menghi N, Crawford JD. Action-specific feature processing in the human cortex: An fMRI study. Neuropsychologia 2024; 194:108773. [PMID: 38142960 DOI: 10.1016/j.neuropsychologia.2023.108773] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/04/2023] [Revised: 11/29/2023] [Accepted: 12/18/2023] [Indexed: 12/26/2023]
Abstract
Sensorimotor integration involves feedforward and reentrant processing of sensory input. Grasp-related motor activity precedes and is thought to influence visual object processing. Yet, while the importance of reentrant feedback is well established in perception, the top-down modulations for action and the neural circuits involved in this process have received less attention. Do action-specific intentions influence the processing of visual information in the human cortex? Using a cue-separation fMRI paradigm, we found that action-specific instruction processing (manual alignment vs. grasp) became apparent only after the visual presentation of oriented stimuli, and occurred as early as in the primary visual cortex and extended to the dorsal visual stream, motor and premotor areas. Further, dorsal stream area aIPS, known to be involved in object manipulation, and the primary visual cortex showed task-related functional connectivity with frontal, parietal and temporal areas, consistent with the idea that reentrant feedback from dorsal and ventral visual stream areas modifies visual inputs to prepare for action. Importantly, both the task-dependent modulations and connections were linked specifically to the object presentation phase of the task, suggesting a role in processing the action goal. Our results show that intended manual actions have an early, pervasive, and differential influence on the cortical processing of vision.
Collapse
Affiliation(s)
- Simona Monaco
- CIMeC - Center for Mind/Brain Sciences, University of Trento, Rovereto (TN), Italy.
| | - Nicholas Menghi
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
| | - J Douglas Crawford
- Center for Vision Research, York University, Toronto, Ontario M3J 1P3, Canada; Vision: Science to Applications (VISTA) Program, Neuroscience Graduate Diploma Program and Departments of Psychology, Biology, and Kinesiology and Health Science, York University, Toronto, Ontario M3J 1P3, Canada
| |
Collapse
|
3
|
Bamford LE, Klassen NR, Karl JM. Faster recognition of graspable targets defined by orientation in a visual search task. Exp Brain Res 2020; 238:905-916. [PMID: 32170332 DOI: 10.1007/s00221-020-05769-z] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/31/2019] [Accepted: 03/03/2020] [Indexed: 10/24/2022]
Abstract
Peri-hand space is the area surrounding the hand. Objects within this space may be subject to increased visuospatial perception, increased attentional prioritization, and slower attentional disengagement compared to more distal objects. This may result from kinesthetic and visual feedback about the location of the hand that projects from the reach and grasp networks of the dorsal visual stream back to occipital visual areas, which in turn, refines cortical visual processing that can subsequently guide skilled motor actions. Thus, we hypothesized that visual stimuli that afford action, which are known to potentiate activity in the dorsal visual stream, would be associated with greater alterations in visual processing when presented near the hand. To test this, participants held their right hand near or far from a touchscreen that presented a visual array containing a single target object that differed from 11 distractor objects by orientation only. The target objects and their accompanying distractors either strongly afforded grasping or did not. Participants identified the target among the distractors by reaching out and touching it with their left index finger while eye-tracking was used to measure visual search times, target recognition times, and search accuracy. The results failed to support the theory of enhanced visual processing of graspable objects near the hand as participants were faster at recognizing graspable compared to non-graspable targets, regardless of the position of the right hand. The results are discussed in relation to the idea that, in addition to potentiating appropriate motor responses, object affordances may also potentiate early visual processes necessary for object recognition.
Collapse
Affiliation(s)
- Lindsay E Bamford
- Department of Psychology, Thompson Rivers University, Kamloops, BC, Canada.
| | - Nikola R Klassen
- Department of Psychology, Thompson Rivers University, Kamloops, BC, Canada
| | - Jenni M Karl
- Department of Psychology, Thompson Rivers University, Kamloops, BC, Canada
| |
Collapse
|
4
|
Karl JM, Wilson AM, Bertoli ME, Shubear NS. Touch the table before the target: contact with an underlying surface may assist the development of precise visually controlled reach and grasp movements in human infants. Exp Brain Res 2018; 236:2185-2207. [PMID: 29797280 DOI: 10.1007/s00221-018-5293-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2017] [Accepted: 05/16/2018] [Indexed: 11/28/2022]
Abstract
Multiple motor channel theory posits that skilled hand movements arise from the coordinated activation of separable neural circuits in parietofrontal cortex, each of which produces a distinct movement and responds to different sensory inputs. Prehension, the act of reaching to grasp an object, consists of at least two movements: a reach movement that transports the hand to a target location and a grasp movement that shapes and closes the hand for target acquisition. During early development, discrete pre-reach and pre-grasp movements are refined based on proprioceptive and tactile feedback, but are gradually coordinated together into a singular hand preshaping movement under feedforward visual control. The neural and behavioural factors that enable this transition are currently unknown. In an attempt to identify such factors, the present descriptive study used frame-by-frame video analysis to examine 9-, 12-, and 15-month-old infants, along with sighted and unsighted adults, as they reached to grasp small ring-shaped pieces of cereal (Cheerios) resting on a table. Compared to sighted adults, infants and unsighted adults were more likely to make initial contact with the underlying table before they contacted the target. The way in which they did so was also similar in that they generally contacted the table with the tip of the thumb and/or pinky finger, a relatively open hand, and poor reach accuracy. Despite this, infants were similar to sighted adults in that they tended to use a pincer digit, defined as the tip of the thumb or index finger, to subsequently contact the target. Only in infants was this ability related to their having made prior contact with the underlying table. The results are discussed in relation to the idea that initial contact with an underlying table or surface may assist infants in learning to use feedforward visual control to direct their digits towards a precise visual target.
Collapse
Affiliation(s)
- Jenni M Karl
- Department of Psychology, Thompson Rivers University, 805 TRU Way, Kamloops, BC, V2C 0C8, Canada.
| | - Alexis M Wilson
- Department of Psychology, Thompson Rivers University, 805 TRU Way, Kamloops, BC, V2C 0C8, Canada
| | - Marisa E Bertoli
- Department of Psychology, Thompson Rivers University, 805 TRU Way, Kamloops, BC, V2C 0C8, Canada
| | - Noor S Shubear
- Department of Psychology, Thompson Rivers University, 805 TRU Way, Kamloops, BC, V2C 0C8, Canada
| |
Collapse
|
5
|
Maiello G, Kwon M, Bex PJ. Three-dimensional binocular eye-hand coordination in normal vision and with simulated visual impairment. Exp Brain Res 2018; 236:691-709. [PMID: 29299642 PMCID: PMC6693328 DOI: 10.1007/s00221-017-5160-8] [Citation(s) in RCA: 15] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2017] [Accepted: 12/21/2017] [Indexed: 10/18/2022]
Abstract
Sensorimotor coupling in healthy humans is demonstrated by the higher accuracy of visually tracking intrinsically-rather than extrinsically-generated hand movements in the fronto-parallel plane. It is unknown whether this coupling also facilitates vergence eye movements for tracking objects in depth, or can overcome symmetric or asymmetric binocular visual impairments. Human observers were therefore asked to track with their gaze a target moving horizontally or in depth. The movement of the target was either directly controlled by the observer's hand or followed hand movements executed by the observer in a previous trial. Visual impairments were simulated by blurring stimuli independently in each eye. Accuracy was higher for self-generated movements in all conditions, demonstrating that motor signals are employed by the oculomotor system to improve the accuracy of vergence as well as horizontal eye movements. Asymmetric monocular blur affected horizontal tracking less than symmetric binocular blur, but impaired tracking in depth as much as binocular blur. There was a critical blur level up to which pursuit and vergence eye movements maintained tracking accuracy independent of blur level. Hand-eye coordination may therefore help compensate for functional deficits associated with eye disease and may be employed to augment visual impairment rehabilitation.
Collapse
Affiliation(s)
- Guido Maiello
- UCL Institute of Ophthalmology, University College London, 11-43 Bath Street, London, EC1V 9EL, UK.
- Department of Experimental Psychology, Justus-Liebig University Giessen, Otto-Behaghel-Str.10F, 35394, Giessen, Germany.
| | - MiYoung Kwon
- Department of Ophthalmology, University of Alabama at Birmingham, 700 S. 18th Street, Birmingham, AL, 35294-0009, USA
| | - Peter J Bex
- Department of Psychology, Northeastern University, 360 Huntington Ave, Boston, MA, 02115, USA
| |
Collapse
|
6
|
Perry CJ, Fallah M. Effector-based attention systems. Ann N Y Acad Sci 2017; 1396:56-69. [PMID: 28548458 DOI: 10.1111/nyas.13354] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/01/2016] [Revised: 03/10/2017] [Accepted: 03/20/2017] [Indexed: 12/14/2022]
Abstract
Visual processing is known to be enhanced at the end point of eye movements. Feedback within the oculomotor system has been shown to drive these alterations in visual processing. However, we do not simply view the world; we also reach out and interact using our hands. Consequently, it is not surprising that visual processing has also been shown to be altered in near-hand space. A growing body of work documents a myriad of alterations in near-hand visual processing, with little consensus on the neural underpinnings of the effect of the hand. Since movement of the eyes and hands is governed by parallel frontoparietal networks and since within the oculomotor system feedback from these motor control regions has been shown to drive enhanced visual processing at saccade end points, it is plausible that a similar feedback mechanism is at play in near-hand improvements in visual processing. Here, we compare and contrast oculomotor-driven and hand-driven changes in visual processing and provide support for the hypothesis that feedback within the reaching and grasping systems enhances visual processing near the hand in a novel way.
Collapse
Affiliation(s)
- Carolyn J Perry
- Department of Biomedical and Molecular Sciences, Queen's University, Kingston, Canada
| | - Mazyar Fallah
- School of Kinesiology and Health Science, York University, Toronto, Canada.,Centre for Vision Research, York University, Toronto, Canada.,Canadian Action and Perception Network, Toronto, Canada.,VISTA: Vision Science to Application, York University, Toronto, Canada
| |
Collapse
|
7
|
Slower attentional disengagement but faster perceptual processing near the hand. Acta Psychol (Amst) 2017; 174:40-47. [PMID: 28147264 DOI: 10.1016/j.actpsy.2017.01.005] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2016] [Revised: 01/10/2017] [Accepted: 01/20/2017] [Indexed: 11/24/2022] Open
Abstract
Many recent studies have reported altered visual processing near the hands. However, there is no definitive agreement about the mechanisms responsible for this effect. One viewpoint is that the effect is predominantly attentional while others argue for the role of pre-attentive perceptual differences in the manifestation of the hand-proximity effect. However, in most of the studies pre-attentional and attentional effects have been conflated. We argue that it is important to dissociate the effect of hand proximity on perception and attention to better theorize and understand how visual processing is altered near the hands. We report two experiments using a visual search task where participants completed a visual search task with their hands either on the monitor or on their lap. When on the monitor, the target could appear near the hand or farther away. In experiment 1, a letter search task showed steeper search slope near the hand suggesting slower attentional disengagement. However, the intercept was smaller in the near hand condition suggesting faster perceptual processing. These results were also replicated in experiment 2 with a conjunction search task with target present and absent conditions and 4 set sizes. The results suggest that there are dissociable effects of hand proximity on perception and attention. Importantly, the pre-attentive advantage of hand proximity does not translate to attentional benefit, but a processing cost. The results of experiment 2 additionally indicate that the steeper slope does not arise from any spatial biases in how search proceeds, but an indicator of slower attentional processing near the hands. The results also suggest that the effect of hand proximity on attention is not spatially graded whereas its effect on perceptuo-motor processes seems to be.
Collapse
|
8
|
Perry CJ, Amarasooriya P, Fallah M. An Eye in the Palm of Your Hand: Alterations in Visual Processing Near the Hand, a Mini-Review. Front Comput Neurosci 2016; 10:37. [PMID: 27148034 PMCID: PMC4834298 DOI: 10.3389/fncom.2016.00037] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2015] [Accepted: 04/01/2016] [Indexed: 11/29/2022] Open
Abstract
Feedback within the oculomotor system improves visual processing at eye movement end points, also termed a visual grasp. We do not just view the world around us however, we also reach out and grab things with our hands. A growing body of literature suggests that visual processing in near-hand space is altered. The control systems for moving either the eyes or the hands rely on parallel networks of fronto-parietal regions, which have feedback connections to visual areas. Since the oculomotor system effects on visual processing occur through feedback, both through the motor plan and the motor efference copy, a parallel system where reaching and/or grasping motor-related activity also affects visual processing is likely. Areas in the posterior parietal cortex, for example, receive proprioceptive and visual information used to guide actions, as well as motor efference signals. This trio of information channels is all that would be necessary to produce spatial allocation of reach-related visual attention. We review evidence from behavioral and neurophysiological studies that support the hypothesis that feedback from the reaching and/or grasping motor control networks affects visual processing while noting ways in which it differs from that seen within the oculomotor system. We also suggest that object affordances may represent the neural mechanism through which certain object features are selected for preferential processing when stimuli are near the hand. Finally, we summarize the two effector-based feedback systems and discuss how having separate but parallel effector systems allows for efficient decoupling of eye and hand movements.
Collapse
Affiliation(s)
- Carolyn J. Perry
- Visual Perception and Attention Laboratory, York UniversityToronto, ON, Canada
- Centre for Vision Research, York UniversityToronto, ON, Canada
- School of Kinesiology and Health Science, York UniversityToronto, ON, Canada
| | - Prakash Amarasooriya
- Visual Perception and Attention Laboratory, York UniversityToronto, ON, Canada
- Centre for Vision Research, York UniversityToronto, ON, Canada
| | - Mazyar Fallah
- Visual Perception and Attention Laboratory, York UniversityToronto, ON, Canada
- Centre for Vision Research, York UniversityToronto, ON, Canada
- School of Kinesiology and Health Science, York UniversityToronto, ON, Canada
- Canadian Action and Perception Network, York UniversityToronto, ON, Canada
| |
Collapse
|
9
|
Multisensory Stimulation to Improve Low- and Higher-Level Sensory Deficits after Stroke: A Systematic Review. Neuropsychol Rev 2015; 26:73-91. [PMID: 26490254 PMCID: PMC4762927 DOI: 10.1007/s11065-015-9301-1] [Citation(s) in RCA: 28] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2015] [Accepted: 10/01/2015] [Indexed: 10/24/2022]
Abstract
The aim of this systematic review was to integrate and assess evidence for the effectiveness of multisensory stimulation (i.e., stimulating at least two of the following sensory systems: visual, auditory, and somatosensory) as a possible rehabilitation method after stroke. Evidence was considered with a focus on low-level, perceptual (visual, auditory and somatosensory deficits), as well as higher-level, cognitive, sensory deficits. We referred to the electronic databases Scopus and PubMed to search for articles that were published before May 2015. Studies were included which evaluated the effects of multisensory stimulation on patients with low- or higher-level sensory deficits caused by stroke. Twenty-one studies were included in this review and the quality of these studies was assessed (based on eight elements: randomization, inclusion of control patient group, blinding of participants, blinding of researchers, follow-up, group size, reporting effect sizes, and reporting time post-stroke). Twenty of the twenty-one included studies demonstrate beneficial effects on low- and/or higher-level sensory deficits after stroke. Notwithstanding these beneficial effects, the quality of the studies is insufficient for valid conclusion that multisensory stimulation can be successfully applied as an effective intervention. A valuable and necessary next step would be to set up well-designed randomized controlled trials to examine the effectiveness of multisensory stimulation as an intervention for low- and/or higher-level sensory deficits after stroke. Finally, we consider the potential mechanisms of multisensory stimulation for rehabilitation to guide this future research.
Collapse
|
10
|
Suh J, Abrams RA. Reduced object-based perception in the near-hand space. Exp Brain Res 2015; 233:3403-12. [PMID: 26289483 DOI: 10.1007/s00221-015-4414-6] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2015] [Accepted: 08/08/2015] [Indexed: 10/23/2022]
Abstract
Previous studies have shown that hand proximity changes visual perception (Abrams et al. in Cognition 107(3):1035-1047, 2008). The present study examined the effects of hand proximity on object-based perception. In three experiments, participants viewed stimuli that were either near to or far from their hands. The target stimulus appeared, after a cue, in one of two rectangular objects: either at the location that had been previously cued, at the uncued end of the cued object, or in the uncued object. We found a significantly reduced same-object benefit in reaction time for stimuli near the hands in one experiment. Interestingly, we observed a same-object cost in sensitivity for stimuli near the hands in another experiment. The results reveal that object-based perception is disrupted in the near-hand space. This is consistent with previous findings revealing altered visual processing near the hands.
Collapse
Affiliation(s)
- Jihyun Suh
- Department of Psychology, Washington University in St. Louis, One Brookings Drive, St. Louis, MO, 63130, USA.
| | - Richard A Abrams
- Department of Psychology, Washington University in St. Louis, One Brookings Drive, St. Louis, MO, 63130, USA
| |
Collapse
|
11
|
Abed Rabbo F, Koch G, Lefèvre C, Seizeur R. Direct geniculo-extrastriate pathways: a review of the literature. Surg Radiol Anat 2015; 37:891-9. [DOI: 10.1007/s00276-015-1450-7] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2014] [Accepted: 02/19/2015] [Indexed: 01/23/2023]
|
12
|
Perry CJ, Sergio LE, Crawford JD, Fallah M. Hand placement near the visual stimulus improves orientation selectivity in V2 neurons. J Neurophysiol 2015; 113:2859-70. [PMID: 25717165 DOI: 10.1152/jn.00919.2013] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/29/2013] [Accepted: 02/23/2015] [Indexed: 11/22/2022] Open
Abstract
Often, the brain receives more sensory input than it can process simultaneously. Spatial attention helps overcome this limitation by preferentially processing input from a behaviorally-relevant location. Recent neuropsychological and psychophysical studies suggest that attention is deployed to near-hand space much like how the oculomotor system can deploy attention to an upcoming gaze position. Here we provide the first neuronal evidence that the presence of a nearby hand enhances orientation selectivity in early visual processing area V2. When the hand was placed outside the receptive field, responses to the preferred orientation were significantly enhanced without a corresponding significant increase at the orthogonal orientation. Consequently, there was also a significant sharpening of orientation tuning. In addition, the presence of the hand reduced neuronal response variability. These results indicate that attention is automatically deployed to the space around a hand, improving orientation selectivity. Importantly, this appears to be optimal for motor control of the hand, as opposed to oculomotor mechanisms which enhance responses without sharpening orientation selectivity. Effector-based mechanisms for visual enhancement thus support not only the spatiotemporal dissociation of gaze and reach, but also the optimization of vision for their separate requirements for guiding movements.
Collapse
Affiliation(s)
- Carolyn J Perry
- Visual Perception and Attention Laboratory, School of Kinesiology and Health Science, York University, Toronto, Ontario, Canada; Centre for Vision Research, York University, Toronto, Ontario, Canada; School of Kinesiology and Health Science, York University, Toronto, Ontario, Canada;
| | - Lauren E Sergio
- Centre for Vision Research, York University, Toronto, Ontario, Canada; School of Kinesiology and Health Science, York University, Toronto, Ontario, Canada
| | - J Douglas Crawford
- Centre for Vision Research, York University, Toronto, Ontario, Canada; Department of Psychology, York University, Toronto, Ontario, Canada; and Canadian Action and Perception Network, York University, Toronto, Ontario, Canada
| | - Mazyar Fallah
- Visual Perception and Attention Laboratory, School of Kinesiology and Health Science, York University, Toronto, Ontario, Canada; Centre for Vision Research, York University, Toronto, Ontario, Canada; School of Kinesiology and Health Science, York University, Toronto, Ontario, Canada; Canadian Action and Perception Network, York University, Toronto, Ontario, Canada
| |
Collapse
|
13
|
Whitwell RL, Milner AD, Goodale MA. The Two Visual Systems Hypothesis: New Challenges and Insights from Visual form Agnosic Patient DF. Front Neurol 2014; 5:255. [PMID: 25538675 PMCID: PMC4259122 DOI: 10.3389/fneur.2014.00255] [Citation(s) in RCA: 53] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/16/2014] [Accepted: 11/20/2014] [Indexed: 11/13/2022] Open
Abstract
Patient DF, who developed visual form agnosia following carbon monoxide poisoning, is still able to use vision to adjust the configuration of her grasping hand to the geometry of a goal object. This striking dissociation between perception and action in DF provided a key piece of evidence for the formulation of Goodale and Milner's Two Visual Systems Hypothesis (TVSH). According to the TVSH, the ventral stream plays a critical role in constructing our visual percepts, whereas the dorsal stream mediates the visual control of action, such as visually guided grasping. In this review, we discuss recent studies of DF that provide new insights into the functional organization of the dorsal and ventral streams. We confirm recent evidence that DF has dorsal as well as ventral brain damage - and that her dorsal-stream lesions and surrounding atrophy have increased in size since her first published brain scan. We argue that the damage to DF's dorsal stream explains her deficits in directing actions at targets in the periphery. We then focus on DF's ability to accurately adjust her in-flight hand aperture to changes in the width of goal objects (grip scaling) whose dimensions she cannot explicitly report. An examination of several studies of DF's grip scaling under natural conditions reveals a modest though significant deficit. Importantly, however, she continues to show a robust dissociation between form vision for perception and form vision-for-action. We also review recent studies that explore the role of online visual feedback and terminal haptic feedback in the programming and control of her grasping. These studies make it clear that DF is no more reliant on visual or haptic feedback than are neurologically intact individuals. In short, we argue that her ability to grasp objects depends on visual feedforward processing carried out by visuomotor networks in her dorsal stream that function in the much the same way as they do in neurologically intact individuals.
Collapse
Affiliation(s)
- Robert L Whitwell
- Graduate Program in Neuroscience, The University of Western Ontario , London, ON , Canada ; Department of Psychology, The University of Western Ontario , London, ON , Canada ; Brain and Mind Institute, The University of Western Ontario , London, ON , Canada
| | - A David Milner
- Department of Psychology, Durham University , Durham , UK
| | - Melvyn A Goodale
- Department of Psychology, The University of Western Ontario , London, ON , Canada ; Brain and Mind Institute, The University of Western Ontario , London, ON , Canada ; Department of Physiology and Pharmacology, The University of Western Ontario , London, ON , Canada
| |
Collapse
|
14
|
Taylor JET, Gozli DG, Chan D, Huffman G, Pratt J. A touchy subject: advancing the modulated visual pathways account of altered vision near the hand. Transl Neurosci 2014; 6:1-7. [PMID: 28123785 PMCID: PMC4936609 DOI: 10.1515/tnsci-2015-0001] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2014] [Accepted: 09/29/2014] [Indexed: 11/15/2022] Open
Abstract
A growing body of evidence demonstrates that human vision operates differently in the space near and on the hands; for example, early findings in this literature reported that rapid onsets are detected faster near the hands, and that objects are searched more thoroughly. These and many other effects were attributed to enhanced attention via the recruitment of bimodal visual-tactile neurons representing the hand and near-hand space. However, recent research supports an alternative account: stimuli near the hands are preferentially processed by the action-oriented magnocellular visual pathway at the expense of processing in the parvocellular pathway. This Modulated Visual Pathways (MVP) account of altered vision near the hands describes a hand position-dependent trade-off between the two main retinal-cortical visual pathways between the eye and brain. The MVP account explains past findings and makes new predictions regarding near-hand vision supported by new research.
Collapse
Affiliation(s)
- J Eric T Taylor
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada
| | - Davood G Gozli
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada
| | - David Chan
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada
| | - Greg Huffman
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada
| | - Jay Pratt
- Department of Psychology, University of Toronto, Toronto, Ontario, Canada
| |
Collapse
|
15
|
Brown LE, Marlin MC, Morrow S. On the contributions of vision and proprioception to the representation of hand-near targets. J Neurophysiol 2014; 113:409-19. [PMID: 25339706 DOI: 10.1152/jn.00005.2014] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Performance is often improved when targets are presented in space near the hands rather than far from the hands. Performance in hand-near space may be improved because participants can use proprioception from the nearby limb and hand to provide a narrower and more resolute frame of reference. An equally compelling alternative is that targets appearing near the hand fall within the receptive fields of visual-tactile bimodal cells, recruiting them to assist in the visual representation of targets that appear near but not far from the hand. We distinguished between these two alternatives by capitalizing on research showing that vision and proprioception have differential effects on the precision of target representation (van Beers RJ, Sittig AC, Denier van der Gon JJ. Exp Brain Res 122: 367-377, 1998). Participants performed an in-to-center reaching task to an array of central target locations with their right hand, while their left hand rested near (beneath) or far from the target array. Reaching end-point accuracy, variability, time, and speed were assessed. We predicted that if proprioception contributes to the representation of hand-near targets, then error variability in depth will be smaller in the hand-near condition than in the hand-far condition. By contrast, if vision contributes to the representation of hand-near targets, then error variability along the lateral dimension will be smaller in the hand-near than in the hand-far condition. Our results showed that the placement of the hand near the targets reduced end-point error variability along the lateral dimension only. The results suggest that hand-near targets are represented with greater visual resolution than far targets.
Collapse
Affiliation(s)
- Liana E Brown
- Department of Psychology, Trent University, Peterborough, Ontario, Canada
| | - Matthew C Marlin
- Department of Psychology, Trent University, Peterborough, Ontario, Canada
| | - Sarah Morrow
- Department of Psychology, Trent University, Peterborough, Ontario, Canada
| |
Collapse
|
16
|
Tseng P, Yu J, Tzeng OJL, Hung DL, Juan CH. Hand proximity facilitates spatial discrimination of auditory tones. Front Psychol 2014; 5:527. [PMID: 24966839 PMCID: PMC4052199 DOI: 10.3389/fpsyg.2014.00527] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/19/2013] [Accepted: 05/13/2014] [Indexed: 11/23/2022] Open
Abstract
The effect of hand proximity on vision and visual attention has been well documented. In this study we tested whether such effect(s) would also be present in the auditory modality. With hands placed either near or away from the audio sources, participants performed an auditory-spatial discrimination (Experiment 1: left or right side), pitch discrimination (Experiment 2: high, med, or low tone), and spatial-plus-pitch (Experiment 3: left or right; high, med, or low) discrimination task. In Experiment 1, when hands were away from the audio source, participants consistently responded faster with their right hand regardless of stimulus location. This right hand advantage, however, disappeared in the hands-near condition because of a significant improvement in left hand's reaction time (RT). No effect of hand proximity was found in Experiments 2 or 3, where a choice RT task requiring pitch discrimination was used. Together, these results that the perceptual and attentional effect of hand proximity is not limited to one specific modality, but applicable to the entire “space” near the hands, including stimuli of different modality (at least visual and auditory) within that space. While these findings provide evidence from auditory attention that supports the multimodal account originally raised by Reed et al. (2006), we also discuss the possibility of a dual mechanism hypothesis to reconcile findings from the multimodal and magno/parvocellular account.
Collapse
Affiliation(s)
- Philip Tseng
- Institute of Cognitive Neuroscience, National Central University Jhongli City, Taiwan ; Taipei Medical University-Shuang Ho Hospital, Brain and Consciousness Research Center New Taipei City, Taiwan
| | - Jiaxin Yu
- Institute of Cognitive Neuroscience, National Central University Jhongli City, Taiwan ; Institute of Neuroscience, National Yang Ming University Taipei, Taiwan
| | - Ovid J L Tzeng
- Institute of Cognitive Neuroscience, National Central University Jhongli City, Taiwan ; Institute of Neuroscience, National Yang Ming University Taipei, Taiwan ; Institute of Linguistics, Academia Sinica Taipei, Taiwan
| | - Daisy L Hung
- Institute of Cognitive Neuroscience, National Central University Jhongli City, Taiwan ; Institute of Neuroscience, National Yang Ming University Taipei, Taiwan
| | - Chi-Hung Juan
- Institute of Cognitive Neuroscience, National Central University Jhongli City, Taiwan
| |
Collapse
|
17
|
Perceived size change induced by nonvisual signals in darkness: the relative contribution of vergence and proprioception. J Neurosci 2013; 33:16915-23. [PMID: 24155297 DOI: 10.1523/jneurosci.0977-13.2013] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Most of the time, the human visual system computes perceived size by scaling the size of an object on the retina with its perceived distance. There are instances, however, in which size-distance scaling is not based on visual inputs but on extraretinal cues. In the Taylor illusion, the perceived afterimage that is projected on an observer's hand will change in size depending on how far the limb is positioned from the eyes-even in complete darkness. In the dark, distance cues might derive from hand position signals either by an efference copy of the motor command to the moving hand or by proprioceptive input. Alternatively, there have been reports that vergence signals from the eyes might also be important. We performed a series of behavioral and eye-tracking experiments to tease apart how these different sources of distance information contribute to the Taylor illusion. We demonstrate that, with no visual information, perceived size changes mainly as a function of the vergence angle of the eyes, underscoring its importance in size-distance scaling. Interestingly, the strength of this relationship decreased when a mismatch between vergence and proprioception was introduced, indicating that proprioceptive feedback from the arm also affected size perception. By using afterimages, we provide strong evidence that the human visual system can benefit from sensory signals that originate from the hand when visual information about distance is unavailable.
Collapse
|
18
|
Langerak RM, La Mantia CL, Brown LE. Global and local processing near the left and right hands. Front Psychol 2013; 4:793. [PMID: 24194725 PMCID: PMC3810600 DOI: 10.3389/fpsyg.2013.00793] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2013] [Accepted: 10/08/2013] [Indexed: 11/13/2022] Open
Abstract
Visual targets can be processed more quickly and reliably when a hand is placed near the target. Both unimodal and bimodal representations of hands are largely lateralized to the contralateral hemisphere, and since each hemisphere demonstrates specialized cognitive processing, it is possible that targets appearing near the left hand may be processed differently than targets appearing near the right hand. The purpose of this study was to determine whether visual processing near the left and right hands interacts with hemispheric specialization. We presented hierarchical-letter stimuli (e.g., small characters used as local elements to compose large characters at the global level) near the left or right hands separately and instructed participants to discriminate the presence of target letters (X and O) from non-target letters (T and U) at either the global or local levels as quickly as possible. Targets appeared at either the global or local level of the display, at both levels, or were absent from the display; participants made foot-press responses. When discriminating target presence at the global level, participants responded more quickly to stimuli presented near the left hand than near either the right hand or in the no-hand condition. Hand presence did not influence target discrimination at the local level. Our interpretation is that left-hand presence may help participants discriminate global information, a right hemisphere (RH) process, and that the left hand may influence visual processing in a way that is distinct from the right hand.
Collapse
Affiliation(s)
- Robin M Langerak
- Department of Psychology, Trent University Peterborough, ON, Canada
| | | | | |
Collapse
|
19
|
Buetti S, Tamietto M, Hervais-Adelman A, Kerzel D, de Gelder B, Pegna AJ. Dissociation between goal-directed and discrete response localization in a patient with bilateral cortical blindness. J Cogn Neurosci 2013; 25:1769-75. [PMID: 23944840 DOI: 10.1162/jocn_a_00404] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/07/2023]
Abstract
We investigated localization performance of simple targets in patient TN, who suffered bilateral damage of his primary visual cortex and shows complete cortical blindness. Using a two-alternative forced-choice paradigm, TN was asked to guess the position of left-right targets with goal-directed and discrete manual responses. The results indicate a clear dissociation between goal-directed and discrete responses. TN pointed toward the correct target location in approximately 75% of the trials but was at chance level with discrete responses. This indicates that the residual ability to localize an unseen stimulus depends critically on the possibility to translate a visual signal into a goal-directed motor output at least in certain forms of blindsight.
Collapse
|
20
|
Bolognini N, Convento S, Rossetti A, Merabet LB. Multisensory processing after a brain damage: Clues on post-injury crossmodal plasticity from neuropsychology. Neurosci Biobehav Rev 2013; 37:269-78. [DOI: 10.1016/j.neubiorev.2012.12.006] [Citation(s) in RCA: 34] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2012] [Revised: 10/03/2012] [Accepted: 12/09/2012] [Indexed: 11/28/2022]
|
21
|
Carey D, Trevethan C, Weiskrantz L, Sahraie A. Does delay impair localisation in blindsight? Neuropsychologia 2012; 50:3673-80. [DOI: 10.1016/j.neuropsychologia.2012.08.018] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2012] [Revised: 08/09/2012] [Accepted: 08/23/2012] [Indexed: 10/27/2022]
|
22
|
Hand position alters vision by biasing processing through different visual pathways. Cognition 2012; 124:244-50. [DOI: 10.1016/j.cognition.2012.04.008] [Citation(s) in RCA: 90] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/21/2011] [Revised: 04/13/2012] [Accepted: 04/27/2012] [Indexed: 11/24/2022]
|
23
|
Brown LE, Doole R, Malfait N. The role of motor learning in spatial adaptation near a tool. PLoS One 2011; 6:e28999. [PMID: 22174944 PMCID: PMC3236781 DOI: 10.1371/journal.pone.0028999] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2011] [Accepted: 11/18/2011] [Indexed: 11/24/2022] Open
Abstract
Some visual-tactile (bimodal) cells have visual receptive fields (vRFs) that overlap and extend moderately beyond the skin of the hand. Neurophysiological evidence suggests, however, that a vRF will grow to encompass a hand-held tool following active tool use but not after passive holding. Why does active tool use, and not passive holding, lead to spatial adaptation near a tool? We asked whether spatial adaptation could be the result of motor or visual experience with the tool, and we distinguished between these alternatives by isolating motor from visual experience with the tool. Participants learned to use a novel, weighted tool. The active training group received both motor and visual experience with the tool, the passive training group received visual experience with the tool, but no motor experience, and finally, a no-training control group received neither visual nor motor experience using the tool. After training, we used a cueing paradigm to measure how quickly participants detected targets, varying whether the tool was placed near or far from the target display. Only the active training group detected targets more quickly when the tool was placed near, rather than far, from the target display. This effect of tool location was not present for either the passive-training or control groups. These results suggest that motor learning influences how visual space around the tool is represented.
Collapse
Affiliation(s)
- Liana E Brown
- Department of Psychology, Trent University, Peterborough, Ontario, Canada.
| | | | | |
Collapse
|
24
|
Lloyd DM, Azañón E, Poliakoff E. Right hand presence modulates shifts of exogenous visuospatial attention in near perihand space. Brain Cogn 2010; 73:102-9. [PMID: 20403655 DOI: 10.1016/j.bandc.2010.03.006] [Citation(s) in RCA: 27] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2009] [Revised: 03/22/2010] [Accepted: 03/25/2010] [Indexed: 10/19/2022]
Abstract
To investigate attentional shifting in perihand space, we measured performance on a covert visual orienting task under different hand positions. Participants discriminated visual shapes presented on a screen and responded using footpedals placed under their right foot. With the right hand positioned by the right side of the screen, mean cueing effects were significantly greater for targets presented on the right compared to the left side, at the shortest stimulus onset asynchrony. The right hand still affected attention when the left foot was used to respond and when the right hand was crossed over the midline, indicating that this effect is not restricted to the right hemifield and cannot be accounted for by greater stimulus-response compatibility with the right (responding) foot. These experiments provide preliminary evidence that the presence of the right hand can modulate shifts of visual attention but emphasise the importance of stimulus-response compatibility effects in such investigations.
Collapse
Affiliation(s)
- Donna M Lloyd
- School of Psychological Sciences, The University of Manchester, Manchester M13 9PL, UK.
| | | | | |
Collapse
|
25
|
Bandpass characteristics of high-frequency sensitivity and visual experience in blindsight. Conscious Cogn 2010; 19:144-51. [PMID: 20129798 DOI: 10.1016/j.concog.2010.01.005] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2009] [Revised: 01/08/2010] [Accepted: 01/11/2010] [Indexed: 11/22/2022]
Abstract
Patient RP suffers a unilateral right homonymous quadrant anopia but demonstrates better than chance discrimination for stimuli presented in the blind field at temporal frequencies between 33 and 47Hz (all significant at p<.05, binomial). Examination of her reports of visual experience during blind-field discrimination suggests a more complex picture in which experiences particular to correct discrimination are not found at low-mid-gamma frequencies, but are significantly more likely than average (76%, p<.001) at a lower frequency (22Hz) at which blindsight is not observed. We believe that visual experience may serve to support blindsight if discrimination tasks are generally impaired at frequencies outside of the low-mid-gamma band. If this is so, although generally experienced as non-specific and unstructured light, the visual experience that accompanies discrimination performance must be based upon a neural representation which includes information on the visual features present in the stimulus.
Collapse
|
26
|
Kao KLC, Goodale MA. Enhanced detection of visual targets on the hand and familiar tools. Neuropsychologia 2009; 47:2454-63. [DOI: 10.1016/j.neuropsychologia.2009.04.016] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/10/2008] [Revised: 04/06/2009] [Accepted: 04/18/2009] [Indexed: 10/20/2022]
|
27
|
|
28
|
Vakalopoulos C. A hand in the blindsight paradox: A subcortical pathway? Neuropsychologia 2008; 46:3239-40. [DOI: 10.1016/j.neuropsychologia.2008.09.002] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/17/2008] [Revised: 08/30/2008] [Accepted: 09/01/2008] [Indexed: 10/21/2022]
|
29
|
Brown LE, Goodale MA. Koniocellular projections and hand-assisted blindsight. Neuropsychologia 2008; 46:3241-2. [DOI: 10.1016/j.neuropsychologia.2008.09.001] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/06/2008] [Accepted: 09/01/2008] [Indexed: 10/21/2022]
|
30
|
Does localisation blindsight extend to two-dimensional targets? Neuropsychologia 2008; 46:3053-60. [DOI: 10.1016/j.neuropsychologia.2008.06.015] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2007] [Revised: 06/09/2008] [Accepted: 06/12/2008] [Indexed: 11/17/2022]
|
31
|
Thura D, Boussaoud D, Meunier M. Hand position affects saccadic reaction times in monkeys and humans. J Neurophysiol 2008; 99:2194-202. [PMID: 18337364 DOI: 10.1152/jn.01271.2007] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
In daily life, activities requiring the hand and eye to work separately are as frequent as activities requiring tight eye-hand coordination, and we effortlessly switch from one type of activity to the other. Such flexibility is unlikely to be achieved without each effector "knowing" where the other one is at all times, even when it is static. Here, we provide behavioral evidence that the mere position of the static hand affects one eye movement parameter: saccadic reaction time. Two monkeys were trained and 11 humans instructed to perform nondelayed or delayed visually guided saccades to either a right or a left target while holding their hand at a location either near or far from the eye target. From trial to trial, target locations and hand positions varied pseudorandomly. Subjects were tested both when they could and when they could not see their hand. The main findings are 1) the presence of the static hand in the workspace did affect saccade initiation; 2) this interaction persisted when the hand was invisible; 3) it was strongly influenced by the delay duration: hand-target proximity retarded immediate saccades, whereas it could hasten delayed saccades; and 4) this held true both for humans and for each of the two monkeys. We propose that both visual and nonvisual hand position signals are used by the primates' oculomotor system for the planning and execution of saccades, and that this may result in a hand-eye competition for spatial attentional resources that explains the delay-dependent reversal observed.
Collapse
Affiliation(s)
- David Thura
- Institut de Neurosciences Cognitives de la Méditerranée, UMR 6193, Centre National de la Recherche Scientifique, 31 Chemin Joseph Aiguier, Marseille Cedex 20, France
| | | | | |
Collapse
|