1
|
Jeschke M, Zoeller AC, Drewing K. Humans flexibly use visual priors to optimize their haptic exploratory behavior. Sci Rep 2024; 14:14906. [PMID: 38942980 PMCID: PMC11213930 DOI: 10.1038/s41598-024-65958-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2023] [Accepted: 06/25/2024] [Indexed: 06/30/2024] Open
Abstract
Humans can use prior information to optimize their haptic exploratory behavior. Here, we investigated the usage of visual priors, which mechanisms enable their usage, and how the usage is affected by information quality. Participants explored different grating textures and discriminated their spatial frequency. Visual priors on texture orientation were given each trial, with qualities randomly varying from high to no informational value. Adjustments of initial exploratory movement direction orthogonal to the textures' orientation served as an indicator of prior usage. Participants indeed used visual priors; the more so the higher the priors' quality (Experiment 1). Higher task demands did not increase the direct usage of visual priors (Experiment 2), but possibly fostered the establishment of adjustment behavior. In Experiment 3, we decreased the proportion of high-quality priors presented during the session, hereby reducing the contingency between high-quality priors and haptic information. In consequence, even priors of high quality ceased to evoke movement adjustments. We conclude that the establishment of adjustment behavior results from a rather implicit contingency learning. Overall, it became evident that humans can autonomously learn to use rather abstract visual priors to optimize haptic exploration, with the learning process and direct usage substantially depending on the priors' quality.
Collapse
Affiliation(s)
- Michaela Jeschke
- Experimental Psychology, Justus-Liebig University, 35390, Gießen, Germany.
| | - Aaron C Zoeller
- Experimental Psychology, Justus-Liebig University, 35390, Gießen, Germany
| | - Knut Drewing
- Experimental Psychology, Justus-Liebig University, 35390, Gießen, Germany
| |
Collapse
|
2
|
Ahissar E, Nelinger G, Assa E, Karp O, Saraf-Sinik I. Thalamocortical loops as temporal demodulators across senses. Commun Biol 2023; 6:562. [PMID: 37237075 DOI: 10.1038/s42003-023-04881-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2023] [Accepted: 04/27/2023] [Indexed: 05/28/2023] Open
Abstract
Sensory information is coded in space and in time. The organization of neuronal activity in space maintains straightforward relationships with the spatial organization of the perceived environment. In contrast, the temporal organization of neuronal activity is not trivially related to external features due to sensor motion. Still, the temporal organization shares similar principles across sensory modalities. Likewise, thalamocortical circuits exhibit common features across senses. Focusing on touch, vision, and audition, we review their shared coding principles and suggest that thalamocortical systems include circuits that allow analogous recoding mechanisms in all three senses. These thalamocortical circuits constitute oscillations-based phase-locked loops, that translate temporally-coded sensory information to rate-coded cortical signals, signals that can integrate information across sensory and motor modalities. The loop also allows predictive locking to the onset of future modulations of the sensory signal. The paper thus suggests a theoretical framework in which a common thalamocortical mechanism implements temporal demodulation across senses.
Collapse
Affiliation(s)
- Ehud Ahissar
- Department of Brain Sciences, Weizmann Institute, Rehovot, 76100, Israel.
| | - Guy Nelinger
- Department of Brain Sciences, Weizmann Institute, Rehovot, 76100, Israel
| | - Eldad Assa
- Department of Brain Sciences, Weizmann Institute, Rehovot, 76100, Israel
| | - Ofer Karp
- Department of Brain Sciences, Weizmann Institute, Rehovot, 76100, Israel
| | - Inbar Saraf-Sinik
- Department of Brain Sciences, Weizmann Institute, Rehovot, 76100, Israel
| |
Collapse
|
3
|
Idiosyncratic selection of active touch for shape perception. Sci Rep 2022; 12:2922. [PMID: 35190603 PMCID: PMC8861104 DOI: 10.1038/s41598-022-06807-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2021] [Accepted: 02/03/2022] [Indexed: 11/23/2022] Open
Abstract
Hand movements are essential for tactile perception of objects. However, the specific functions served by active touch strategies, and their dependence on physiological parameters, are unclear and understudied. Focusing on planar shape perception, we tracked at high resolution the hands of 11 participants during shape recognition task. Two dominant hand movement strategies were identified: contour following and scanning. Contour following movements were either tangential to the contour or oscillating perpendicular to it. Scanning movements crossed between distant parts of the shapes’ contour. Both strategies exhibited non-uniform coverage of the shapes’ contours. Idiosyncratic movement patterns were specific to the sensed object. In a second experiment, we have measured the participants’ spatial and temporal tactile thresholds. Significant portions of the variations in hand speed and in oscillation patterns could be explained by the idiosyncratic thresholds. Using data-driven simulations, we show how specific strategy choices may affect receptors activation. These results suggest that motion strategies of active touch adapt to both the sensed object and to the perceiver’s physiological parameters.
Collapse
|
4
|
Brown J, Oldenburg IA, Telian GI, Griffin S, Voges M, Jain V, Adesnik H. Spatial integration during active tactile sensation drives orientation perception. Neuron 2021; 109:1707-1720.e7. [PMID: 33826906 PMCID: PMC8944414 DOI: 10.1016/j.neuron.2021.03.020] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2020] [Revised: 07/24/2020] [Accepted: 03/11/2021] [Indexed: 01/07/2023]
Abstract
Active haptic sensation is critical for object identification, but its neural circuit basis is poorly understood. We combined optogenetics, two-photon imaging, and high-speed behavioral tracking in mice solving a whisker-based object orientation discrimination task. We found that orientation discrimination required animals to summate input from multiple whiskers specifically along the whisker arc. Animals discriminated the orientation of the stimulus per se as their performance was invariant to the location of the presented stimulus. Populations of barrel cortex neurons summated across whiskers to encode each orientation. Finally, acute optogenetic inactivation of the barrel cortex and cell-type-specific optogenetic suppression of layer 4 excitatory neurons degraded performance, implying that infragranular layers alone are not sufficient to solve the task. These data suggest that spatial summation over an active haptic array generates representations of an object's orientation, which may facilitate encoding of complex three-dimensional objects during active exploration.
Collapse
Affiliation(s)
- Jennifer Brown
- Department of Molecular and Cell Biology, University of California, Berkeley, Berkeley, CA, USA
| | - Ian Antón Oldenburg
- Department of Molecular and Cell Biology, University of California, Berkeley, Berkeley, CA, USA
| | - Gregory I Telian
- The Helen Wills Neuroscience Institute, University of California, Berkeley, Berkeley, CA, USA
| | - Sandon Griffin
- Department of Molecular and Cell Biology, University of California, Berkeley, Berkeley, CA, USA
| | - Mieke Voges
- Department of Molecular and Cell Biology, University of California, Berkeley, Berkeley, CA, USA
| | - Vedant Jain
- Department of Molecular and Cell Biology, University of California, Berkeley, Berkeley, CA, USA
| | - Hillel Adesnik
- Department of Molecular and Cell Biology, University of California, Berkeley, Berkeley, CA, USA; The Helen Wills Neuroscience Institute, University of California, Berkeley, Berkeley, CA, USA.
| |
Collapse
|
5
|
Buchs G, Haimler B, Kerem M, Maidenbaum S, Braun L, Amedi A. A self-training program for sensory substitution devices. PLoS One 2021; 16:e0250281. [PMID: 33905446 PMCID: PMC8078811 DOI: 10.1371/journal.pone.0250281] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/22/2020] [Accepted: 04/01/2021] [Indexed: 11/30/2022] Open
Abstract
Sensory Substitution Devices (SSDs) convey visual information through audition or touch, targeting blind and visually impaired individuals. One bottleneck towards adopting SSDs in everyday life by blind users, is the constant dependency on sighted instructors throughout the learning process. Here, we present a proof-of-concept for the efficacy of an online self-training program developed for learning the basics of the EyeMusic visual-to-auditory SSD tested on sighted blindfolded participants. Additionally, aiming to identify the best training strategy to be later re-adapted for the blind, we compared multisensory vs. unisensory as well as perceptual vs. descriptive feedback approaches. To these aims, sighted participants performed identical SSD-stimuli identification tests before and after ~75 minutes of self-training on the EyeMusic algorithm. Participants were divided into five groups, differing by the feedback delivered during training: auditory-descriptive, audio-visual textual description, audio-visual perceptual simultaneous and interleaved, and a control group which had no training. At baseline, before any EyeMusic training, participants SSD objects’ identification was significantly above chance, highlighting the algorithm’s intuitiveness. Furthermore, self-training led to a significant improvement in accuracy between pre- and post-training tests in each of the four feedback groups versus control, though no significant difference emerged among those groups. Nonetheless, significant correlations between individual post-training success rates and various learning measures acquired during training, suggest a trend for an advantage of multisensory vs. unisensory feedback strategies, while no trend emerged for perceptual vs. descriptive strategies. The success at baseline strengthens the conclusion that cross-modal correspondences facilitate learning, given SSD algorithms are based on such correspondences. Additionally, and crucially, the results highlight the feasibility of self-training for the first stages of SSD learning, and suggest that for these initial stages, unisensory training, easily implemented also for blind and visually impaired individuals, may suffice. Together, these findings will potentially boost the use of SSDs for rehabilitation.
Collapse
Affiliation(s)
- Galit Buchs
- The Baruch Ivcher Institute For Brain, Cognition & Technology, The Baruch Ivcher School of Psychology, Interdisciplinary Center (IDC), Herzeliya, Israel
- Department of Cognitive Science, Faculty of Humanities, Hebrew University of Jerusalem, Jerusalem, Israel
- * E-mail: (AA); (GB)
| | - Benedetta Haimler
- The Baruch Ivcher Institute For Brain, Cognition & Technology, The Baruch Ivcher School of Psychology, Interdisciplinary Center (IDC), Herzeliya, Israel
- Center of Advanced Technologies in Rehabilitation (CATR), The Chaim Sheba Medical Center, Ramat Gan, Israel
| | - Menachem Kerem
- The Baruch Ivcher Institute For Brain, Cognition & Technology, The Baruch Ivcher School of Psychology, Interdisciplinary Center (IDC), Herzeliya, Israel
| | - Shachar Maidenbaum
- The Baruch Ivcher Institute For Brain, Cognition & Technology, The Baruch Ivcher School of Psychology, Interdisciplinary Center (IDC), Herzeliya, Israel
- Department of Biomedical Engineering, Ben Gurion University, Beersheba, Israel
| | - Liraz Braun
- The Baruch Ivcher Institute For Brain, Cognition & Technology, The Baruch Ivcher School of Psychology, Interdisciplinary Center (IDC), Herzeliya, Israel
- Hebrew University of Jerusalem, Jerusalem, Israel
| | - Amir Amedi
- The Baruch Ivcher Institute For Brain, Cognition & Technology, The Baruch Ivcher School of Psychology, Interdisciplinary Center (IDC), Herzeliya, Israel
- * E-mail: (AA); (GB)
| |
Collapse
|
6
|
Aizawa T, Iizima H, Abe K, Tadakuma K, Tadakuma R. Study on portable haptic guide device with omnidirectional driving gear. Adv Robot 2021. [DOI: 10.1080/01691864.2021.1888796] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Affiliation(s)
- Tetsuya Aizawa
- Department of Mechanical Systems Engineering, Yamagata University, Yamagata, Japan
| | - Haruhiko Iizima
- Department of Mechanical Systems Engineering, Yamagata University, Yamagata, Japan
| | - Kazuki Abe
- Department of Mechanical Systems Engineering, Yamagata University, Yamagata, Japan
| | - Kenjiro Tadakuma
- Graduate School of Information Sciences, Tohoku University, Miyagi, Japan
| | - Riichiro Tadakuma
- Department of Mechanical Systems Engineering, Yamagata University, Yamagata, Japan
| |
Collapse
|
7
|
Zilbershtain-Kra Y, Graffi S, Ahissar E, Arieli A. Active sensory substitution allows fast learning via effective motor-sensory strategies. iScience 2021; 24:101918. [PMID: 33392481 PMCID: PMC7773576 DOI: 10.1016/j.isci.2020.101918] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2020] [Revised: 10/25/2020] [Accepted: 12/07/2020] [Indexed: 11/28/2022] Open
Abstract
We examined the development of new sensing abilities in adults by training participants to perceive remote objects through their fingers. Using an Active-Sensing based sensory Substitution device (ASenSub), participants quickly learned to perceive fast via the new modality and preserved their high performance for more than 20 months. Both sighted and blind participants exhibited almost complete transfer of performance from 2D images to novel 3D physical objects. Perceptual accuracy and speed using the ASenSub were, on average, 300% and 600% better than previous reports for 2D images and 3D objects. This improvement is attributed to the ability of the participants to employ their own motor-sensory strategies. Sighted participants dominant strategy was based on motor-sensory convergence on the most informative regions of objects, similarly to fixation patterns in vision. Congenitally, blind participants did not show such a tendency, and many of their exploratory procedures resembled those observed with natural touch.
Collapse
Affiliation(s)
- Yael Zilbershtain-Kra
- The Department of Neurobiology, Weizmann Institute of Science, 234 Herzl Street, Rehovot 76100, Israel
| | - Shmuel Graffi
- The Department of Neurobiology, Weizmann Institute of Science, 234 Herzl Street, Rehovot 76100, Israel
| | - Ehud Ahissar
- The Department of Neurobiology, Weizmann Institute of Science, 234 Herzl Street, Rehovot 76100, Israel
| | - Amos Arieli
- The Department of Neurobiology, Weizmann Institute of Science, 234 Herzl Street, Rehovot 76100, Israel
| |
Collapse
|
8
|
A Systematic Comparison of Perceptual Performance in Softness Discrimination with Different Fingers. Atten Percept Psychophys 2020; 82:3696-3709. [PMID: 32686066 PMCID: PMC7536162 DOI: 10.3758/s13414-020-02100-4] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
In studies investigating haptic softness perception, participants are typically instructed to explore soft objects by indenting them with their index finger. In contrast, performance with other fingers has rarely been investigated. We wondered which fingers are used in spontaneous exploration and if performance differences between fingers can explain spontaneous usage. In Experiment 1 participants discriminated the softness of two rubber stimuli with hardly any constraints on finger movements. Results indicate that humans use successive phases of different fingers and finger combinations during an exploration, preferring index, middle, and (to a lesser extent) ring finger. In Experiment 2 we compared discrimination thresholds between conditions, with participants using one of the four fingers of the dominant hand. Participants compared the softness of rubber stimuli in a two-interval forced choice discrimination task. Performance with index and middle finger was better as compared to ring and little finger, the little finger was the worst. In Experiment 3 we again compared discrimination thresholds, but participants were told to use constant peak force. Performance with the little finger was worst, whereas performance for the other fingers did not differ. We conclude that in spontaneous exploration the preference of combinations of index, middle, and partly ring finger seems to be well chosen, as indicated by improved performance with the spontaneously used fingers. Better performance seems to be based on both different motor abilities to produce force, mainly linked to using index and middle finger, and different sensory sensitivities, mainly linked to avoiding the little finger.
Collapse
|
9
|
Gruber LZ, Ahissar E. Closed loop motor-sensory dynamics in human vision. PLoS One 2020; 15:e0240660. [PMID: 33057398 PMCID: PMC7561174 DOI: 10.1371/journal.pone.0240660] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/11/2020] [Accepted: 09/30/2020] [Indexed: 12/02/2022] Open
Abstract
Vision is obtained with a continuous motion of the eyes. The kinematic analysis of eye motion, during any visual or ocular task, typically reveals two (kinematic) components: saccades, which quickly replace the visual content in the retinal fovea, and drifts, which slowly scan the image after each saccade. While the saccadic exchange of regions of interest (ROIs) is commonly considered to be included in motor-sensory closed-loops, it is commonly assumed that drifts function in an open-loop manner, that is, independent of the concurrent visual input. Accordingly, visual perception is assumed to be based on a sequence of open-loop processes, each initiated by a saccade-triggered retinal snapshot. Here we directly challenged this assumption by testing the dependency of drift kinematics on concurrent visual inputs using real-time gaze-contingent-display. Our results demonstrate a dependency of the trajectory on the concurrent visual input, convergence of speed to condition-specific values and maintenance of selected drift-related motor-sensory controlled variables, all strongly indicative of drifts being included in a closed-loop brain-world process, and thus suggesting that vision is inherently a closed-loop process.
Collapse
Affiliation(s)
| | - Ehud Ahissar
- Department of Neurobiology, Weizmann Institute of Science, Rehovot, Israel
| |
Collapse
|
10
|
Miller LE, Fabio C, Ravenda V, Bahmad S, Koun E, Salemme R, Luauté J, Bolognini N, Hayward V, Farnè A. Somatosensory Cortex Efficiently Processes Touch Located Beyond the Body. Curr Biol 2019; 29:4276-4283.e5. [PMID: 31813607 DOI: 10.1016/j.cub.2019.10.043] [Citation(s) in RCA: 32] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2019] [Revised: 09/30/2019] [Accepted: 10/21/2019] [Indexed: 01/24/2023]
Abstract
The extent to which a tool is an extension of its user is a question that has fascinated writers and philosophers for centuries [1]. Despite two decades of research [2-7], it remains unknown how this could be instantiated at the neural level. To this aim, the present study combined behavior, electrophysiology and neuronal modeling to characterize how the human brain could treat a tool like an extended sensory "organ." As with the body, participants localize touches on a hand-held tool with near-perfect accuracy [7]. This behavior is owed to the ability of the somatosensory system to rapidly and efficiently use the tool as a tactile extension of the body. Using electroencephalography (EEG), we found that where a hand-held tool was touched was immediately coded in the neural dynamics of primary somatosensory and posterior parietal cortices of healthy participants. We found similar neural responses in a proprioceptively deafferented patient with spared touch perception, suggesting that location information is extracted from the rod's vibrational patterns. Simulations of mechanoreceptor responses [8] suggested that the speed at which these patterns are processed is highly efficient. A second EEG experiment showed that touches on the tool and arm surfaces were localized by similar stages of cortical processing. Multivariate decoding algorithms and cortical source reconstruction provided further evidence that early limb-based processes were repurposed to map touch on a tool. We propose that an elementary strategy the human brain uses to sense with tools is to recruit primary somatosensory dynamics otherwise devoted to the body.
Collapse
Affiliation(s)
- Luke E Miller
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Avenue Doyen Lépine, Bron 69676, France; University of Lyon 1, 43 Boulevard du 11 Novembre 1918, Villeurbanne 69100, France; Hospices Civils de Lyon, Neuro-immersion, 16 Avenue Doyen Lépine, Bron 69676, France.
| | - Cécile Fabio
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Avenue Doyen Lépine, Bron 69676, France; University of Lyon 1, 43 Boulevard du 11 Novembre 1918, Villeurbanne 69100, France
| | - Valeria Ravenda
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Avenue Doyen Lépine, Bron 69676, France; University of Lyon 1, 43 Boulevard du 11 Novembre 1918, Villeurbanne 69100, France; Department of Psychology & Milan Center for Neuroscience-NeuroMi, University of Milano Bicocca, Building U6, 1 Piazza dell'Ateneo Nuovo, Milan 20126, Italy
| | - Salam Bahmad
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Avenue Doyen Lépine, Bron 69676, France; University of Lyon 1, 43 Boulevard du 11 Novembre 1918, Villeurbanne 69100, France
| | - Eric Koun
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Avenue Doyen Lépine, Bron 69676, France; University of Lyon 1, 43 Boulevard du 11 Novembre 1918, Villeurbanne 69100, France; Hospices Civils de Lyon, Neuro-immersion, 16 Avenue Doyen Lépine, Bron 69676, France
| | - Romeo Salemme
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Avenue Doyen Lépine, Bron 69676, France; University of Lyon 1, 43 Boulevard du 11 Novembre 1918, Villeurbanne 69100, France; Hospices Civils de Lyon, Neuro-immersion, 16 Avenue Doyen Lépine, Bron 69676, France
| | - Jacques Luauté
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Avenue Doyen Lépine, Bron 69676, France; University of Lyon 1, 43 Boulevard du 11 Novembre 1918, Villeurbanne 69100, France; Hospices Civils de Lyon, Neuro-immersion, 16 Avenue Doyen Lépine, Bron 69676, France
| | - Nadia Bolognini
- Department of Psychology & Milan Center for Neuroscience-NeuroMi, University of Milano Bicocca, Building U6, 1 Piazza dell'Ateneo Nuovo, Milan 20126, Italy; Laboratory of Neuropsychology, IRCSS Istituto Auxologico Italiano, 28 Via G. Mercalli, Milan 20122, Italy
| | - Vincent Hayward
- Sorbonne Université, Institut des Systèmes Intelligents et de Robotique (ISIR), 4 Place Jussieu, Paris 75005, France; Centre for the Study of the Senses, School of Advanced Study, University of London, Senate House, Malet Street, London WC1E 7HU, UK
| | - Alessandro Farnè
- Integrative Multisensory Perception Action & Cognition Team-ImpAct, Lyon Neuroscience Research Center, INSERM U1028, CNRS U5292, 16 Avenue Doyen Lépine, Bron 69676, France; University of Lyon 1, 43 Boulevard du 11 Novembre 1918, Villeurbanne 69100, France; Hospices Civils de Lyon, Neuro-immersion, 16 Avenue Doyen Lépine, Bron 69676, France; Center for Mind/Brain Sciences, University of Trento, 31 Corso Bettini, Rovereto 38068, Italy
| |
Collapse
|
11
|
Zoeller AC, Lezkan A, Paulun VC, Fleming RW, Drewing K. Integration of prior knowledge during haptic exploration depends on information type. J Vis 2019; 19:20. [PMID: 30998830 DOI: 10.1167/19.4.20] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
When haptically exploring softness, humans use higher peak forces when indenting harder versus softer objects. Here, we investigated the influence of different channels and types of prior knowledge on initial peak forces. Participants explored two stimuli (hard vs. soft) and judged which was softer. In Experiment 1 participants received either semantic (the words "hard" and "soft"), visual (video of indentation), or prior information from recurring presentation (blocks of harder or softer pairs only). In a control condition no prior information was given (randomized presentation). In the recurring condition participants used higher initial forces when exploring harder stimuli. No effects were found in control and semantic conditions. With visual prior information, participants used less force for harder objects. We speculate that these findings reflect differences between implicit knowledge induced by recurring presentation and explicit knowledge induced by visual and semantic information. To test this hypothesis, we investigated whether explicit prior information interferes with implicit information in Experiment 2. Two groups of participants discriminated softness of harder or softer stimuli in two conditions (blocked and randomized). The interference group received additional explicit information during the blocked condition; the implicit-only group did not. Implicit prior information was only used for force adaptation when no additional explicit information was given, whereas explicit interfered with movement adaptation. The integration of prior knowledge only seems possible when implicit prior knowledge is induced-not with explicit knowledge.
Collapse
Affiliation(s)
- Aaron C Zoeller
- Department of General Psychology, Giessen University, Gießen, Germany
| | - Alexandra Lezkan
- Department of General Psychology, Giessen University, Gießen, Germany
| | - Vivian C Paulun
- Department of General Psychology, Giessen University, Gießen, Germany
| | - Roland W Fleming
- Department of General Psychology, Giessen University, Gießen, Germany
| | - Knut Drewing
- Department of General Psychology, Giessen University, Gießen, Germany
| |
Collapse
|
12
|
Abraham C, Farah N, Gerbi-Zarfati L, Harpaz Y, Zalvesky Z, Mandel Y. Active photonic sensing for super-resolved reading performance in simulated prosthetic vision. BIOMEDICAL OPTICS EXPRESS 2019; 10:1081-1096. [PMID: 30891331 PMCID: PMC6420299 DOI: 10.1364/boe.10.001081] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/20/2018] [Revised: 12/06/2018] [Accepted: 12/11/2018] [Indexed: 06/09/2023]
Abstract
In this work, we study the enhancement of simulated prosthetic reading performance through "active photonic sensing" in normally sighted subjects. Three sensing paradigms were implemented: active sensing, in which the subject actively scanned the presented words using the computer mouse, with an option to control text size; passive scanning produced by software-initiated horizontal movements of words; and no scanning. Our findings reveal a 30% increase in word recognition rate with active scanning as compared to no or passive scanning and up to 14-fold increase with zooming. These results highlight the importance of a patient interactive interface and shed light on techniques that can greatly enhance prosthetic vision quality.
Collapse
Affiliation(s)
- Chen Abraham
- Faculty of Engineering and the Nanotechnology Center, Bar Ilan University, Ramat-Gan, Israel
- Contributed equally
| | - Nairouz Farah
- Faculty of Life Sciences, School of Optometry and Vision Science, Bar Ilan University, Ramat Gan, Israel
- Bar-Ilan Institute for Nanotechnology and Advanced Materials (BINA), Bar Ilan University, Ramat Gan, Israel
- Contributed equally
| | - Liron Gerbi-Zarfati
- Faculty of Engineering and the Nanotechnology Center, Bar Ilan University, Ramat-Gan, Israel
| | - Yuval Harpaz
- Gonda Brain Research Center, Bar-Ilan University, Ramat Gan, Israel
| | - Zeev Zalvesky
- Faculty of Engineering and the Nanotechnology Center, Bar Ilan University, Ramat-Gan, Israel
- Bar-Ilan Institute for Nanotechnology and Advanced Materials (BINA), Bar Ilan University, Ramat Gan, Israel
| | - Yossi Mandel
- Faculty of Life Sciences, School of Optometry and Vision Science, Bar Ilan University, Ramat Gan, Israel
- Bar-Ilan Institute for Nanotechnology and Advanced Materials (BINA), Bar Ilan University, Ramat Gan, Israel
| |
Collapse
|
13
|
Harrison LA, Kats A, Williams ME, Aziz-Zadeh L. The Importance of Sensory Processing in Mental Health: A Proposed Addition to the Research Domain Criteria (RDoC) and Suggestions for RDoC 2.0. Front Psychol 2019; 10:103. [PMID: 30804830 PMCID: PMC6370662 DOI: 10.3389/fpsyg.2019.00103] [Citation(s) in RCA: 58] [Impact Index Per Article: 9.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2018] [Accepted: 01/14/2019] [Indexed: 12/21/2022] Open
Abstract
The time is ripe to integrate burgeoning evidence of the important role of sensory and motor functioning in mental health within the National Institute of Mental Health's [NIMH] Research Domain Criteria [RDoC] framework (National Institute of Mental Health, n.d.a), a multi-dimensional method of characterizing mental functioning in health and disease across all neurobiological levels of analysis ranging from genetic to behavioral. As the importance of motor processing in psychopathology has been recognized (Bernard and Mittal, 2015; Garvey and Cuthbert, 2017; National Institute of Mental Health, 2019), here we focus on sensory processing. First, we review the current design of the RDoC matrix, noting sensory features missing despite their prevalence in multiple mental illnesses. We identify two missing classes of sensory symptoms that we widely define as (1) sensory processing, including sensory sensitivity and active sensing, and (2) domains of perceptual signaling, including interoception and proprioception, which are currently absent or underdeveloped in the perception construct of the cognitive systems domain. Then, we describe the neurobiological basis of these psychological constructs and examine why these sensory features are important for understanding psychopathology. Where appropriate, we examine links between sensory processing and the domains currently included in the RDoC matrix. Throughout, we emphasize how the addition of these sensory features to the RDoC matrix is important for understanding a range of mental health disorders. We conclude with the suggestion that a separate sensation and perception domain can enhance the current RDoC framework, while discussing what we see as important principles and promising directions for the future development and use of the RDoC.
Collapse
Affiliation(s)
- Laura A. Harrison
- USC Chan Division of Occupational Science and Occupational Therapy, University of Southern California, Los Angeles, CA, United States
- Brain and Creativity Institute, University of Southern California, Los Angeles, CA, United States
| | - Anastasiya Kats
- Brain and Creativity Institute, University of Southern California, Los Angeles, CA, United States
| | - Marian E. Williams
- Children’s Hospital Los Angeles, University of Southern California, Los Angeles, CA, United States
| | - Lisa Aziz-Zadeh
- USC Chan Division of Occupational Science and Occupational Therapy, University of Southern California, Los Angeles, CA, United States
- Brain and Creativity Institute, University of Southern California, Los Angeles, CA, United States
| |
Collapse
|
14
|
Interdependences between finger movement direction and haptic perception of oriented textures. PLoS One 2018; 13:e0208988. [PMID: 30550578 PMCID: PMC6294351 DOI: 10.1371/journal.pone.0208988] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/06/2018] [Accepted: 11/28/2018] [Indexed: 11/19/2022] Open
Abstract
Although the natural haptic perception of textures includes active finger movements, it is unclear how closely perception and movements are linked. Here we investigated this question using oriented textures. Textures that are composed of periodically repeating grooves have a clear orientation defined by the grooves. The direction of finger movement relative to texture orientation determines the availability of temporal cues to the spatial period of the texture. These cues are absent during movements directed in line with texture orientation, whereas movements orthogonal to texture orientation maximize the temporal frequency of stimulation. This may optimize temporal cues. In Experiment 1 we tested whether texture perception gets more precise the more orthogonal the movement direction is to the texture. We systematically varied the movement direction within a 2IFC spatial period discrimination task. As expected, perception was more precise (lower discrimination thresholds) when finger movements were directed closer towards the texture orthogonal as compared to in parallel to the texture. In Experiment 2 we investigated whether people adjust movement directions to the texture orthogonal in free exploration. We recorded movement directions during free exploration of standard and comparison gratings. The standard gratings were clearly oriented. The comparison gratings did not have a clear orientation defined by grooves. Participants adjusted movement directions to the texture orthogonal only for clearly oriented textures (standards). The adjustment to texture orthogonal was present in the final movement but not in the first movement. This suggests that movement adjustment is based on sensory signals for texture orientation that were gathered over the course of exploration. In Experiment 3 we assessed whether the perception of texture orientation and movement adjustments are based on shared sensory signals. We determined perceptual thresholds for orientation discrimination and computed 'movometric' thresholds from the stroke-by-stroke adjustment of movement direction. Perception and movements were influenced by a common factor, the spatial period, suggesting that the same sensory signals for texture orientation contribute to both. We conclude that people optimize texture perception by adjusting their movements in directions that maximize temporal cue frequency. Adjustments are performed on the basis of sensory signals that are also used for perception.
Collapse
|
15
|
Biswas D, Arend LA, Stamper SA, Vágvölgyi BP, Fortune ES, Cowan NJ. Closed-Loop Control of Active Sensing Movements Regulates Sensory Slip. Curr Biol 2018; 28:4029-4036.e4. [PMID: 30503617 DOI: 10.1016/j.cub.2018.11.002] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2018] [Revised: 09/20/2018] [Accepted: 11/01/2018] [Indexed: 01/20/2023]
Abstract
Active sensing involves the production of motor signals for the purpose of acquiring sensory information [1-3]. The most common form of active sensing, found across animal taxa and behaviors, involves the generation of movements-e.g., whisking [4-6], touching [7, 8], sniffing [9, 10], and eye movements [11]. Active sensing movements profoundly affect the information carried by sensory feedback pathways [12-15] and are modulated by both top-down goals (e.g., measuring weight versus texture [1, 16]) and bottom-up stimuli (e.g., lights on or off [12]), but it remains unclear whether and how these movements are controlled in relation to the ongoing feedback they generate. To investigate the control of movements for active sensing, we created an experimental apparatus for freely swimming weakly electric fish, Eigenmannia virescens, that modulates the gain of reafferent feedback by adjusting the position of a refuge based on real-time videographic measurements of fish position. We discovered that fish robustly regulate sensory slip via closed-loop control of active sensing movements. Specifically, as fish performed the task of maintaining position inside the refuge [17-22], they dramatically up- or downregulated fore-aft active sensing movements in relation to a 4-fold change of experimentally modulated reafferent gain. These changes in swimming movements served to maintain a constant magnitude of sensory slip. The magnitude of sensory slip depended on the presence or absence of visual cues. These results indicate that fish use two controllers: one that controls the acquisition of information by regulating feedback from active sensing movements and another that maintains position in the refuge, a control structure that may be ubiquitous in animals [23, 24].
Collapse
Affiliation(s)
- Debojyoti Biswas
- Department of Electrical and Computer Engineering, Johns Hopkins University, 3400 N. Charles Street, Baltimore, MD 21218, USA
| | - Luke A Arend
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, 3400 N. Charles Street, Baltimore, MD 21218, USA
| | - Sarah A Stamper
- Department of Mechanical Engineering, Johns Hopkins University, 3400 N. Charles Street, Baltimore, MD 21218, USA
| | - Balázs P Vágvölgyi
- Laboratory for Computational Sensing and Robotics, Johns Hopkins University, 3400 N. Charles Street, Baltimore, MD 21218, USA
| | - Eric S Fortune
- Federated Department of Biological Sciences, New Jersey Institute of Technology, 323 Dr. Martin Luther King Jr. Boulevard, Newark, NJ 07102, USA
| | - Noah J Cowan
- Department of Electrical and Computer Engineering, Johns Hopkins University, 3400 N. Charles Street, Baltimore, MD 21218, USA; Laboratory for Computational Sensing and Robotics, Johns Hopkins University, 3400 N. Charles Street, Baltimore, MD 21218, USA; Department of Mechanical Engineering, Johns Hopkins University, 3400 N. Charles Street, Baltimore, MD 21218, USA.
| |
Collapse
|
16
|
Lezkan A, Metzger A, Drewing K. Active Haptic Exploration of Softness: Indentation Force Is Systematically Related to Prediction, Sensation and Motivation. Front Integr Neurosci 2018; 12:59. [PMID: 30555306 PMCID: PMC6281961 DOI: 10.3389/fnint.2018.00059] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2018] [Accepted: 11/16/2018] [Indexed: 12/11/2022] Open
Abstract
Active finger movements play a crucial role in natural haptic perception. For the perception of different haptic properties people use different well-chosen movement schemes (Lederman and Klatzky, 1987). The haptic property of softness is stereotypically judged by repeatedly pressing one’s finger against an objects’ surface, actively indenting the object. It has been shown that people adjust the peak indentation forces of their pressing movements to the expected stimulus’ softness in order to improve perception (Kaim and Drewing, 2011). Here, we aim to clarify the mechanisms underlying such adjustments. We disentangle how people modulate executed peak indentation forces depending on predictive vs. sensory signals to softness, and investigate the influence of the participants’ motivational state on movement adjustments. In Experiment 1, participants performed a two alternative forced-choice (2AFC) softness discrimination task for stimulus pairs from one of four softness categories. We manipulated the predictability of the softness category. Either all stimuli of the same category were presented in a blocked fashion, which allowed predicting the softness category of the upcoming pair (predictive signals high), or stimuli from different categories were randomly intermixed, which made prediction impossible (predictive signals low). Sensory signals to softness category of the two stimuli in a pair are gathered during exploration. We contrasted the first indentation (sensory signals low) and last indentation (sensory signals high) in order to examine the effect of sensory signals. The results demonstrate that participants systematically apply lower forces when softer objects (as compared to harder objects) are indicated by predictive signals. Notably, sensory signals seemed to be not as relevant as predictive signals. However, in Experiment 2, we manipulated participant motivation by introducing rewards for good performance, and showed that the use of sensory information for movement adjustments can be fostered by high motivation. Overall, the present study demonstrates that exploratory movements are adjusted to the actual perceptual situation and that in the process of fine-tuning, closed- and open-loop mechanisms interact, with varying contributions depending on the observer’s motivation.
Collapse
Affiliation(s)
- Alexandra Lezkan
- Department of General Psychology, Justus-Liebig University Giessen, Giessen, Germany
| | - Anna Metzger
- Department of General Psychology, Justus-Liebig University Giessen, Giessen, Germany
| | - Knut Drewing
- Department of General Psychology, Justus-Liebig University Giessen, Giessen, Germany
| |
Collapse
|
17
|
Miller LE, Montroni L, Koun E, Salemme R, Hayward V, Farnè A. Sensing with tools extends somatosensory processing beyond the body. Nature 2018; 561:239-242. [DOI: 10.1038/s41586-018-0460-0] [Citation(s) in RCA: 87] [Impact Index Per Article: 12.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2017] [Accepted: 07/02/2018] [Indexed: 11/09/2022]
|
18
|
Buchs G, Simon N, Maidenbaum S, Amedi A. Waist-up protection for blind individuals using the EyeCane as a primary and secondary mobility aid. Restor Neurol Neurosci 2018; 35:225-235. [PMID: 28157111 PMCID: PMC5366249 DOI: 10.3233/rnn-160686] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
Abstract
Background: One of the most stirring statistics in relation to the mobility of blind individuals is the high rate of upper body injuries, even when using the white-cane. Objective: We here addressed a rehabilitation- oriented challenge of providing a reliable tool for blind people to avoid waist-up obstacles, namely one of the impediments to their successful mobility using currently available methods (e.g., white-cane). Methods: We used the EyeCane, a device we developed which translates distances from several angles to haptic and auditory cues in an intuitive and unobtrusive manner, serving both as a primary and secondary mobility aid. We investigated the rehabilitation potential of such a device in facilitating visionless waist-up body protection. Results: After ∼5 minutes of training with the EyeCane blind participants were able to successfully detect and avoid obstacles waist-high and up. This was significantly higher than their success when using the white-cane alone. As avoidance of obstacles required participants to perform an additional cognitive process after their detection, the avoidance rate was significantly lower than the detection rate. Conclusion: Our work has demonstrated that the EyeCane has the potential to extend the sensory world of blind individuals by expanding their currently accessible inputs, and has offered them a new practical rehabilitation tool.
Collapse
Affiliation(s)
- Galit Buchs
- Department of Cognitive Science, Faculty of Humanities, Hebrew University of Jerusalem, Hadassah Ein-Kerem, Jerusalem, Israel
| | - Noa Simon
- The Edmond and Lily Safra Center for Brain Research, Hebrew University of Jerusalem, Hadassah Ein-Kerem, Jerusalem, Israel
| | - Shachar Maidenbaum
- Department of Medical Neurobiology, Institute for Medical Research Israel-Canada, Faculty of Medicine, Hebrew University of Jerusalem, Hadassah Ein-Kerem, Jerusalem, Israel
| | - Amir Amedi
- Department of Cognitive Science, Faculty of Humanities, Hebrew University of Jerusalem, Hadassah Ein-Kerem, Jerusalem, Israel.,The Edmond and Lily Safra Center for Brain Research, Hebrew University of Jerusalem, Hadassah Ein-Kerem, Jerusalem, Israel.,Department of Medical Neurobiology, Institute for Medical Research Israel-Canada, Faculty of Medicine, Hebrew University of Jerusalem, Hadassah Ein-Kerem, Jerusalem, Israel.,Sorbonne Universités UPMC Univ Paris 06, Institut de la Vision, Paris, France
| |
Collapse
|
19
|
Attention Robustly Gates a Closed-Loop Touch Reflex. Curr Biol 2017; 27:1836-1843.e7. [DOI: 10.1016/j.cub.2017.05.058] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/12/2017] [Revised: 04/24/2017] [Accepted: 05/17/2017] [Indexed: 11/15/2022]
|
20
|
Abstract
Perception of external objects involves sensory acquisition via the relevant sensory organs. A widely-accepted assumption is that the sensory organ is the first station in a serial chain of processing circuits leading to an internal circuit in which a percept emerges. This open-loop scheme, in which the interaction between the sensory organ and the environment is not affected by its concurrent downstream neuronal processing, is strongly challenged by behavioral and anatomical data. We present here a hypothesis in which the perception of external objects is a closed-loop dynamical process encompassing loops that integrate the organism and its environment and converging towards organism-environment steady-states. We discuss the consistency of closed-loop perception (CLP) with empirical data and show that it can be synthesized in a robotic setup. Testable predictions are proposed for empirical distinction between open and closed loop schemes of perception.
Collapse
Affiliation(s)
- Ehud Ahissar
- Department of Neurobiology, Weizmann Institute of Science, Rehovot, Israel
| | - Eldad Assa
- Department of Neurobiology, Weizmann Institute of Science, Rehovot, Israel
| |
Collapse
|
21
|
Hobbs JA, Towal RB, Hartmann MJZ. Spatiotemporal Patterns of Contact Across the Rat Vibrissal Array During Exploratory Behavior. Front Behav Neurosci 2016; 9:356. [PMID: 26778990 PMCID: PMC4700281 DOI: 10.3389/fnbeh.2015.00356] [Citation(s) in RCA: 29] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2015] [Accepted: 12/08/2015] [Indexed: 11/13/2022] Open
Abstract
The rat vibrissal system is an important model for the study of somatosensation, but the small size and rapid speed of the vibrissae have precluded measuring precise vibrissal-object contact sequences during behavior. We used a laser light sheet to quantify, with 1 ms resolution, the spatiotemporal structure of whisker-surface contact as five naïve rats freely explored a flat, vertical glass wall. Consistent with previous work, we show that the whisk cycle cannot be uniquely defined because different whiskers often move asynchronously, but that quasi-periodic (~8 Hz) variations in head velocity represent a distinct temporal feature on which to lock analysis. Around times of minimum head velocity, whiskers protract to make contact with the surface, and then sustain contact with the surface for extended durations (~25-60 ms) before detaching. This behavior results in discrete temporal windows in which large numbers of whiskers are in contact with the surface. These "sustained collective contact intervals" (SCCIs) were observed on 100% of whisks for all five rats. The overall spatiotemporal structure of the SCCIs can be qualitatively predicted based on information about head pose and the average whisk cycle. In contrast, precise sequences of whisker-surface contact depend on detailed head and whisker kinematics. Sequences of vibrissal contact were highly variable, equally likely to propagate in all directions across the array. Somewhat more structure was found when sequences of contacts were examined on a row-wise basis. In striking contrast to the high variability associated with contact sequences, a consistent feature of each SCCI was that the contact locations of the whiskers on the glass converged and moved more slowly on the sheet. Together, these findings lead us to propose that the rat uses a strategy of "windowed sampling" to extract an object's spatial features: specifically, the rat spatially integrates quasi-static mechanical signals across whiskers during the period of sustained contact, resembling an "enclosing" haptic procedure.
Collapse
Affiliation(s)
- Jennifer A Hobbs
- Department of Physics and Astronomy, Northwestern University Evanston, IL, USA
| | - R Blythe Towal
- Department of Biomedical Engineering, Northwestern University Evanston, IL, USA
| | - Mitra J Z Hartmann
- Department of Biomedical Engineering, Northwestern UniversityEvanston, IL, USA; Department of Mechanical Engineering, Northwestern UniversityEvanston, IL, USA
| |
Collapse
|
22
|
Buchs G, Maidenbaum S, Levy-Tzedek S, Amedi A. Integration and binding in rehabilitative sensory substitution: Increasing resolution using a new Zooming-in approach. Restor Neurol Neurosci 2016; 34:97-105. [PMID: 26518671 PMCID: PMC4927841 DOI: 10.3233/rnn-150592] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
PURPOSE To visually perceive our surroundings we constantly move our eyes and focus on particular details, and then integrate them into a combined whole. Current visual rehabilitation methods, both invasive, like bionic-eyes and non-invasive, like Sensory Substitution Devices (SSDs), down-sample visual stimuli into low-resolution images. Zooming-in to sub-parts of the scene could potentially improve detail perception. Can congenitally blind individuals integrate a 'visual' scene when offered this information via different sensory modalities, such as audition? Can they integrate visual information -perceived in parts - into larger percepts despite never having had any visual experience? METHODS We explored these questions using a zooming-in functionality embedded in the EyeMusic visual-to-auditory SSD. Eight blind participants were tasked with identifying cartoon faces by integrating their individual components recognized via the EyeMusic's zooming mechanism. RESULTS After specialized training of just 6-10 hours, blind participants successfully and actively integrated facial features into cartooned identities in 79±18% of the trials in a highly significant manner, (chance level 10% ; rank-sum P < 1.55E-04). CONCLUSIONS These findings show that even users who lacked any previous visual experience whatsoever can indeed integrate this visual information with increased resolution. This potentially has important practical visual rehabilitation implications for both invasive and non-invasive methods.
Collapse
Affiliation(s)
- Galit Buchs
- Department of Cognitive Science, Faculty of Humanities, Hebrew University of Jerusalem, Hadassah Ein-Kerem, Jerusalem, Israel
- The Edmond and Lily Safra Center for Brain Research, Hebrew University of Jerusalem Hadassah Ein-Kerem, Jerusalem, Israel
| | - Shachar Maidenbaum
- The Edmond and Lily Safra Center for Brain Research, Hebrew University of Jerusalem Hadassah Ein-Kerem, Jerusalem, Israel
- Department of Medical Neurobiology, Institute for Medical Research Israel-Canada, Faculty of Medicine, Hebrew University of Jerusalem, Hadassah Ein-Kerem, Jerusalem, Israel
| | - Shelly Levy-Tzedek
- Recanati School for Community Health Professions, Department of Physical Therapy, Ben Gurion University of the Negev, Beer-Sheva, Israel
- Zlotowski Center for Neuroscience, Ben Gurion University of the Negev, Beer-Sheva, Israel
| | - Amir Amedi
- Department of Cognitive Science, Faculty of Humanities, Hebrew University of Jerusalem, Hadassah Ein-Kerem, Jerusalem, Israel
- The Edmond and Lily Safra Center for Brain Research, Hebrew University of Jerusalem Hadassah Ein-Kerem, Jerusalem, Israel
- Department of Medical Neurobiology, Institute for Medical Research Israel-Canada, Faculty of Medicine, Hebrew University of Jerusalem, Hadassah Ein-Kerem, Jerusalem, Israel
- Sorbonne Universités UPMC Univ Paris 06, Institut de la Vision Paris, France
| |
Collapse
|
23
|
Ahissar E, Ozana S, Arieli A. 1-D Vision: Encoding of Eye Movements by Simple Receptive Fields. Perception 2015; 44:986-94. [PMID: 26562913 DOI: 10.1177/0301006615594946] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Eye movements (eyeM) are an essential component of visual perception. They allow the sampling and scanning of stationary scenes at various spatial scales, primarily at the scene level, via saccades, and at the local level, via fixational eyeM. Given the constant motion of visual images on the retina, a crucial factor in resolving spatial ambiguities related to the external scene is the exact trajectory of eyeM. We show here that the trajectory of eyeM can be encoded at high resolution by simple retinal receptive fields of the symmetrical type. We also show that such encoding can account for motion illusions such as the Ouchi illusion. In addition, encoding of motion projections along horizontal and vertical symmetrical simple retinal receptive fields entails a kind of Cartesian decomposition of the 2-D image into two 1-D projections.
Collapse
Affiliation(s)
- Ehud Ahissar
- Department of Neurobiology, Weizmann Institute of Science, Rehovot, Israel
| | - Shira Ozana
- Department of Neurobiology, Weizmann Institute of Science, Rehovot, Israel
| | - Amos Arieli
- Department of Neurobiology, Weizmann Institute of Science, Rehovot, Israel
| |
Collapse
|
24
|
Bui TV, Stifani N, Panek I, Farah C. Genetically identified spinal interneurons integrating tactile afferents for motor control. J Neurophysiol 2015; 114:3050-63. [PMID: 26445867 DOI: 10.1152/jn.00522.2015] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2015] [Accepted: 09/28/2015] [Indexed: 11/22/2022] Open
Abstract
Our movements are shaped by our perception of the world as communicated by our senses. Perception of sensory information has been largely attributed to cortical activity. However, a prior level of sensory processing occurs in the spinal cord. Indeed, sensory inputs directly project to many spinal circuits, some of which communicate with motor circuits within the spinal cord. Therefore, the processing of sensory information for the purpose of ensuring proper movements is distributed between spinal and supraspinal circuits. The mechanisms underlying the integration of sensory information for motor control at the level of the spinal cord have yet to be fully described. Recent research has led to the characterization of spinal neuron populations that share common molecular identities. Identification of molecular markers that define specific populations of spinal neurons is a prerequisite to the application of genetic techniques devised to both delineate the function of these spinal neurons and their connectivity. This strategy has been used in the study of spinal neurons that receive tactile inputs from sensory neurons innervating the skin. As a result, the circuits that include these spinal neurons have been revealed to play important roles in specific aspects of motor function. We describe these genetically identified spinal neurons that integrate tactile information and the contribution of these studies to our understanding of how tactile information shapes motor output. Furthermore, we describe future opportunities that these circuits present for shedding light on the neural mechanisms of tactile processing.
Collapse
Affiliation(s)
- Tuan V Bui
- Department of Biology, University of Ottawa, Ottawa, Ontario, Canada; Center for Neural Dynamics, University of Ottawa, Ottawa, Ontario, Canada; and
| | - Nicolas Stifani
- Department of Medical Neuroscience, Dalhousie University, Halifax, Nova Scotia, Canada
| | - Izabela Panek
- Department of Medical Neuroscience, Dalhousie University, Halifax, Nova Scotia, Canada
| | - Carl Farah
- Department of Biology, University of Ottawa, Ottawa, Ontario, Canada
| |
Collapse
|
25
|
Motion makes sense: an adaptive motor-sensory strategy underlies the perception of object location in rats. J Neurosci 2015; 35:8777-89. [PMID: 26063912 DOI: 10.1523/jneurosci.4149-14.2015] [Citation(s) in RCA: 33] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Tactile perception is obtained by coordinated motor-sensory processes. We studied the processes underlying the perception of object location in freely moving rats. We trained rats to identify the relative location of two vertical poles placed in front of them and measured at high resolution the motor and sensory variables (19 and 2 variables, respectively) associated with this whiskers-based perceptual process. We found that the rats developed stereotypic head and whisker movements to solve this task, in a manner that can be described by several distinct behavioral phases. During two of these phases, the rats' whiskers coded object position by first temporal and then angular coding schemes. We then introduced wind (in two opposite directions) and remeasured their perceptual performance and motor-sensory variables. Our rats continued to perceive object location in a consistent manner under wind perturbations while maintaining all behavioral phases and relatively constant sensory coding. Constant sensory coding was achieved by keeping one group of motor variables (the "controlled variables") constant, despite the perturbing wind, at the cost of strongly modulating another group of motor variables (the "modulated variables"). The controlled variables included coding-relevant variables, such as head azimuth and whisker velocity. These results indicate that consistent perception of location in the rat is obtained actively, via a selective control of perception-relevant motor variables.
Collapse
|
26
|
Brownstone RM, Bui TV, Stifani N. Spinal circuits for motor learning. Curr Opin Neurobiol 2015; 33:166-73. [PMID: 25978563 DOI: 10.1016/j.conb.2015.04.007] [Citation(s) in RCA: 27] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2015] [Revised: 04/25/2015] [Accepted: 04/28/2015] [Indexed: 12/11/2022]
Abstract
Studies of motor learning have largely focussed on the cerebellum, and have provided key concepts about neural circuits required. However, other parts of the nervous system are involved in learning, as demonstrated by the capacity to 'train' spinal circuits to produce locomotion following spinal cord injury. While somatosensory feedback is necessary for spinal motor learning, feed forward circuits within the spinal cord must also contribute. In fact, motoneurons themselves could act as comparators that integrate feed forward and feedback inputs, and thus contribute to motor learning. Application of cerebellar-derived principles to spinal circuitry leads to testable predictions of spinal organization required for motor learning.
Collapse
Affiliation(s)
- Robert M Brownstone
- Department of Surgery (Neurosurgery), Dalhousie University, Halifax, Nova Scotia, Canada B3H 4R2; Department of Medical Neuroscience, Dalhousie University, Halifax, Nova Scotia, Canada B3H 4R2.
| | - Tuan V Bui
- Department of Biology, University of Ottawa, Ottawa, Ontario, Canada K1N 6N5; Centre for Neural Dynamics, University of Ottawa, Ottawa, Ontario, Canada K1N 6N5
| | - Nicolas Stifani
- Department of Medical Neuroscience, Dalhousie University, Halifax, Nova Scotia, Canada B3H 4R2
| |
Collapse
|
27
|
Fonio E, Gordon G, Barak N, Winetraub Y, Oram TB, Haidarliu S, Kimchi T, Ahissar E. Coordination of sniffing and whisking depends on the mode of interaction with the environment. Isr J Ecol Evol 2015. [DOI: 10.1080/15659801.2015.1124656] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/30/2022]
Abstract
Smell and touch convey most of the information that nocturnal rodents collect in their natural environments, each via its own complex network of muscles, receptors and neurons. Being active senses, a critical factor determining the integration of their sensations relates to the degree of their coordination. While it has been known for nearly 50 years that sniffing and whisking can be coordinated, the dynamics of such coordination and its dependency on behavioral and environmental conditions are not yet understood. Here we introduce a novel non-invasive method to track sniffing along with whisking and locomotion using high-resolution video recordings of mice, during free exploration of an open arena. Active sensing parameters in each modality showed significant dependency on exploratory modes (“Outbound”, “Exploration” and “Inbound”) and locomotion speed. Surprisingly, the correlation between sniffing and whisking was often as high as the bilateral inter-whisker correlation. Both inter-whisker and inter-modal coordination switched between distinct high-correlation and low-correlation states. The fraction of time with high-correlation states was higher in the Outbound and Exploration modes compared with the Inbound mode. Overall, these data indicate that sniffing–whisking coordination is a complex dynamic process, likely to be controlled by multiple-level inter-modal coordinated loops of motor-sensory networks.
Collapse
Affiliation(s)
- Ehud Fonio
- Department of Physics of Complex Systems, Weizmann Institute of Science
| | - Goren Gordon
- Department of Industrial Engineering, Tel-Aviv University
| | - Noy Barak
- Department of Neurobiology, Weizmann Institute of Science
| | | | | | | | - Tali Kimchi
- Department of Structural Biology, Stanford University
| | - Ehud Ahissar
- Department of Structural Biology, Stanford University
| |
Collapse
|
28
|
Abstract
When encountering novel environments, animals perform complex yet structured exploratory behaviors. Despite their typical structuring, the principles underlying exploratory patterns are still not sufficiently understood. Here we analyzed exploratory behavioral data from two modalities: whisking and locomotion in rats and mice. We found that these rodents maximized novelty signal-to-noise ratio during each exploration episode, where novelty is defined as the accumulated information gain. We further found that these rodents maximized novelty during outbound exploration, used novelty-triggered withdrawal-like retreat behavior, and explored the environment in a novelty-descending sequence. We applied a hierarchical curiosity model, which incorporates these principles, to both modalities. We show that the model captures the major components of exploratory behavior in multiple timescales: single excursions, exploratory episodes, and developmental timeline. The model predicted that novelty is managed across exploratory modalities. Using a novel experimental setup in which mice encountered a novel object for the first time in their life, we tested and validated this prediction. Further predictions, related to the development of brain circuitry, are described. This study demonstrates that rodents select exploratory actions according to a novelty management framework and suggests a plausible mechanism by which mammalian exploration primitives can be learned during development and integrated in adult exploration of complex environments.
Collapse
|
29
|
Learning and control of exploration primitives. J Comput Neurosci 2014; 37:259-80. [PMID: 24796479 DOI: 10.1007/s10827-014-0500-1] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2013] [Revised: 02/11/2014] [Accepted: 03/12/2014] [Indexed: 12/17/2022]
Abstract
Animals explore novel environments in a cautious manner, exhibiting alternation between curiosity-driven behavior and retreats. We present a detailed formal framework for exploration behavior, which generates behavior that maintains a constant level of novelty. Similar to other types of complex behaviors, the resulting exploratory behavior is composed of exploration motor primitives. These primitives can be learned during a developmental period, wherein the agent experiences repeated interactions with environments that share common traits, thus allowing transference of motor learning to novel environments. The emergence of exploration motor primitives is the result of reinforcement learning in which information gain serves as intrinsic reward. Furthermore, actors and critics are local and ego-centric, thus enabling transference to other environments. Novelty control, i.e. the principle which governs the maintenance of constant novelty, is implemented by a central action-selection mechanism, which switches between the emergent exploration primitives and a retreat policy, based on the currently-experienced novelty. The framework has only a few parameters, wherein time-scales, learning rates and thresholds are adaptive, and can thus be easily applied to many scenarios. We implement it by modeling the rodent's whisking system and show that it can explain characteristic observed behaviors. A detailed discussion of the framework's merits and flaws, as compared to other related models, concludes the paper.
Collapse
|
30
|
Weber’s law in tactile grasping and manual estimation: Feedback-dependent evidence for functionally distinct processing streams. Brain Cogn 2014; 86:32-41. [DOI: 10.1016/j.bandc.2014.01.014] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2013] [Revised: 01/26/2014] [Accepted: 01/27/2014] [Indexed: 11/21/2022]
|
31
|
Kato S, Xu Y, Cho CE, Abbott LF, Bargmann CI. Temporal responses of C. elegans chemosensory neurons are preserved in behavioral dynamics. Neuron 2014; 81:616-28. [PMID: 24440227 DOI: 10.1016/j.neuron.2013.11.020] [Citation(s) in RCA: 70] [Impact Index Per Article: 6.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/06/2013] [Indexed: 12/20/2022]
Abstract
Animals track fluctuating stimuli over multiple timescales during natural olfactory behaviors. Here, we define mechanisms underlying these computations in Caenorhabditis elegans. By characterizing neuronal calcium responses to rapidly fluctuating odor sequences, we show that sensory neurons reliably track stimulus fluctuations relevant to behavior. AWC olfactory neurons respond to multiple odors with subsecond precision required for chemotaxis, whereas ASH nociceptive neurons integrate noxious cues over several seconds to reach a threshold for avoidance behavior. Each neuron's response to fluctuating stimuli is largely linear and can be described by a biphasic temporal filter and dynamical model. A calcium channel mutation alters temporal filtering and avoidance behaviors initiated by ASH on similar timescales. A sensory G-alpha protein mutation affects temporal filtering in AWC and alters steering behavior in a way that supports an active sensing model for chemotaxis. Thus, temporal features of sensory neurons can be propagated across circuits to specify behavioral dynamics.
Collapse
Affiliation(s)
- Saul Kato
- Department of Neuroscience and Department of Physiology and Cellular Biophysics, Columbia University College of Physicians and Surgeons, New York, NY 10032, USA
| | - Yifan Xu
- Howard Hughes Medical Institute, The Rockefeller University, New York, NY 10065, USA
| | - Christine E Cho
- Howard Hughes Medical Institute, The Rockefeller University, New York, NY 10065, USA
| | - L F Abbott
- Department of Neuroscience and Department of Physiology and Cellular Biophysics, Columbia University College of Physicians and Surgeons, New York, NY 10032, USA.
| | - Cornelia I Bargmann
- Howard Hughes Medical Institute, The Rockefeller University, New York, NY 10065, USA.
| |
Collapse
|
32
|
Morrison I, Perini I, Dunham J. Facets and mechanisms of adaptive pain behavior: predictive regulation and action. Front Hum Neurosci 2013; 7:755. [PMID: 24348358 PMCID: PMC3842910 DOI: 10.3389/fnhum.2013.00755] [Citation(s) in RCA: 46] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/21/2013] [Accepted: 10/21/2013] [Indexed: 12/30/2022] Open
Abstract
Neural mechanisms underlying nociception and pain perception are considered to serve the ultimate goal of limiting tissue damage. However, since pain usually occurs in complex environments and situations that call for elaborate control over behavior, simple avoidance is insufficient to explain a range of mammalian pain responses, especially in the presence of competing goals. In this integrative review we propose a Predictive Regulation and Action (PRA) model of acute pain processing. It emphasizes evidence that the nervous system is organized to anticipate potential pain and to adjust behavior before the risk of tissue damage becomes critical. Regulatory processes occur on many levels, and can be dynamically influenced by local interactions or by modulation from other brain areas in the network. The PRA model centers on neural substrates supporting the predictive nature of pain processing, as well as on finely-calibrated yet versatile regulatory processes that ultimately affect behavior. We outline several operational categories of pain behavior, from spinally-mediated reflexes to adaptive voluntary action, situated at various neural levels. An implication is that neural processes that track potential tissue damage in terms of behavioral consequences are an integral part of pain perception.
Collapse
Affiliation(s)
- India Morrison
- 1Department of Clinical Neurophysiology, Sahlgrenska University Hospital Gothenburg, Sweden ; 2Institute of Neuroscience and Physiology, University of Gothenburg Gothenburg, Sweden ; 3Department of Cognitive Neuroscience and Philosophy, University of Skövde Skövde, Sweden
| | - Irene Perini
- 1Department of Clinical Neurophysiology, Sahlgrenska University Hospital Gothenburg, Sweden ; 2Institute of Neuroscience and Physiology, University of Gothenburg Gothenburg, Sweden
| | - James Dunham
- 1Department of Clinical Neurophysiology, Sahlgrenska University Hospital Gothenburg, Sweden
| |
Collapse
|
33
|
Yu C, Horev G, Rubin N, Derdikman D, Haidarliu S, Ahissar E. Coding of object location in the vibrissal thalamocortical system. ACTA ACUST UNITED AC 2013; 25:563-77. [PMID: 24062318 DOI: 10.1093/cercor/bht241] [Citation(s) in RCA: 24] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Abstract
In whisking rodents, object location is encoded at the receptor level by a combination of motor and sensory related signals. Recoding of the encoded signals can result in various forms of internal representations. Here, we examined the coding schemes occurring at the first forebrain level that receives inputs necessary for generating such internal representations--the thalamocortical network. Single units were recorded in 8 thalamic and cortical stations in artificially whisking anesthetized rats. Neuronal representations of object location generated across these stations and expressed in response latency and magnitude were classified based on graded and binary coding schemes. Both graded and binary coding schemes occurred across the entire thalamocortical network, with a general tendency of graded-to-binary transformation from thalamus to cortex. Overall, 63% of the neurons of the thalamocortical network coded object position in their firing. Thalamocortical responses exhibited a slow dynamics during which the amount of coded information increased across 4-5 whisking cycles and then stabilized. Taken together, the results indicate that the thalamocortical network contains dynamic mechanisms that can converge over time on multiple coding schemes of object location, schemes which essentially transform temporal coding to rate coding and gradual to labeled-line coding.
Collapse
Affiliation(s)
- Chunxiu Yu
- Current address: Department of Psychology and Neuroscience, Center for Cognitive Neuroscience, Duke University, Durham, NC 27708, USA
| | - Guy Horev
- Current address: Cold Spring Harbor Laboratory, Cold Spring Harbor, NY 11724, USA
| | - Naama Rubin
- Department of Neurobiology, Weizmann Institute of Science, Rehovot 76100, Israel Current address: Department of Psychology and Neuroscience, Center for Cognitive Neuroscience, Duke University, Durham, NC 27708, USA Current address: Cold Spring Harbor Laboratory, Cold Spring Harbor, NY 11724, USA
| | - Dori Derdikman
- Department of Neurobiology, Weizmann Institute of Science, Rehovot 76100, Israel Current address: Department of Psychology and Neuroscience, Center for Cognitive Neuroscience, Duke University, Durham, NC 27708, USA Current address: Cold Spring Harbor Laboratory, Cold Spring Harbor, NY 11724, USA
| | - Sebastian Haidarliu
- Department of Neurobiology, Weizmann Institute of Science, Rehovot 76100, Israel Current address: Department of Psychology and Neuroscience, Center for Cognitive Neuroscience, Duke University, Durham, NC 27708, USA Current address: Cold Spring Harbor Laboratory, Cold Spring Harbor, NY 11724, USA
| | - Ehud Ahissar
- Department of Neurobiology, Weizmann Institute of Science, Rehovot 76100, Israel Current address: Department of Psychology and Neuroscience, Center for Cognitive Neuroscience, Duke University, Durham, NC 27708, USA Current address: Cold Spring Harbor Laboratory, Cold Spring Harbor, NY 11724, USA
| |
Collapse
|
34
|
Pre-neuronal morphological processing of object location by individual whiskers. Nat Neurosci 2013; 16:622-31. [PMID: 23563582 DOI: 10.1038/nn.3378] [Citation(s) in RCA: 76] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2012] [Accepted: 03/11/2013] [Indexed: 11/08/2022]
Abstract
In the vibrissal system, touch information is conveyed by a receptorless whisker hair to follicle mechanoreceptors, which then provide input to the brain. We examined whether any processing, that is, meaningful transformation, occurs in the whisker itself. Using high-speed videography and tracking the movements of whiskers in anesthetized and behaving rats, we found that whisker-related morphological phase planes, based on angular and curvature variables, can represent the coordinates of object position after contact in a reliable manner, consistent with theoretical predictions. By tracking exposed follicles, we found that the follicle-whisker junction is rigid, which enables direct readout of whisker morphological coding by mechanoreceptors. Finally, we found that our behaving rats pushed their whiskers against objects during localization in a way that induced meaningful morphological coding and, in parallel, improved their localization performance, which suggests a role for pre-neuronal morphological computation in active vibrissal touch.
Collapse
|
35
|
Hsu B, Hsieh CH, Yu SN, Ahissar E, Arieli A, Zilbershtain-Kra Y. A tactile vision substitution system for the study of active sensing. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2013; 2013:3206-3209. [PMID: 24110410 DOI: 10.1109/embc.2013.6610223] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/02/2023]
Abstract
This paper presents a tactile vision substitution system (TVSS) for the study of active sensing. Two algorithms, namely image processing and trajectory tracking, were developed to enhance the capability of conventional TVSS. Image processing techniques were applied to reduce the artifacts and extract important features from the active camera and effectively converted the information into tactile stimuli with much lower resolution. A fixed camera was used to record the movement of the active camera. A trajectory tracking algorithm was developed to analyze the active sensing strategy of the TVSS users to explore the environment. The image processing subsystem showed advantageous improvement in extracting object's features for superior recognition. The trajectory tracking subsystem, on the other hand, enabled accurately locating the portion of the scene pointed by the active camera and providing profound information for the study of active sensing strategy applied by TVSS users.
Collapse
|
36
|
Ahissar E, Arieli A. Seeing via Miniature Eye Movements: A Dynamic Hypothesis for Vision. Front Comput Neurosci 2012; 6:89. [PMID: 23162458 PMCID: PMC3492788 DOI: 10.3389/fncom.2012.00089] [Citation(s) in RCA: 63] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2012] [Accepted: 10/05/2012] [Indexed: 11/20/2022] Open
Abstract
During natural viewing, the eyes are never still. Even during fixation, miniature movements of the eyes move the retinal image across tens of foveal photoreceptors. Most theories of vision implicitly assume that the visual system ignores these movements and somehow overcomes the resulting smearing. However, evidence has accumulated to indicate that fixational eye movements cannot be ignored by the visual system if fine spatial details are to be resolved. We argue that the only way the visual system can achieve its high resolution given its fixational movements is by seeing via these movements. Seeing via eye movements also eliminates the instability of the image, which would be induced by them otherwise. Here we present a hypothesis for vision, in which coarse details are spatially encoded in gaze-related coordinates, and fine spatial details are temporally encoded in relative retinal coordinates. The temporal encoding presented here achieves its highest resolution by encoding along the elongated axes of simple-cell receptive fields and not across these axes as suggested by spatial models of vision. According to our hypothesis, fine details of shape are encoded by inter-receptor temporal phases, texture by instantaneous intra-burst rates of individual receptors, and motion by inter-burst temporal frequencies. We further describe the ability of the visual system to readout the encoded information and recode it internally. We show how reading out of retinal signals can be facilitated by neuronal phase-locked loops (NPLLs), which lock to the retinal jitter; this locking enables recoding of motion information and temporal framing of shape and texture processing. A possible implementation of this locking-and-recoding process by specific thalamocortical loops is suggested. Overall it is suggested that high-acuity vision is based primarily on temporal mechanisms of the sort presented here and low-acuity vision is based primarily on spatial mechanisms.
Collapse
Affiliation(s)
- Ehud Ahissar
- Department of Neurobiology, Weizmann Institute of Science Rehovot, Israel
| | | |
Collapse
|