1
|
Wang J, Lin S, Liu A. Bioinspired Perception and Navigation of Service Robots in Indoor Environments: A Review. Biomimetics (Basel) 2023; 8:350. [PMID: 37622955 PMCID: PMC10452487 DOI: 10.3390/biomimetics8040350] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/27/2023] [Revised: 07/27/2023] [Accepted: 08/01/2023] [Indexed: 08/26/2023] Open
Abstract
Biological principles draw attention to service robotics because of similar concepts when robots operate various tasks. Bioinspired perception is significant for robotic perception, which is inspired by animals' awareness of the environment. This paper reviews the bioinspired perception and navigation of service robots in indoor environments, which are popular applications of civilian robotics. The navigation approaches are classified by perception type, including vision-based, remote sensing, tactile sensor, olfactory, sound-based, inertial, and multimodal navigation. The trend of state-of-art techniques is moving towards multimodal navigation to combine several approaches. The challenges in indoor navigation focus on precise localization and dynamic and complex environments with moving objects and people.
Collapse
Affiliation(s)
- Jianguo Wang
- Faculty of Engineering and Information Technology, University of Technology Sydney, Sydney, NSW 2007, Australia
| | - Shiwei Lin
- Faculty of Engineering and Information Technology, University of Technology Sydney, Sydney, NSW 2007, Australia
| | - Ang Liu
- Faculty of Engineering and Information Technology, University of Technology Sydney, Sydney, NSW 2007, Australia
| |
Collapse
|
2
|
Skelton PSM, Finn A, Brinkworth RSA. Contrast independent biologically inspired translational optic flow estimation. BIOLOGICAL CYBERNETICS 2022; 116:635-660. [PMID: 36303043 PMCID: PMC9691503 DOI: 10.1007/s00422-022-00948-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/19/2021] [Accepted: 10/11/2022] [Indexed: 06/16/2023]
Abstract
The visual systems of insects are relatively simple compared to humans. However, they enable navigation through complex environments where insects perform exceptional levels of obstacle avoidance. Biology uses two separable modes of optic flow to achieve this: rapid gaze fixation (rotational motion known as saccades); and the inter-saccadic translational motion. While the fundamental process of insect optic flow has been known since the 1950's, so too has its dependence on contrast. The surrounding visual pathways used to overcome environmental dependencies are less well known. Previous work has shown promise for low-speed rotational motion estimation, but a gap remained in the estimation of translational motion, in particular the estimation of the time to impact. To consistently estimate the time to impact during inter-saccadic translatory motion, the fundamental limitation of contrast dependence must be overcome. By adapting an elaborated rotational velocity estimator from literature to work for translational motion, this paper proposes a novel algorithm for overcoming the contrast dependence of time to impact estimation using nonlinear spatio-temporal feedforward filtering. By applying bioinspired processes, approximately 15 points per decade of statistical discrimination were achieved when estimating the time to impact to a target across 360 background, distance, and velocity combinations: a 17-fold increase over the fundamental process. These results show the contrast dependence of time to impact estimation can be overcome in a biologically plausible manner. This, combined with previous results for low-speed rotational motion estimation, allows for contrast invariant computational models designed on the principles found in the biological visual system, paving the way for future visually guided systems.
Collapse
Affiliation(s)
- Phillip S. M. Skelton
- Centre for Defence Engineering Research and Training, College of Science and Engineering, Flinders University, 1284 South Road, Tonsley, South Australia 5042 Australia
| | - Anthony Finn
- Science, Technology, Engineering, and Mathematics, University of South Australia, 1 Mawson Lakes Boulevard, Mawson Lakes, South Australia 5095 Australia
| | - Russell S. A. Brinkworth
- Centre for Defence Engineering Research and Training, College of Science and Engineering, Flinders University, 1284 South Road, Tonsley, South Australia 5042 Australia
| |
Collapse
|
3
|
Li J, Niemeier M, Kern R, Egelhaaf M. Disentangling of Local and Wide-Field Motion Adaptation. Front Neural Circuits 2021; 15:713285. [PMID: 34531728 PMCID: PMC8438216 DOI: 10.3389/fncir.2021.713285] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2021] [Accepted: 08/11/2021] [Indexed: 11/21/2022] Open
Abstract
Motion adaptation has been attributed in flying insects a pivotal functional role in spatial vision based on optic flow. Ongoing motion enhances in the visual pathway the representation of spatial discontinuities, which manifest themselves as velocity discontinuities in the retinal optic flow pattern during translational locomotion. There is evidence for different spatial scales of motion adaptation at the different visual processing stages. Motion adaptation is supposed to take place, on the one hand, on a retinotopic basis at the level of local motion detecting neurons and, on the other hand, at the level of wide-field neurons pooling the output of many of these local motion detectors. So far, local and wide-field adaptation could not be analyzed separately, since conventional motion stimuli jointly affect both adaptive processes. Therefore, we designed a novel stimulus paradigm based on two types of motion stimuli that had the same overall strength but differed in that one led to local motion adaptation while the other did not. We recorded intracellularly the activity of a particular wide-field motion-sensitive neuron, the horizontal system equatorial cell (HSE) in blowflies. The experimental data were interpreted based on a computational model of the visual motion pathway, which included the spatially pooling HSE-cell. By comparing the difference between the recorded and modeled HSE-cell responses induced by the two types of motion adaptation, the major characteristics of local and wide-field adaptation could be pinpointed. Wide-field adaptation could be shown to strongly depend on the activation level of the cell and, thus, on the direction of motion. In contrast, the response gain is reduced by local motion adaptation to a similar extent independent of the direction of motion. This direction-independent adaptation differs fundamentally from the well-known adaptive adjustment of response gain according to the prevailing overall stimulus level that is considered essential for an efficient signal representation by neurons with a limited operating range. Direction-independent adaptation is discussed to result from the joint activity of local motion-sensitive neurons of different preferred directions and to lead to a representation of the local motion direction that is independent of the overall direction of global motion.
Collapse
Affiliation(s)
- Jinglin Li
- Neurobiology, Bielefeld University, Bielefeld, Germany
| | | | - Roland Kern
- Neurobiology, Bielefeld University, Bielefeld, Germany
| | | |
Collapse
|
4
|
Meyer HG, Klimeck D, Paskarbeit J, Rückert U, Egelhaaf M, Porrmann M, Schneider A. Resource-efficient bio-inspired visual processing on the hexapod walking robot HECTOR. PLoS One 2020; 15:e0230620. [PMID: 32236111 PMCID: PMC7112198 DOI: 10.1371/journal.pone.0230620] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2019] [Accepted: 03/04/2020] [Indexed: 11/26/2022] Open
Abstract
Emulating the highly resource-efficient processing of visual motion information in the brain of flying insects, a bio-inspired controller for collision avoidance and navigation was implemented on a novel, integrated System-on-Chip-based hardware module. The hardware module is used to control visually-guided navigation behavior of the stick insect-like hexapod robot HECTOR. By leveraging highly parallelized bio-inspired algorithms to extract nearness information from visual motion in dynamically reconfigurable logic, HECTOR is able to navigate to predefined goal positions without colliding with obstacles. The system drastically outperforms CPU- and graphics card-based implementations in terms of speed and resource efficiency, making it suitable to be also placed on fast moving robots, such as flying drones.
Collapse
Affiliation(s)
- Hanno Gerd Meyer
- Research Group Biomechatronics, CITEC, Bielefeld University, Bielefeld, Germany
- Department of Neurobiology and CITEC, Bielefeld University, Bielefeld, Germany
- Biomechatronics and Embedded Systems Group, Faculty of Engineering and Mathematics, University of Applied Sciences, Bielefeld, Germany
| | - Daniel Klimeck
- Cognitronics and Sensor Systems Group, CITEC, Bielefeld University, Bielefeld, Germany
| | - Jan Paskarbeit
- Research Group Biomechatronics, CITEC, Bielefeld University, Bielefeld, Germany
| | - Ulrich Rückert
- Cognitronics and Sensor Systems Group, CITEC, Bielefeld University, Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology and CITEC, Bielefeld University, Bielefeld, Germany
| | - Mario Porrmann
- Computer Engineering Group, Osnabrück University, Osnabrück, Germany
| | - Axel Schneider
- Research Group Biomechatronics, CITEC, Bielefeld University, Bielefeld, Germany
- Biomechatronics and Embedded Systems Group, Faculty of Engineering and Mathematics, University of Applied Sciences, Bielefeld, Germany
| |
Collapse
|
5
|
Spatial Encoding of Translational Optic Flow in Planar Scenes by Elementary Motion Detector Arrays. Sci Rep 2018; 8:5821. [PMID: 29643402 PMCID: PMC5895815 DOI: 10.1038/s41598-018-24162-z] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2017] [Accepted: 03/28/2018] [Indexed: 02/02/2023] Open
Abstract
Elementary Motion Detectors (EMD) are well-established models of visual motion estimation in insects. The response of EMDs are tuned to specific temporal and spatial frequencies of the input stimuli, which matches the behavioural response of insects to wide-field image rotation, called the optomotor response. However, other behaviours, such as speed and position control, cannot be fully accounted for by EMDs because these behaviours are largely unaffected by image properties and appear to be controlled by the ratio between the flight speed and the distance to an object, defined here as relative nearness. We present a method that resolves this inconsistency by extracting an unambiguous estimate of relative nearness from the output of an EMD array. Our method is suitable for estimation of relative nearness in planar scenes such as when flying above the ground or beside large flat objects. We demonstrate closed loop control of the lateral position and forward velocity of a simulated agent flying in a corridor. This finding may explain how insects can measure relative nearness and control their flight despite the frequency tuning of EMDs. Our method also provides engineers with a relative nearness estimation technique that benefits from the low computational cost of EMDs.
Collapse
|
6
|
Li J, Lindemann JP, Egelhaaf M. Local motion adaptation enhances the representation of spatial structure at EMD arrays. PLoS Comput Biol 2017; 13:e1005919. [PMID: 29281631 PMCID: PMC5760083 DOI: 10.1371/journal.pcbi.1005919] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2017] [Revised: 01/09/2018] [Accepted: 11/13/2017] [Indexed: 11/18/2022] Open
Abstract
Neuronal representation and extraction of spatial information are essential for behavioral control. For flying insects, a plausible way to gain spatial information is to exploit distance-dependent optic flow that is generated during translational self-motion. Optic flow is computed by arrays of local motion detectors retinotopically arranged in the second neuropile layer of the insect visual system. These motion detectors have adaptive response characteristics, i.e. their responses to motion with a constant or only slowly changing velocity decrease, while their sensitivity to rapid velocity changes is maintained or even increases. We analyzed by a modeling approach how motion adaptation affects signal representation at the output of arrays of motion detectors during simulated flight in artificial and natural 3D environments. We focused on translational flight, because spatial information is only contained in the optic flow induced by translational locomotion. Indeed, flies, bees and other insects segregate their flight into relatively long intersaccadic translational flight sections interspersed with brief and rapid saccadic turns, presumably to maximize periods of translation (80% of the flight). With a novel adaptive model of the insect visual motion pathway we could show that the motion detector responses to background structures of cluttered environments are largely attenuated as a consequence of motion adaptation, while responses to foreground objects stay constant or even increase. This conclusion even holds under the dynamic flight conditions of insects.
Collapse
Affiliation(s)
- Jinglin Li
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
- * E-mail:
| | - Jens P. Lindemann
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology and Cluster of Excellence Cognitive Interaction Technology (CITEC), Bielefeld University, Bielefeld, Germany
| |
Collapse
|
7
|
|