1
|
Skelton PSM, Finn A, Brinkworth RSA. Contrast independent biologically inspired translational optic flow estimation. BIOLOGICAL CYBERNETICS 2022; 116:635-660. [PMID: 36303043 PMCID: PMC9691503 DOI: 10.1007/s00422-022-00948-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/19/2021] [Accepted: 10/11/2022] [Indexed: 06/16/2023]
Abstract
The visual systems of insects are relatively simple compared to humans. However, they enable navigation through complex environments where insects perform exceptional levels of obstacle avoidance. Biology uses two separable modes of optic flow to achieve this: rapid gaze fixation (rotational motion known as saccades); and the inter-saccadic translational motion. While the fundamental process of insect optic flow has been known since the 1950's, so too has its dependence on contrast. The surrounding visual pathways used to overcome environmental dependencies are less well known. Previous work has shown promise for low-speed rotational motion estimation, but a gap remained in the estimation of translational motion, in particular the estimation of the time to impact. To consistently estimate the time to impact during inter-saccadic translatory motion, the fundamental limitation of contrast dependence must be overcome. By adapting an elaborated rotational velocity estimator from literature to work for translational motion, this paper proposes a novel algorithm for overcoming the contrast dependence of time to impact estimation using nonlinear spatio-temporal feedforward filtering. By applying bioinspired processes, approximately 15 points per decade of statistical discrimination were achieved when estimating the time to impact to a target across 360 background, distance, and velocity combinations: a 17-fold increase over the fundamental process. These results show the contrast dependence of time to impact estimation can be overcome in a biologically plausible manner. This, combined with previous results for low-speed rotational motion estimation, allows for contrast invariant computational models designed on the principles found in the biological visual system, paving the way for future visually guided systems.
Collapse
Affiliation(s)
- Phillip S. M. Skelton
- Centre for Defence Engineering Research and Training, College of Science and Engineering, Flinders University, 1284 South Road, Tonsley, South Australia 5042 Australia
| | - Anthony Finn
- Science, Technology, Engineering, and Mathematics, University of South Australia, 1 Mawson Lakes Boulevard, Mawson Lakes, South Australia 5095 Australia
| | - Russell S. A. Brinkworth
- Centre for Defence Engineering Research and Training, College of Science and Engineering, Flinders University, 1284 South Road, Tonsley, South Australia 5042 Australia
| |
Collapse
|
2
|
Ohradzansky MT, Humbert JS. Lidar-Based Navigation of Subterranean Environments Using Bio-Inspired Wide-Field Integration of Nearness. SENSORS (BASEL, SWITZERLAND) 2022; 22:849. [PMID: 35161595 PMCID: PMC8840438 DOI: 10.3390/s22030849] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/31/2021] [Revised: 01/17/2022] [Accepted: 01/18/2022] [Indexed: 06/14/2023]
Abstract
Navigating unknown environments is an ongoing challenge in robotics. Processing large amounts of sensor data to maintain localization, maps of the environment, and sensible paths can result in high compute loads and lower maximum vehicle speeds. This paper presents a bio-inspired algorithm for efficiently processing depth measurements to achieve fast navigation of unknown subterranean environments. Animals developed efficient sensorimotor convergence approaches, allowing for rapid processing of large numbers of spatially distributed measurements into signals relevant for different behavioral responses necessary to their survival. Using a spatial inner-product to model this sensorimotor convergence principle, environmentally relative states critical to navigation are extracted from spatially distributed depth measurements using derived weighting functions. These states are then applied as feedback to control a simulated quadrotor platform, enabling autonomous navigation in subterranean environments. The resulting outer-loop velocity controller is demonstrated in both a generalized subterranean environment, represented by an infinite cylinder, and nongeneralized environments like tunnels and caves.
Collapse
Affiliation(s)
- Michael T. Ohradzansky
- Department of Aerospace Engineering Sciences, University of Colorado Boulder, 3775 Discovery Drive, Boulder, CO 80303, USA
| | - J. Sean Humbert
- Department of Mechanical Engineering, University of Colorado Boulder, 427 UCB, 1111 Engineering Dr, Boulder, CO 80309, USA;
| |
Collapse
|
3
|
Bertrand OJN, Doussot C, Siesenop T, Ravi S, Egelhaaf M. Visual and movement memories steer foraging bumblebees along habitual routes. J Exp Biol 2021; 224:269087. [PMID: 34115117 DOI: 10.1242/jeb.237867] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/17/2020] [Accepted: 04/06/2021] [Indexed: 11/20/2022]
Abstract
One persistent question in animal navigation is how animals follow habitual routes between their home and a food source. Our current understanding of insect navigation suggests an interplay between visual memories, collision avoidance and path integration, the continuous integration of distance and direction travelled. However, these behavioural modules have to be continuously updated with instantaneous visual information. In order to alleviate this need, the insect could learn and replicate habitual movements ('movement memories') around objects (e.g. a bent trajectory around an object) to reach its destination. We investigated whether bumblebees, Bombus terrestris, learn and use movement memories en route to their home. Using a novel experimental paradigm, we habituated bumblebees to establish a habitual route in a flight tunnel containing 'invisible' obstacles. We then confronted them with conflicting cues leading to different choice directions depending on whether they rely on movement or visual memories. The results suggest that they use movement memories to navigate, but also rely on visual memories to solve conflicting situations. We investigated whether the observed behaviour was due to other guidance systems, such as path integration or optic flow-based flight control, and found that neither of these systems was sufficient to explain the behaviour.
Collapse
Affiliation(s)
- Olivier J N Bertrand
- Department of Neurobiology and Cognitive Interaction Technology Center of Excellence (CITEC) , Bielefeld University, D-33501 Bielefeld, Germany
| | - Charlotte Doussot
- Department of Neurobiology and Cognitive Interaction Technology Center of Excellence (CITEC) , Bielefeld University, D-33501 Bielefeld, Germany
| | - Tim Siesenop
- Department of Neurobiology and Cognitive Interaction Technology Center of Excellence (CITEC) , Bielefeld University, D-33501 Bielefeld, Germany
| | - Sridhar Ravi
- Department of Neurobiology and Cognitive Interaction Technology Center of Excellence (CITEC) , Bielefeld University, D-33501 Bielefeld, Germany.,School of Engineering, RMIT University, Melbourne, VIC 3083, Australia
| | - Martin Egelhaaf
- Department of Neurobiology and Cognitive Interaction Technology Center of Excellence (CITEC) , Bielefeld University, D-33501 Bielefeld, Germany
| |
Collapse
|
4
|
Ruiz C, Theobald JC. Stabilizing responses to sideslip disturbances in Drosophila melanogaster are modulated by the density of moving elements on the ground. Biol Lett 2021; 17:20200748. [PMID: 33653094 DOI: 10.1098/rsbl.2020.0748] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022] Open
Abstract
Stabilizing responses to sideslip disturbances are a critical part of the flight control system in flies. While strongly mediated by mechanoreception, much of the final response results from the wide-field motion detection system associated with vision. In order to be effective, these responses must match the disturbance they are aimed to correct. To do this, flies must estimate the velocity of the disturbance, although it is not known how they accomplish this task when presented with natural images or dot fields. The recent finding, that motion parallax in dot fields can modulate stabilizing responses only if perceived below the fly, raises the question of whether other image statistics are also processed differently between eye regions. One such parameter is the density of elements moving in translational optic flow. Depending on the habitat, there might be strong differences in the density of elements providing information about self-motion above and below the fly, which in turn could act as selective pressures tuning the visual system to process this parameter on a regional basis. By presenting laterally moving dot fields of different densities we found that, in Drosophila melanogaster, the amplitude of the stabilizing response is significantly affected by the number of elements in the field of view. Flies countersteer strongly within a relatively low and narrow range of element densities. But this effect is exclusive to the ventral region of the eye, and dorsal stimuli elicit an unaltered and stereotypical response regardless of the density of elements in the flow. This highlights local specialization of the eye and suggests the lower region may play a more critical role in translational flight stabilization.
Collapse
Affiliation(s)
- Carlos Ruiz
- Department of Biological Sciences, Florida International University, Miami, FL 33199, USA
| | - Jamie C Theobald
- Department of Biological Sciences, Florida International University, Miami, FL 33199, USA
| |
Collapse
|
5
|
Baird E, Boeddeker N, Srinivasan MV. The effect of optic flow cues on honeybee flight control in wind. Proc Biol Sci 2021; 288:20203051. [PMID: 33468001 DOI: 10.1098/rspb.2020.3051] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
To minimize the risk of colliding with the ground or other obstacles, flying animals need to control both their ground speed and ground height. This task is particularly challenging in wind, where head winds require an animal to increase its airspeed to maintain a constant ground speed and tail winds may generate negative airspeeds, rendering flight more difficult to control. In this study, we investigate how head and tail winds affect flight control in the honeybee Apis mellifera, which is known to rely on the pattern of visual motion generated across the eye-known as optic flow-to maintain constant ground speeds and heights. We find that, when provided with both longitudinal and transverse optic flow cues (in or perpendicular to the direction of flight, respectively), honeybees maintain a constant ground speed but fly lower in head winds and higher in tail winds, a response that is also observed when longitudinal optic flow cues are minimized. When the transverse component of optic flow is minimized, or when all optic flow cues are minimized, the effect of wind on ground height is abolished. We propose that the regular sidewards oscillations that the bees make as they fly may be used to extract information about the distance to the ground, independently of the longitudinal optic flow that they use for ground speed control. This computationally simple strategy could have potential uses in the development of lightweight and robust systems for guiding autonomous flying vehicles in natural environments.
Collapse
Affiliation(s)
- Emily Baird
- Department of Zoology, Stockholm University, Stockholm, Sweden
| | - Norbert Boeddeker
- Department of Cognitive Neuroscience, Bielefeld University, Bielefeld, Germany
| | | |
Collapse
|
6
|
Meyer HG, Klimeck D, Paskarbeit J, Rückert U, Egelhaaf M, Porrmann M, Schneider A. Resource-efficient bio-inspired visual processing on the hexapod walking robot HECTOR. PLoS One 2020; 15:e0230620. [PMID: 32236111 PMCID: PMC7112198 DOI: 10.1371/journal.pone.0230620] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/21/2019] [Accepted: 03/04/2020] [Indexed: 11/26/2022] Open
Abstract
Emulating the highly resource-efficient processing of visual motion information in the brain of flying insects, a bio-inspired controller for collision avoidance and navigation was implemented on a novel, integrated System-on-Chip-based hardware module. The hardware module is used to control visually-guided navigation behavior of the stick insect-like hexapod robot HECTOR. By leveraging highly parallelized bio-inspired algorithms to extract nearness information from visual motion in dynamically reconfigurable logic, HECTOR is able to navigate to predefined goal positions without colliding with obstacles. The system drastically outperforms CPU- and graphics card-based implementations in terms of speed and resource efficiency, making it suitable to be also placed on fast moving robots, such as flying drones.
Collapse
Affiliation(s)
- Hanno Gerd Meyer
- Research Group Biomechatronics, CITEC, Bielefeld University, Bielefeld, Germany
- Department of Neurobiology and CITEC, Bielefeld University, Bielefeld, Germany
- Biomechatronics and Embedded Systems Group, Faculty of Engineering and Mathematics, University of Applied Sciences, Bielefeld, Germany
| | - Daniel Klimeck
- Cognitronics and Sensor Systems Group, CITEC, Bielefeld University, Bielefeld, Germany
| | - Jan Paskarbeit
- Research Group Biomechatronics, CITEC, Bielefeld University, Bielefeld, Germany
| | - Ulrich Rückert
- Cognitronics and Sensor Systems Group, CITEC, Bielefeld University, Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology and CITEC, Bielefeld University, Bielefeld, Germany
| | - Mario Porrmann
- Computer Engineering Group, Osnabrück University, Osnabrück, Germany
| | - Axel Schneider
- Research Group Biomechatronics, CITEC, Bielefeld University, Bielefeld, Germany
- Biomechatronics and Embedded Systems Group, Faculty of Engineering and Mathematics, University of Applied Sciences, Bielefeld, Germany
| |
Collapse
|
7
|
Huang JV, Wei Y, Krapp HG. A biohybrid fly-robot interface system that performs active collision avoidance. BIOINSPIRATION & BIOMIMETICS 2019; 14:065001. [PMID: 31412322 DOI: 10.1088/1748-3190/ab3b23] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
We have designed a bio-hybrid fly-robot interface (FRI) to study sensorimotor control in insects. The FRI consists of a miniaturized recording platform mounted on a two-wheeled robot and is controlled by the neuronal spiking activity of an identified visual interneuron, the blowfly H1-cell. For a given turning radius of the robot, we found a proportional relationship between the spike rate of the H1-cell and the relative distance of the FRI from the patterned wall of an experimental arena. Under closed-loop conditions during oscillatory forward movements biased towards the wall, collision avoidance manoeuvres were triggered whenever the H1-cell spike rate exceeded a certain threshold value. We also investigated the FRI behaviour in corners of the arena. The ultimate goal is to enable autonomous and energy-efficient manoeuvrings of the FRI within arbitrary visual environments.
Collapse
Affiliation(s)
- Jiaqi V Huang
- Department of Bioengineering, Imperial College London, London SW7 2AZ, United Kingdom
| | | | | |
Collapse
|
8
|
Lecoeur J, Dacke M, Floreano D, Baird E. The role of optic flow pooling in insect flight control in cluttered environments. Sci Rep 2019; 9:7707. [PMID: 31118454 PMCID: PMC6531491 DOI: 10.1038/s41598-019-44187-2] [Citation(s) in RCA: 25] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2018] [Accepted: 05/07/2019] [Indexed: 11/23/2022] Open
Abstract
Flight through cluttered environments, such as forests, poses great challenges for animals and machines alike because even small changes in flight path may lead to collisions with nearby obstacles. When flying along narrow corridors, insects use the magnitude of visual motion experienced in each eye to control their position, height, and speed but it is unclear how this strategy would work when the environment contains nearby obstacles against a distant background. To minimise the risk of collisions, we would expect animals to rely on the visual motion generated by only the nearby obstacles but is this the case? To answer this, we combine behavioural experiments with numerical simulations and provide the first evidence that bumblebees extract the maximum rate of image motion in the frontal visual field to steer away from obstacles. Our findings also suggest that bumblebees use different optic flow calculations to control lateral position, speed, and height.
Collapse
Affiliation(s)
- Julien Lecoeur
- Laboratory of Intelligent Systems, Institute of Microengineering, École Polytechnique Fédérale de Lausanne, Lausanne, CH-1015, Switzerland.
| | - Marie Dacke
- Lund Vision Group, Department of Biology, Lund University, Lund, SE-22362, Sweden
| | - Dario Floreano
- Laboratory of Intelligent Systems, Institute of Microengineering, École Polytechnique Fédérale de Lausanne, Lausanne, CH-1015, Switzerland
| | - Emily Baird
- Lund Vision Group, Department of Biology, Lund University, Lund, SE-22362, Sweden.,Division of Functional Morphology, Department of Zoology, Stockholm University, Stockholm, SE-10691, Sweden
| |
Collapse
|
9
|
Shi C, Dong Z, Pundlik S, Luo G. A Hardware-Friendly Optical Flow-Based Time-to-Collision Estimation Algorithm. SENSORS 2019; 19:s19040807. [PMID: 30781489 PMCID: PMC6412735 DOI: 10.3390/s19040807] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/24/2018] [Revised: 02/03/2019] [Accepted: 02/13/2019] [Indexed: 11/26/2022]
Abstract
This work proposes a hardware-friendly, dense optical flow-based Time-to-Collision (TTC) estimation algorithm intended to be deployed on smart video sensors for collision avoidance. The algorithm optimized for hardware first extracts biological visual motion features (motion energies), and then utilizes a Random Forests regressor to predict robust and dense optical flow. Finally, TTC is reliably estimated from the divergence of the optical flow field. This algorithm involves only feed-forward data flows with simple pixel-level operations, and hence has inherent parallelism for hardware acceleration. The algorithm offers good scalability, allowing for flexible tradeoffs among estimation accuracy, processing speed and hardware resource. Experimental evaluation shows that the accuracy of the optical flow estimation is improved due to the use of Random Forests compared to existing voting-based approaches. Furthermore, results show that estimated TTC values by the algorithm closely follow the ground truth. The specifics of the hardware design to implement the algorithm on a real-time embedded system are laid out.
Collapse
Affiliation(s)
- Cong Shi
- School of Microelectronics and Communication Engineering, Chongqing University, Chongqing 400044, China.
- Schepens Eye Research Institute, Massachusetts Eye and Ear, Harvard Medical School, Boston, MA 02114, USA.
| | - Zhuoran Dong
- Viterbi School of Engineering, University of Southern California, Los Angeles, CA 90089, USA.
| | - Shrinivas Pundlik
- Schepens Eye Research Institute, Massachusetts Eye and Ear, Harvard Medical School, Boston, MA 02114, USA.
| | - Gang Luo
- Schepens Eye Research Institute, Massachusetts Eye and Ear, Harvard Medical School, Boston, MA 02114, USA.
| |
Collapse
|