1
|
Goyal P, Baird E, Srinivasan MV, Muijres FT. Visual guidance of honeybees approaching a vertical landing surface. J Exp Biol 2023; 226:jeb245956. [PMID: 37589414 PMCID: PMC10482386 DOI: 10.1242/jeb.245956] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2023] [Accepted: 08/08/2023] [Indexed: 08/18/2023]
Abstract
Landing is a critical phase for flying animals, whereby many rely on visual cues to perform controlled touchdown. Foraging honeybees rely on regular landings on flowers to collect food crucial for colony survival and reproduction. Here, we explored how honeybees utilize optical expansion cues to regulate approach flight speed when landing on vertical surfaces. Three sensory-motor control models have been proposed for landings of natural flyers. Landing honeybees maintain a constant optical expansion rate set-point, resulting in a gradual decrease in approach velocity and gentile touchdown. Bumblebees exhibit a similar strategy, but they regularly switch to a new constant optical expansion rate set-point. In contrast, landing birds fly at a constant time to contact to achieve faster landings. Here, we re-examined the landing strategy of honeybees by fitting the three models to individual approach flights of honeybees landing on platforms with varying optical expansion cues. Surprisingly, the landing model identified in bumblebees proved to be the most suitable for these honeybees. This reveals that honeybees adjust their optical expansion rate in a stepwise manner. Bees flying at low optical expansion rates tend to increase their set-point stepwise, while those flying at high optical expansion rates tend to decrease it stepwise. This modular landing control system enables honeybees to land rapidly and reliably under a wide range of initial flight conditions and visual landing platform patterns. The remarkable similarity between the landing strategies of honeybees and bumblebees suggests that this may also be prevalent among other flying insects. Furthermore, these findings hold promising potential for bioinspired guidance systems in flying robots.
Collapse
Affiliation(s)
- Pulkit Goyal
- Experimental Zoology Group, Wageningen University & Research, 6708WD Wageningen, The Netherlands
| | - Emily Baird
- Department of Zoology, Stockholm University, 114 18 Stockholm, Sweden
| | - Mandyam V. Srinivasan
- Queensland Brain Institute, University of Queensland, St. Lucia, QLD 4072, Australia
| | - Florian T. Muijres
- Experimental Zoology Group, Wageningen University & Research, 6708WD Wageningen, The Netherlands
| |
Collapse
|
2
|
Fu Q. Motion perception based on ON/OFF channels: A survey. Neural Netw 2023; 165:1-18. [PMID: 37263088 DOI: 10.1016/j.neunet.2023.05.031] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/05/2022] [Revised: 04/02/2023] [Accepted: 05/17/2023] [Indexed: 06/03/2023]
Abstract
Motion perception is an essential ability for animals and artificially intelligent systems interacting effectively, safely with surrounding objects and environments. Biological visual systems, that have naturally evolved over hundreds-million years, are quite efficient and robust for motion perception, whereas artificial vision systems are far from such capability. This paper argues that the gap can be significantly reduced by formulation of ON/OFF channels in motion perception models encoding luminance increment (ON) and decrement (OFF) responses within receptive field, separately. Such signal-bifurcating structure has been found in neural systems of many animal species articulating early motion is split and processed in segregated pathways. However, the corresponding biological substrates, and the necessity for artificial vision systems have never been elucidated together, leaving concerns on uniqueness and advantages of ON/OFF channels upon building dynamic vision systems to address real world challenges. This paper highlights the importance of ON/OFF channels in motion perception through surveying current progress covering both neuroscience and computationally modelling works with applications. Compared to related literature, this paper for the first time provides insights into implementation of different selectivity to directional motion of looming, translating, and small-sized target movement based on ON/OFF channels in keeping with soundness and robustness of biological principles. Existing challenges and future trends of such bio-plausible computational structure for visual perception in connection with hotspots of machine learning, advanced vision sensors like event-driven camera finally are discussed.
Collapse
Affiliation(s)
- Qinbing Fu
- Machine Life and Intelligence Research Centre, School of Mathematics and Information Science, Guangzhou University, Guangzhou, 510006, China.
| |
Collapse
|
3
|
Borboni A, Pagani R, Sandrini S, Carbone G, Pellegrini N. Role of Reference Frames for a Safe Human-Robot Interaction. SENSORS (BASEL, SWITZERLAND) 2023; 23:5762. [PMID: 37420924 DOI: 10.3390/s23125762] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/23/2023] [Revised: 06/13/2023] [Accepted: 06/17/2023] [Indexed: 07/09/2023]
Abstract
Safety plays a key role in human-robot interactions in collaborative robot (cobot) applications. This paper provides a general procedure to guarantee safe workstations allowing human operations, robot contributions, the dynamical environment, and time-variant objects in a set of collaborative robotic tasks. The proposed methodology focuses on the contribution and the mapping of reference frames. Multiple reference frame representation agents are defined at the same time by considering egocentric, allocentric, and route-centric perspectives. The agents are processed to provide a minimal and effective assessment of the ongoing human-robot interactions. The proposed formulation is based on the generalization and proper synthesis of multiple cooperating reference frame agents at the same time. Accordingly, it is possible to achieve a real-time assessment of the safety-related implications through the implementation and fast calculation of proper safety-related quantitative indices. This allows us to define and promptly regulate the controlling parameters of the involved cobot without velocity limitations that are recognized as the main disadvantage. A set of experiments has been realized and investigated to demonstrate the feasibility and effectiveness of the research by using a seven-DOF anthropomorphic arm in combination with a psychometric test. The acquired results agree with the current literature in terms of the kinematic, position, and velocity aspects; use measurement methods based on tests provided to the operator; and introduce novel features of work cell arranging, including the use of virtual instrumentation. Finally, the associated analytical-topological treatments have enabled the development of a safe and comfortable measure to the human-robot relation with satisfactory experimental results compared to previous research. Nevertheless, the robot posture, human perception, and learning technologies would have to apply research from multidisciplinary fields such as psychology, gesture, communication, and social sciences in order to be prepared for positioning in real-world applications that offer new challenges for cobot applications.
Collapse
Affiliation(s)
- Alberto Borboni
- Mechanical and Industrial Engineering Department, Università degli Studi di Brescia, Via Branze 38, 25123 Brescia, Italy
| | - Roberto Pagani
- Mechanical and Industrial Engineering Department, Università degli Studi di Brescia, Via Branze 38, 25123 Brescia, Italy
| | - Samuele Sandrini
- STIIMA-CNR-Institute of Intelligent Industrial Technologies and System, National Researcher Council of Italy, 00185 Roma, Italy
| | - Giuseppe Carbone
- Department of Mechanical, Energy and Management Engineering, Università della Calabria, Via P. Bucci, Edificio Cubo 46C, Arcavata di Rende, 87036 Rende, Italy
| | - Nicola Pellegrini
- Mechanical and Industrial Engineering Department, Università degli Studi di Brescia, Via Branze 38, 25123 Brescia, Italy
| |
Collapse
|
4
|
Accommodating unobservability to control flight attitude with optic flow. Nature 2022; 610:485-490. [PMID: 36261554 PMCID: PMC9581779 DOI: 10.1038/s41586-022-05182-2] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2022] [Accepted: 08/02/2022] [Indexed: 11/08/2022]
Abstract
Attitude control is an essential flight capability. Whereas flying robots commonly rely on accelerometers1 for estimating attitude, flying insects lack an unambiguous sense of gravity2,3. Despite the established role of several sense organs in attitude stabilization3-5, the dependence of flying insects on an internal gravity direction estimate remains unclear. Here we show how attitude can be extracted from optic flow when combined with a motion model that relates attitude to acceleration direction. Although there are conditions such as hover in which the attitude is unobservable, we prove that the ensuing control system is still stable, continuously moving into and out of these conditions. Flying robot experiments confirm that accommodating unobservability in this manner leads to stable, but slightly oscillatory, attitude control. Moreover, experiments with a bio-inspired flapping-wing robot show that residual, high-frequency attitude oscillations from flapping motion improve observability. The presented approach holds a promise for robotics, with accelerometer-less autopilots paving the road for insect-scale autonomous flying robots6. Finally, it forms a hypothesis on insect attitude estimation and control, with the potential to provide further insight into known biological phenomena5,7,8 and to generate new predictions such as reduced head and body attitude variance at higher flight speeds9.
Collapse
|
5
|
Embrace wobble to level flight without a horizon. Nature 2022; 610:455-457. [PMID: 36261545 DOI: 10.1038/d41586-022-03217-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
6
|
Abstract
Autonomous robots are expected to perform a wide range of sophisticated tasks in complex, unknown environments. However, available onboard computing capabilities and algorithms represent a considerable obstacle to reaching higher levels of autonomy, especially as robots get smaller and the end of Moore's law approaches. Here, we argue that inspiration from insect intelligence is a promising alternative to classic methods in robotics for the artificial intelligence (AI) needed for the autonomy of small, mobile robots. The advantage of insect intelligence stems from its resource efficiency (or parsimony) especially in terms of power and mass. First, we discuss the main aspects of insect intelligence underlying this parsimony: embodiment, sensory-motor coordination, and swarming. Then, we take stock of where insect-inspired AI stands as an alternative to other approaches to important robotic tasks such as navigation and identify open challenges on the road to its more widespread adoption. Last, we reflect on the types of processors that are suitable for implementing insect-inspired AI, from more traditional ones such as microcontrollers and field-programmable gate arrays to unconventional neuromorphic processors. We argue that even for neuromorphic processors, one should not simply apply existing AI algorithms but exploit insights from natural insect intelligence to get maximally efficient AI for robot autonomy.
Collapse
Affiliation(s)
- G C H E de Croon
- Micro Air Vehicle Laboratory, Faculty of Aerospace Engineering, TU Delft, Delft, Netherlands
| | - J J G Dupeyroux
- Micro Air Vehicle Laboratory, Faculty of Aerospace Engineering, TU Delft, Delft, Netherlands
| | - S B Fuller
- Autonomous Insect Robotics Laboratory, Department of Mechanical Engineering and Paul G. Allen School of Computer Science, University of Washington, Seattle, WA, USA
| | - J A R Marshall
- Opteran Technologies, Sheffield, UK
- Complex Systems Modeling Group, Department of Computer Science, University of Sheffield, Sheffield, UK
| |
Collapse
|
7
|
Fabian ST, Sumner ME, Wardill TJ, Gonzalez-Bellido PT. Avoiding obstacles while intercepting a moving target: a miniature fly's solution. J Exp Biol 2022; 225:274211. [PMID: 35168251 PMCID: PMC8920034 DOI: 10.1242/jeb.243568] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2021] [Accepted: 12/14/2021] [Indexed: 11/20/2022]
Abstract
The miniature robber fly Holcocephala fusca intercepts its targets with behaviour that is approximated by the proportional navigation guidance law. During predatory trials, we challenged the interception of H. fusca performance by placing a large object in its potential flight path. In response, H. fusca deviated from the path predicted by pure proportional navigation, but in many cases still eventually contacted the target. We show that such flight deviations can be explained as the output of two competing navigational systems: pure-proportional navigation and a simple obstacle avoidance algorithm. Obstacle avoidance by H. fusca is here described by a simple feedback loop that uses the visual expansion of the approaching obstacle to mediate the magnitude of the turning-away response. We name the integration of this steering law with proportional navigation 'combined guidance'. The results demonstrate that predatory intent does not operate a monopoly on the fly's steering when attacking a target, and that simple guidance combinations can explain obstacle avoidance during interceptive tasks.
Collapse
Affiliation(s)
- Samuel T Fabian
- Department of Physiology, Development, and Neuroscience, University of Cambridge, Cambridge CB2 3EG, UK.,Department of Bioengineering, Imperial College London, London SW7 2AZ, UK
| | - Mary E Sumner
- Department of Ecology, Evolution and Behaviour, University of Minnesota, Saint Paul, MN 55108, USA
| | - Trevor J Wardill
- Department of Ecology, Evolution and Behaviour, University of Minnesota, Saint Paul, MN 55108, USA
| | | |
Collapse
|
8
|
Goulard R, Verbe A, Vercher JL, Viollet S. Role of the light source position in freely falling hoverflies' stabilization performances. Biol Lett 2019; 14:rsbl.2018.0051. [PMID: 29794004 DOI: 10.1098/rsbl.2018.0051] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2018] [Accepted: 04/30/2018] [Indexed: 11/12/2022] Open
Abstract
The stabilization of plummeting hoverflies was filmed and analysed in terms of their wingbeat initiation times as well as the crash and stabilization rates. The flies experienced near-weightlessness for a period of time that depended on their ability to counteract the free fall by triggering their wingbeats. In this paradigm, hoverflies' flight stabilization strategies were investigated here for the first time under two different positions of the light source (overhead and bottom lighting). The crash rates were higher in bottom lighting conditions than with top lighting. In addition, adding a texture to the walls reduced the crash rates only in the overhead lighting condition. The position of the lighting also significantly affected both the stabilization rates and the time taken by the flies to stabilize, which decreased and increased under bottom lighting conditions, respectively, whereas textured walls increased the stabilization rates under both lighting conditions. These results support the idea that flies may mainly base their flight control strategy on visual cues and particularly that the light distribution in the visual field may provide reliable, efficient cues for estimating their orientation with respect to an allocentric reference frame. In addition, the finding that the hoverflies' optic flow-based motion detection ability is affected by the position of the light source in their visual field suggests the occurrence of interactions between movement perception and this visual vertical perception process.
Collapse
Affiliation(s)
- Roman Goulard
- Aix-Marseille Université, CNRS, ISM UMR 7287, Marseille 13009, France
| | - Anna Verbe
- Aix-Marseille Université, CNRS, ISM UMR 7287, Marseille 13009, France
| | | | - Stéphane Viollet
- Aix-Marseille Université, CNRS, ISM UMR 7287, Marseille 13009, France
| |
Collapse
|
9
|
Cheng Y, Cao J, Zhang Y, Hao Q. Review of state-of-the-art artificial compound eye imaging systems. BIOINSPIRATION & BIOMIMETICS 2019; 14:031002. [PMID: 30654337 DOI: 10.1088/1748-3190/aaffb5] [Citation(s) in RCA: 34] [Impact Index Per Article: 6.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
The natural compound eye has received much attention in recent years due to its remarkable properties, such as its large field of view (FOV), compact structure, and high sensitivity to moving objects. Many studies have been devoted to mimicking the imaging system of the natural compound eye. The paper gives a review of state-of-the-art artificial compound eye imaging systems. Firstly, we introduce the imaging principle of three types of natural compound eye. Then, we divide current artificial compound eye imaging systems into four categories according to the difference of structural composition. Readers can easily grasp methods to build an artificial compound eye imaging system from the perspective of structural composition. Moreover, we compare the imaging performance of state-of-the-art artificial compound eye imaging systems, which provides a reference for readers to design system parameters of an artificial compound eye imaging system. Next, we present the applications of the artificial compound eye imaging system including imaging with a large FOV, imaging with high resolution, object distance detection, medical imaging, egomotion estimation, and navigation. Finally, an outlook of the artificial compound eye imaging system is highlighted.
Collapse
Affiliation(s)
- Yang Cheng
- Key Laboratory of Biomimetic Robots and Systems, Ministry of Education, Beijing Institute of Technology, Beijing, People's Republic of China
| | | | | | | |
Collapse
|
10
|
Fu Q, Wang H, Hu C, Yue S. Towards Computational Models and Applications of Insect Visual Systems for Motion Perception: A Review. ARTIFICIAL LIFE 2019; 25:263-311. [PMID: 31397604 DOI: 10.1162/artl_a_00297] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Motion perception is a critical capability determining a variety of aspects of insects' life, including avoiding predators, foraging, and so forth. A good number of motion detectors have been identified in the insects' visual pathways. Computational modeling of these motion detectors has not only been providing effective solutions to artificial intelligence, but also benefiting the understanding of complicated biological visual systems. These biological mechanisms through millions of years of evolutionary development will have formed solid modules for constructing dynamic vision systems for future intelligent machines. This article reviews the computational motion perception models originating from biological research on insects' visual systems in the literature. These motion perception models or neural networks consist of the looming-sensitive neuronal models of lobula giant movement detectors (LGMDs) in locusts, the translation-sensitive neural systems of direction-selective neurons (DSNs) in fruit flies, bees, and locusts, and the small-target motion detectors (STMDs) in dragonflies and hoverflies. We also review the applications of these models to robots and vehicles. Through these modeling studies, we summarize the methodologies that generate different direction and size selectivity in motion perception. Finally, we discuss multiple systems integration and hardware realization of these bio-inspired motion perception models.
Collapse
Affiliation(s)
- Qinbing Fu
- Guangzhou University, School of Mechanical and Electrical Engineering; Machine Life and Intelligence Research Centre
- University of Lincoln, Computational Intelligence Lab, School of Computer Science; Lincoln Centre for Autonomous Systems.
| | - Hongxin Wang
- University of Lincoln, Computational Intelligence Lab, School of Computer Science; Lincoln Centre for Autonomous Systems.
| | - Cheng Hu
- Guangzhou University, School of Mechanical and Electrical Engineering; Machine Life and Intelligence Research Centre
- University of Lincoln, Computational Intelligence Lab, School of Computer Science; Lincoln Centre for Autonomous Systems.
| | - Shigang Yue
- Guangzhou University, School of Mechanical and Electrical Engineering; Machine Life and Intelligence Research Centre
- University of Lincoln, Computational Intelligence Lab, School of Computer Science; Lincoln Centre for Autonomous Systems.
| |
Collapse
|
11
|
Serres JR, Viollet S. Insect-inspired vision for autonomous vehicles. CURRENT OPINION IN INSECT SCIENCE 2018; 30:46-51. [PMID: 30553484 DOI: 10.1016/j.cois.2018.09.005] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/15/2018] [Revised: 09/11/2018] [Accepted: 09/14/2018] [Indexed: 06/09/2023]
Abstract
Flying insects are being studied these days as if they were agile micro air vehicles fitted with smart sensors, requiring very few brain resources. The findings obtained on these natural fliers have proved to be extremely valuable when it comes to designing compact low-weight artificial optical sensors capable of performing visual processing tasks robustly under various environmental conditions (light, clouds, contrast). Here, we review some outstanding bio-inspired visual sensors, which can be used for either detecting motion in the visible spectrum or controlling celestial navigation in the ultraviolet spectrum and for attitude stabilisation purposes. Biologically inspired visual sensors do not have to comprise a very large number of pixels: they are able to perform both short and long range navigation tasks surprisingly well with just a few pixels and a weak resolution.
Collapse
|
12
|
Li L, Hao Y, Xu J, Liu F, Lu J. The Design and Positioning Method of a Flexible Zoom Artificial Compound Eye. MICROMACHINES 2018; 9:E319. [PMID: 30424252 PMCID: PMC6082292 DOI: 10.3390/mi9070319] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/22/2018] [Revised: 06/13/2018] [Accepted: 06/19/2018] [Indexed: 12/18/2022]
Abstract
The focal lengths of the sub-eyes in a single-layer uniform curved compound eye are all the same, resulting in poor imaging quality for the compound eye. A non-uniform curved compound eye can effectively solve the problem of poor edge-imaging quality, however, it suffers from a large spherical aberration, and is unable to achieve zoom imaging. To solve these problems, a new type of aspherical artificial compound eye structure with variable focal length is proposed in this paper. The structure divides the surface compound eye into three fan-shaped areas with different focal lengths of the microlens in different areas, which allow the artificial compound eye to zoom in a certain range. The focal length and size of the microlens is determined by the area and the location of the microlens. The aspherical optimization of the microlens is calculated, and spherical aberration in each area is reduced to one percent of the initial value. Through simulation analysis, the designed artificial compound eye structure realizes focal length adjustment and effectively reduces the problem of the poor imaging quality of the curved compound eye edge. As a result, an aspherical artificial compound eye sample-where the number of sub-eyes is n = 61, and the diameter of the base is Φ = 8.66 mm-was prepared by using a molding method. Additionally, the mutual relationship between the eyes of the child was calibrated, and hence, a mathematical model for the simultaneous identification of multiple sub-eyes was established. This study set up an experimental artificial compound eye positioning system, and through a number of microlens capture target point settlement coordinates, achieved an error value of less than 10%.
Collapse
Affiliation(s)
- Lun Li
- Department of Equipment Engineering, Shenyang Ligong University, Shenyang 110159, China.
- Technology Center of Computer-aided Design & Manufacturing (CAD/CAM), Shenyang Ligong University, Shenyang 110159, China.
| | - Yongping Hao
- Department of Equipment Engineering, Shenyang Ligong University, Shenyang 110159, China.
- Technology Center of Computer-aided Design & Manufacturing (CAD/CAM), Shenyang Ligong University, Shenyang 110159, China.
| | - Jiulong Xu
- Department of Equipment Engineering, Shenyang Ligong University, Shenyang 110159, China.
| | - Fengli Liu
- Technology Center of Computer-aided Design & Manufacturing (CAD/CAM), Shenyang Ligong University, Shenyang 110159, China.
- Department of Mechanical Engineering, Shenyang Ligong University, Shenyang 110159, China.
| | - Jiang Lu
- Fuyao Glass Industry Group Co., Ltd., Shenyang 110159, China.
| |
Collapse
|
13
|
Schroeder TBH, Houghtaling J, Wilts BD, Mayer M. It's Not a Bug, It's a Feature: Functional Materials in Insects. ADVANCED MATERIALS (DEERFIELD BEACH, FLA.) 2018; 30:e1705322. [PMID: 29517829 DOI: 10.1002/adma.201705322] [Citation(s) in RCA: 69] [Impact Index Per Article: 11.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/15/2017] [Revised: 11/15/2017] [Indexed: 05/25/2023]
Abstract
Over the course of their wildly successful proliferation across the earth, the insects as a taxon have evolved enviable adaptations to their diverse habitats, which include adhesives, locomotor systems, hydrophobic surfaces, and sensors and actuators that transduce mechanical, acoustic, optical, thermal, and chemical signals. Insect-inspired designs currently appear in a range of contexts, including antireflective coatings, optical displays, and computing algorithms. However, as over one million distinct and highly specialized species of insects have colonized nearly all habitable regions on the planet, they still provide a largely untapped pool of unique problem-solving strategies. With the intent of providing materials scientists and engineers with a muse for the next generation of bioinspired materials, here, a selection of some of the most spectacular adaptations that insects have evolved is assembled and organized by function. The insects presented display dazzling optical properties as a result of natural photonic crystals, precise hierarchical patterns that span length scales from nanometers to millimeters, and formidable defense mechanisms that deploy an arsenal of chemical weaponry. Successful mimicry of these adaptations may facilitate technological solutions to as wide a range of problems as they solve in the insects that originated them.
Collapse
Affiliation(s)
- Thomas B H Schroeder
- Department of Chemical Engineering, University of Michigan, 2300 Hayward Street, Ann Arbor, MI, 48109, USA
- Adolphe Merkle Institute, University of Fribourg, Chemin des Verdiers 4, 1700, Fribourg, Switzerland
| | - Jared Houghtaling
- Adolphe Merkle Institute, University of Fribourg, Chemin des Verdiers 4, 1700, Fribourg, Switzerland
- Department of Biomedical Engineering, University of Michigan, 2200 Bonisteel Boulevard, Ann Arbor, MI, 48109, USA
| | - Bodo D Wilts
- Adolphe Merkle Institute, University of Fribourg, Chemin des Verdiers 4, 1700, Fribourg, Switzerland
| | - Michael Mayer
- Adolphe Merkle Institute, University of Fribourg, Chemin des Verdiers 4, 1700, Fribourg, Switzerland
| |
Collapse
|
14
|
Spatial Encoding of Translational Optic Flow in Planar Scenes by Elementary Motion Detector Arrays. Sci Rep 2018; 8:5821. [PMID: 29643402 PMCID: PMC5895815 DOI: 10.1038/s41598-018-24162-z] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/09/2017] [Accepted: 03/28/2018] [Indexed: 02/02/2023] Open
Abstract
Elementary Motion Detectors (EMD) are well-established models of visual motion estimation in insects. The response of EMDs are tuned to specific temporal and spatial frequencies of the input stimuli, which matches the behavioural response of insects to wide-field image rotation, called the optomotor response. However, other behaviours, such as speed and position control, cannot be fully accounted for by EMDs because these behaviours are largely unaffected by image properties and appear to be controlled by the ratio between the flight speed and the distance to an object, defined here as relative nearness. We present a method that resolves this inconsistency by extracting an unambiguous estimate of relative nearness from the output of an EMD array. Our method is suitable for estimation of relative nearness in planar scenes such as when flying above the ground or beside large flat objects. We demonstrate closed loop control of the lateral position and forward velocity of a simulated agent flying in a corridor. This finding may explain how insects can measure relative nearness and control their flight despite the frequency tuning of EMDs. Our method also provides engineers with a relative nearness estimation technique that benefits from the low computational cost of EMDs.
Collapse
|
15
|
Goulard R, Vercher JL, Viollet S. Modeling visual-based pitch, lift and speed control strategies in hoverflies. PLoS Comput Biol 2018; 14:e1005894. [PMID: 29361632 PMCID: PMC5780187 DOI: 10.1371/journal.pcbi.1005894] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2017] [Accepted: 11/12/2017] [Indexed: 11/19/2022] Open
Abstract
To avoid crashing onto the floor, a free falling fly needs to trigger its wingbeats quickly and control the orientation of its thrust accurately and swiftly to stabilize its pitch and hence its speed. Behavioural data have suggested that the vertical optic flow produced by the fall and crossing the visual field plays a key role in this anti-crash response. Free fall behavior analyses have also suggested that flying insect may not rely on graviception to stabilize their flight. Based on these two assumptions, we have developed a model which accounts for hoverflies´ position and pitch orientation recorded in 3D with a fast stereo camera during experimental free falls. Our dynamic model shows that optic flow-based control combined with closed-loop control of the pitch suffice to stabilize the flight properly. In addition, our model sheds a new light on the visual-based feedback control of fly´s pitch, lift and thrust. Since graviceptive cues are possibly not used by flying insects, the use of a vertical reference to control the pitch is discussed, based on the results obtained on a complete dynamic model of a virtual fly falling in a textured corridor. This model would provide a useful tool for understanding more clearly how insects may or not estimate their absolute attitude. On the basis of vision-based feedback control of optic flow occurring during insects’ flight, we developed a dynamic model that accounts for the pitch orientation and speed in plummeting flies. We compared the hoverflies’ responses with our model and showed that an optic-flow based control strategy can be used to correct the initial pitch misorientation caused by the free fall situation. To complete the model, we combined the closed-loop control of the vertical optic flow with an additional feedback control loop based on the value of the absolute pitch orientation. The need for this measurement to stabilize the pitch orientation raises the question as whether this is also the case in dipterans. After ruling out the possibility that insects may use gravity acceleration cues to control their flight, for which no experimental evidence has been found so far, we discussed the three main sensory processes possibly involved in in their ability to control their attitude. Our model provides a useful tool for studying the various sensory processes possibly involved in dipterans’ flight stabilization abilities as well as the interactions between these processes.
Collapse
|
16
|
Serres JR, Ruffier F. Optic flow-based collision-free strategies: From insects to robots. ARTHROPOD STRUCTURE & DEVELOPMENT 2017; 46:703-717. [PMID: 28655645 DOI: 10.1016/j.asd.2017.06.003] [Citation(s) in RCA: 64] [Impact Index Per Article: 9.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/07/2016] [Revised: 06/19/2017] [Accepted: 06/19/2017] [Indexed: 06/07/2023]
Abstract
Flying insects are able to fly smartly in an unpredictable environment. It has been found that flying insects have smart neurons inside their tiny brains that are sensitive to visual motion also called optic flow. Consequently, flying insects rely mainly on visual motion during their flight maneuvers such as: takeoff or landing, terrain following, tunnel crossing, lateral and frontal obstacle avoidance, and adjusting flight speed in a cluttered environment. Optic flow can be defined as the vector field of the apparent motion of objects, surfaces, and edges in a visual scene generated by the relative motion between an observer (an eye or a camera) and the scene. Translational optic flow is particularly interesting for short-range navigation because it depends on the ratio between (i) the relative linear speed of the visual scene with respect to the observer and (ii) the distance of the observer from obstacles in the surrounding environment without any direct measurement of either speed or distance. In flying insects, roll stabilization reflex and yaw saccades attenuate any rotation at the eye level in roll and yaw respectively (i.e. to cancel any rotational optic flow) in order to ensure pure translational optic flow between two successive saccades. Our survey focuses on feedback-loops which use the translational optic flow that insects employ for collision-free navigation. Optic flow is likely, over the next decade to be one of the most important visual cues that can explain flying insects' behaviors for short-range navigation maneuvers in complex tunnels. Conversely, the biorobotic approach can therefore help to develop innovative flight control systems for flying robots with the aim of mimicking flying insects' abilities and better understanding their flight.
Collapse
|
17
|
Altitude control in honeybees: joint vision-based learning and guidance. Sci Rep 2017; 7:9231. [PMID: 28835634 PMCID: PMC5569062 DOI: 10.1038/s41598-017-09112-5] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2016] [Accepted: 07/24/2017] [Indexed: 11/15/2022] Open
Abstract
Studies on insects’ visual guidance systems have shed little light on how learning contributes to insects’ altitude control system. In this study, honeybees were trained to fly along a double-roofed tunnel after entering it near either the ceiling or the floor of the tunnel. The honeybees trained to hug the ceiling therefore encountered a sudden change in the tunnel configuration midways: i.e. a "dorsal ditch". Thus, the trained honeybees met a sudden increase in the distance to the ceiling, corresponding to a sudden strong change in the visual cues available in their dorsal field of view. Honeybees reacted by rising quickly and hugging the new, higher ceiling, keeping a similar forward speed, distance to the ceiling and dorsal optic flow to those observed during the training step; whereas bees trained to follow the floor kept on following the floor regardless of the change in the ceiling height. When trained honeybees entered the tunnel via the other entry (the lower or upper entry) to that used during the training step, they quickly changed their altitude and hugged the surface they had previously learned to follow. These findings clearly show that trained honeybees control their altitude based on visual cues memorized during training. The memorized visual cues generated by the surfaces followed form a complex optic flow pattern: trained honeybees may attempt to match the visual cues they perceive with this memorized optic flow pattern by controlling their altitude.
Collapse
|
18
|
Weibel D, Lawrence D, Palo S. Optical Beacon Sensor for Small Unmanned Aerial System State Estimation. J FIELD ROBOT 2017. [DOI: 10.1002/rob.21648] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Affiliation(s)
| | | | - Scott Palo
- University of Colorado; Boulder Colorado 80309
| |
Collapse
|
19
|
Vanhoutte E, Mafrica S, Ruffier F, Bootsma RJ, Serres J. Time-of-Travel Methods for Measuring Optical Flow on Board a Micro Flying Robot. SENSORS 2017; 17:s17030571. [PMID: 28287484 PMCID: PMC5375857 DOI: 10.3390/s17030571] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/30/2016] [Revised: 03/08/2017] [Accepted: 03/09/2017] [Indexed: 11/22/2022]
Abstract
For use in autonomous micro air vehicles, visual sensors must not only be small, lightweight and insensitive to light variations; on-board autopilots also require fast and accurate optical flow measurements over a wide range of speeds. Using an auto-adaptive bio-inspired Michaelis–Menten Auto-adaptive Pixel (M2APix) analog silicon retina, in this article, we present comparative tests of two optical flow calculation algorithms operating under lighting conditions from 6×10−7 to 1.6×10−2 W·cm−2 (i.e., from 0.2 to 12,000 lux for human vision). Contrast “time of travel” between two adjacent light-sensitive pixels was determined by thresholding and by cross-correlating the two pixels’ signals, with measurement frequency up to 5 kHz for the 10 local motion sensors of the M2APix sensor. While both algorithms adequately measured optical flow between 25 ∘/s and 1000 ∘/s, thresholding gave rise to a lower precision, especially due to a larger number of outliers at higher speeds. Compared to thresholding, cross-correlation also allowed for a higher rate of optical flow output (99 Hz and 1195 Hz, respectively) but required substantially more computational resources.
Collapse
Affiliation(s)
- Erik Vanhoutte
- Aix-Marseille Université, CNRS, ISM UMR7287, 13288 Marseille Cedex 09, France.
| | - Stefano Mafrica
- Aix-Marseille Université, CNRS, ISM UMR7287, 13288 Marseille Cedex 09, France.
| | - Franck Ruffier
- Aix-Marseille Université, CNRS, ISM UMR7287, 13288 Marseille Cedex 09, France.
| | - Reinoud J Bootsma
- Aix-Marseille Université, CNRS, ISM UMR7287, 13288 Marseille Cedex 09, France.
| | - Julien Serres
- Aix-Marseille Université, CNRS, ISM UMR7287, 13288 Marseille Cedex 09, France.
| |
Collapse
|
20
|
Campos ISG, Nascimento ER, Freitas GM, Chaimowicz L. A Height Estimation Approach for Terrain Following Flights from Monocular Vision. SENSORS 2016; 16:s16122071. [PMID: 27929424 PMCID: PMC5191052 DOI: 10.3390/s16122071] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/05/2016] [Revised: 11/22/2016] [Accepted: 11/29/2016] [Indexed: 11/16/2022]
Abstract
In this paper, we present a monocular vision-based height estimation algorithm for terrain following flights. The impressive growth of Unmanned Aerial Vehicle (UAV) usage, notably in mapping applications, will soon require the creation of new technologies to enable these systems to better perceive their surroundings. Specifically, we chose to tackle the terrain following problem, as it is still unresolved for consumer available systems. Virtually every mapping aircraft carries a camera; therefore, we chose to exploit this in order to use presently available hardware to extract the height information toward performing terrain following flights. The proposed methodology consists of using optical flow to track features from videos obtained by the UAV, as well as its motion information to estimate the flying height. To determine if the height estimation is reliable, we trained a decision tree that takes the optical flow information as input and classifies whether the output is trustworthy or not. The classifier achieved accuracies of 80% for positives and 90% for negatives, while the height estimation algorithm presented good accuracy.
Collapse
Affiliation(s)
- Igor S G Campos
- Department of Computer Science, Federal University of Minas Gerais, Belo Horizonte 31270-901, Brazil.
| | - Erickson R Nascimento
- Department of Computer Science, Federal University of Minas Gerais, Belo Horizonte 31270-901, Brazil.
| | - Gustavo M Freitas
- Department of Automation and Process Integration, Vale Institute of Technology, Ouro Preto 35400-000, Brazil.
| | - Luiz Chaimowicz
- Department of Computer Science, Federal University of Minas Gerais, Belo Horizonte 31270-901, Brazil.
| |
Collapse
|
21
|
Mafrica S, Servel A, Ruffier F. Minimalistic optic flow sensors applied to indoor and outdoor visual guidance and odometry on a car-like robot. BIOINSPIRATION & BIOMIMETICS 2016; 11:066007. [PMID: 27831937 DOI: 10.1088/1748-3190/11/6/066007] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
Here we present a novel bio-inspired optic flow (OF) sensor and its application to visual guidance and odometry on a low-cost car-like robot called BioCarBot. The minimalistic OF sensor was robust to high-dynamic-range lighting conditions and to various visual patterns encountered thanks to its M2APIX auto-adaptive pixels and the new cross-correlation OF algorithm implemented. The low-cost car-like robot estimated its velocity and steering angle, and therefore its position and orientation, via an extended Kalman filter (EKF) using only two downward-facing OF sensors and the Ackerman steering model. Indoor and outdoor experiments were carried out in which the robot was driven in the closed-loop mode based on the velocity and steering angle estimates. The experimental results obtained show that our novel OF sensor can deliver high-frequency measurements ([Formula: see text]) in a wide OF range (1.5-[Formula: see text]) and in a 7-decade high-dynamic light level range. The OF resolution was constant and could be adjusted as required (up to [Formula: see text]), and the OF precision obtained was relatively high (standard deviation of [Formula: see text] with an average OF of [Formula: see text], under the most demanding lighting conditions). An EKF-based algorithm gave the robot's position and orientation with a relatively high accuracy (maximum errors outdoors at a very low light level: [Formula: see text] and [Formula: see text] over about [Formula: see text] and [Formula: see text]) despite the low-resolution control systems of the steering servo and the DC motor, as well as a simplified model identification and calibration. Finally, the minimalistic OF-based odometry results were compared to those obtained using measurements based on an inertial measurement unit (IMU) and a motor's speed sensor.
Collapse
Affiliation(s)
- Stefano Mafrica
- PSA Peugeot Citroën, 78140 Vélizy-Villacoublay, France. Aix-Marseille Univ., CNRS, ISM, Inst. Movement Sci., Marseille, France
| | | | | |
Collapse
|
22
|
Bio-Inspired Principles Applied to the Guidance, Navigation and Control of UAS. AEROSPACE 2016. [DOI: 10.3390/aerospace3030021] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
|
23
|
Pericet-Camara R, Dobrzynski MK, Juston R, Viollet S, Leitel R, Mallot HA, Floreano D. An artificial elementary eye with optic flow detection and compositional properties. J R Soc Interface 2016. [PMID: 26202684 DOI: 10.1098/rsif.2015.0414] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
We describe a 2 mg artificial elementary eye whose structure and functionality is inspired by compound eye ommatidia. Its optical sensitivity and electronic architecture are sufficient to generate the required signals for the measurement of local optic flow vectors in multiple directions. Multiple elementary eyes can be assembled to create a compound vision system of desired shape and curvature spanning large fields of view. The system configurability is validated with the fabrication of a flexible linear array of artificial elementary eyes capable of extracting optic flow over multiple visual directions.
Collapse
Affiliation(s)
- Ramon Pericet-Camara
- Laboratory of Intelligent Systems, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Michal K Dobrzynski
- Laboratory of Intelligent Systems, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| | - Raphaël Juston
- Aix-Marseille Université, CNRS, ISM UMR7287, 13288, Marseille CEDEX 09, France
| | - Stéphane Viollet
- Aix-Marseille Université, CNRS, ISM UMR7287, 13288, Marseille CEDEX 09, France
| | - Robert Leitel
- Fraunhofer Institute for Applied Optics and Precision Engineering, Jena, Germany
| | - Hanspeter A Mallot
- Laboratory of Cognitive Systems, Department of Biology, University of Tübingen, Tübingen, Germany
| | - Dario Floreano
- Laboratory of Intelligent Systems, École Polytechnique Fédérale de Lausanne, Lausanne, Switzerland
| |
Collapse
|
24
|
Shyy W, Kang CK, Chirarattananon P, Ravi S, Liu H. Aerodynamics, sensing and control of insect-scale flapping-wing flight. Proc Math Phys Eng Sci 2016; 472:20150712. [PMID: 27118897 PMCID: PMC4841661 DOI: 10.1098/rspa.2015.0712] [Citation(s) in RCA: 37] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2015] [Accepted: 01/04/2016] [Indexed: 11/12/2022] Open
Abstract
There are nearly a million known species of flying insects and 13 000 species of flying warm-blooded vertebrates, including mammals, birds and bats. While in flight, their wings not only move forward relative to the air, they also flap up and down, plunge and sweep, so that both lift and thrust can be generated and balanced, accommodate uncertain surrounding environment, with superior flight stability and dynamics with highly varied speeds and missions. As the size of a flyer is reduced, the wing-to-body mass ratio tends to decrease as well. Furthermore, these flyers use integrated system consisting of wings to generate aerodynamic forces, muscles to move the wings, and sensing and control systems to guide and manoeuvre. In this article, recent advances in insect-scale flapping-wing aerodynamics, flexible wing structures, unsteady flight environment, sensing, stability and control are reviewed with perspective offered. In particular, the special features of the low Reynolds number flyers associated with small sizes, thin and light structures, slow flight with comparable wind gust speeds, bioinspired fabrication of wing structures, neuron-based sensing and adaptive control are highlighted.
Collapse
Affiliation(s)
- Wei Shyy
- Department of Mechanical and Aerospace Engineering, Hong Kong University of Science and Technology, Clear Water Bay, Hong Kong
| | - Chang-kwon Kang
- Department of Mechanical and Aerospace Engineering, University of Alabama in Huntsville, Huntsville, AL, USA
| | - Pakpong Chirarattananon
- Department of Mechanical and Biomedical Engineering, City University of Hong Kong, Kowloon Tong, Hong Kong
| | - Sridhar Ravi
- Graduate School of Engineering, Chiba University, Chiba, Japan
- School of Aerospace, Mechanical and Manufacturing Engineering, RMIT University, Melbourne, Victoria, Australia
| | - Hao Liu
- School of Aerospace, Mechanical and Manufacturing Engineering, RMIT University, Melbourne, Victoria, Australia
- Shanghai-Jiao Tong University and Chiba, University International Cooperative Research Centre (SJTU-CU ICRC), Minhang, Shanghai, China
| |
Collapse
|
25
|
de Croon GCHE. Monocular distance estimation with optical flow maneuvers and efference copies: a stability-based strategy. BIOINSPIRATION & BIOMIMETICS 2016; 11:016004. [PMID: 26740501 DOI: 10.1088/1748-3190/11/1/016004] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
The visual cue of optical flow plays an important role in the navigation of flying insects, and is increasingly studied for use by small flying robots as well. A major problem is that successful optical flow control seems to require distance estimates, while optical flow is known to provide only the ratio of velocity to distance. In this article, a novel, stability-based strategy is proposed for monocular distance estimation, relying on optical flow maneuvers and knowledge of the control inputs (efference copies). It is shown analytically that given a fixed control gain, the stability of a constant divergence control loop only depends on the distance to the approached surface. At close distances, the control loop starts to exhibit self-induced oscillations. The robot can detect these oscillations and hence be aware of the distance to the surface. The proposed stability-based strategy for estimating distances has two main attractive characteristics. First, self-induced oscillations can be detected robustly by the robot and are hardly influenced by wind. Second, the distance can be estimated during a zero divergence maneuver, i.e., around hover. The stability-based strategy is implemented and tested both in simulation and on board a Parrot AR drone 2.0. It is shown that the strategy can be used to: (1) trigger a final approach response during a constant divergence landing with fixed gain, (2) estimate the distance in hover, and (3) estimate distances during an entire landing if the robot uses adaptive gain control to continuously stay on the 'edge of oscillation.'
Collapse
Affiliation(s)
- Guido C H E de Croon
- Micro Air Vehicle Laboratory, Control and Simulation, Faculty of Aerospace Engineering, TU Delft, the Netherlands
| |
Collapse
|