1
|
Treidel LA, Deem KD, Salcedo MK, Dickinson MH, Bruce HS, Darveau CA, Dickerson BH, Ellers O, Glass JR, Gordon CM, Harrison JF, Hedrick TL, Johnson MG, Lebenzon JE, Marden JH, Niitepõld K, Sane SP, Sponberg S, Talal S, Williams CM, Wold ES. Insect Flight: State of the Field and Future Directions. Integr Comp Biol 2024; 64:icae106. [PMID: 38982327 PMCID: PMC11406162 DOI: 10.1093/icb/icae106] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/11/2024] Open
Abstract
The evolution of flight in an early winged insect ancestral lineage is recognized as a key adaptation explaining the unparalleled success and diversification of insects. Subsequent transitions and modifications to flight machinery, including secondary reductions and losses, also play a central role in shaping the impacts of insects on broadscale geographic and ecological processes and patterns in the present and future. Given the importance of insect flight, there has been a centuries-long history of research and debate on the evolutionary origins and biological mechanisms of flight. Here, we revisit this history from an interdisciplinary perspective, discussing recent discoveries regarding the developmental origins, physiology, biomechanics, and neurobiology and sensory control of flight in a diverse set of insect models. We also identify major outstanding questions yet to be addressed and provide recommendations for overcoming current methodological challenges faced when studying insect flight, which will allow the field to continue to move forward in new and exciting directions. By integrating mechanistic work into ecological and evolutionary contexts, we hope that this synthesis promotes and stimulates new interdisciplinary research efforts necessary to close the many existing gaps about the causes and consequences of insect flight evolution.
Collapse
Affiliation(s)
- Lisa A Treidel
- School of Biological Sciences, University of Nebraska, Lincoln, Lincoln NE, 68588, USA
| | - Kevin D Deem
- Department of Biology, University of Rochester, Rochester NY, 14627, USA
| | - Mary K Salcedo
- Department of Biological and Environmental Engineering, Cornell University, Ithaca NY, 14853, USA
| | - Michael H Dickinson
- Department of Bioengineering, California Institute of Technology, Pasadena CA 91125, USA
| | | | - Charles-A Darveau
- Department of Biology, University of Ottawa, Ottawa Ontario, K1N 6N5, Canada
| | - Bradley H Dickerson
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ 08544, USA
| | - Olaf Ellers
- Biology Department, Bowdoin College, Brunswick, ME 04011, USA
| | - Jordan R Glass
- Department of Zoology & Physiology, University of Wyoming, Laramie, WY 82070, USA
| | - Caleb M Gordon
- Department of Earth and Planetary Sciences, Yale University, New Haven, CT 06520-8109, USA
| | - Jon F Harrison
- School of Life Sciences, Arizona State University, Tempe, AZ 85287-4501, USA
| | - Tyson L Hedrick
- Department of Biology, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599, USA
| | - Meredith G Johnson
- School of Life Sciences, Arizona State University, Tempe, AZ 85287-4501, USA
| | - Jacqueline E Lebenzon
- Department of Integrative Biology, University of California, Berkeley, Berkeley CA, 94720, USA
| | - James H Marden
- Department of Biology, Pennsylvania State University, University Park, PA 16803, USA
| | | | - Sanjay P Sane
- National Center for Biological Sciences, Tata Institute of Fundamental Research, Bangalore 560065 India
| | - Simon Sponberg
- School of Biological Sciences, Georgia Institute of Technology, Atlanta, GA 30332, USA
| | - Stav Talal
- School of Life Sciences, Arizona State University, Tempe, AZ 85287-4501, USA
| | - Caroline M Williams
- Department of Integrative Biology, University of California, Berkeley, Berkeley CA, 94720, USA
| | - Ethan S Wold
- School of Biological Sciences, Georgia Institute of Technology, Atlanta, GA 30332, USA
| |
Collapse
|
2
|
Bae B, Lee D, Park M, Mu Y, Baek Y, Sim I, Shen C, Lee K. Stereoscopic artificial compound eyes for spatiotemporal perception in three-dimensional space. Sci Robot 2024; 9:eadl3606. [PMID: 38748779 DOI: 10.1126/scirobotics.adl3606] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2023] [Accepted: 04/17/2024] [Indexed: 10/11/2024]
Abstract
Arthropods' eyes are effective biological vision systems for object tracking and wide field of view because of their structural uniqueness; however, unlike mammalian eyes, they can hardly acquire the depth information of a static object because of their monocular cues. Therefore, most arthropods rely on motion parallax to track the object in three-dimensional (3D) space. Uniquely, the praying mantis (Mantodea) uses both compound structured eyes and a form of stereopsis and is capable of achieving object recognition in 3D space. Here, by mimicking the vision system of the praying mantis using stereoscopically coupled artificial compound eyes, we demonstrated spatiotemporal object sensing and tracking in 3D space with a wide field of view. Furthermore, to achieve a fast response with minimal latency, data storage/transportation, and power consumption, we processed the visual information at the edge of the system using a synaptic device and a federated split learning algorithm. The designed and fabricated stereoscopic artificial compound eye provides energy-efficient and accurate spatiotemporal object sensing and optical flow tracking. It exhibits a root mean square error of 0.3 centimeter, consuming only approximately 4 millijoules for sensing and tracking. These results are more than 400 times lower than conventional complementary metal-oxide semiconductor-based imaging systems. Our biomimetic imager shows the potential of integrating nature's unique design using hardware and software codesigned technology toward capabilities of edge computing and sensing.
Collapse
Affiliation(s)
- Byungjoon Bae
- Department of Electrical and Computer Engineering, University of Virginia, Charlottesville, VA 22904, USA
| | - Doeon Lee
- Department of Electrical and Computer Engineering, University of Virginia, Charlottesville, VA 22904, USA
| | - Minseong Park
- Department of Electrical and Computer Engineering, University of Virginia, Charlottesville, VA 22904, USA
| | - Yujia Mu
- Department of Electrical and Computer Engineering, University of Virginia, Charlottesville, VA 22904, USA
| | - Yongmin Baek
- Department of Electrical and Computer Engineering, University of Virginia, Charlottesville, VA 22904, USA
| | - Inbo Sim
- Department of Electrical and Computer Engineering, University of Virginia, Charlottesville, VA 22904, USA
| | - Cong Shen
- Department of Electrical and Computer Engineering, University of Virginia, Charlottesville, VA 22904, USA
| | - Kyusang Lee
- Department of Electrical and Computer Engineering, University of Virginia, Charlottesville, VA 22904, USA
- Department of Material Science and Engineering, University of Virginia, Charlottesville, VA 22904, USA
| |
Collapse
|
3
|
Goyal P, Baird E, Srinivasan MV, Muijres FT. Visual guidance of honeybees approaching a vertical landing surface. J Exp Biol 2023; 226:jeb245956. [PMID: 37589414 PMCID: PMC10482386 DOI: 10.1242/jeb.245956] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2023] [Accepted: 08/08/2023] [Indexed: 08/18/2023]
Abstract
Landing is a critical phase for flying animals, whereby many rely on visual cues to perform controlled touchdown. Foraging honeybees rely on regular landings on flowers to collect food crucial for colony survival and reproduction. Here, we explored how honeybees utilize optical expansion cues to regulate approach flight speed when landing on vertical surfaces. Three sensory-motor control models have been proposed for landings of natural flyers. Landing honeybees maintain a constant optical expansion rate set-point, resulting in a gradual decrease in approach velocity and gentile touchdown. Bumblebees exhibit a similar strategy, but they regularly switch to a new constant optical expansion rate set-point. In contrast, landing birds fly at a constant time to contact to achieve faster landings. Here, we re-examined the landing strategy of honeybees by fitting the three models to individual approach flights of honeybees landing on platforms with varying optical expansion cues. Surprisingly, the landing model identified in bumblebees proved to be the most suitable for these honeybees. This reveals that honeybees adjust their optical expansion rate in a stepwise manner. Bees flying at low optical expansion rates tend to increase their set-point stepwise, while those flying at high optical expansion rates tend to decrease it stepwise. This modular landing control system enables honeybees to land rapidly and reliably under a wide range of initial flight conditions and visual landing platform patterns. The remarkable similarity between the landing strategies of honeybees and bumblebees suggests that this may also be prevalent among other flying insects. Furthermore, these findings hold promising potential for bioinspired guidance systems in flying robots.
Collapse
Affiliation(s)
- Pulkit Goyal
- Experimental Zoology Group, Wageningen University & Research, 6708WD Wageningen, The Netherlands
| | - Emily Baird
- Department of Zoology, Stockholm University, 114 18 Stockholm, Sweden
| | - Mandyam V. Srinivasan
- Queensland Brain Institute, University of Queensland, St. Lucia, QLD 4072, Australia
| | - Florian T. Muijres
- Experimental Zoology Group, Wageningen University & Research, 6708WD Wageningen, The Netherlands
| |
Collapse
|
4
|
Fuller S, Yu Z, Talwekar YP. A gyroscope-free visual-inertial flight control and wind sensing system for 10-mg robots. Sci Robot 2022; 7:eabq8184. [DOI: 10.1126/scirobotics.abq8184] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
Abstract
Tiny “gnat robots,” weighing just a few milligrams, were first conjectured in the 1980s. How to stabilize one if it were to hover like a small insect has not been answered. Challenges include the requirement that sensors be both low mass and high bandwidth and that silicon-micromachined rate gyroscopes are too heavy. The smallest robot to perform controlled hovering uses a sensor suite weighing hundreds of milligrams. Here, we demonstrate that an accelerometer represents perhaps the most direct way to stabilize flight while satisfying the extreme size, speed, weight, and power constraints of a flying robot even as it scales down to just a few milligrams. As aircraft scale reduces, scaling physics dictates that the ratio of aerodynamic drag to mass increases. This results in reduced noise in an accelerometer’s airspeed measurement. We show through simulation and experiment on a 30-gram robot that a 2-milligram off-the-shelf accelerometer is able in principle to stabilize a 10-milligram robot despite high noise in the sensor itself. Inspired by wind-vision sensory fusion in the flight controller of the fruit fly
Drosophila melanogaster
, we then added a tiny camera and efficient, fly-inspired autocorrelation-based visual processing to allow the robot to estimate and reject wind as well as control its attitude and flight velocity using a Kalman filter. Our biology-inspired approach, validated on a small flying helicopter, has a wind gust response comparable to the fruit fly and is small and efficient enough for a 10-milligram flying vehicle (weighing less than a grain of rice).
Collapse
Affiliation(s)
- Sawyer Fuller
- Department of Mechanical Engineering, University of Washington, Seattle, WA, USA
- Paul G. Allen School of Computer Science, Seattle, WA, USA
| | - Zhitao Yu
- Department of Mechanical Engineering, University of Washington, Seattle, WA, USA
| | - Yash P. Talwekar
- Department of Mechanical Engineering, University of Washington, Seattle, WA, USA
| |
Collapse
|
5
|
Berger Dauxère A, Montagne G, Serres JR. An experimental setup for decoupling optical invariants in honeybees' altitude control. JOURNAL OF INSECT PHYSIOLOGY 2022; 143:104451. [PMID: 36374736 DOI: 10.1016/j.jinsphys.2022.104451] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/19/2022] [Revised: 10/04/2022] [Accepted: 10/14/2022] [Indexed: 06/16/2023]
Abstract
Bees outperform pilots in navigational tasks, despite having 100,000 times fewer neurons. It is commonly accepted in the literature that optic flow is a key parameter used by flying insects to control their altitude. The ambition of the present work was to design an innovative experimental setup that would make it possible to determine whether bees could rely simultaneously on several optical invariants, as pilots do. We designed a flight tunnel to enable manipulation of an optical invariant, the Splay Angle Rate of Change (SARC) and the restriction of the Optical Speed Rate of Change (OSRC) in the optic flow. It allows us to determine if bees use the SARC to control their altitude and to identify the integration process combining these two optical invariants. Access to the OSRC can be restricted by using different textures. The SARC can be biased thanks to motorized rods. This device allows to record bees' trajectories in different visual configurations, including impoverished conditions and conditions containing contradictory information. The comparative analysis of the recorded trajectories provides first time evidence of SARC use in a ground-following task by a non-human animal. This new tunnel allows a precise experimental control of the visual environment in ecological experimental conditions. Therefore, it could pave the way for a new type of ecologically based studies examining the simultaneous use of several information sources for navigation by flying insects.
Collapse
|
6
|
Jansen W, Laurijssen D, Steckel J. Real-Time Sonar Fusion for Layered Navigation Controller. SENSORS 2022; 22:s22093109. [PMID: 35590798 PMCID: PMC9102793 DOI: 10.3390/s22093109] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/14/2022] [Revised: 04/06/2022] [Accepted: 04/15/2022] [Indexed: 12/10/2022]
Abstract
Navigation in varied and dynamic indoor environments remains a complex task for autonomous mobile platforms. Especially when conditions worsen, typical sensor modalities may fail to operate optimally and subsequently provide inapt input for safe navigation control. In this study, we present an approach for the navigation of a dynamic indoor environment with a mobile platform with a single or several sonar sensors using a layered control system. These sensors can operate in conditions such as rain, fog, dust, or dirt. The different control layers, such as collision avoidance and corridor following behavior, are activated based on acoustic flow queues in the fusion of the sonar images. The novelty of this work is allowing these sensors to be freely positioned on the mobile platform and providing the framework for designing the optimal navigational outcome based on a zoning system around the mobile platform. Presented in this paper is the acoustic flow model used, as well as the design of the layered controller. Next to validation in simulation, an implementation is presented and validated in a real office environment using a real mobile platform with one, two, or three sonar sensors in real time with 2D navigation. Multiple sensor layouts were validated in both the simulation and real experiments to demonstrate that the modular approach for the controller and sensor fusion works optimally. The results of this work show stable and safe navigation of indoor environments with dynamic objects.
Collapse
Affiliation(s)
- Wouter Jansen
- Cosys-Lab, Faculty of Applied Engineering, University of Antwerp, 2020 Antwerpen, Belgium; (D.L.); (J.S.)
- Flanders Make Strategic Research Centre, 3920 Lommel, Belgium
- Correspondence:
| | - Dennis Laurijssen
- Cosys-Lab, Faculty of Applied Engineering, University of Antwerp, 2020 Antwerpen, Belgium; (D.L.); (J.S.)
- Flanders Make Strategic Research Centre, 3920 Lommel, Belgium
| | - Jan Steckel
- Cosys-Lab, Faculty of Applied Engineering, University of Antwerp, 2020 Antwerpen, Belgium; (D.L.); (J.S.)
- Flanders Make Strategic Research Centre, 3920 Lommel, Belgium
| |
Collapse
|
7
|
Serres JR, Morice AHP, Blary C, Miot R, Montagne G, Ruffier F. Floor and ceiling mirror configurations to study altitude control in honeybees. Biol Lett 2022; 18:20210534. [PMID: 35317623 PMCID: PMC8941389 DOI: 10.1098/rsbl.2021.0534] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
To investigate altitude control in honeybees, an optical configuration was designed to manipulate or cancel the optic flow. It has been widely accepted that honeybees rely on the optic flow generated by the ground to control their altitude. Here, we create an optical configuration enabling a better understanding of the mechanism of altitude control in honeybees. This optical configuration aims to mimic some of the conditions that honeybees experience over a natural water body. An optical manipulation, based on a pair of opposed horizontal mirrors, was designed to remove any visual information coming from the floor and ceiling. Such an optical manipulation allowed us to get closer to the seminal experiment of Heran & Lindauer 1963. Zeitschrift für vergleichende Physiologie47, 39-55. (doi:10.1007/BF00342890). Our results confirmed that a reduction or an absence of ventral optic flow in honeybees leads to a loss in altitude, and eventually a collision with the floor.
Collapse
Affiliation(s)
| | | | - Constance Blary
- Aix Marseille Univ, CNRS, ISM, Marseille, France
- CEFE UMR 5175, CNRS - Université de Montpellier - Université Paul-Valéry Montpellier - EPHE - 1919 route de Mende, 34293 Montpellier cedex 5, France
| | - Romain Miot
- Aix Marseille Univ, CNRS, ISM, Marseille, France
- XTIM SAS, 77 rue de Lyon, 13015 Marseille, France
| | | | | |
Collapse
|
8
|
Fabian ST, Sumner ME, Wardill TJ, Gonzalez-Bellido PT. Avoiding obstacles while intercepting a moving target: a miniature fly's solution. J Exp Biol 2022; 225:274211. [PMID: 35168251 PMCID: PMC8920034 DOI: 10.1242/jeb.243568] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2021] [Accepted: 12/14/2021] [Indexed: 11/20/2022]
Abstract
The miniature robber fly Holcocephala fusca intercepts its targets with behaviour that is approximated by the proportional navigation guidance law. During predatory trials, we challenged the interception of H. fusca performance by placing a large object in its potential flight path. In response, H. fusca deviated from the path predicted by pure proportional navigation, but in many cases still eventually contacted the target. We show that such flight deviations can be explained as the output of two competing navigational systems: pure-proportional navigation and a simple obstacle avoidance algorithm. Obstacle avoidance by H. fusca is here described by a simple feedback loop that uses the visual expansion of the approaching obstacle to mediate the magnitude of the turning-away response. We name the integration of this steering law with proportional navigation 'combined guidance'. The results demonstrate that predatory intent does not operate a monopoly on the fly's steering when attacking a target, and that simple guidance combinations can explain obstacle avoidance during interceptive tasks.
Collapse
Affiliation(s)
- Samuel T Fabian
- Department of Physiology, Development, and Neuroscience, University of Cambridge, Cambridge CB2 3EG, UK.,Department of Bioengineering, Imperial College London, London SW7 2AZ, UK
| | - Mary E Sumner
- Department of Ecology, Evolution and Behaviour, University of Minnesota, Saint Paul, MN 55108, USA
| | - Trevor J Wardill
- Department of Ecology, Evolution and Behaviour, University of Minnesota, Saint Paul, MN 55108, USA
| | | |
Collapse
|
9
|
Berger Dauxère A, Serres JR, Montagne G. Ecological Entomology: How Is Gibson's Framework Useful? INSECTS 2021; 12:1075. [PMID: 34940163 PMCID: PMC8703479 DOI: 10.3390/insects12121075] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/23/2021] [Revised: 11/25/2021] [Accepted: 11/26/2021] [Indexed: 11/17/2022]
Abstract
To date, numerous studies have demonstrated the fundamental role played by optic flow in the control of goal-directed displacement tasks in insects. Optic flow was first introduced by Gibson as part of their ecological approach to perception and action. While this theoretical approach (as a whole) has been demonstrated to be particularly suitable for the study of goal-directed displacements in humans, its usefulness in carrying out entomological field studies remains to be established. In this review we would like to demonstrate that the ecological approach to perception and action could be relevant for the entomologist community in their future investigations. This approach could provide a conceptual and methodological framework for the community in order to: (i) take a critical look at the research carried out to date, (ii) develop rigorous and innovative experimental protocols, and (iii) define scientific issues that push the boundaries of the current scientific field. After a concise literature review about the perceptual control of displacement in insects, we will present the framework proposed by Gibson and suggest its added value for carrying out research in the field of behavioral ecology in insects.
Collapse
Affiliation(s)
- Aimie Berger Dauxère
- The Institute of Movement Sciences, Aix Marseille University, CNRS, ISM, CEDEX 07, 13284 Marseille, France; (J.R.S.); (G.M.)
| | | | | |
Collapse
|
10
|
Motion cues from the background influence associative color learning of honey bees in a virtual-reality scenario. Sci Rep 2021; 11:21127. [PMID: 34702914 PMCID: PMC8548521 DOI: 10.1038/s41598-021-00630-x] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2021] [Accepted: 10/13/2021] [Indexed: 11/21/2022] Open
Abstract
Honey bees exhibit remarkable visual learning capacities, which can be studied using virtual reality (VR) landscapes in laboratory conditions. Existing VR environments for bees are imperfect as they provide either open-loop conditions or 2D displays. Here we achieved a true 3D environment in which walking bees learned to discriminate a rewarded from a punished virtual stimulus based on color differences. We included ventral or frontal background cues, which were also subjected to 3D updating based on the bee movements. We thus studied if and how the presence of such motion cues affected visual discrimination in our VR landscape. Our results showed that the presence of frontal, and to a lesser extent, of ventral background motion cues impaired the bees' performance. Whenever these cues were suppressed, color discrimination learning became possible. We analyzed the specific contribution of foreground and background cues and discussed the role of attentional interference and differences in stimulus salience in the VR environment to account for these results. Overall, we show how background and target cues may interact at the perceptual level and influence associative learning in bees. In addition, we identify issues that may affect decision-making in VR landscapes, which require specific control by experimenters.
Collapse
|
11
|
Saito K, Nagai H, Suto K, Ogawa N, Seong YA, Tachi T, Niiyama R, Kawahara Y. Insect wing 3D printing. Sci Rep 2021; 11:18631. [PMID: 34650126 PMCID: PMC8516917 DOI: 10.1038/s41598-021-98242-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2021] [Accepted: 09/06/2021] [Indexed: 11/13/2022] Open
Abstract
Insects have acquired various types of wings over their course of evolution and have become the most successful terrestrial animals. Consequently, the essence of their excellent environmental adaptability and locomotive ability should be clarified; a simple and versatile method to artificially reproduce the complex structure and various functions of these innumerable types of wings is necessary. This study presents a simple integral forming method for an insect-wing-type composite structure by 3D printing wing frames directly onto thin films. The artificial venation generation algorithm based on the centroidal Voronoi diagram, which can be observed in the wings of dragonflies, was used to design the complex mechanical properties of artificial wings. Furthermore, we implemented two representative functions found in actual insect wings: folding and coupling. The proposed crease pattern design software developed based on a beetle hindwing enables the 3D printing of foldable wings of any shape. In coupling-type wings, the forewing and hindwing are connected to form a single large wing during flight; these wings can be stored compactly by disconnecting and stacking them like cicada wings.
Collapse
Affiliation(s)
- Kazuya Saito
- Faculty of Design, Kyushu University, Fukuoka, 815-8540, Japan.
| | - Hiroto Nagai
- Graduate School of Engineering, Nagasaki University, Nagasaki, 852-8521, Japan
| | - Kai Suto
- Graduate School of Arts and Sciences, The University of Tokyo, Tokyo, 153-8902, Japan
- Nature Architects Inc., Tokyo, 107-0052, Japan
| | - Naoki Ogawa
- Tokyo University of Agriculture, Kanagawa, 243-0034, Japan
| | - Young Ah Seong
- Graduate School of Information Science and Technology, The University of Tokyo, Tokyo, 113-8654, Japan
| | - Tomohiro Tachi
- Graduate School of Arts and Sciences, The University of Tokyo, Tokyo, 153-8902, Japan
| | - Ryuma Niiyama
- Graduate School of Information Science and Technology, The University of Tokyo, Tokyo, 113-8654, Japan
| | - Yoshihiro Kawahara
- Graduate School of Engineering, The University of Tokyo, Tokyo, 113-8654, Japan
| |
Collapse
|
12
|
Bergantin L, Harbaoui N, Raharijaona T, Ruffier F. Oscillations make a self-scaled model for honeybees' visual odometer reliable regardless of flight trajectory. J R Soc Interface 2021; 18:20210567. [PMID: 34493092 PMCID: PMC8424324 DOI: 10.1098/rsif.2021.0567] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Honeybees foraging and recruiting nest-mates by performing the waggle dance need to be able to gauge the flight distance to the food source regardless of the wind and terrain conditions. Previous authors have hypothesized that the foragers’ visual odometer mathematically integrates the angular velocity of the ground image sweeping backward across their ventral viewfield, known as translational optic flow. The question arises as to how mathematical integration of optic flow (usually expressed in radians/s) can reliably encode distances, regardless of the height and speed of flight. The vertical self-oscillatory movements observed in honeybees trigger expansions and contractions of the optic flow vector field, yielding an additional visual cue called optic flow divergence. We have developed a self-scaled model for the visual odometer in which the translational optic flow is scaled by the visually estimated current clearance from the ground. In simulation, this model, which we have called SOFIa, was found to be reliable in a large range of flight trajectories, terrains and wind conditions. It reduced the statistical dispersion of the estimated flight distances approximately 10-fold in comparison with the mathematically integrated raw optic flow model. The SOFIa model can be directly implemented in robotic applications based on minimalistic visual equipment.
Collapse
Affiliation(s)
| | - Nesrine Harbaoui
- Aix-Marseille University, CNRS, ISM, Marseille, France.,CRIStAL Laboratory, CNRS UMR, 9189, University of Lille, 59650 Lille, France
| | - Thibaut Raharijaona
- Aix-Marseille University, CNRS, ISM, Marseille, France.,Université de Lorraine, Arts et Métiers Institute of Technology, LCFC, HESAM Université, 57070 Metz, France
| | | |
Collapse
|
13
|
Lingenfelter B, Nag A, van Breugel F. Insect inspired vision-based velocity estimation through spatial pooling of optic flow during linear motion. BIOINSPIRATION & BIOMIMETICS 2021; 16:10.1088/1748-3190/ac1f7b. [PMID: 34412040 PMCID: PMC10561965 DOI: 10.1088/1748-3190/ac1f7b] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/11/2020] [Accepted: 08/19/2021] [Indexed: 06/13/2023]
Abstract
Insects rely on the perception of image motion, or optic flow, to estimate their velocity relative to nearby objects. This information provides important sensory input for avoiding obstacles. However, certain behaviors, such as estimating the absolute distance to a landing target, accurately measuring absolute distance traveled, and estimating the ambient wind speed require decoupling optic flow into its component parts: absolute ground velocity and distance to nearby objects. Behavioral experiments suggest that insects perform these calculations, but their mechanism for doing so remains unknown. Here we present a novel algorithm that combines the geometry of dynamic forward motion with known features of insect visual processing to provide a hypothesis for how insects mightdirectlyestimate absolute ground velocity from a combination of optic flow and acceleration information. Our robotics-inspired-biology approach reveals three critical requirements. First, absolute ground velocity can only be directly estimated from optic flow during times of active acceleration and deceleration. Second, spatial pooling of optic flow across a receptive field helps to alleviate the effects of noise and/or low resolution visual systems. Third, averaging velocity estimates from multiple receptive fields further helps to reject noise. Our algorithm provides a hypothesis for how insects might estimate absolute velocity from vision during active maneuvers, and also provides a theoretical framework for designing fast analog circuitry for efficient state estimation that can be applied to insect-sized robots.
Collapse
Affiliation(s)
- Bryson Lingenfelter
- Department of Computer Science and Engineering, University of Nevada, Reno, United States of America
| | - Arunava Nag
- Department of Mechanical Engineering, University of Nevada, Reno, United States of America
| | - Floris van Breugel
- Department of Mechanical Engineering, University of Nevada, Reno, United States of America
| |
Collapse
|
14
|
Murayama Y, Nakata T, Liu H. Flexible Flaps Inspired by Avian Feathers Can Enhance Aerodynamic Robustness in low Reynolds Number Airfoils. Front Bioeng Biotechnol 2021; 9:612182. [PMID: 34026737 PMCID: PMC8137910 DOI: 10.3389/fbioe.2021.612182] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2020] [Accepted: 04/15/2021] [Indexed: 11/13/2022] Open
Abstract
Unlike rigid rotors of drones, bird wings are composed of flexible feathers that can passively deform while achieving remarkable aerodynamic robustness in response to wind gusts. In this study, we conduct an experimental study on the effects of the flexible flaps inspired by the covert of bird wings on aerodynamic characteristics of fixed-wings in disturbances. Through force measurements and flow visualization in a low-speed wind tunnel, it is found that the flexible flaps can suppress the large-scale vortex shedding and hence reduce the fluctuations of aerodynamic forces in a disturbed flow behind an oscillating plate. Our results demonstrate that the stiffness of the flaps strongly affects the aerodynamic performance, and the force fluctuations are observed to be reduced when the deformation synchronizes with the strong vortex generation. The results point out that the simple attachment of the flexible flaps on the upper surface of the wing is an effective method, providing a novel biomimetic design to improve the aerodynamic robustness of small-scale drones with fixed-wings operating in unpredictable aerial environments.
Collapse
Affiliation(s)
- Yuta Murayama
- Graduate School of Science and Engineering, Chiba University, Chiba, Japan
| | | | - Hao Liu
- Graduate School of Engineering, Chiba University, Chiba, Japan
| |
Collapse
|
15
|
Wang H, Fu Q, Wang H, Baxter P, Peng J, Yue S. A bioinspired angular velocity decoding neural network model for visually guided flights. Neural Netw 2021; 136:180-193. [PMID: 33494035 DOI: 10.1016/j.neunet.2020.12.008] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2020] [Revised: 12/03/2020] [Accepted: 12/07/2020] [Indexed: 11/17/2022]
Abstract
Efficient and robust motion perception systems are important pre-requisites for achieving visually guided flights in future micro air vehicles. As a source of inspiration, the visual neural networks of flying insects such as honeybee and Drosophila provide ideal examples on which to base artificial motion perception models. In this paper, we have used this approach to develop a novel method that solves the fundamental problem of estimating angular velocity for visually guided flights. Compared with previous models, our elementary motion detector (EMD) based model uses a separate texture estimation pathway to effectively decode angular velocity, and demonstrates considerable independence from the spatial frequency and contrast of the gratings. Using the Unity development platform the model is further tested for tunnel centering and terrain following paradigms in order to reproduce the visually guided flight behaviors of honeybees. In a series of controlled trials, the virtual bee utilizes the proposed angular velocity control schemes to accurately navigate through a patterned tunnel, maintaining a suitable distance from the undulating textured terrain. The results are consistent with both neuron spike recordings and behavioral path recordings of real honeybees, thereby demonstrating the model's potential for implementation in micro air vehicles which have only visual sensors.
Collapse
Affiliation(s)
- Huatian Wang
- Computational Intelligence Laboratory (CIL), University of Lincoln, Lincoln, UK; Machine Life and Intelligence Research Center, Guangzhou University, Guangzhou, China
| | - Qinbing Fu
- Machine Life and Intelligence Research Center, Guangzhou University, Guangzhou, China; Computational Intelligence Laboratory (CIL), University of Lincoln, Lincoln, UK
| | - Hongxin Wang
- Machine Life and Intelligence Research Center, Guangzhou University, Guangzhou, China; Computational Intelligence Laboratory (CIL), University of Lincoln, Lincoln, UK
| | - Paul Baxter
- Computational Intelligence Laboratory (CIL), University of Lincoln, Lincoln, UK
| | - Jigen Peng
- School of Mathematics and Information Science, Guangzhou University, Guangzhou, China; Machine Life and Intelligence Research Center, Guangzhou University, Guangzhou, China.
| | - Shigang Yue
- Machine Life and Intelligence Research Center, Guangzhou University, Guangzhou, China; Computational Intelligence Laboratory (CIL), University of Lincoln, Lincoln, UK.
| |
Collapse
|
16
|
Kim JJ, Liu H, Ousati Ashtiani A, Jiang H. Biologically inspired artificial eyes and photonics. REPORTS ON PROGRESS IN PHYSICS. PHYSICAL SOCIETY (GREAT BRITAIN) 2020; 83:047101. [PMID: 31923911 PMCID: PMC7195211 DOI: 10.1088/1361-6633/ab6a42] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/30/2023]
Abstract
Natural visual systems have inspired scientists and engineers to mimic their intriguing features for the development of advanced photonic devices that can provide better solutions than conventional ones. Among various kinds of natural eyes, researchers have had intensive interest in mammal eyes and compound eyes due to their advantages in optical properties such as focal length tunability, high-resolution imaging, light intensity modulation, wide field of view, high light sensitivity, and efficient light management. A variety of different approaches in the broad field of science and technology have been tried and succeeded to duplicate the functions of natural eyes and develop bioinspired photonic devices for various applications. In this review, we present a comprehensive overview of bioinspired artificial eyes and photonic devices that mimic functions of natural eyes. After we briefly introduce visual systems in nature, we discuss optical components inspired by the mammal eyes, including tunable lenses actuated with different mechanisms, curved image sensors with low aberration, and light intensity modulators. Next, compound eye inspired photonic devices are presented, such as microlenses and micromirror arrays, imaging sensor arrays on curved surfaces, self-written waveguides with microlens arrays, and antireflective nanostructures (ARS). Subsequently, compound eyes with focal length tunability, photosensitivity enhancers, and polarization imaging sensors are described.
Collapse
Affiliation(s)
- Jae-Jun Kim
- Electrical and Computer Engineering, University of Wisconsin-Madison, Madison, WI 53706, United States of America
| | | | | | | |
Collapse
|
17
|
Serres JR, Evans TJ, Åkesson S, Duriez O, Shamoun-Baranes J, Ruffier F, Hedenström A. Optic flow cues help explain altitude control over sea in freely flying gulls. J R Soc Interface 2019; 16:20190486. [PMID: 31594521 DOI: 10.1098/rsif.2019.0486] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/14/2023] Open
Abstract
For studies of how birds control their altitude, seabirds are of particular interest because they forage offshore where the visual environment can be simply modelled by a flat world textured by waves then generating only ventral visual cues. This study suggests that optic flow, i.e. the rate at which the sea moves across the eye's retina, can explain gulls' altitude control over seas. In particular, a new flight model that includes both energy and optical invariants helps explain the gulls' trajectories during offshore takeoff and cruising flight. A linear mixed model applied to 352 flights from 16 individual lesser black backed gulls (Larus fuscus) revealed a statistically significant optic flow set-point of ca 25° s-1. Thereafter, an optic flow-based flight model was applied to 18 offshore takeoff flights from nine individual gulls. By introducing an upper limit in climb rate on the elevation dynamics, coupled with an optic flow set-point, the predicted altitude gives an optimized fit factor value of 63% on average (30-83% in range) with respect to the GPS data. We conclude that the optic flow regulation principle helps gulls to adjust their altitude over sea without having to directly measure their current altitude.
Collapse
Affiliation(s)
| | - Thomas J Evans
- Department of Biology, CAnMove, Lund University, Ecology Building, 223 62 Lund, Sweden.,Marine Scotland Science, Marine Laboratory, 375 Victoria Road, Aberdeen AB11 9DB, UK
| | - Susanne Åkesson
- Department of Biology, CAnMove, Lund University, Ecology Building, 223 62 Lund, Sweden
| | - Olivier Duriez
- CEFE UMR 5175, CNRS - Université de Montpellier - Université Paul-Valéry Montpellier - EPHE - 1919 route de Mende, 34293 Montpellier cedex 5, France
| | - Judy Shamoun-Baranes
- Theoretical and Computational Ecology, Institute of Biodiversity and Ecosystem Dynamics, University of Amsterdam, PO Box 94 248, 1090 GE Amsterdam, The Netherlands
| | | | - Anders Hedenström
- Department of Biology, CAnMove, Lund University, Ecology Building, 223 62 Lund, Sweden
| |
Collapse
|
18
|
Fu Q, Wang H, Hu C, Yue S. Towards Computational Models and Applications of Insect Visual Systems for Motion Perception: A Review. ARTIFICIAL LIFE 2019; 25:263-311. [PMID: 31397604 DOI: 10.1162/artl_a_00297] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Motion perception is a critical capability determining a variety of aspects of insects' life, including avoiding predators, foraging, and so forth. A good number of motion detectors have been identified in the insects' visual pathways. Computational modeling of these motion detectors has not only been providing effective solutions to artificial intelligence, but also benefiting the understanding of complicated biological visual systems. These biological mechanisms through millions of years of evolutionary development will have formed solid modules for constructing dynamic vision systems for future intelligent machines. This article reviews the computational motion perception models originating from biological research on insects' visual systems in the literature. These motion perception models or neural networks consist of the looming-sensitive neuronal models of lobula giant movement detectors (LGMDs) in locusts, the translation-sensitive neural systems of direction-selective neurons (DSNs) in fruit flies, bees, and locusts, and the small-target motion detectors (STMDs) in dragonflies and hoverflies. We also review the applications of these models to robots and vehicles. Through these modeling studies, we summarize the methodologies that generate different direction and size selectivity in motion perception. Finally, we discuss multiple systems integration and hardware realization of these bio-inspired motion perception models.
Collapse
Affiliation(s)
- Qinbing Fu
- Guangzhou University, School of Mechanical and Electrical Engineering; Machine Life and Intelligence Research Centre
- University of Lincoln, Computational Intelligence Lab, School of Computer Science; Lincoln Centre for Autonomous Systems.
| | - Hongxin Wang
- University of Lincoln, Computational Intelligence Lab, School of Computer Science; Lincoln Centre for Autonomous Systems.
| | - Cheng Hu
- Guangzhou University, School of Mechanical and Electrical Engineering; Machine Life and Intelligence Research Centre
- University of Lincoln, Computational Intelligence Lab, School of Computer Science; Lincoln Centre for Autonomous Systems.
| | - Shigang Yue
- Guangzhou University, School of Mechanical and Electrical Engineering; Machine Life and Intelligence Research Centre
- University of Lincoln, Computational Intelligence Lab, School of Computer Science; Lincoln Centre for Autonomous Systems.
| |
Collapse
|
19
|
Karásek M, Muijres FT, De Wagter C, Remes BDW, de Croon GCHE. A tailless aerial robotic flapper reveals that flies use torque coupling in rapid banked turns. Science 2018; 361:1089-1094. [DOI: 10.1126/science.aat0350] [Citation(s) in RCA: 176] [Impact Index Per Article: 29.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2018] [Revised: 05/04/2018] [Accepted: 07/20/2018] [Indexed: 02/06/2023]
Abstract
Insects are among the most agile natural flyers. Hypotheses on their flight control cannot always be validated by experiments with animals or tethered robots. To this end, we developed a programmable and agile autonomous free-flying robot controlled through bio-inspired motion changes of its flapping wings. Despite being 55 times the size of a fruit fly, the robot can accurately mimic the rapid escape maneuvers of flies, including a correcting yaw rotation toward the escape heading. Because the robot’s yaw control was turned off, we showed that these yaw rotations result from passive, translation-induced aerodynamic coupling between the yaw torque and the roll and pitch torques produced throughout the maneuver. The robot enables new methods for studying animal flight, and its flight characteristics allow for real-world flight missions.
Collapse
Affiliation(s)
- Matěj Karásek
- Micro Air Vehicle Laboratory, Control and Simulation, Delft University of Technology, Delft, Netherlands
| | - Florian T. Muijres
- Experimental Zoology Group, Wageningen University and Research, Wageningen, Netherlands
| | - Christophe De Wagter
- Micro Air Vehicle Laboratory, Control and Simulation, Delft University of Technology, Delft, Netherlands
| | - Bart D. W. Remes
- Micro Air Vehicle Laboratory, Control and Simulation, Delft University of Technology, Delft, Netherlands
| | - Guido C. H. E. de Croon
- Micro Air Vehicle Laboratory, Control and Simulation, Delft University of Technology, Delft, Netherlands
| |
Collapse
|
20
|
Linander N, Dacke M, Baird E, Hempel de Ibarra N. The role of spatial texture in visual control of bumblebee learning flights. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2018; 204:737-745. [PMID: 29980840 PMCID: PMC6096632 DOI: 10.1007/s00359-018-1274-0] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2018] [Revised: 06/07/2018] [Accepted: 06/19/2018] [Indexed: 11/29/2022]
Abstract
When leaving the nest for the first time, bees and wasps perform elaborate learning flights, during which the location of the nest is memorised. These flights are characterised by a succession of arcs or loops of increasing radius centred around the nest, with an incremental increase in ground speed, which requires precise control of the flight manoeuvres by the insect. Here, we investigated the role of optic flow cues in the control of learning flights by manipulating spatial texture in the ventral and panoramic visual field. We measured height, lateral displacement relative to the nest and ground speed during learning flights in bumblebees when ventral and panoramic optic flow cues were present or minimised, or features of the ground texture varied in size. Our observations show that ventral optic flow cues were required for the smooth execution of learning flights. We also found that bumblebees adjusted their flight height in response to variations of the visual texture on the ground. However, the presence or absence of panoramic optic flow did not have a substantial effect on flight performance. Our findings suggest that bumblebees mainly rely on optic flow information from the ventral visual field to control their learning flights.
Collapse
Affiliation(s)
- Nellie Linander
- Lund Vision Group, Department of Biology, Lund University, Lund, Sweden. .,Centre for Research in Animal Behaviour, Psychology, University of Exeter, Exeter, EX4 4QG, UK.
| | - Marie Dacke
- Lund Vision Group, Department of Biology, Lund University, Lund, Sweden
| | - Emily Baird
- Lund Vision Group, Department of Biology, Lund University, Lund, Sweden
| | | |
Collapse
|
21
|
Serres JR, Ruffier F. Optic flow-based collision-free strategies: From insects to robots. ARTHROPOD STRUCTURE & DEVELOPMENT 2017; 46:703-717. [PMID: 28655645 DOI: 10.1016/j.asd.2017.06.003] [Citation(s) in RCA: 64] [Impact Index Per Article: 9.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/07/2016] [Revised: 06/19/2017] [Accepted: 06/19/2017] [Indexed: 06/07/2023]
Abstract
Flying insects are able to fly smartly in an unpredictable environment. It has been found that flying insects have smart neurons inside their tiny brains that are sensitive to visual motion also called optic flow. Consequently, flying insects rely mainly on visual motion during their flight maneuvers such as: takeoff or landing, terrain following, tunnel crossing, lateral and frontal obstacle avoidance, and adjusting flight speed in a cluttered environment. Optic flow can be defined as the vector field of the apparent motion of objects, surfaces, and edges in a visual scene generated by the relative motion between an observer (an eye or a camera) and the scene. Translational optic flow is particularly interesting for short-range navigation because it depends on the ratio between (i) the relative linear speed of the visual scene with respect to the observer and (ii) the distance of the observer from obstacles in the surrounding environment without any direct measurement of either speed or distance. In flying insects, roll stabilization reflex and yaw saccades attenuate any rotation at the eye level in roll and yaw respectively (i.e. to cancel any rotational optic flow) in order to ensure pure translational optic flow between two successive saccades. Our survey focuses on feedback-loops which use the translational optic flow that insects employ for collision-free navigation. Optic flow is likely, over the next decade to be one of the most important visual cues that can explain flying insects' behaviors for short-range navigation maneuvers in complex tunnels. Conversely, the biorobotic approach can therefore help to develop innovative flight control systems for flying robots with the aim of mimicking flying insects' abilities and better understanding their flight.
Collapse
|
22
|
Altitude control in honeybees: joint vision-based learning and guidance. Sci Rep 2017; 7:9231. [PMID: 28835634 PMCID: PMC5569062 DOI: 10.1038/s41598-017-09112-5] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2016] [Accepted: 07/24/2017] [Indexed: 11/15/2022] Open
Abstract
Studies on insects’ visual guidance systems have shed little light on how learning contributes to insects’ altitude control system. In this study, honeybees were trained to fly along a double-roofed tunnel after entering it near either the ceiling or the floor of the tunnel. The honeybees trained to hug the ceiling therefore encountered a sudden change in the tunnel configuration midways: i.e. a "dorsal ditch". Thus, the trained honeybees met a sudden increase in the distance to the ceiling, corresponding to a sudden strong change in the visual cues available in their dorsal field of view. Honeybees reacted by rising quickly and hugging the new, higher ceiling, keeping a similar forward speed, distance to the ceiling and dorsal optic flow to those observed during the training step; whereas bees trained to follow the floor kept on following the floor regardless of the change in the ceiling height. When trained honeybees entered the tunnel via the other entry (the lower or upper entry) to that used during the training step, they quickly changed their altitude and hugged the surface they had previously learned to follow. These findings clearly show that trained honeybees control their altitude based on visual cues memorized during training. The memorized visual cues generated by the surfaces followed form a complex optic flow pattern: trained honeybees may attempt to match the visual cues they perceive with this memorized optic flow pattern by controlling their altitude.
Collapse
|
23
|
Chaotic metaheuristic algorithms for learning and reproduction of robot motion trajectories. Neural Comput Appl 2016. [DOI: 10.1007/s00521-016-2717-6] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/01/2022]
|
24
|
Mafrica S, Servel A, Ruffier F. Minimalistic optic flow sensors applied to indoor and outdoor visual guidance and odometry on a car-like robot. BIOINSPIRATION & BIOMIMETICS 2016; 11:066007. [PMID: 27831937 DOI: 10.1088/1748-3190/11/6/066007] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
Here we present a novel bio-inspired optic flow (OF) sensor and its application to visual guidance and odometry on a low-cost car-like robot called BioCarBot. The minimalistic OF sensor was robust to high-dynamic-range lighting conditions and to various visual patterns encountered thanks to its M2APIX auto-adaptive pixels and the new cross-correlation OF algorithm implemented. The low-cost car-like robot estimated its velocity and steering angle, and therefore its position and orientation, via an extended Kalman filter (EKF) using only two downward-facing OF sensors and the Ackerman steering model. Indoor and outdoor experiments were carried out in which the robot was driven in the closed-loop mode based on the velocity and steering angle estimates. The experimental results obtained show that our novel OF sensor can deliver high-frequency measurements ([Formula: see text]) in a wide OF range (1.5-[Formula: see text]) and in a 7-decade high-dynamic light level range. The OF resolution was constant and could be adjusted as required (up to [Formula: see text]), and the OF precision obtained was relatively high (standard deviation of [Formula: see text] with an average OF of [Formula: see text], under the most demanding lighting conditions). An EKF-based algorithm gave the robot's position and orientation with a relatively high accuracy (maximum errors outdoors at a very low light level: [Formula: see text] and [Formula: see text] over about [Formula: see text] and [Formula: see text]) despite the low-resolution control systems of the steering servo and the DC motor, as well as a simplified model identification and calibration. Finally, the minimalistic OF-based odometry results were compared to those obtained using measurements based on an inertial measurement unit (IMU) and a motor's speed sensor.
Collapse
Affiliation(s)
- Stefano Mafrica
- PSA Peugeot Citroën, 78140 Vélizy-Villacoublay, France. Aix-Marseille Univ., CNRS, ISM, Inst. Movement Sci., Marseille, France
| | | | | |
Collapse
|
25
|
Abstract
In the attempt to build adaptive and intelligent machines, roboticists have looked at neuroscience for more than half a century as a source of inspiration for perception and control. More recently, neuroscientists have resorted to robots for testing hypotheses and validating models of biological nervous systems. Here, we give an overview of the work at the intersection of robotics and neuroscience and highlight the most promising approaches and areas where interactions between the two fields have generated significant new insights. We articulate the work in three sections, invertebrate, vertebrate and primate neuroscience. We argue that robots generate valuable insight into the function of nervous systems, which is intimately linked to behaviour and embodiment, and that brain-inspired algorithms and devices give robots life-like capabilities.
Collapse
Affiliation(s)
- Dario Floreano
- Laboratory of Intelligent Systems, Ecole Polytechnique Fédérale de Lausanne, Station 11, Lausanne, CH 1015, Switzerland.
| | - Auke Jan Ijspeert
- Biorobotics Laboratory, Ecole Polytechnique Fédérale de Lausanne, Station 14, Lausanne, CH 1015, Switzerland
| | - Stefan Schaal
- Max-Planck-Institute for Intelligent Systems, Spemannstrasse 41, 72076 Tübingen, Germany, & University of Southern California, Ronald Tutor Hall RTH 401, 3710 S. McClintock Avenue, Los Angeles, CA 90089-2905, USA
| |
Collapse
|
26
|
Stewart FJ, Kinoshita M, Arikawa K. The roles of visual parallax and edge attraction in the foraging behaviour of the butterfly Papilio xuthus. ACTA ACUST UNITED AC 2015; 218:1725-32. [PMID: 25883380 DOI: 10.1242/jeb.115063] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/07/2014] [Accepted: 04/07/2015] [Indexed: 11/20/2022]
Abstract
Several examples of insects using visual motion to measure distance have been documented, from locusts peering to gauge the proximity of prey, to honeybees performing visual odometry en route between the hive and a flower patch. However, whether the use of parallax information is confined to specialised behaviours like these or represents a more general purpose sensory capability, is an open question. We investigate this issue in the foraging swallowtail butterfly Papilio xuthus, which we trained to associate a target presented on a monitor with a food reward. We then tracked the animal's flight in real-time, allowing us to manipulate the size and/or position of the target in a closed-loop manner to create the illusion that it is situated either above or below the monitor surface. Butterflies are less attracted to (i.e. slower to approach) targets that appear, based on motion parallax, to be more distant. Furthermore, we found that the number of abortive descent manoeuvres performed prior to the first successful target approach varies according to the depth of the virtual target, with expansion and parallax cues having effects of opposing polarity. However, we found no evidence that Papilio modulate the kinematic parameters of their descents according to the apparent distance of the target. Thus, we argue that motion parallax is used to identify a proximal target object, but that the subsequent process of approaching it is based on stabilising its edge in the 2D space of the retina, without estimating its distance.
Collapse
Affiliation(s)
- Finlay J Stewart
- Department of Evolutionary Studies of Biosystems, School of Advanced Sciences, Sokendai (The Graduate University for Advanced Studies), Shonan Village, Hayama, Kanagawa 240-0193, Japan
| | - Michiyo Kinoshita
- Department of Evolutionary Studies of Biosystems, School of Advanced Sciences, Sokendai (The Graduate University for Advanced Studies), Shonan Village, Hayama, Kanagawa 240-0193, Japan
| | - Kentaro Arikawa
- Department of Evolutionary Studies of Biosystems, School of Advanced Sciences, Sokendai (The Graduate University for Advanced Studies), Shonan Village, Hayama, Kanagawa 240-0193, Japan
| |
Collapse
|
27
|
Fuller SB, Karpelson M, Censi A, Ma KY, Wood RJ. Controlling free flight of a robotic fly using an onboard vision sensor inspired by insect ocelli. J R Soc Interface 2015; 11:20140281. [PMID: 24942846 DOI: 10.1098/rsif.2014.0281] [Citation(s) in RCA: 90] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Scaling a flying robot down to the size of a fly or bee requires advances in manufacturing, sensing and control, and will provide insights into mechanisms used by their biological counterparts. Controlled flight at this scale has previously required external cameras to provide the feedback to regulate the continuous corrective manoeuvres necessary to keep the unstable robot from tumbling. One stabilization mechanism used by flying insects may be to sense the horizon or Sun using the ocelli, a set of three light sensors distinct from the compound eyes. Here, we present an ocelli-inspired visual sensor and use it to stabilize a fly-sized robot. We propose a feedback controller that applies torque in proportion to the angular velocity of the source of light estimated by the ocelli. We demonstrate theoretically and empirically that this is sufficient to stabilize the robot's upright orientation. This constitutes the first known use of onboard sensors at this scale. Dipteran flies use halteres to provide gyroscopic velocity feedback, but it is unknown how other insects such as honeybees stabilize flight without these sensory organs. Our results, using a vehicle of similar size and dynamics to the honeybee, suggest how the ocelli could serve this role.
Collapse
Affiliation(s)
- Sawyer B Fuller
- School of Engineering and Applied Sciences and the Wyss Institute for Biologically Inspired Engineering, Harvard University, Cambridge, MA 02138, USA
| | - Michael Karpelson
- School of Engineering and Applied Sciences and the Wyss Institute for Biologically Inspired Engineering, Harvard University, Cambridge, MA 02138, USA
| | - Andrea Censi
- Laboratory for Information and Decision Systems, Massachusetts Institute of Technology, Cambridge, MA 02138, USA
| | - Kevin Y Ma
- School of Engineering and Applied Sciences and the Wyss Institute for Biologically Inspired Engineering, Harvard University, Cambridge, MA 02138, USA
| | - Robert J Wood
- School of Engineering and Applied Sciences and the Wyss Institute for Biologically Inspired Engineering, Harvard University, Cambridge, MA 02138, USA
| |
Collapse
|
28
|
Laurent É. Multiscale Enaction Model (MEM): the case of complexity and "context-sensitivity" in vision. Front Psychol 2014; 5:1425. [PMID: 25566115 PMCID: PMC4271595 DOI: 10.3389/fpsyg.2014.01425] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2014] [Accepted: 11/22/2014] [Indexed: 11/23/2022] Open
Abstract
I review the data on human visual perception that reveal the critical role played by non-visual contextual factors influencing visual activity. The global perspective that progressively emerges reveals that vision is sensitive to multiple couplings with other systems whose nature and levels of abstraction in science are highly variable. Contrary to some views where vision is immersed in modular hard-wired modules, rather independent from higher-level or other non-cognitive processes, converging data gathered in this article suggest that visual perception can be theorized in the larger context of biological, physical, and social systems with which it is coupled, and through which it is enacted. Therefore, any attempt to model complexity and multiscale couplings, or to develop a complex synthesis in the fields of mind, brain, and behavior, shall involve a systematic empirical study of both connectedness between systems or subsystems, and the embodied, multiscale and flexible teleology of subsystems. The conceptual model (Multiscale Enaction Model [MEM]) that is introduced in this paper finally relates empirical evidence gathered from psychology to biocomputational data concerning the human brain. Both psychological and biocomputational descriptions of MEM are proposed in order to help fill in the gap between scales of scientific analysis and to provide an account for both the autopoiesis-driven search for information, and emerging perception.
Collapse
Affiliation(s)
- Éric Laurent
- Laboratoire de Psychologie EA 3188, Unité de Formation et de Recherche Sciences du Langage de l’Homme et de la Société, University of Franche-ComtéBesançon, France
- Maison des Sciences de l’Homme et de l’Environnement Claude Nicolas Ledoux, UMSR 3124, CNRS and University of Franche-ComtéBesançon, France
| |
Collapse
|
29
|
Roubieu FL, Serres JR, Colonnier F, Franceschini N, Viollet S, Ruffier F. A biomimetic vision-based hovercraft accounts for bees' complex behaviour in various corridors. BIOINSPIRATION & BIOMIMETICS 2014; 9:036003. [PMID: 24615558 DOI: 10.1088/1748-3182/9/3/036003] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
Here we present the first systematic comparison between the visual guidance behaviour of a biomimetic robot and those of honeybees flying in similar environments. We built a miniature hovercraft which can travel safely along corridors with various configurations. For the first time, we implemented on a real physical robot the 'lateral optic flow regulation autopilot', which we previously studied computer simulations. This autopilot inspired by the results of experiments on various species of hymenoptera consists of two intertwined feedback loops, the speed and lateral control loops, each of which has its own optic flow (OF) set-point. A heading-lock system makes the robot move straight ahead as fast as 69 cm s(-1) with a clearance from one wall as small as 31 cm, giving an unusually high translational OF value (125° s(-1)). Our biomimetic robot was found to navigate safely along straight, tapered and bent corridors, and to react appropriately to perturbations such as the lack of texture on one wall, the presence of a tapering or non-stationary section of the corridor and even a sloping terrain equivalent to a wind disturbance. The front end of the visual system consists of only two local motion sensors (LMS), one on each side. This minimalistic visual system measuring the lateral OF suffices to control both the robot's forward speed and its clearance from the walls without ever measuring any speeds or distances. We added two additional LMSs oriented at +/-45° to improve the robot's performances in stiffly tapered corridors. The simple control system accounts for worker bees' ability to navigate safely in six challenging environments: straight corridors, single walls, tapered corridors, straight corridors with part of one wall moving or missing, as well as in the presence of wind.
Collapse
Affiliation(s)
- Frédéric L Roubieu
- Aix-Marseille Université, CNRS, ISM UMR 7287, 13288, Marseille cedex 09, France
| | | | | | | | | | | |
Collapse
|
30
|
Ruffier F, Franceschini N. Optic Flow Regulation in Unsteady Environments: A Tethered MAV Achieves Terrain Following and Targeted Landing Over a Moving Platform. J INTELL ROBOT SYST 2014. [DOI: 10.1007/s10846-014-0062-5] [Citation(s) in RCA: 27] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
31
|
Keshavan J, Gremillion G, Escobar-Alvarez H, Humbert JS. A μ analysis-based, controller-synthesis framework for robust bioinspired visual navigation in less-structured environments. BIOINSPIRATION & BIOMIMETICS 2014; 9:025011. [PMID: 24852145 DOI: 10.1088/1748-3182/9/2/025011] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
Safe, autonomous navigation by aerial microsystems in less-structured environments is a difficult challenge to overcome with current technology. This paper presents a novel visual-navigation approach that combines bioinspired wide-field processing of optic flow information with control-theoretic tools for synthesis of closed loop systems, resulting in robustness and performance guarantees. Structured singular value analysis is used to synthesize a dynamic controller that provides good tracking performance in uncertain environments without resorting to explicit pose estimation or extraction of a detailed environmental depth map. Experimental results with a quadrotor demonstrate the vehicle's robust obstacle-avoidance behaviour in a straight line corridor, an S-shaped corridor and a corridor with obstacles distributed in the vehicle's path. The computational efficiency and simplicity of the current approach offers a promising alternative to satisfying the payload, power and bandwidth constraints imposed by aerial microsystems.
Collapse
Affiliation(s)
- J Keshavan
- Autonomous Vehicles Laboratory, Department of Aerospace Engineering, University of Maryland, College Park 20742, USA
| | | | | | | |
Collapse
|
32
|
Abstract
Most experiments on the flight behavior of Drosophila melanogaster have been performed within confined laboratory chambers, yet the natural history of these animals involves dispersal that takes place on a much larger spatial scale. Thirty years ago, a group of population geneticists performed a series of mark-and-recapture experiments on Drosophila flies, which demonstrated that even cosmopolitan species are capable of covering 10 km of open desert, probably in just a few hours and without the possibility of feeding along the way. In this review I revisit these fascinating and informative experiments and attempt to explain how-from takeoff to landing-the flies might have made these journeys based on our current knowledge of flight behavior. This exercise provides insight into how animals generate long behavioral sequences using sensory-motor modules that may have an ancient evolutionary origin.
Collapse
|
33
|
Hofmann V, Sanguinetti-Scheck JI, Künzel S, Geurten B, Gómez-Sena L, Engelmann J. Sensory flow shaped by active sensing: sensorimotor strategies in electric fish. J Exp Biol 2013; 216:2487-500. [DOI: 10.1242/jeb.082420] [Citation(s) in RCA: 48] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
Summary
Goal-directed behavior in most cases is composed of a sequential order of elementary motor patterns shaped by sensorimotor contingencies. The sensory information acquired thus is structured in both space and time. Here we review the role of motion during the generation of sensory flow focusing on how animals actively shape information by behavioral strategies. We use the well-studied examples of vision in insects and echolocation in bats to describe commonalities of sensory-related behavioral strategies across sensory systems, and evaluate what is currently known about comparable active sensing strategies in electroreception of electric fish. In this sensory system the sensors are dispersed across the animal's body and the carrier source emitting energy used for sensing, the electric organ, is moved while the animal moves. Thus ego-motions strongly influence sensory dynamics. We present, for the first time, data of electric flow during natural probing behavior in Gnathonemus petersii (Mormyridae), which provide evidence for this influence. These data reveal a complex interdependency between the physical input to the receptors and the animal's movements, posture and objects in its environment. Although research on spatiotemporal dynamics in electrolocation is still in its infancy, the emerging field of dynamical sensory systems analysis in electric fish is a promising approach to the study of the link between movement and acquisition of sensory information.
Collapse
Affiliation(s)
- Volker Hofmann
- Bielefeld University, Faculty of Biology/CITEC, AG Active Sensing, Universitätsstraße 25, 33615 Bielefeld, Germany
| | - Juan I. Sanguinetti-Scheck
- Universidad de la Republica, Facultad de Ciencias, Laboratorio de Neurociencias, Igua 4225, Montevideo, Uruguay
| | - Silke Künzel
- Bielefeld University, Faculty of Biology/CITEC, AG Active Sensing, Universitätsstraße 25, 33615 Bielefeld, Germany
| | - Bart Geurten
- Göttingen University, Abt. Zelluläre Neurobiologie, Schwann-Schleiden Forschungszentrum, Julia-Lermontowa-Weg 3, 37077 Göttingen, Germany
| | - Leonel Gómez-Sena
- Universidad de la Republica, Facultad de Ciencias, Laboratorio de Neurociencias, Igua 4225, Montevideo, Uruguay
| | - Jacob Engelmann
- Bielefeld University, Faculty of Biology/CITEC, AG Active Sensing, Universitätsstraße 25, 33615 Bielefeld, Germany
| |
Collapse
|
34
|
Floreano D, Pericet-Camara R, Viollet S, Ruffier F, Brückner A, Leitel R, Buss W, Menouni M, Expert F, Juston R, Dobrzynski MK, L'Eplattenier G, Recktenwald F, Mallot HA, Franceschini N. Miniature curved artificial compound eyes. Proc Natl Acad Sci U S A 2013; 110:9267-72. [PMID: 23690574 PMCID: PMC3677439 DOI: 10.1073/pnas.1219068110] [Citation(s) in RCA: 126] [Impact Index Per Article: 11.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
In most animal species, vision is mediated by compound eyes, which offer lower resolution than vertebrate single-lens eyes, but significantly larger fields of view with negligible distortion and spherical aberration, as well as high temporal resolution in a tiny package. Compound eyes are ideally suited for fast panoramic motion perception. Engineering a miniature artificial compound eye is challenging because it requires accurate alignment of photoreceptive and optical components on a curved surface. Here, we describe a unique design method for biomimetic compound eyes featuring a panoramic, undistorted field of view in a very thin package. The design consists of three planar layers of separately produced arrays, namely, a microlens array, a neuromorphic photodetector array, and a flexible printed circuit board that are stacked, cut, and curved to produce a mechanically flexible imager. Following this method, we have prototyped and characterized an artificial compound eye bearing a hemispherical field of view with embedded and programmable low-power signal processing, high temporal resolution, and local adaptation to illumination. The prototyped artificial compound eye possesses several characteristics similar to the eye of the fruit fly Drosophila and other arthropod species. This design method opens up additional vistas for a broad range of applications in which wide field motion detection is at a premium, such as collision-free navigation of terrestrial and aerospace vehicles, and for the experimental testing of insect vision theories.
Collapse
Affiliation(s)
- Dario Floreano
- Laboratory of Intelligent Systems, École Polytechnique Fédérale de Lausanne, CH-1015 Lausanne, Switzerland.
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
Collapse
|
35
|
Cancar L, Díaz A, Barrientos A, Travieso D, Jacobs DM. Tactile-Sight: A Sensory Substitution Device Based on Distance-Related Vibrotactile Flow. INT J ADV ROBOT SYST 2013. [DOI: 10.5772/56235] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022] Open
Abstract
Sensory substitution is a research field of increasing interest with regard to technical, applied and theoretical issues. Among the latter, it is of central interest to understand the form in which humans perceive the environment. Ecological psychology, among other approaches, proposes that we can detect higher-order informational variables (in the sense that they are defined over substantial spatial and temporal intervals) that specify our interaction with the environment. When using a vibrotactile sensory substitution device, it is reasonable to ask if stimulation on the skin may be exploitable to detect higher-order variables. Motivated by this question, a portable vibrotactile sensory substitution device was built, using distance-based information as a source and driving a large number of vibrotactile actuators (72 in the reported version, 120 max). The portable device was designed to explore real environments, allowing natural unrestricted movement for the user while providing contingent real-time vibrotactile information. Two preliminary experiments were performed. In the first one, participants were asked to detect the time to contact of an approaching ball in a simulated (desktop) environment. Reasonable performance was observed in all experimental conditions, including the one with only tactile stimulation. In the second experiment, a portable version of the device was used in a real environment, where participants were asked to hit an approaching ball. Participants were able to coordinate their arm movements with vibrotactile stimulation in appropriate timing. We conclude that vibrotactile flow can be generated by distance-based activation of the actuators and that this stimulation on the skin allows users to perceive time-to-contact related environmental properties.
Collapse
Affiliation(s)
- Leandro Cancar
- Madrid Polytechnic University UPM, Department of Automation and Robotics, Spain
| | - Alex Díaz
- Madrid Autonomous University UAM, Perception and Action Group, Spain
| | - Antonio Barrientos
- Madrid Polytechnic University UPM, Department of Automation and Robotics, Spain
| | - David Travieso
- Madrid Autonomous University UAM, Perception and Action Group, Spain
| | - David M. Jacobs
- Madrid Autonomous University UAM, Perception and Action Group, Spain
| |
Collapse
|
36
|
Reiser MB, Dickinson MH. Visual motion speed determines a behavioral switch from forward flight to expansion avoidance in Drosophila. ACTA ACUST UNITED AC 2012. [PMID: 23197097 DOI: 10.1242/jeb.074732] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
As an animal translates through the world, its eyes will experience a radiating pattern of optic flow in which there is a focus of expansion directly in front and a focus of contraction behind. For flying fruit flies, recent experiments indicate that flies actively steer away from patterns of expansion. Whereas such a reflex makes sense for avoiding obstacles, it presents a paradox of sorts because an insect could not navigate stably through a visual scene unless it tolerated flight towards a focus of expansion during episodes of forward translation. One possible solution to this paradox is that a fly's behavior might change such that it steers away from strong expansion, but actively steers towards weak expansion. In this study, we use a tethered flight arena to investigate the influence of stimulus strength on the magnitude and direction of turning responses to visual expansion in flies. These experiments indicate that the expansion-avoidance behavior is speed dependent. At slower speeds of expansion, flies exhibit an attraction to the focus of expansion, whereas the behavior transforms to expansion avoidance at higher speeds. Open-loop experiments indicate that this inversion of the expansion-avoidance response depends on whether or not the head is fixed to the thorax. The inversion of the expansion-avoidance response with stimulus strength has a clear manifestation under closed-loop conditions. Flies will actively orient towards a focus of expansion at low temporal frequency but steer away from it at high temporal frequency. The change in the response with temporal frequency does not require motion stimuli directly in front or behind the fly. Animals in which the stimulus was presented within 120 deg sectors on each side consistently steered towards expansion at low temporal frequency and steered towards contraction at high temporal frequency. A simple model based on an array of Hassenstein-Reichardt type elementary movement detectors suggests that the inversion of the expansion-avoidance reflex can explain the spatial distribution of straight flight segments and collision-avoidance saccades when flies fly freely within an open circular arena.
Collapse
Affiliation(s)
- Michael B Reiser
- HHMI Janelia Farm Research Campus, 19700 Helix Drive, Ashburn, VA 20147, USA.
| | | |
Collapse
|
37
|
Ohzono T, Monobe H. Microwrinkles: Shape-tunability and applications. J Colloid Interface Sci 2012; 368:1-8. [DOI: 10.1016/j.jcis.2011.11.075] [Citation(s) in RCA: 53] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2011] [Revised: 11/11/2011] [Accepted: 11/12/2011] [Indexed: 11/24/2022]
|
38
|
Algorithms in nature: the convergence of systems biology and computational thinking. Mol Syst Biol 2011; 7:546. [PMID: 22068329 PMCID: PMC3261700 DOI: 10.1038/msb.2011.78] [Citation(s) in RCA: 73] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/17/2011] [Accepted: 09/07/2011] [Indexed: 01/30/2023] Open
Abstract
Biologists rely on computational methods to analyze and integrate large data sets, while several computational methods were inspired by the high-level design principles of biological systems. This Perspectives discusses the recent convergence of these two ways of thinking. Computer science and biology have enjoyed a long and fruitful relationship for decades. Biologists rely on computational methods to analyze and integrate large data sets, while several computational methods were inspired by the high-level design principles of biological systems. Recently, these two directions have been converging. In this review, we argue that thinking computationally about biological processes may lead to more accurate models, which in turn can be used to improve the design of algorithms. We discuss the similar mechanisms and requirements shared by computational and biological processes and then present several recent studies that apply this joint analysis strategy to problems related to coordination, network analysis, and tracking and vision. We also discuss additional biological processes that can be studied in a similar manner and link them to potential computational problems. With the rapid accumulation of data detailing the inner workings of biological systems, we expect this direction of coupling biological and computational studies to greatly expand in the future.
Collapse
|
39
|
Srinivasan MV. Honeybees as a model for the study of visually guided flight, navigation, and biologically inspired robotics. Physiol Rev 2011; 91:413-60. [PMID: 21527730 DOI: 10.1152/physrev.00005.2010] [Citation(s) in RCA: 103] [Impact Index Per Article: 7.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Research over the past century has revealed the impressive capacities of the honeybee, Apis mellifera, in relation to visual perception, flight guidance, navigation, and learning and memory. These observations, coupled with the relative ease with which these creatures can be trained, and the relative simplicity of their nervous systems, have made honeybees an attractive model in which to pursue general principles of sensorimotor function in a variety of contexts, many of which pertain not just to honeybees, but several other animal species, including humans. This review begins by describing the principles of visual guidance that underlie perception of the world in three dimensions, obstacle avoidance, control of flight speed, and orchestrating smooth landings. We then consider how navigation over long distances is accomplished, with particular reference to how bees use information from the celestial compass to determine their flight bearing, and information from the movement of the environment in their eyes to gauge how far they have flown. Finally, we illustrate how some of the principles gleaned from these studies are now being used to design novel, biologically inspired algorithms for the guidance of unmanned aerial vehicles.
Collapse
Affiliation(s)
- Mandyam V Srinivasan
- Queensland Brain Institute and School of Information Technology and Electrical Engineering, University of Queensland, and ARC Center of Excellence in Vision Science, St. Lucia, Australia.
| |
Collapse
|
40
|
Expert F, Viollet S, Ruffier F. Outdoor field performances of insect-based visual motion sensors. J FIELD ROBOT 2011. [DOI: 10.1002/rob.20398] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
|
41
|
Roberts B, Lind R, Chatterjee S. Flight dynamics of a pterosaur-inspired aircraft utilizing a variable-placement vertical tail. BIOINSPIRATION & BIOMIMETICS 2011; 6:026010. [PMID: 21558603 DOI: 10.1088/1748-3182/6/2/026010] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/30/2023]
Abstract
Mission performance for small aircraft is often dependent on the turn radius. Various biologically inspired concepts have demonstrated that performance can be improved by morphing the wings in a manner similar to birds and bats; however, the morphing of the vertical tail has received less attention since neither birds nor bats have an appreciable vertical tail. This paper investigates a design that incorporates the morphing of the vertical tail based on the cranial crest of a pterosaur. The aerodynamics demonstrate a reduction in the turn radius of 14% when placing the tail over the nose in comparison to a traditional aft-placed vertical tail. The flight dynamics associated with this configuration has unique characteristics such as a Dutch-roll mode with excessive roll motion and a skid divergence that replaces the roll convergence.
Collapse
Affiliation(s)
- Brian Roberts
- Department of Mechanical and Aerospace Engineering, University of Florida, Gainesville, FL 32611, USA
| | | | | |
Collapse
|
42
|
Portelli G, Ruffier F, Roubieu FL, Franceschini N. Honeybees' speed depends on dorsal as well as lateral, ventral and frontal optic flows. PLoS One 2011; 6:e19486. [PMID: 21589861 PMCID: PMC3093387 DOI: 10.1371/journal.pone.0019486] [Citation(s) in RCA: 46] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2010] [Accepted: 04/08/2011] [Indexed: 11/19/2022] Open
Abstract
Flying insects use the optic flow to navigate safely in unfamiliar environments, especially by adjusting their speed and their clearance from surrounding objects. It has not yet been established, however, which specific parts of the optical flow field insects use to control their speed. With a view to answering this question, freely flying honeybees were trained to fly along a specially designed tunnel including two successive tapering parts: the first part was tapered in the vertical plane and the second one, in the horizontal plane. The honeybees were found to adjust their speed on the basis of the optic flow they perceived not only in the lateral and ventral parts of their visual field, but also in the dorsal part. More specifically, the honeybees' speed varied monotonically, depending on the minimum cross-section of the tunnel, regardless of whether the narrowing occurred in the horizontal or vertical plane. The honeybees' speed decreased or increased whenever the minimum cross-section decreased or increased. In other words, the larger sum of the two opposite optic flows in the horizontal and vertical planes was kept practically constant thanks to the speed control performed by the honeybees upon encountering a narrowing of the tunnel. The previously described ALIS ("AutopiLot using an Insect-based vision System") model nicely matches the present behavioral findings. The ALIS model is based on a feedback control scheme that explains how honeybees may keep their speed proportional to the minimum local cross-section of a tunnel, based solely on optic flow processing, without any need for speedometers or rangefinders. The present behavioral findings suggest how flying insects may succeed in adjusting their speed in their complex foraging environments, while at the same time adjusting their distance not only from lateral and ventral objects but also from those located in their dorsal visual field.
Collapse
Affiliation(s)
- Geoffrey Portelli
- Biorobotic Department, Institute of Movement Science, CNRS/Aix-Marseille II University, Marseille, France
| | - Franck Ruffier
- Biorobotic Department, Institute of Movement Science, CNRS/Aix-Marseille II University, Marseille, France
| | - Frédéric L. Roubieu
- Biorobotic Department, Institute of Movement Science, CNRS/Aix-Marseille II University, Marseille, France
| | - Nicolas Franceschini
- Biorobotic Department, Institute of Movement Science, CNRS/Aix-Marseille II University, Marseille, France
| |
Collapse
|
43
|
Graetzel CF, Nelson BJ, Fry SN. Frequency response of lift control in Drosophila. J R Soc Interface 2010; 7:1603-16. [PMID: 20462877 DOI: 10.1098/rsif.2010.0040] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
The flight control responses of the fruitfly represent a powerful model system to explore neuromotor control mechanisms, whose system level control properties can be suitably characterized with a frequency response analysis. We characterized the lift response dynamics of tethered flying Drosophila in presence of vertically oscillating visual patterns, whose oscillation frequency we varied between 0.1 and 13 Hz. We justified these measurements by showing that the amplitude gain and phase response is invariant to the pattern oscillation amplitude and spatial frequency within a broad dynamic range. We also showed that lift responses are largely linear and time invariant (LTI), a necessary condition for a meaningful analysis of frequency responses and a remarkable characteristic given its nonlinear constituents. The flies responded to increasing oscillation frequencies with a roughly linear decrease in response gain, which dropped to background noise levels at about 6 Hz. The phase lag decreased linearly, consistent with a constant reaction delay of 75 ms. Next, we estimated the free-flight response of the fly to generate a Bode diagram of the lift response. The limitation of lift control to frequencies below 6 Hz is explained with inertial body damping, which becomes dominant at higher frequencies. Our work provides the detailed background and techniques that allow optomotor lift responses of Drosophila to be measured with comparatively simple, affordable and commercially available techniques. The identification of an LTI, pattern velocity dependent, lift control strategy is relevant to the underlying motion computation mechanisms and serves a broader understanding of insects' flight control strategies. The relevance and potential pitfalls of applying system identification techniques in tethered preparations is discussed.
Collapse
Affiliation(s)
- Chauncey F Graetzel
- Institute of Robotics and Intelligent Systems, ETH Zurich, Zurich, Switzerland
| | | | | |
Collapse
|
44
|
|
45
|
Straw AD, Lee S, Dickinson MH. Visual Control of Altitude in Flying Drosophila. Curr Biol 2010; 20:1550-6. [PMID: 20727759 DOI: 10.1016/j.cub.2010.07.025] [Citation(s) in RCA: 58] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/18/2010] [Revised: 06/11/2010] [Accepted: 07/07/2010] [Indexed: 11/28/2022]
Affiliation(s)
- Andrew D Straw
- Bioengineering, California Institute of Technology, Pasadena, CA 91125, USA
| | | | | |
Collapse
|
46
|
Rohrseitz N, Fry SN. Behavioural system identification of visual flight speed control in Drosophila melanogaster. J R Soc Interface 2010; 8:171-85. [PMID: 20525744 DOI: 10.1098/rsif.2010.0225] [Citation(s) in RCA: 29] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Behavioural control in many animals involves complex mechanisms with intricate sensory-motor feedback loops. Modelling allows functional aspects to be captured without relying on a description of the underlying complex, and often unknown, mechanisms. A wide range of engineering techniques are available for modelling, but their ability to describe time-continuous processes is rarely exploited to describe sensory-motor control mechanisms in biological systems. We performed a system identification of visual flight speed control in the fruitfly Drosophila, based on an extensive dataset of open-loop responses previously measured under free flight conditions. We identified a second-order under-damped control model with just six free parameters that well describes both the transient and steady-state characteristics of the open-loop data. We then used the identified control model to predict flight speed responses after a visual perturbation under closed-loop conditions and validated the model with behavioural measurements performed in free-flying flies under the same closed-loop conditions. Our system identification of the fruitfly's flight speed response uncovers the high-level control strategy of a fundamental flight control reflex without depending on assumptions about the underlying physiological mechanisms. The results are relevant for future investigations of the underlying neuromotor processing mechanisms, as well as for the design of biomimetic robots, such as micro-air vehicles.
Collapse
Affiliation(s)
- Nicola Rohrseitz
- Institute of Neuroinformatics, University of Zurich and ETH Zurich, Winterthurerstrasse 190, 8057 Zurich, Switzerland
| | | |
Collapse
|
47
|
Honeybees change their height to restore their optic flow. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2010; 196:307-13. [DOI: 10.1007/s00359-010-0510-z] [Citation(s) in RCA: 31] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2009] [Revised: 01/28/2010] [Accepted: 01/29/2010] [Indexed: 10/19/2022]
|
48
|
Oudeyer PY. On the Impact of Robotics in Behavioral and Cognitive Sciences: From Insect Navigation to Human Cognitive Development. ACTA ACUST UNITED AC 2010. [DOI: 10.1109/tamd.2009.2039057] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
49
|
Portelli G, Serres J, Ruffier F, Franceschini N. Modelling honeybee visual guidance in a 3-D environment. ACTA ACUST UNITED AC 2009; 104:27-39. [PMID: 19909808 DOI: 10.1016/j.jphysparis.2009.11.011] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
In view of the behavioral findings published on bees during the last two decades, it was proposed to decipher the principles underlying bees' autopilot system, focusing in particular on these insects' use of the optic flow (OF). Based on computer-simulated experiments, we developed a vision-based autopilot that enables a "simulated bee" to travel along a tunnel, controlling both its speed and its clearance from the right wall, left wall, ground, and roof. The flying agent thus equipped enjoys three translational degrees of freedom on the surge (x), sway (y), and heave (z) axes, which are uncoupled. This visuo-motor control system, which is called ALIS (AutopiLot using an Insect based vision System), is a dual OF regulator consisting of two interdependent feedback loops, each of which has its own OF set-point. The experiments presented here showed that the simulated bee was able to navigate safely along a straight or tapered tunnel and to react appropriately to any untoward OF perturbations, such as those resulting from the occasional lack of texture on one wall or the tapering of the tunnel. The minimalistic visual system used here (involving only eight pixels) suffices to jointly control both the clearance from the four walls and the forward speed, without having to measure any speeds or distances. The OF sensors and the simple visuo-motor control system we have developed account well for the results of ethological studies performed on honeybees flying freely along straight and tapered corridors.
Collapse
Affiliation(s)
- G Portelli
- The Institute of Movement Sciences, UMR CNRS - Aix-Marseille university II., France.
| | | | | | | |
Collapse
|
50
|
Reynolds AM, Reynolds DR, Smith AD, Chapman JW. A single wind-mediated mechanism explains high-altitude 'non-goal oriented' headings and layering of nocturnally migrating insects. Proc Biol Sci 2009; 277:765-72. [PMID: 19889697 DOI: 10.1098/rspb.2009.1221] [Citation(s) in RCA: 42] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Abstract
Studies made with both entomological and meteorological radars over the last 40 years have frequently reported the occurrence of insect layers, and that the individuals forming these layers often show a considerable degree of uniformity in their headings--behaviour known as 'common orientation'. The environmental cues used by nocturnal migrants to select and maintain common headings, while flying in low illumination levels at great heights above the ground, and the adaptive benefits of this behaviour have long remained a mystery. Here we show how a wind-mediated mechanism accounts for the common orientation patterns of 'medium-sized' nocturnal insects. Our theory posits a mechanism by which migrants are able to align themselves with the direction of the flow using a turbulence cue, thus adding their air speed to the wind speed and significantly increasing their migration distance. Our mechanism also predicts that insects flying in the Northern Hemisphere will typically be offset to the right of the mean wind line when the atmosphere is stably stratified, with the Ekman spiral in full effect. We report on the first evidence for such offsets, and show that they have significant implications for the accurate prediction of the flight trajectories of migrating nocturnal insects.
Collapse
|