1
|
Coppola CM, Strong JB, O'Reilly L, Dalesman S, Akanyeti O. Robot Programming from Fish Demonstrations. Biomimetics (Basel) 2023; 8:248. [PMID: 37366843 DOI: 10.3390/biomimetics8020248] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/14/2023] [Revised: 05/29/2023] [Accepted: 06/07/2023] [Indexed: 06/28/2023] Open
Abstract
Fish are capable of learning complex relations found in their surroundings, and harnessing their knowledge may help to improve the autonomy and adaptability of robots. Here, we propose a novel learning from demonstration framework to generate fish-inspired robot control programs with as little human intervention as possible. The framework consists of six core modules: (1) task demonstration, (2) fish tracking, (3) analysis of fish trajectories, (4) acquisition of robot training data, (5) generating a perception-action controller, and (6) performance evaluation. We first describe these modules and highlight the key challenges pertaining to each one. We then present an artificial neural network for automatic fish tracking. The network detected fish successfully in 85% of the frames, and in these frames, its average pose estimation error was less than 0.04 body lengths. We finally demonstrate how the framework works through a case study focusing on a cue-based navigation task. Two low-level perception-action controllers were generated through the framework. Their performance was measured using two-dimensional particle simulations and compared against two benchmark controllers, which were programmed manually by a researcher. The fish-inspired controllers had excellent performance when the robot was started from the initial conditions used in fish demonstrations (>96% success rate), outperforming the benchmark controllers by at least 3%. One of them also had an excellent generalisation performance when the robot was started from random initial conditions covering a wider range of starting positions and heading angles (>98% success rate), again outperforming the benchmark controllers by 12%. The positive results highlight the utility of the framework as a research tool to form biological hypotheses on how fish navigate in complex environments and design better robot controllers on the basis of biological findings.
Collapse
Affiliation(s)
| | | | - Lissa O'Reilly
- Department of Life Sciences, Aberystwyth University, Ceredigion SY23 3DA, UK
| | - Sarah Dalesman
- Department of Life Sciences, Aberystwyth University, Ceredigion SY23 3DA, UK
| | - Otar Akanyeti
- Department of Computer Science, Aberystwyth University, Ceredigion SY23 3DB, UK
| |
Collapse
|
2
|
Yadipour M, Billah MA, Faruque IA. Optic flow enrichment via Drosophila head and retina motions to support inflight position regulation. J Theor Biol 2023; 562:111416. [PMID: 36681182 DOI: 10.1016/j.jtbi.2023.111416] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2021] [Revised: 12/13/2022] [Accepted: 01/11/2023] [Indexed: 01/20/2023]
Abstract
Developing a functional description of the neural control circuits and visual feedback paths underlying insect flight behaviors is an active research area. Feedback controllers incorporating engineering models of the insect visual system outputs have described some flight behaviors, yet they do not explain how insects are able to stabilize their body position relative to nearby targets such as neighbors or forage sources, especially in challenging environments in which optic flow is poor. The insect experimental community is simultaneously recording a growing library of in-flight head and eye motions that may be linked to increased perception. This study develops a quantitative model of the optic flow experienced by a flying insect or robot during head yawing rotations (distinct from lateral peering motions in previous work) with a single other target in view. This study then applies a model of insect visuomotor feedback to show via analysis and simulation of five species that these head motions sufficiently enrich the optic flow and that the output feedback can provide relative position regulation relative to the single target (asymptotic stability). In the simplifying case of pure rotation relative to the body, theoretical analysis provides a stronger stability guarantee. The results are shown to be robust to anatomical neck angle limits and body vibrations, persist with more detailed Drosophila lateral-directional flight dynamics simulations, and generalize to recent retinal motion studies. Together, these results suggest that the optic flow enrichment provided by head or pseudopupil rotation could be used in an insect's neural processing circuit to enable position regulation.
Collapse
Affiliation(s)
- Mehdi Yadipour
- School of Mechanical and Aerospace Engineering, Oklahoma State University, Stillwater, OK, 74078, USA.
| | - Md Arif Billah
- School of Mechanical and Aerospace Engineering, Oklahoma State University, Stillwater, OK, 74078, USA.
| | - Imraan A Faruque
- School of Mechanical and Aerospace Engineering, Oklahoma State University, Stillwater, OK, 74078, USA.
| |
Collapse
|
3
|
Huang X, Qiao H, Li H, Jiang Z. Bioinspired approach-sensitive neural network for collision detection in cluttered and dynamic backgrounds. Appl Soft Comput 2022. [DOI: 10.1016/j.asoc.2022.108782] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/02/2022]
|
4
|
Ohradzansky MT, Humbert JS. Lidar-Based Navigation of Subterranean Environments Using Bio-Inspired Wide-Field Integration of Nearness. SENSORS (BASEL, SWITZERLAND) 2022; 22:849. [PMID: 35161595 PMCID: PMC8840438 DOI: 10.3390/s22030849] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/31/2021] [Revised: 01/17/2022] [Accepted: 01/18/2022] [Indexed: 06/14/2023]
Abstract
Navigating unknown environments is an ongoing challenge in robotics. Processing large amounts of sensor data to maintain localization, maps of the environment, and sensible paths can result in high compute loads and lower maximum vehicle speeds. This paper presents a bio-inspired algorithm for efficiently processing depth measurements to achieve fast navigation of unknown subterranean environments. Animals developed efficient sensorimotor convergence approaches, allowing for rapid processing of large numbers of spatially distributed measurements into signals relevant for different behavioral responses necessary to their survival. Using a spatial inner-product to model this sensorimotor convergence principle, environmentally relative states critical to navigation are extracted from spatially distributed depth measurements using derived weighting functions. These states are then applied as feedback to control a simulated quadrotor platform, enabling autonomous navigation in subterranean environments. The resulting outer-loop velocity controller is demonstrated in both a generalized subterranean environment, represented by an infinite cylinder, and nongeneralized environments like tunnels and caves.
Collapse
Affiliation(s)
- Michael T. Ohradzansky
- Department of Aerospace Engineering Sciences, University of Colorado Boulder, 3775 Discovery Drive, Boulder, CO 80303, USA
| | - J. Sean Humbert
- Department of Mechanical Engineering, University of Colorado Boulder, 427 UCB, 1111 Engineering Dr, Boulder, CO 80309, USA;
| |
Collapse
|
5
|
|
6
|
Billah MA, Faruque IA. Bioinspired Visuomotor Feedback in a Multiagent Group/Swarm Context. IEEE T ROBOT 2021. [DOI: 10.1109/tro.2020.3033703] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
7
|
Wang H, Fu Q, Wang H, Baxter P, Peng J, Yue S. A bioinspired angular velocity decoding neural network model for visually guided flights. Neural Netw 2021; 136:180-193. [PMID: 33494035 DOI: 10.1016/j.neunet.2020.12.008] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2020] [Revised: 12/03/2020] [Accepted: 12/07/2020] [Indexed: 11/17/2022]
Abstract
Efficient and robust motion perception systems are important pre-requisites for achieving visually guided flights in future micro air vehicles. As a source of inspiration, the visual neural networks of flying insects such as honeybee and Drosophila provide ideal examples on which to base artificial motion perception models. In this paper, we have used this approach to develop a novel method that solves the fundamental problem of estimating angular velocity for visually guided flights. Compared with previous models, our elementary motion detector (EMD) based model uses a separate texture estimation pathway to effectively decode angular velocity, and demonstrates considerable independence from the spatial frequency and contrast of the gratings. Using the Unity development platform the model is further tested for tunnel centering and terrain following paradigms in order to reproduce the visually guided flight behaviors of honeybees. In a series of controlled trials, the virtual bee utilizes the proposed angular velocity control schemes to accurately navigate through a patterned tunnel, maintaining a suitable distance from the undulating textured terrain. The results are consistent with both neuron spike recordings and behavioral path recordings of real honeybees, thereby demonstrating the model's potential for implementation in micro air vehicles which have only visual sensors.
Collapse
Affiliation(s)
- Huatian Wang
- Computational Intelligence Laboratory (CIL), University of Lincoln, Lincoln, UK; Machine Life and Intelligence Research Center, Guangzhou University, Guangzhou, China
| | - Qinbing Fu
- Machine Life and Intelligence Research Center, Guangzhou University, Guangzhou, China; Computational Intelligence Laboratory (CIL), University of Lincoln, Lincoln, UK
| | - Hongxin Wang
- Machine Life and Intelligence Research Center, Guangzhou University, Guangzhou, China; Computational Intelligence Laboratory (CIL), University of Lincoln, Lincoln, UK
| | - Paul Baxter
- Computational Intelligence Laboratory (CIL), University of Lincoln, Lincoln, UK
| | - Jigen Peng
- School of Mathematics and Information Science, Guangzhou University, Guangzhou, China; Machine Life and Intelligence Research Center, Guangzhou University, Guangzhou, China.
| | - Shigang Yue
- Machine Life and Intelligence Research Center, Guangzhou University, Guangzhou, China; Computational Intelligence Laboratory (CIL), University of Lincoln, Lincoln, UK.
| |
Collapse
|
8
|
Escobar-Alvarez HD, Ohradzansky M, Keshavan J, Ranganathan BN, Humbert JS. Bioinspired Approaches for Autonomous Small-Object Detection and Avoidance. IEEE T ROBOT 2019. [DOI: 10.1109/tro.2019.2922472] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
9
|
Huang JV, Wei Y, Krapp HG. A biohybrid fly-robot interface system that performs active collision avoidance. BIOINSPIRATION & BIOMIMETICS 2019; 14:065001. [PMID: 31412322 DOI: 10.1088/1748-3190/ab3b23] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
We have designed a bio-hybrid fly-robot interface (FRI) to study sensorimotor control in insects. The FRI consists of a miniaturized recording platform mounted on a two-wheeled robot and is controlled by the neuronal spiking activity of an identified visual interneuron, the blowfly H1-cell. For a given turning radius of the robot, we found a proportional relationship between the spike rate of the H1-cell and the relative distance of the FRI from the patterned wall of an experimental arena. Under closed-loop conditions during oscillatory forward movements biased towards the wall, collision avoidance manoeuvres were triggered whenever the H1-cell spike rate exceeded a certain threshold value. We also investigated the FRI behaviour in corners of the arena. The ultimate goal is to enable autonomous and energy-efficient manoeuvrings of the FRI within arbitrary visual environments.
Collapse
Affiliation(s)
- Jiaqi V Huang
- Department of Bioengineering, Imperial College London, London SW7 2AZ, United Kingdom
| | | | | |
Collapse
|
10
|
Escobar‐Alvarez HD, Johnson N, Hebble T, Klingebiel K, Quintero SAP, Regenstein J, Browning NA. R‐ADVANCE: Rapid Adaptive Prediction for Vision‐based Autonomous Navigation, Control, and Evasion. J FIELD ROBOT 2017. [DOI: 10.1002/rob.21744] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Affiliation(s)
| | - Neil Johnson
- Scientific Systems Company Inc. Woburn Massachusetts 01801
| | - Tom Hebble
- Scientific Systems Company Inc. Woburn Massachusetts 01801
| | | | | | | | | |
Collapse
|
11
|
Serres JR, Ruffier F. Optic flow-based collision-free strategies: From insects to robots. ARTHROPOD STRUCTURE & DEVELOPMENT 2017; 46:703-717. [PMID: 28655645 DOI: 10.1016/j.asd.2017.06.003] [Citation(s) in RCA: 64] [Impact Index Per Article: 9.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/07/2016] [Revised: 06/19/2017] [Accepted: 06/19/2017] [Indexed: 06/07/2023]
Abstract
Flying insects are able to fly smartly in an unpredictable environment. It has been found that flying insects have smart neurons inside their tiny brains that are sensitive to visual motion also called optic flow. Consequently, flying insects rely mainly on visual motion during their flight maneuvers such as: takeoff or landing, terrain following, tunnel crossing, lateral and frontal obstacle avoidance, and adjusting flight speed in a cluttered environment. Optic flow can be defined as the vector field of the apparent motion of objects, surfaces, and edges in a visual scene generated by the relative motion between an observer (an eye or a camera) and the scene. Translational optic flow is particularly interesting for short-range navigation because it depends on the ratio between (i) the relative linear speed of the visual scene with respect to the observer and (ii) the distance of the observer from obstacles in the surrounding environment without any direct measurement of either speed or distance. In flying insects, roll stabilization reflex and yaw saccades attenuate any rotation at the eye level in roll and yaw respectively (i.e. to cancel any rotational optic flow) in order to ensure pure translational optic flow between two successive saccades. Our survey focuses on feedback-loops which use the translational optic flow that insects employ for collision-free navigation. Optic flow is likely, over the next decade to be one of the most important visual cues that can explain flying insects' behaviors for short-range navigation maneuvers in complex tunnels. Conversely, the biorobotic approach can therefore help to develop innovative flight control systems for flying robots with the aim of mimicking flying insects' abilities and better understanding their flight.
Collapse
|
12
|
Roubieu FL, Serres JR, Colonnier F, Franceschini N, Viollet S, Ruffier F. A biomimetic vision-based hovercraft accounts for bees' complex behaviour in various corridors. BIOINSPIRATION & BIOMIMETICS 2014; 9:036003. [PMID: 24615558 DOI: 10.1088/1748-3182/9/3/036003] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
Here we present the first systematic comparison between the visual guidance behaviour of a biomimetic robot and those of honeybees flying in similar environments. We built a miniature hovercraft which can travel safely along corridors with various configurations. For the first time, we implemented on a real physical robot the 'lateral optic flow regulation autopilot', which we previously studied computer simulations. This autopilot inspired by the results of experiments on various species of hymenoptera consists of two intertwined feedback loops, the speed and lateral control loops, each of which has its own optic flow (OF) set-point. A heading-lock system makes the robot move straight ahead as fast as 69 cm s(-1) with a clearance from one wall as small as 31 cm, giving an unusually high translational OF value (125° s(-1)). Our biomimetic robot was found to navigate safely along straight, tapered and bent corridors, and to react appropriately to perturbations such as the lack of texture on one wall, the presence of a tapering or non-stationary section of the corridor and even a sloping terrain equivalent to a wind disturbance. The front end of the visual system consists of only two local motion sensors (LMS), one on each side. This minimalistic visual system measuring the lateral OF suffices to control both the robot's forward speed and its clearance from the walls without ever measuring any speeds or distances. We added two additional LMSs oriented at +/-45° to improve the robot's performances in stiffly tapered corridors. The simple control system accounts for worker bees' ability to navigate safely in six challenging environments: straight corridors, single walls, tapered corridors, straight corridors with part of one wall moving or missing, as well as in the presence of wind.
Collapse
Affiliation(s)
- Frédéric L Roubieu
- Aix-Marseille Université, CNRS, ISM UMR 7287, 13288, Marseille cedex 09, France
| | | | | | | | | | | |
Collapse
|
13
|
Keshavan J, Gremillion G, Escobar-Alvarez H, Humbert JS. A μ analysis-based, controller-synthesis framework for robust bioinspired visual navigation in less-structured environments. BIOINSPIRATION & BIOMIMETICS 2014; 9:025011. [PMID: 24852145 DOI: 10.1088/1748-3182/9/2/025011] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
Safe, autonomous navigation by aerial microsystems in less-structured environments is a difficult challenge to overcome with current technology. This paper presents a novel visual-navigation approach that combines bioinspired wide-field processing of optic flow information with control-theoretic tools for synthesis of closed loop systems, resulting in robustness and performance guarantees. Structured singular value analysis is used to synthesize a dynamic controller that provides good tracking performance in uncertain environments without resorting to explicit pose estimation or extraction of a detailed environmental depth map. Experimental results with a quadrotor demonstrate the vehicle's robust obstacle-avoidance behaviour in a straight line corridor, an S-shaped corridor and a corridor with obstacles distributed in the vehicle's path. The computational efficiency and simplicity of the current approach offers a promising alternative to satisfying the payload, power and bandwidth constraints imposed by aerial microsystems.
Collapse
Affiliation(s)
- J Keshavan
- Autonomous Vehicles Laboratory, Department of Aerospace Engineering, University of Maryland, College Park 20742, USA
| | | | | | | |
Collapse
|
14
|
Dimble KD, Faddy JM, Humbert JS. Electrolocation-based underwater obstacle avoidance using wide-field integration methods. BIOINSPIRATION & BIOMIMETICS 2014; 9:016012. [PMID: 24451219 DOI: 10.1088/1748-3182/9/1/016012] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
Weakly electric fish are capable of efficiently performing obstacle avoidance in dark and navigationally challenging aquatic environments using electrosensory information. This sensory modality enables extraction of relevant proximity information about surrounding obstacles by interpretation of perturbations induced to the fish's self-generated electric field. In this paper, reflexive obstacle avoidance is demonstrated by extracting relative proximity information using spatial decompositions of the perturbation signal, also called an electric image. Electrostatics equations were formulated for mathematically expressing electric images due to a straight tunnel to the electric field generated with a planar electro-sensor model. These equations were further used to design a wide-field integration based static output feedback controller. The controller was implemented in quasi-static simulations for environments with complicated geometries modelled using finite element methods to demonstrate sense and avoid behaviours. The simulation results were confirmed by performing experiments using a computer operated gantry system in environments lined with either conductive or non-conductive objects acting as global stimuli to the field of the electro-sensor. The proposed approach is computationally inexpensive and readily implementable, making underwater autonomous navigation in real-time feasible.
Collapse
|
15
|
Visual control of navigation in insects and its relevance for robotics. Curr Opin Neurobiol 2012; 21:535-43. [PMID: 21689925 DOI: 10.1016/j.conb.2011.05.020] [Citation(s) in RCA: 64] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/07/2011] [Revised: 05/02/2011] [Accepted: 05/24/2011] [Indexed: 11/22/2022]
Abstract
Flying insects display remarkable agility, despite their diminutive eyes and brains. This review describes our growing understanding of how these creatures use visual information to stabilize flight, avoid collisions with objects, regulate flight speed, detect and intercept other flying insects such as mates or prey, navigate to a distant food source, and orchestrate flawless landings. It also outlines the ways in which these insights are now being used to develop novel, biologically inspired strategies for the guidance of autonomous, airborne vehicles.
Collapse
|
16
|
Expert F, Viollet S, Ruffier F. Outdoor field performances of insect-based visual motion sensors. J FIELD ROBOT 2011. [DOI: 10.1002/rob.20398] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
|
17
|
Shoemaker PA, Hyslop AM, Humbert JS. Optic flow estimation on trajectories generated by bio-inspired closed-loop flight. BIOLOGICAL CYBERNETICS 2011; 104:339-350. [PMID: 21626306 DOI: 10.1007/s00422-011-0436-8] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/21/2010] [Accepted: 05/09/2011] [Indexed: 05/30/2023]
Abstract
We generated panoramic imagery by simulating a fly-like robot carrying an imaging sensor, moving in free flight through a virtual arena bounded by walls, and containing obstructions. Flight was conducted under closed-loop control by a bio-inspired algorithm for visual guidance with feedback signals corresponding to the true optic flow that would be induced on an imager (computed by known kinematics and position of the robot relative to the environment). The robot had dynamics representative of a housefly-sized organism, although simplified to two-degree-of-freedom flight to generate uniaxial (azimuthal) optic flow on the retina in the plane of travel. Surfaces in the environment contained images of natural and man-made scenes that were captured by the moving sensor. Two bio-inspired motion detection algorithms and two computational optic flow estimation algorithms were applied to sequences of image data, and their performance as optic flow estimators was evaluated by estimating the mutual information between outputs and true optic flow in an equatorial section of the visual field. Mutual information for individual estimators at particular locations within the visual field was surprisingly low (less than 1 bit in all cases) and considerably poorer for the bio-inspired algorithms that the man-made computational algorithms. However, mutual information between weighted sums of these signals and comparable sums of the true optic flow showed significant increases for the bio-inspired algorithms, whereas such improvement did not occur for the computational algorithms. Such summation is representative of the spatial integration performed by wide-field motion-sensitive neurons in the third optic ganglia of flies.
Collapse
|
18
|
Xu P, Humbert JS, Abshire P. Analog VLSI Implementation of Wide-field Integration Methods. J INTELL ROBOT SYST 2011. [DOI: 10.1007/s10846-011-9549-5] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|
19
|
Hyslop A, Krapp HG, Humbert JS. Control theoretic interpretation of directional motion preferences in optic flow processing interneurons. BIOLOGICAL CYBERNETICS 2010; 103:353-364. [PMID: 20694561 DOI: 10.1007/s00422-010-0404-8] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/04/2009] [Accepted: 07/22/2010] [Indexed: 05/29/2023]
Abstract
In this article, we formalize the processing of optic flow in identified fly lobula plate tangential cells and develop a control theoretic framework that suggests how the signals of these cells may be combined and used to achieve reflex-like navigation behavior. We show that this feedback gain synthesis task can be cast as a combined static state estimation and linear feedback control problem. Our framework allows us to analyze and determine the relationship between optic flow measurements and actuator commands, which greatly simplifies the implementation of biologically inspired control architectures on terrestrial and aerial robotic platforms.
Collapse
Affiliation(s)
- Andrew Hyslop
- Department of Aerospace Engineering, University of Maryland, College Park, MD 20742, USA.
| | | | | |
Collapse
|
20
|
Hérissé B, Hamel T, Mahony R, Russotto FX. A terrain-following control approach for a VTOL Unmanned Aerial Vehicle using average optical flow. Auton Robots 2010. [DOI: 10.1007/s10514-010-9208-x] [Citation(s) in RCA: 41] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
21
|
Faruque I, Sean Humbert J. Dipteran insect flight dynamics. Part 2: Lateral-directional motion about hover. J Theor Biol 2010; 265:306-13. [PMID: 20470783 DOI: 10.1016/j.jtbi.2010.05.003] [Citation(s) in RCA: 61] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2009] [Revised: 04/29/2010] [Accepted: 05/05/2010] [Indexed: 11/18/2022]
Abstract
The purpose of this study is to determine computationally tractable models describing the lateral-directional motion of a Drosophila-like dipteran insect, which may then be used to estimate the requirements for flight control and stabilization. This study continues the work begun in Faruque and Humbert (2010) to extend the quasi-steady aerodynamics model via inclusion of perturbations from the hover state. The aerodynamics model is considered as forcing upon rigid body dynamics, and frequency-based system identification tools used to derive the models. The analysis indicates two stable real poles, and two very lightly damped and nearly unstable complex poles describing a decoupling of roll/sideslip oscillatory motion from a first order subsidence yaw behavior. The results are presented with uncertainty variation for both a smaller male and larger female phenotype.
Collapse
Affiliation(s)
- Imraan Faruque
- University of Maryland, Department of Aerospace Engineering, College Park, MD 20742, USA.
| | | |
Collapse
|