1
|
Jesusanmi OO, Amin AA, Domcsek N, Knight JC, Philippides A, Nowotny T, Graham P. Investigating visual navigation using spiking neural network models of the insect mushroom bodies. Front Physiol 2024; 15:1379977. [PMID: 38841209 PMCID: PMC11151298 DOI: 10.3389/fphys.2024.1379977] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/31/2024] [Accepted: 04/29/2024] [Indexed: 06/07/2024] Open
Abstract
Ants are capable of learning long visually guided foraging routes with limited neural resources. The visual scene memory needed for this behaviour is mediated by the mushroom bodies; an insect brain region important for learning and memory. In a visual navigation context, the mushroom bodies are theorised to act as familiarity detectors, guiding ants to views that are similar to those previously learned when first travelling along a foraging route. Evidence from behavioural experiments, computational studies and brain lesions all support this idea. Here we further investigate the role of mushroom bodies in visual navigation with a spiking neural network model learning complex natural scenes. By implementing these networks in GeNN-a library for building GPU accelerated spiking neural networks-we were able to test these models offline on an image database representing navigation through a complex outdoor natural environment, and also online embodied on a robot. The mushroom body model successfully learnt a large series of visual scenes (400 scenes corresponding to a 27 m route) and used these memories to choose accurate heading directions during route recapitulation in both complex environments. Through analysing our model's Kenyon cell (KC) activity, we were able to demonstrate that KC activity is directly related to the respective novelty of input images. Through conducting a parameter search we found that there is a non-linear dependence between optimal KC to visual projection neuron (VPN) connection sparsity and the length of time the model is presented with an image stimulus. The parameter search also showed training the model on lower proportions of a route generally produced better accuracy when testing on the entire route. We embodied the mushroom body model and comparator visual navigation algorithms on a Quanser Q-car robot with all processing running on an Nvidia Jetson TX2. On a 6.5 m route, the mushroom body model had a mean distance to training route (error) of 0.144 ± 0.088 m over 5 trials, which was performance comparable to standard visual-only navigation algorithms. Thus, we have demonstrated that a biologically plausible model of the ant mushroom body can navigate complex environments both in simulation and the real world. Understanding the neural basis of this behaviour will provide insight into how neural circuits are tuned to rapidly learn behaviourally relevant information from complex environments and provide inspiration for creating bio-mimetic computer/robotic systems that can learn rapidly with low energy requirements.
Collapse
Affiliation(s)
| | - Amany Azevedo Amin
- Sussex AI, School of Engineering and Informatics, University of Sussex, Brighton, United Kingdom
| | - Norbert Domcsek
- Sussex AI, School of Engineering and Informatics, University of Sussex, Brighton, United Kingdom
| | - James C. Knight
- Sussex AI, School of Engineering and Informatics, University of Sussex, Brighton, United Kingdom
| | - Andrew Philippides
- Sussex AI, School of Engineering and Informatics, University of Sussex, Brighton, United Kingdom
| | - Thomas Nowotny
- Sussex AI, School of Engineering and Informatics, University of Sussex, Brighton, United Kingdom
| | - Paul Graham
- Sussex Neuroscience, School of Life Sciences, University of Sussex, Brighton, United Kingdom
| |
Collapse
|
2
|
Luo J, Miras K, Tomczak J, Eiben AE. Enhancing robot evolution through Lamarckian principles. Sci Rep 2023; 13:21109. [PMID: 38036589 PMCID: PMC10689460 DOI: 10.1038/s41598-023-48338-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2023] [Accepted: 11/25/2023] [Indexed: 12/02/2023] Open
Abstract
Evolutionary robot systems offer two principal advantages: an advanced way of developing robots through evolutionary optimization and a special research platform to conduct what-if experiments regarding questions about evolution. Our study sits at the intersection of these. We investigate the question "What if the 18th-century biologist Lamarck was not completely wrong and individual traits learned during a lifetime could be passed on to offspring through inheritance?" We research this issue through simulations with an evolutionary robot framework where morphologies (bodies) and controllers (brains) of robots are evolvable and robots also can improve their controllers through learning during their lifetime. Within this framework, we compare a Lamarckian system, where learned bits of the brain are inheritable, with a Darwinian system, where they are not. Analyzing simulations based on these systems, we obtain new insights about Lamarckian evolution dynamics and the interaction between evolution and learning. Specifically, we show that Lamarckism amplifies the emergence of 'morphological intelligence', the ability of a given robot body to acquire a good brain by learning, and identify the source of this success: newborn robots have a higher fitness because their inherited brains match their bodies better than those in a Darwinian system.
Collapse
Affiliation(s)
- Jie Luo
- Vrije Universiteit Amsterdam, Amsterdam, The Netherlands.
| | - Karine Miras
- Vrije Universiteit Amsterdam, Amsterdam, The Netherlands
| | - Jakub Tomczak
- Eindhoven University of Technology, Eindhoven, The Netherlands
| | | |
Collapse
|
3
|
Damiano L, Stano P. Explorative Synthetic Biology in AI: Criteria of Relevance and a Taxonomy for Synthetic Models of Living and Cognitive Processes. ARTIFICIAL LIFE 2023; 29:367-387. [PMID: 37490711 DOI: 10.1162/artl_a_00411] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/27/2023]
Abstract
This article tackles the topic of the special issue "Biology in AI: New Frontiers in Hardware, Software and Wetware Modeling of Cognition" in two ways. It addresses the problem of the relevance of hardware, software, and wetware models for the scientific understanding of biological cognition, and it clarifies the contributions that synthetic biology, construed as the synthetic exploration of cognition, can offer to artificial intelligence (AI). The research work proposed in this article is based on the idea that the relevance of hardware, software, and wetware models of biological and cognitive processes-that is, the concrete contribution that these models can make to the scientific understanding of life and cognition-is still unclear, mainly because of the lack of explicit criteria to assess in what ways synthetic models can support the experimental exploration of biological and cognitive phenomena. Our article draws on elements from cybernetic and autopoietic epistemology to define a framework of reference, for the synthetic study of life and cognition, capable of generating a set of assessment criteria and a classification of forms of relevance, for synthetic models, able to overcome the sterile, traditional polarization of their evaluation between mere imitation and full reproduction of the target processes. On the basis of these tools, we tentatively map the forms of relevance characterizing wetware models of living and cognitive processes that synthetic biology can produce and outline a programmatic direction for the development of "organizationally relevant approaches" applying synthetic biology techniques to the investigative field of (embodied) AI.
Collapse
Affiliation(s)
- Luisa Damiano
- IULM University, Research Group on the Epistemology of the Sciences of the Artificial, Department of Communication, Arts, and Media.
| | - Pasquale Stano
- University of Salento, Department of Biological and Environmental Sciences and Technologies
| |
Collapse
|
4
|
Flammang BE. Bioinspired Design in Research: Evolution as Beta-Testing. Integr Comp Biol 2022; 62:icac134. [PMID: 35933125 DOI: 10.1093/icb/icac134] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
Modern fishes represent over 400 million years of evolutionary processes that, in many cases, resulted in selection for phenotypes with particular performance advantages. While this certainly occurred without a trajectory for optimization, it cannot be denied that some morphologies allow organisms to be more effective than others at tasks like evading predation, securing food, and ultimately passing on their genes. In this way, evolution generates a series of iterative prototypes with varying but measurable success in accomplishing objectives. Therefore, careful analysis of fundamental properties underlying biological phenomena allow us to fast-track development of bioinspired technologies aiming to accomplish similar objectives. At the same time, bioinspired designs can be a way to explore evolutionary processes, by better understanding the performance space within which a given morphology operates. Through strong interdisciplinary collaborations, we can develop novel bioinspired technologies that not only excel as robotic devices but that teach us something about biology and the rules of life in the process.
Collapse
Affiliation(s)
- Brooke E Flammang
- Department of Biological Sciences, New Jersey Institute of Technology, 323 Dr. Martin Luther King, Jr. Blvd., 07102, NJ, USA
| |
Collapse
|
5
|
Datteri E. The creation of phenomena in interactive biorobotics. BIOLOGICAL CYBERNETICS 2021; 115:629-642. [PMID: 34714419 PMCID: PMC8642366 DOI: 10.1007/s00422-021-00900-x] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/18/2021] [Accepted: 09/28/2021] [Indexed: 05/19/2023]
Abstract
In so-called interactive biorobotics, robotic models of living systems interact with animals in controlled experimental settings. By observing how the focal animal reacts to the stimuli delivered by the robot, one tests hypotheses concerning the determinants of animal behaviour in social contexts. Building on previous methodological reconstructions of interactive biorobotics, this article reflects on the claim, made by several authors in the field, that this strategy may enable one to explain social phenomena in animals. The answer offered here will be negative: interactive biorobotics does not contribute to the explanation of social phenomena. However, it may greatly contribute to the study of animal behaviour by creating social phenomena in the sense discussed by Ian Hacking, i.e. by precisely defining new phenomena to be explained. It will be also suggested that interactive biorobotics can be combined with more classical robot-based approaches to the study of living systems, leading to a so-called simulation-interactive strategy for the mechanistic explanation of social behaviour in animals.
Collapse
Affiliation(s)
- Edoardo Datteri
- RobotiCSS Lab, Laboratory of Robotics for the Cognitive and Social Sciences, Department of Human Sciences for Education, University of Milano-Bicocca, Milano, Italy.
| |
Collapse
|
6
|
Abstract
In recent studies, robots are used to stimulate living systems in controlled experimental settings. This research strategy is here called interactive biorobotics, to distinguish it from classical biorobotics, in which robots are used to simulate, rather than to stimulate, living system behavior. This article offers a methodological analysis of interactive biorobotics and has two goals. The first one is to argue that interactive biorobotics is methodologically different, in some important respects, from classical biorobotics and from countless instances of model-based science. It will be shown that interactive biorobotics does not conform to the so-called “understanding by building” approach or synthetic method, and that it illustrates a novel use of models in science. The second goal is to reflect on the logic of interactive biorobotics. A distinction will be made between two classes of studies, which will be called “proximal” and “distal.” In proximal studies, experiments involving robot-animal interaction are brought to bear on theoretical hypotheses on robot-animal interaction. In distal studies, experiments involving robot-animal interaction are brought to bear on theoretical hypotheses on animal-animal interaction. Distal studies involve logical steps which may be particularly hard to justify. This distinction, together with a methodological reflection on the relationship between the context in which the experiments are carried out and the context in which the conclusions are expected to hold, will lead to a checklist of questions which may be useful to justify and evaluate the validity of interactive biorobotics studies. The reconstruction of the logic of interactive biorobotics made here, though preliminary, may contribute to justifying the important role that robots, as tool for stimulating living systems, can play in the contemporary life sciences.
Collapse
Affiliation(s)
- Edoardo Datteri
- RobotiCSS Lab - Laboratory of Robotics for the Cognitive and Social Sciences, Department of Human Sciences for Education, University of Milano-Bicocca, Milan, Italy
| |
Collapse
|
7
|
Roberts SF, Koditschek DE, Miracchi LJ. Examples of Gibsonian Affordances in Legged Robotics Research Using an Empirical, Generative Framework. Front Neurorobot 2020; 14:12. [PMID: 32153382 PMCID: PMC7044146 DOI: 10.3389/fnbot.2020.00012] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2019] [Accepted: 01/31/2020] [Indexed: 01/17/2023] Open
Abstract
Evidence from empirical literature suggests that explainable complex behaviors can be built from structured compositions of explainable component behaviors with known properties. Such component behaviors can be built to directly perceive and exploit affordances. Using six examples of recent research in legged robot locomotion, we suggest that robots can be programmed to effectively exploit affordances without developing explicit internal models of them. We use a generative framework to discuss the examples, because it helps us to separate-and thus clarify the relationship between-description of affordance exploitation from description of the internal representations used by the robot in that exploitation. Under this framework, details of the architecture and environment are related to the emergent behavior of the system via a generative explanation. For example, the specific method of information processing a robot uses might be related to the affordance the robot is designed to exploit via a formal analysis of its control policy. By considering the mutuality of the agent-environment system during robot behavior design, roboticists can thus develop robust architectures which implicitly exploit affordances. The manner of this exploitation is made explicit by a well constructed generative explanation.
Collapse
Affiliation(s)
- Sonia F Roberts
- Department of Electrical and Systems Engineering, University of Pennsylvania, Philadelphia, PA, United States
| | - Daniel E Koditschek
- Department of Electrical and Systems Engineering, University of Pennsylvania, Philadelphia, PA, United States
| | - Lisa J Miracchi
- Department of Philosophy, University of Pennsylvania, Philadelphia, PA, United States
| |
Collapse
|
8
|
|
9
|
Serres JR, Ruffier F. Optic flow-based collision-free strategies: From insects to robots. ARTHROPOD STRUCTURE & DEVELOPMENT 2017; 46:703-717. [PMID: 28655645 DOI: 10.1016/j.asd.2017.06.003] [Citation(s) in RCA: 64] [Impact Index Per Article: 9.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/07/2016] [Revised: 06/19/2017] [Accepted: 06/19/2017] [Indexed: 06/07/2023]
Abstract
Flying insects are able to fly smartly in an unpredictable environment. It has been found that flying insects have smart neurons inside their tiny brains that are sensitive to visual motion also called optic flow. Consequently, flying insects rely mainly on visual motion during their flight maneuvers such as: takeoff or landing, terrain following, tunnel crossing, lateral and frontal obstacle avoidance, and adjusting flight speed in a cluttered environment. Optic flow can be defined as the vector field of the apparent motion of objects, surfaces, and edges in a visual scene generated by the relative motion between an observer (an eye or a camera) and the scene. Translational optic flow is particularly interesting for short-range navigation because it depends on the ratio between (i) the relative linear speed of the visual scene with respect to the observer and (ii) the distance of the observer from obstacles in the surrounding environment without any direct measurement of either speed or distance. In flying insects, roll stabilization reflex and yaw saccades attenuate any rotation at the eye level in roll and yaw respectively (i.e. to cancel any rotational optic flow) in order to ensure pure translational optic flow between two successive saccades. Our survey focuses on feedback-loops which use the translational optic flow that insects employ for collision-free navigation. Optic flow is likely, over the next decade to be one of the most important visual cues that can explain flying insects' behaviors for short-range navigation maneuvers in complex tunnels. Conversely, the biorobotic approach can therefore help to develop innovative flight control systems for flying robots with the aim of mimicking flying insects' abilities and better understanding their flight.
Collapse
|
10
|
Blustein D, Rosenthal N, Ayers J. Designing and implementing nervous system simulations on LEGO robots. J Vis Exp 2013:e50519. [PMID: 23728477 DOI: 10.3791/50519] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/31/2022] Open
Abstract
We present a method to use the commercially available LEGO Mindstorms NXT robotics platform to test systems level neuroscience hypotheses. The first step of the method is to develop a nervous system simulation of specific reflexive behaviors of an appropriate model organism; here we use the American Lobster. Exteroceptive reflexes mediated by decussating (crossing) neural connections can explain an animal's taxis towards or away from a stimulus as described by Braitenberg and are particularly well suited for investigation using the NXT platform.(1) The nervous system simulation is programmed using LabVIEW software on the LEGO Mindstorms platform. Once the nervous system is tuned properly, behavioral experiments are run on the robot and on the animal under identical environmental conditions. By controlling the sensory milieu experienced by the specimens, differences in behavioral outputs can be observed. These differences may point to specific deficiencies in the nervous system model and serve to inform the iteration of the model for the particular behavior under study. This method allows for the experimental manipulation of electronic nervous systems and serves as a way to explore neuroscience hypotheses specifically regarding the neurophysiological basis of simple innate reflexive behaviors. The LEGO Mindstorms NXT kit provides an affordable and efficient platform on which to test preliminary biomimetic robot control schemes. The approach is also well suited for the high school classroom to serve as the foundation for a hands-on inquiry-based biorobotics curriculum.
Collapse
|
11
|
Laschi C, Patanè F, Stefano Maini E, Manfredi L, Teti G, Zollo L, Guglielmelli E, Dario P. An Anthropomorphic Robotic Head for Investigating Gaze Control. Adv Robot 2012. [DOI: 10.1163/156855308x291845] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
Affiliation(s)
- Cecilia Laschi
- a ARTS Lab (Advanced Robotics Technology and Systems Laboratory), Scuola Superiore Sant'Anna, Piazza Martiri della Libertà 33, 56127 Pisa, Italy;,
| | - Francesco Patanè
- b ARTS Lab (Advanced Robotics Technology and Systems Laboratory), Scuola Superiore Sant'Anna, Piazza Martiri della Libertà 33, 56127 Pisa, Italy
| | - Eliseo Stefano Maini
- c ARTS Lab (Advanced Robotics Technology and Systems Laboratory), Scuola Superiore Sant'Anna, Piazza Martiri della Libertà 33, 56127 Pisa, Italy
| | - Luigi Manfredi
- d PhD School of Biorobotics Science and Engineering, IMT Institute of Advanced Studies, Via San Micheletto 3, 55100 Lucca, Italy
| | - Giancarlo Teti
- e ARTS Lab (Advanced Robotics Technology and Systems Laboratory), Scuola Superiore Sant'Anna, Piazza Martiri della Libertà 33, 56127 Pisa, Italy, RoboTech srl, Peccioli (Pisa), Italy; Current address: RoboTech srl, Peccioli (Pisa), Italy
| | - Loredana Zollo
- f Laboratory of Biomedical Robotics & EMC, Campus Bio-Medico University, Via Longoni 83, 00155 Rome, Italy
| | - Eugenio Guglielmelli
- g Laboratory of Biomedical Robotics & EMC, Campus Bio-Medico University, Via Longoni 83, 00155 Rome, Italy
| | - Paolo Dario
- h ARTS Lab (Advanced Robotics Technology and Systems Laboratory), Scuola Superiore Sant'Anna, Piazza Martiri della Libertà 33, 56127 Pisa, Italy
| |
Collapse
|
12
|
Amudha M, Ahamed Khan MKA, Elamvazuthi I, Aliah Abd Jamil, Vasant P, Ganesan T. Development of a modular general purpose controller board for biologically inspired robot. 2011 IEEE INTERNATIONAL CONFERENCE ON CONTROL SYSTEM, COMPUTING AND ENGINEERING 2011. [DOI: 10.1109/iccsce.2011.6190582] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 09/02/2023]
|
13
|
Petrou G, Webb B. Detailed tracking of body and leg movements of a freely walking female cricket during phonotaxis. J Neurosci Methods 2011; 203:56-68. [PMID: 21951620 DOI: 10.1016/j.jneumeth.2011.09.011] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2011] [Revised: 09/14/2011] [Accepted: 09/14/2011] [Indexed: 10/17/2022]
Abstract
We describe a semi-automated tracking system for insect motion based on commercially available high-speed video cameras and freely available software. We use it to collect detailed three-dimensional kinematic information from female crickets performing free walking phonotaxis towards a calling song stimulus. We mark the insect's joints with small dots of paint and record the movements from underneath with a pair of cameras following the insect as it walks on the transparent floor of an arena. Tracking is done offline, utilizing a kinematic model to constrain the processing. We can obtain the positions and angles of all joints of all legs and six additional body joints, synchronised with stance-swing transitions and the sound pattern, at a 300 Hz frame rate. This data will be used in the further development of models of neural control of phonotaxis.
Collapse
Affiliation(s)
- Georgios Petrou
- Institute of Perception, Action and Behaviour, School of Informatics, University of Edinburgh, 10 Crichton Street, Edinburgh, EH8 9AB, UK.
| | | |
Collapse
|
14
|
Long JH, Krenitsky NM, Roberts SF, Hirokawa J, de Leeuw J, Porter ME. Testing Biomimetic Structures in Bioinspired Robots: How Vertebrae Control the Stiffness of the Body and the Behavior of Fish-Like Swimmers. Integr Comp Biol 2011; 51:158-75. [DOI: 10.1093/icb/icr020] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
|
15
|
Witney AG, Hedwig B. Kinematics of phonotactic steering in the walking cricket Gryllus bimaculatus (de Geer). ACTA ACUST UNITED AC 2011; 214:69-79. [PMID: 21147970 DOI: 10.1242/jeb.044800] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
Female crickets, Gryllus bimaculatus, are attracted by the male calling song and approach singing males; a behaviour known as phonotaxis. Even tethered females walking on a trackball steer towards a computer-generated male song presented from their left or right side. High-speed video analysis showed how this auditory-evoked steering was integrated with walking. Typically all the front and middle legs showed kinematic adjustments during steering, with the trajectories tilted towards the side of acoustic stimulation. Furthermore, the average speed of the tarsi contralateral to song increased relative to the ipsilateral tarsi. Kinematic changes of the hind legs were small and may be a consequence of the front and middle leg adjustments. Although phonotactic steering generally led to stereotyped adjustments there were differences in the specific combination of kinematic changes in leg trajectories. The most reliable kinematic steering response was by the contralateral front leg, such that, during its swing phase the tarsus moved towards the side of acoustic stimulation through an increased forward rotation of the femur and an increased extension of the tibia. Relating the changes in tarsal positioning of each leg to the steering velocity of the animal indicated that typically the front and middle legs contralateral to song generated the turning forces. Phonotactic steering was integrated into forward walking without changes to the walking motor cycle.
Collapse
Affiliation(s)
- Alice G Witney
- Department of Physiology, Trinity College Institute of Neuroscience and Trinity Centre for Bioengineering, Trinity College Dublin, Dublin 2, Ireland
| | | |
Collapse
|
16
|
Guerra RDS, Aonuma H, Hosoda K, Asada M. Semi-automatic behavior analysis using robot/insect mixed society and video tracking. J Neurosci Methods 2010; 191:138-44. [PMID: 20600321 DOI: 10.1016/j.jneumeth.2010.06.013] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2009] [Revised: 06/08/2010] [Accepted: 06/11/2010] [Indexed: 11/30/2022]
Abstract
This paper proposes a novel robot/insect mixed society setup which enhances the possibilities for insect behavioral research and can be used as a powerful tool for interdisciplinary studies on insect behavior. Micro-robots are equipped with decoys so as to allow a controlled dynamic interaction with crickets, Gryllus bimaculatus. A camera records the interaction and the video is later processed for the automatic tracking of each encounter between cricket and robot. A novelty of our method lies in using the robots as tools for the controlled evoking of specific insect behaviors rather than trying to build an insect-like robot. The possibility for performing controlled repeatable movements allows the stimulation of certain insect behaviors that are usually difficult to trigger using insects alone, allowing consistent behavioral research. A set of experiments were performed in order to validate the proposed setup. We also demonstrate the use of our setup for stimulating agonistic behavior during an electromyography recording session.
Collapse
|
17
|
Mhatre N, Balakrishnan R. Predicting acoustic orientation in complex real-world environments. ACTA ACUST UNITED AC 2008; 211:2779-85. [PMID: 18723535 DOI: 10.1242/jeb.017756] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
Animals have to accomplish several tasks in their lifetime, such as finding food and mates and avoiding predators. Animals that locate these using sound need to detect, recognize and localize appropriate acoustic objects in their environment, typically in noisy, non-ideal conditions. Quantitative models attempting to explain or predict animal behaviour should be able to accurately simulate behaviour in such complex, real-world conditions. Female crickets locate potential mates in choruses of simultaneously calling males. In the present study, we have tested field cricket acoustic orientation behaviour in complex acoustic conditions in the field and also successfully predicted female orientation and paths under these conditions using a simulation model based on auditory physiology. Such simulation models can provide powerful tools to predict and dissect patterns of behaviour in complex, natural environments.
Collapse
Affiliation(s)
- Natasha Mhatre
- Centre for Ecological Sciences, Indian Institute of Science, Bangalore, 560012, India
| | | |
Collapse
|
18
|
Webb B. Chapter 1 Using Robots to Understand Animal Behavior. ADVANCES IN THE STUDY OF BEHAVIOR 2008. [DOI: 10.1016/s0065-3454(08)00001-6] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/10/2023]
|
19
|
Gopal V, Hartmann MJZ. Using hardware models to quantify sensory data acquisition across the rat vibrissal array. BIOINSPIRATION & BIOMIMETICS 2007; 2:S135-S145. [PMID: 18037723 DOI: 10.1088/1748-3182/2/4/s03] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2023]
Abstract
Our laboratory investigates how animals acquire sensory data to understand the neural computations that permit complex sensorimotor behaviors. We use the rat whisker system as a model to study active tactile sensing; our aim is to quantitatively describe the spatiotemporal structure of incoming sensory information to place constraints on subsequent neural encoding and processing. In the first part of this paper we describe the steps in the development of a hardware model (a 'sensobot') of the rat whisker array that can perform object feature extraction. We show how this model provides insights into the neurophysiology and behavior of the real animal. In the second part of this paper, we suggest that sensory data acquisition across the whisker array can be quantified using the complete derivative. We use the example of wall-following behavior to illustrate that computing the appropriate spatial gradients across a sensor array would enable an animal or mobile robot to predict the sensory data that will be acquired at the next time step.
Collapse
Affiliation(s)
- Venkatesh Gopal
- Department of Biomedical Engineering, 2145 Sheridan Road, Northwestern University, Evanston, IL 60208, USA.
| | | |
Collapse
|