1
|
Schoepe T, Janotte E, Milde MB, Bertrand OJN, Egelhaaf M, Chicca E. Finding the gap: neuromorphic motion-vision in dense environments. Nat Commun 2024; 15:817. [PMID: 38280859 PMCID: PMC10821932 DOI: 10.1038/s41467-024-45063-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2021] [Accepted: 01/15/2024] [Indexed: 01/29/2024] Open
Abstract
Animals have evolved mechanisms to travel safely and efficiently within different habitats. On a journey in dense terrains animals avoid collisions and cross narrow passages while controlling an overall course. Multiple hypotheses target how animals solve challenges faced during such travel. Here we show that a single mechanism enables safe and efficient travel. We developed a robot inspired by insects. It has remarkable capabilities to travel in dense terrain, avoiding collisions, crossing gaps and selecting safe passages. These capabilities are accomplished by a neuromorphic network steering the robot toward regions of low apparent motion. Our system leverages knowledge about vision processing and obstacle avoidance in insects. Our results demonstrate how insects might safely travel through diverse habitats. We anticipate our system to be a working hypothesis to study insects' travels in dense terrains. Furthermore, it illustrates that we can design novel hardware systems by understanding the underlying mechanisms driving behaviour.
Collapse
Affiliation(s)
- Thorben Schoepe
- Peter Grünberg Institut 15, Forschungszentrum Jülich, Aachen, Germany.
- Faculty of Technology and Cognitive Interaction Technology Center of Excellence (CITEC), Bielefeld University, Bielefeld, Germany.
- Bio-Inspired Circuits and Systems (BICS) Lab. Zernike Institute for Advanced Materials (Zernike Inst Adv Mat), University of Groningen, Groningen, Netherlands.
- CogniGron (Groningen Cognitive Systems and Materials Center), University of Groningen, Groningen, Netherlands.
| | - Ella Janotte
- Event Driven Perception for Robotics, Italian Institute of Technology, iCub facility, Genoa, Italy
| | - Moritz B Milde
- International Centre for Neuromorphic Systems, MARCS Institute, Western Sydney University, Penrith, Australia
| | | | - Martin Egelhaaf
- Neurobiology, Faculty of Biology, Bielefeld University, Bielefeld, Germany
| | - Elisabetta Chicca
- Faculty of Technology and Cognitive Interaction Technology Center of Excellence (CITEC), Bielefeld University, Bielefeld, Germany
- Bio-Inspired Circuits and Systems (BICS) Lab. Zernike Institute for Advanced Materials (Zernike Inst Adv Mat), University of Groningen, Groningen, Netherlands
- CogniGron (Groningen Cognitive Systems and Materials Center), University of Groningen, Groningen, Netherlands
| |
Collapse
|
2
|
Müller MM, Scherer J, Unterbrink P, Bertrand OJN, Egelhaaf M, Boeddeker N. The Virtual Navigation Toolbox: Providing tools for virtual navigation experiments. PLoS One 2023; 18:e0293536. [PMID: 37943845 PMCID: PMC10635524 DOI: 10.1371/journal.pone.0293536] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2023] [Accepted: 10/16/2023] [Indexed: 11/12/2023] Open
Abstract
Spatial navigation research in humans increasingly relies on experiments using virtual reality (VR) tools, which allow for the creation of highly flexible, and immersive study environments, that can react to participant interaction in real time. Despite the popularity of VR, tools simplifying the creation and data management of such experiments are rare and often restricted to a specific scope-limiting usability and comparability. To overcome those limitations, we introduce the Virtual Navigation Toolbox (VNT), a collection of interchangeable and independent tools for the development of spatial navigation VR experiments using the popular Unity game engine. The VNT's features are packaged in loosely coupled and reusable modules, facilitating convenient implementation of diverse experimental designs. Here, we depict how the VNT fulfils feature requirements of different VR environments and experiments, guiding through the implementation and execution of a showcase study using the toolbox. The presented showcase study reveals that homing performance in a classic triangle completion task is invariant to translation velocity of the participant's avatar, but highly sensitive to the number of landmarks. The VNT is freely available under a creative commons license, and we invite researchers to contribute, extending and improving tools using the provided repository.
Collapse
Affiliation(s)
- Martin M. Müller
- Department of Neurobiology, Bielefeld University, Bielefeld, NRW, Germany
| | - Jonas Scherer
- Department of Neurobiology, Bielefeld University, Bielefeld, NRW, Germany
| | - Patrick Unterbrink
- Department of Neurobiology, Bielefeld University, Bielefeld, NRW, Germany
| | | | - Martin Egelhaaf
- Department of Neurobiology, Bielefeld University, Bielefeld, NRW, Germany
| | - Norbert Boeddeker
- Department of Cognitive Neuroscience, Bielefeld University, Bielefeld, NRW, Germany
| |
Collapse
|
3
|
Bertrand OJN, Sonntag A. The potential underlying mechanisms during learning flights. J Comp Physiol A Neuroethol Sens Neural Behav Physiol 2023:10.1007/s00359-023-01637-7. [PMID: 37204434 DOI: 10.1007/s00359-023-01637-7] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/30/2022] [Revised: 05/02/2023] [Accepted: 05/04/2023] [Indexed: 05/20/2023]
Abstract
Hymenopterans, such as bees and wasps, have long fascinated researchers with their sinuous movements at novel locations. These movements, such as loops, arcs, or zigzags, serve to help insects learn their surroundings at important locations. They also allow the insects to explore and orient themselves in their environment. After they gained experience with their environment, the insects fly along optimized paths guided by several guidance strategies, such as path integration, local homing, and route-following, forming a navigational toolkit. Whereas the experienced insects combine these strategies efficiently, the naive insects need to learn about their surroundings and tune the navigational toolkit. We will see that the structure of the movements performed during the learning flights leverages the robustness of certain strategies within a given scale to tune other strategies which are more efficient at a larger scale. Thus, an insect can explore its environment incrementally without risking not finding back essential locations.
Collapse
Affiliation(s)
- Olivier J N Bertrand
- Neurobiology, Bielefeld University, Universitätstr. 25, 33615, Bielefeld, NRW, Germany.
| | - Annkathrin Sonntag
- Neurobiology, Bielefeld University, Universitätstr. 25, 33615, Bielefeld, NRW, Germany
| |
Collapse
|
4
|
Bertrand OJN, Doussot C, Siesenop T, Ravi S, Egelhaaf M. Visual and movement memories steer foraging bumblebees along habitual routes. J Exp Biol 2021; 224:269087. [PMID: 34115117 DOI: 10.1242/jeb.237867] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/17/2020] [Accepted: 04/06/2021] [Indexed: 11/20/2022]
Abstract
One persistent question in animal navigation is how animals follow habitual routes between their home and a food source. Our current understanding of insect navigation suggests an interplay between visual memories, collision avoidance and path integration, the continuous integration of distance and direction travelled. However, these behavioural modules have to be continuously updated with instantaneous visual information. In order to alleviate this need, the insect could learn and replicate habitual movements ('movement memories') around objects (e.g. a bent trajectory around an object) to reach its destination. We investigated whether bumblebees, Bombus terrestris, learn and use movement memories en route to their home. Using a novel experimental paradigm, we habituated bumblebees to establish a habitual route in a flight tunnel containing 'invisible' obstacles. We then confronted them with conflicting cues leading to different choice directions depending on whether they rely on movement or visual memories. The results suggest that they use movement memories to navigate, but also rely on visual memories to solve conflicting situations. We investigated whether the observed behaviour was due to other guidance systems, such as path integration or optic flow-based flight control, and found that neither of these systems was sufficient to explain the behaviour.
Collapse
Affiliation(s)
- Olivier J N Bertrand
- Department of Neurobiology and Cognitive Interaction Technology Center of Excellence (CITEC) , Bielefeld University, D-33501 Bielefeld, Germany
| | - Charlotte Doussot
- Department of Neurobiology and Cognitive Interaction Technology Center of Excellence (CITEC) , Bielefeld University, D-33501 Bielefeld, Germany
| | - Tim Siesenop
- Department of Neurobiology and Cognitive Interaction Technology Center of Excellence (CITEC) , Bielefeld University, D-33501 Bielefeld, Germany
| | - Sridhar Ravi
- Department of Neurobiology and Cognitive Interaction Technology Center of Excellence (CITEC) , Bielefeld University, D-33501 Bielefeld, Germany.,School of Engineering, RMIT University, Melbourne, VIC 3083, Australia
| | - Martin Egelhaaf
- Department of Neurobiology and Cognitive Interaction Technology Center of Excellence (CITEC) , Bielefeld University, D-33501 Bielefeld, Germany
| |
Collapse
|
5
|
Gonsek A, Jeschke M, Rönnau S, Bertrand OJN. From Paths to Routes: A Method for Path Classification. Front Behav Neurosci 2021; 14:610560. [PMID: 33551764 PMCID: PMC7859641 DOI: 10.3389/fnbeh.2020.610560] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/26/2020] [Accepted: 12/21/2020] [Indexed: 11/13/2022] Open
Abstract
Many animals establish, learn and optimize routes between locations to commute efficiently. One step in understanding route following is defining measures of similarities between the paths taken by the animals. Paths have commonly been compared by using several descriptors (e.g., the speed, distance traveled, or the amount of meandering) or were visually classified into categories by the experimenters. However, similar quantities obtained from such descriptors do not guarantee similar paths, and qualitative classification by experimenters is prone to observer biases. Here we propose a novel method to classify paths based on their similarity with different distance functions and clustering algorithms based on the trajectories of bumblebees flying through a cluttered environment. We established a method based on two distance functions (Dynamic Time Warping and Fréchet Distance). For all combinations of trajectories, the distance was calculated with each measure. Based on these distance values, we grouped similar trajectories by applying the Monte Carlo Reference-Based Consensus Clustering algorithm. Our procedure provides new options for trajectory analysis based on path similarities in a variety of experimental paradigms.
Collapse
|
6
|
Doussot C, Bertrand OJN, Egelhaaf M. The Critical Role of Head Movements for Spatial Representation During Bumblebees Learning Flight. Front Behav Neurosci 2021; 14:606590. [PMID: 33542681 PMCID: PMC7852487 DOI: 10.3389/fnbeh.2020.606590] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2020] [Accepted: 11/23/2020] [Indexed: 11/20/2022] Open
Abstract
Bumblebees perform complex flight maneuvers around the barely visible entrance of their nest upon their first departures. During these flights bees learn visual information about the surroundings, possibly including its spatial layout. They rely on this information to return home. Depth information can be derived from the apparent motion of the scenery on the bees' retina. This motion is shaped by the animal's flight and orientation: Bees employ a saccadic flight and gaze strategy, where rapid turns of the head (saccades) alternate with flight segments of apparently constant gaze direction (intersaccades). When during intersaccades the gaze direction is kept relatively constant, the apparent motion contains information about the distance of the animal to environmental objects, and thus, in an egocentric reference frame. Alternatively, when the gaze direction rotates around a fixed point in space, the animal perceives the depth structure relative to this pivot point, i.e., in an allocentric reference frame. If the pivot point is at the nest-hole, the information is nest-centric. Here, we investigate in which reference frames bumblebees perceive depth information during their learning flights. By precisely tracking the head orientation, we found that half of the time, the head appears to pivot actively. However, only few of the corresponding pivot points are close to the nest entrance. Our results indicate that bumblebees perceive visual information in several reference frames when they learn about the surroundings of a behaviorally relevant location.
Collapse
Affiliation(s)
- Charlotte Doussot
- Department of Neurobiology, University of Bielefeld, Bielefeld, Germany
| | | | | |
Collapse
|
7
|
Odenthal L, Doussot C, Meyer S, Bertrand OJN. Analysing Head-Thorax Choreography During Free-Flights in Bumblebees. Front Behav Neurosci 2021; 14:610029. [PMID: 33510626 PMCID: PMC7835495 DOI: 10.3389/fnbeh.2020.610029] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2020] [Accepted: 12/14/2020] [Indexed: 01/29/2023] Open
Abstract
Animals coordinate their various body parts, sometimes in elaborate manners to swim, walk, climb, fly, and navigate their environment. The coordination of body parts is essential to behaviors such as, chasing, escaping, landing, and the extraction of relevant information. For example, by shaping the movement of the head and body in an active and controlled manner, flying insects structure their flights to facilitate the acquisition of distance information. They condense their turns into a short period of time (the saccade) interspaced by a relatively long translation (the intersaccade). However, due to technological limitations, the precise coordination of the head and thorax during insects' free-flight remains unclear. Here, we propose methods to analyse the orientation of the head and thorax of bumblebees Bombus terrestris, to segregate the trajectories of flying insects into saccades and intersaccades by using supervised machine learning (ML) techniques, and finally to analyse the coordination between head and thorax by using artificial neural networks (ANN). The segregation of flights into saccades and intersaccades by ML, based on the thorax angular velocities, decreased the misclassification by 12% compared to classically used methods. Our results demonstrate how machine learning techniques can be used to improve the analyses of insect flight structures and to learn about the complexity of head-body coordination. We anticipate our assay to be a starting point for more sophisticated experiments and analysis on freely flying insects. For example, the coordination of head and body movements during collision avoidance, chasing behavior, or negotiation of gaps could be investigated by monitoring the head and thorax orientation of freely flying insects within and across behavioral tasks, and in different species.
Collapse
Affiliation(s)
| | | | - Stefan Meyer
- Department of Informatics, University of Sussex, Brighton, United Kingdom
| | | |
Collapse
|
8
|
Doussot C, Bertrand OJN, Egelhaaf M. Visually guided homing of bumblebees in ambiguous situations: A behavioural and modelling study. PLoS Comput Biol 2020; 16:e1008272. [PMID: 33048938 PMCID: PMC7553325 DOI: 10.1371/journal.pcbi.1008272] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2019] [Accepted: 08/19/2020] [Indexed: 11/19/2022] Open
Abstract
Returning home is a crucial task accomplished daily by many animals, including humans. Because of their tiny brains, insects, like bees or ants, are good study models for efficient navigation strategies. Bees and ants are known to rely mainly on learned visual information about the nest surroundings to pinpoint their barely visible nest-entrance. During the return, when the actual sight of the insect matches the learned information, the insect is easily guided home. Occasionally, modifications to the visual environment may take place while the insect is on a foraging trip. Here, we addressed the ecologically relevant question of how bumblebees’ homing is affected by such a situation. In an artificial setting, we habituated bees to be guided to their nest by two constellations of visual cues. After habituation, these cues were displaced during foraging trips into a conflict situation. We recorded bumblebees’ return flights in such circumstances and investigated where they search for their nest entrance following the degree of displacement between the two visually relevant cues. Bumblebees mostly searched at the fictive nest location as indicated by either cue constellation, but never at a compromise location between them. We compared these experimental results to the predictions of different types of homing models. We found that models guiding an agent by a single holistic view of the nest surroundings could not account for the bumblebees’ search behaviour in cue-conflict situations. Instead, homing models relying on multiple views were sufficient. We could further show that homing models required fewer views and got more robust to height changes if optic flow-based spatial information was encoded and learned, rather than just brightness information. Returning home sounds trivial, but to a concealed underground location like a burrow, is less easy. For the buff-tailed bumblebees, this task is a routine. After collecting pollen in gardens or flowered meadows, bees must return to their underground nest to feed the queen’s larvae. The nest entrance is almost invisible for a returning bee; therefore, it guides its flight by information about the surrounding visual environment. Since the seminal work of Timbergern, many experiments have focused on how visual information is guiding foraging insects back home. In these experiments, returning foragers were confronted with a coherent displacement of the entire nest surroundings, hence, leading the bees to a unique new location. But in nature, the objects constituting the visual environment maybe unorderly displaced, as some are differently inclined to the action of different factors, e.g. wind. In our study, we moved objects in a tricky way to create two fictitious nest entrances. The bees searched at the fictitious nest entrances, but never in-between. The distance between the fictitious nests affected the bees’ search. Finally, we could predict the search location by using bio-inspired homing models potentially interesting for implementing in autonomous robots.
Collapse
Affiliation(s)
- Charlotte Doussot
- Neurobiology, Faculty of Biology, Universität Bielefeld, Germany
- * E-mail:
| | | | - Martin Egelhaaf
- Neurobiology, Faculty of Biology, Universität Bielefeld, Germany
| |
Collapse
|
9
|
Abstract
Apparent motion of the surroundings on an agent's retina can be used to navigate through cluttered environments, avoid collisions with obstacles, or track targets of interest. The pattern of apparent motion of objects, (i.e., the optic flow), contains spatial information about the surrounding environment. For a small, fast-moving agent, as used in search and rescue missions, it is crucial to estimate the distance to close-by objects to avoid collisions quickly. This estimation cannot be done by conventional methods, such as frame-based optic flow estimation, given the size, power, and latency constraints of the necessary hardware. A practical alternative makes use of event-based vision sensors. Contrary to the frame-based approach, they produce so-called events only when there are changes in the visual scene. We propose a novel asynchronous circuit, the spiking elementary motion detector (sEMD), composed of a single silicon neuron and synapse, to detect elementary motion from an event-based vision sensor. The sEMD encodes the time an object's image needs to travel across the retina into a burst of spikes. The number of spikes within the burst is proportional to the speed of events across the retina. A fast but imprecise estimate of the time-to-travel can already be obtained from the first two spikes of a burst and refined by subsequent interspike intervals. The latter encoding scheme is possible due to an adaptive nonlinear synaptic efficacy scaling. We show that the sEMD can be used to compute a collision avoidance direction in the context of robotic navigation in a cluttered outdoor environment and compared the collision avoidance direction to a frame-based algorithm. The proposed computational principle constitutes a generic spiking temporal correlation detector that can be applied to other sensory modalities (e.g., sound localization), and it provides a novel perspective to gating information in spiking neural networks.
Collapse
Affiliation(s)
- M B Milde
- Institute of Neuroinformatics, University of Zurich, and ETH Zurich, 8057 Zurich, Switzerland
| | - O J N Bertrand
- Neurobiology, Faculty of Biology, Bielefeld University, 33615 Bielefeld, and Cognitive Interaction Technology, Center of Excellence, Bielefeld University, 33501 Bielefeld, Germany
| | - H Ramachandran
- Faculty of Technology, Bielefeld University, 33615 Bielefeld, and Cognitive Interaction Technology, Center of Excellence, Bielefeld University, 33501 Bielefeld, Germany
| | - M Egelhaaf
- Neurobiology, Faculty of Biology, Bielefeld University, 33615 Bielefeld, and Cognitive Interaction Technology, Center of Excellence, Bielefeld University, 33501 Bielefeld, Germany
| | - E Chicca
- Faculty of Technology, Bielefeld University, 33615 Bielefeld, Germany, and Cognitive Interaction Technology, Center of Excellence, Bielefeld University, 33501 Bielefeld, Germany
| |
Collapse
|
10
|
Abstract
Navigation in cluttered environments is an important challenge for animals and robots alike and has been the subject of many studies trying to explain and mimic animal navigational abilities. However, the question of selecting an appropriate home location has, so far, received only little attention. This is surprising, since the choice of a home location might greatly influence an animal’s navigation performance. To address the question of home choice in cluttered environments, a systematic analysis of homing trajectories was performed by computer simulations using a skyline-based local homing method. Our analysis reveals that homing performance strongly depends on the location of the home in the environment. Furthermore, it appears that by assessing homing success in the immediate vicinity of the home, an animal might be able to predict its overall success in returning to it from within a much larger area.
Collapse
Affiliation(s)
- Martin M. Müller
- Department of Neurobiology, Faculty of Biology, and Cluster of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
- * E-mail:
| | - Olivier J. N. Bertrand
- Department of Neurobiology, Faculty of Biology, and Cluster of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
| | - Dario Differt
- Computer Engineering Group, Faculty of Technology, Bielefeld University, Bielefeld, Germany
| | - Martin Egelhaaf
- Department of Neurobiology, Faculty of Biology, and Cluster of Excellence ‘Cognitive Interaction Technology’ (CITEC), Bielefeld University, Bielefeld, Germany
| |
Collapse
|
11
|
Bertrand OJN, Lindemann JP, Egelhaaf M. A Bio-inspired Collision Avoidance Model Based on Spatial Information Derived from Motion Detectors Leads to Common Routes. PLoS Comput Biol 2015; 11:e1004339. [PMID: 26583771 PMCID: PMC4652890 DOI: 10.1371/journal.pcbi.1004339] [Citation(s) in RCA: 32] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/14/2014] [Accepted: 05/13/2015] [Indexed: 11/18/2022] Open
Abstract
Avoiding collisions is one of the most basic needs of any mobile agent, both biological and technical, when searching around or aiming toward a goal. We propose a model of collision avoidance inspired by behavioral experiments on insects and by properties of optic flow on a spherical eye experienced during translation, and test the interaction of this model with goal-driven behavior. Insects, such as flies and bees, actively separate the rotational and translational optic flow components via behavior, i.e. by employing a saccadic strategy of flight and gaze control. Optic flow experienced during translation, i.e. during intersaccadic phases, contains information on the depth-structure of the environment, but this information is entangled with that on self-motion. Here, we propose a simple model to extract the depth structure from translational optic flow by using local properties of a spherical eye. On this basis, a motion direction of the agent is computed that ensures collision avoidance. Flying insects are thought to measure optic flow by correlation-type elementary motion detectors. Their responses depend, in addition to velocity, on the texture and contrast of objects and, thus, do not measure the velocity of objects veridically. Therefore, we initially used geometrically determined optic flow as input to a collision avoidance algorithm to show that depth information inferred from optic flow is sufficient to account for collision avoidance under closed-loop conditions. Then, the collision avoidance algorithm was tested with bio-inspired correlation-type elementary motion detectors in its input. Even then, the algorithm led successfully to collision avoidance and, in addition, replicated the characteristics of collision avoidance behavior of insects. Finally, the collision avoidance algorithm was combined with a goal direction and tested in cluttered environments. The simulated agent then showed goal-directed behavior reminiscent of components of the navigation behavior of insects.
Collapse
Affiliation(s)
| | | | - Martin Egelhaaf
- Neurobiologie & CITEC, Bielefeld University, Bielefeld, Germany
| |
Collapse
|