Wallach A, Harvey-Girard E, Jun JJ, Longtin A, Maler L. A time-stamp mechanism may provide temporal information necessary for egocentric to allocentric spatial transformations.
eLife 2018;
7:36769. [PMID:
30465523 PMCID:
PMC6264071 DOI:
10.7554/elife.36769]
[Citation(s) in RCA: 27] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2018] [Accepted: 11/12/2018] [Indexed: 11/17/2022] Open
Abstract
Learning the spatial organization of the environment is essential for most animals’ survival. This requires the animal to derive allocentric spatial information from egocentric sensory and motor experience. The neural mechanisms underlying this transformation are mostly unknown. We addressed this problem in electric fish, which can precisely navigate in complete darkness and whose brain circuitry is relatively simple. We conducted the first neural recordings in the preglomerular complex, the thalamic region exclusively connecting the optic tectum with the spatial learning circuits in the dorsolateral pallium. While tectal topographic information was mostly eliminated in preglomerular neurons, the time-intervals between object encounters were precisely encoded. We show that this reliable temporal information, combined with a speed signal, can permit accurate estimation of the distance between encounters, a necessary component of path-integration that enables computing allocentric spatial relations. Our results suggest that similar mechanisms are involved in sequential spatial learning in all vertebrates.
Finding their way around is an essential part of survival for many animals and helps them to locate food, mates and shelter. Animals have evolved the ability to form a 'map' or representation of their surroundings. For example, the electric fish Apteronotus leptorhynchus, is able to precisely learn the location of food and navigate there. It can do this in complete darkness by generating a weak electric field. As it swims, every object it encounters generates an ‘electric image’ that is detected on the skin and processed in the brain.
However, all the cues the fish comes across are from its own point of view – the information about its environment is processed with respect to its location. And yet, the map that it generates needs to be independent of the fish’s position – it has to work regardless of where the animal is. The way animals translate ‘self-centered’ experiences to form a general representation of their surroundings is not yet fully understood.
Now, Wallach et al. studied how internal brain maps are generated in A. leptorhynchus. Information about the fish's environment passes through a structure in the brain called the preglomerular complex. Measuring the activity of this region revealed that the preglomerular complex does not process much self-centered information. Instead, whenever the fish passed any object – regardless of where it was in relation to the fish – the event triggered a brief burst of preglomerular activity. The intensity of the activity depended on how recently the fish had encountered another object. This information, combined with the dynamics of the fish's movement, could be what allows the fish to convert a sequence of encounters into a general spatial map.
These findings could help to inform research on learning and navigation. Further research could also reveal whether other species, including humans, generate their mental maps in a similar way. This may be relevant for people suffering from diseases such as Alzheimer’s, in which a sense of orientation has become impaired.
Collapse