1
|
Interactions between rodent visual and spatial systems during navigation. Nat Rev Neurosci 2023:10.1038/s41583-023-00716-7. [PMID: 37380885 DOI: 10.1038/s41583-023-00716-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/31/2023] [Indexed: 06/30/2023]
Abstract
Many behaviours that are critical for animals to survive and thrive rely on spatial navigation. Spatial navigation, in turn, relies on internal representations about one's spatial location, one's orientation or heading direction and the distance to objects in the environment. Although the importance of vision in guiding such internal representations has long been recognized, emerging evidence suggests that spatial signals can also modulate neural responses in the central visual pathway. Here, we review the bidirectional influences between visual and navigational signals in the rodent brain. Specifically, we discuss reciprocal interactions between vision and the internal representations of spatial position, explore the effects of vision on representations of an animal's heading direction and vice versa, and examine how the visual and navigational systems work together to assess the relative distances of objects and other features. Throughout, we consider how technological advances and novel ethological paradigms that probe rodent visuo-spatial behaviours allow us to advance our understanding of how brain areas of the central visual pathway and the spatial systems interact and enable complex behaviours.
Collapse
|
2
|
Redefining sensorimotor mismatch selectivity in the visual cortex. Cell Rep 2023; 42:112098. [PMID: 36821444 PMCID: PMC10632662 DOI: 10.1016/j.celrep.2023.112098] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2022] [Revised: 01/12/2023] [Accepted: 01/26/2023] [Indexed: 02/24/2023] Open
Abstract
This Matters Arising Response contains our commentary to the response written by Vasilevskaya et al., 2023, publishing concurrently in Cell Reports, for our recent article "Feature selectivity can explain mismatch signals in mouse visual cortex." We find that results in the response reinforced many of our findings and, further supported by their new results, we argue for the necessity to redefine sensorimotor mismatch selectivity in the mouse visual system.
Collapse
|
3
|
Walking humans and running mice: perception and neural encoding of optic flow during self-motion. Philos Trans R Soc Lond B Biol Sci 2023; 378:20210450. [PMID: 36511417 PMCID: PMC9745880 DOI: 10.1098/rstb.2021.0450] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/15/2022] Open
Abstract
Locomotion produces full-field optic flow that often dominates the visual motion inputs to an observer. The perception of optic flow is in turn important for animals to guide their heading and interact with moving objects. Understanding how locomotion influences optic flow processing and perception is therefore essential to understand how animals successfully interact with their environment. Here, we review research investigating how perception and neural encoding of optic flow are altered during self-motion, focusing on locomotion. Self-motion has been found to influence estimation and sensitivity for optic flow speed and direction. Nonvisual self-motion signals also increase compensation for self-driven optic flow when parsing the visual motion of moving objects. The integration of visual and nonvisual self-motion signals largely follows principles of Bayesian inference and can improve the precision and accuracy of self-motion perception. The calibration of visual and nonvisual self-motion signals is dynamic, reflecting the changing visuomotor contingencies across different environmental contexts. Throughout this review, we consider experimental research using humans, non-human primates and mice. We highlight experimental challenges and opportunities afforded by each of these species and draw parallels between experimental findings. These findings reveal a profound influence of locomotion on optic flow processing and perception across species. This article is part of a discussion meeting issue 'New approaches to 3D vision'.
Collapse
|
4
|
Altered low-frequency brain rhythms precede changes in gamma power during tauopathy. iScience 2022; 25:105232. [PMID: 36274955 PMCID: PMC9579020 DOI: 10.1016/j.isci.2022.105232] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2022] [Revised: 08/22/2022] [Accepted: 09/25/2022] [Indexed: 11/12/2022] Open
Abstract
Neurodegenerative disorders are associated with widespread disruption to brain activity and brain rhythms. Some disorders are linked to dysfunction of the membrane-associated protein Tau. Here, we ask how brain rhythms are affected in rTg4510 mouse model of tauopathy, at an early stage of tauopathy (5 months), and at a more advanced stage (8 months). We measured brain rhythms in primary visual cortex in presence or absence of visual stimulation, while monitoring pupil diameter and locomotion to establish behavioral state. At 5 months, we found increased low-frequency rhythms during resting state in tauopathic animals, associated with periods of abnormally increased neural synchronization. At 8 months, this increase in low-frequency rhythms was accompanied by a reduction of power in the gamma range. Our results therefore show that slower rhythms are impaired earlier than gamma rhythms in this model of tauopathy, and suggest that electrophysiological measurements can track the progression of tauopathic neurodegeneration.
Collapse
|
5
|
Abstract
The superior colliculus (SC) is a highly conserved area of the mammalian midbrain that is widely implicated in the organisation and control of behaviour. SC receives input from a large number of brain areas, and provides outputs to a large number of areas. The convergence and divergence of anatomical connections with different areas and systems provides challenges for understanding how SC contributes to behaviour. Recent work in mouse has provided large anatomical datasets, and a wealth of new data from experiments that identify and manipulate different cells within SC, and their inputs and outputs, during simple behaviours. These data offer an opportunity to better understand the roles that SC plays in these behaviours. However, some of the observations appear, at first sight, to be contradictory. Here we review this recent work and hypothesise a simple framework which can capture the observations, that requires only a small change to previous models. Specifically, the functional organisation of SC can be explained by supposing that three largely distinct circuits support three largely distinct classes of simple behaviours-arrest, turning towards, and the triggering of escape or capture. These behaviours are hypothesised to be supported by the optic, intermediate and deep layers, respectively.
Collapse
|
6
|
Abstract
Alzheimer's disease and other dementias are thought to underlie a progressive impairment of neural plasticity. Previous work in mouse models of Alzheimer's disease shows pronounced changes in artificially-induced plasticity in hippocampus, perirhinal and prefrontal cortex. However, it is not known how degeneration disrupts intrinsic forms of brain plasticity. Here we characterised the impact of tauopathy on a simple form of intrinsic plasticity in the visual system, which allowed us to track plasticity at both long (days) and short (minutes) timescales. We studied rTg4510 transgenic mice at early stages of tauopathy (5 months) and a more advanced stage (8 months). We recorded local field potentials in the primary visual cortex while animals were repeatedly exposed to a stimulus over 9 days. We found that both short- and long-term visual plasticity were already disrupted at early stages of tauopathy, and further reduced in older animals, such that it was abolished in mice expressing mutant tau. Additionally, visually evoked behaviours were disrupted in both younger and older mice expressing mutant tau. Our results show that visual cortical plasticity and visually evoked behaviours are disrupted in the rTg4510 model of tauopathy. This simple measure of plasticity may help understand how tauopathy disrupts neural circuits, and offers a translatable platform for detection and tracking of the disease.
Collapse
|
7
|
Feature selectivity can explain mismatch signals in mouse visual cortex. Cell Rep 2021; 37:109772. [PMID: 34610298 PMCID: PMC8655498 DOI: 10.1016/j.celrep.2021.109772] [Citation(s) in RCA: 14] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2021] [Revised: 07/28/2021] [Accepted: 09/09/2021] [Indexed: 11/23/2022] Open
Abstract
Sensory experience often depends on one's own actions, including self-motion. Theories of predictive coding postulate that actions are regulated by calculating prediction error, which is the difference between sensory experience and expectation based on self-generated actions. Signals consistent with prediction error have been reported in the mouse visual cortex (V1) when visual flow coupled to running was unexpectedly stopped. Here, we show that such signals can be elicited by visual stimuli uncoupled to an animal running. We record V1 neurons while presenting drifting gratings that unexpectedly stop. We find strong responses to visual perturbations, which are enhanced during running. Perturbation responses are strongest in the preferred orientation of individual neurons, and perturbation-responsive neurons are more likely to prefer slow visual speeds. Our results indicate that prediction error signals can be explained by the convergence of known motor and sensory signals, providing a purely sensory and motor explanation for purported mismatch signals.
Collapse
|
8
|
Spatial modulation of dark versus bright stimulus responses in the mouse visual system. Curr Biol 2021; 31:4172-4179.e6. [PMID: 34314675 PMCID: PMC8478832 DOI: 10.1016/j.cub.2021.06.094] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2020] [Revised: 05/20/2021] [Accepted: 06/30/2021] [Indexed: 01/06/2023]
Abstract
A fundamental task of the visual system is to respond to both increases and decreases of luminance with action potentials (ON and OFF responses1-4). OFF responses are stronger, faster, and more salient than ON responses in primary visual cortex (V1) of both cats5,6 and primates,7,8 but in ferrets9 and mice,10 ON responses can be stronger, weaker,11 or balanced12 in comparison to OFF responses. These discrepancies could arise from differences in species, experimental techniques, or stimulus properties, particularly retinotopic location in the visual field, as has been speculated;9 however, the role of retinotopy for ON/OFF dominance has not been systematically tested across multiple scales of neural activity within species. Here, we measured OFF versus ON responses across large portions of visual space with silicon probe and whole-cell patch-clamp recordings in mouse V1 and lateral geniculate nucleus (LGN). We found that OFF responses dominated in the central visual field, whereas ON and OFF responses were more balanced in the periphery. These findings were consistent across local field potential (LFP), spikes, and subthreshold membrane potential in V1, and were aligned with spatial biases in ON and OFF responses in LGN. Our findings reveal that retinotopy may provide a common organizing principle for spatial modulation of OFF versus ON processing in mammalian visual systems.
Collapse
|
9
|
Organization of feedback projections to mouse primary visual cortex. iScience 2021; 24:102450. [PMID: 34113813 PMCID: PMC8169797 DOI: 10.1016/j.isci.2021.102450] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2020] [Revised: 02/01/2021] [Accepted: 04/14/2021] [Indexed: 11/17/2022] Open
Abstract
Top-down, context-dependent modulation of visual processing has been a topic of wide interest, including in mouse primary visual cortex (V1). However, the organization of feedback projections to V1 is relatively unknown. Here, we investigated inputs to mouse V1 by injecting retrograde tracers. We developed a software pipeline that maps labeled cell bodies to corresponding brain areas in the Allen Reference Atlas. We identified more than 24 brain areas that provide inputs to V1 and quantified the relative strength of their projections. We also assessed the organization of the projections, based on either the organization of cell bodies in the source area (topography) or the distribution of projections across V1 (bias). Projections from most higher visual and some nonvisual areas to V1 showed both topography and bias. Such organization of feedback projections to V1 suggests that parts of the visual field are differentially modulated by context, which can be ethologically relevant for a navigating animal.
Collapse
|
10
|
Abstract
Real-time rendering of closed-loop visual environments is important for next-generation understanding of brain function and behaviour, but is often prohibitively difficult for non-experts to implement and is limited to few laboratories worldwide. We developed BonVision as an easy-to-use open-source software for the display of virtual or augmented reality, as well as standard visual stimuli. BonVision has been tested on humans and mice, and is capable of supporting new experimental designs in other animal models of vision. As the architecture is based on the open-source Bonsai graphical programming language, BonVision benefits from native integration with experimental hardware. BonVision therefore enables easy implementation of closed-loop experiments, including real-time interaction with deep neural networks, and communication with behavioural and physiological measurement and manipulation devices.
Collapse
|
11
|
Spatial modulation of visual responses arises in cortex with active navigation. eLife 2021; 10:e63705. [PMID: 33538692 PMCID: PMC7861612 DOI: 10.7554/elife.63705] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2020] [Accepted: 01/12/2021] [Indexed: 01/01/2023] Open
Abstract
During navigation, the visual responses of neurons in mouse primary visual cortex (V1) are modulated by the animal's spatial position. Here we show that this spatial modulation is similarly present across multiple higher visual areas but negligible in the main thalamic pathway into V1. Similar to hippocampus, spatial modulation in visual cortex strengthens with experience and with active behavior. Active navigation in a familiar environment, therefore, enhances the spatial modulation of visual signals starting in the cortex.
Collapse
|
12
|
Mouse Visual Cortex Is Modulated by Distance Traveled and by Theta Oscillations. Curr Biol 2020; 30:3811-3817.e6. [PMID: 32763173 PMCID: PMC7544510 DOI: 10.1016/j.cub.2020.07.006] [Citation(s) in RCA: 28] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2019] [Revised: 05/26/2020] [Accepted: 07/01/2020] [Indexed: 01/29/2023]
Abstract
The visual responses of neurons in the primary visual cortex (V1) are influenced by the animal's position in the environment [1-5]. V1 responses encode positions that co-fluctuate with those encoded by place cells in hippocampal area CA1 [2, 5]. This correlation might reflect a common influence of non-visual spatial signals on both areas. Place cells in CA1, indeed, do not rely only on vision; their place preference depends on the physical distance traveled [6-11] and on the phase of the 6-9 Hz theta oscillation [12, 13]. Are V1 responses similarly influenced by these non-visual factors? We recorded V1 and CA1 neurons simultaneously while mice performed a spatial task in a virtual corridor by running on a wheel and licking at a reward location. By changing the gain that couples the wheel movement to the virtual environment, we found that ∼20% of V1 neurons were influenced by the physical distance traveled, as were ∼40% of CA1 place cells. Moreover, the firing rate of ∼24% of V1 neurons was modulated by the phase of theta oscillations recorded in CA1 and the response profiles of ∼7% of V1 neurons shifted spatially across the theta cycle, analogous to the phase precession observed in ∼37% of CA1 place cells. The influence of theta oscillations on V1 responses was more prominent in putative layer 6. These results reveal that, in a familiar environment, sensory processing in V1 is modulated by the key non-visual signals that influence spatial coding in the hippocampus.
Collapse
|
13
|
Two stream hypothesis of visual processing for navigation in mouse. Curr Opin Neurobiol 2020; 64:70-78. [PMID: 32294570 DOI: 10.1016/j.conb.2020.03.009] [Citation(s) in RCA: 28] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/27/2019] [Revised: 03/17/2020] [Accepted: 03/20/2020] [Indexed: 12/11/2022]
Abstract
Vision research has traditionally been studied in stationary subjects observing stimuli, and rarely during navigation. Recent research using virtual reality environments for mice has revealed that responses even in the primary visual cortex are modulated by spatial context - identical scenes presented in different positions of a room can elicit different responses. Here, we review these results and discuss how information from visual areas can reach navigational areas of the brain. Based on the observation that mouse higher visual areas cover different parts of the visual field, we propose that spatial signals are processed along two-streams based on visual field coverage. Specifically, this hypothesis suggests that landmark related signals are processed by areas biased to the central field, and self-motion related signals are processed by areas biased to the peripheral field.
Collapse
|
14
|
Coherent encoding of subjective spatial position in visual cortex and hippocampus. Nature 2018; 562:124-127. [PMID: 30202092 PMCID: PMC6309439 DOI: 10.1038/s41586-018-0516-1] [Citation(s) in RCA: 117] [Impact Index Per Article: 19.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2017] [Accepted: 07/24/2018] [Indexed: 01/30/2023]
Abstract
A major role of vision is to guide navigation, and navigation is strongly driven by vision1-4. Indeed, the brain's visual and navigational systems are known to interact5,6, and signals related to position in the environment have been suggested to appear as early as in the visual cortex6,7. Here, to establish the nature of these signals, we recorded in the primary visual cortex (V1) and hippocampal area CA1 while mice traversed a corridor in virtual reality. The corridor contained identical visual landmarks in two positions, so that a purely visual neuron would respond similarly at those positions. Most V1 neurons, however, responded solely or more strongly to the landmarks in one position rather than the other. This modulation of visual responses by spatial location was not explained by factors such as running speed. To assess whether the modulation is related to navigational signals and to the animal's subjective estimate of position, we trained the mice to lick for a water reward upon reaching a reward zone in the corridor. Neuronal populations in both CA1 and V1 encoded the animal's position along the corridor, and the errors in their representations were correlated. Moreover, both representations reflected the animal's subjective estimate of position, inferred from the animal's licks, better than its actual position. When animals licked in a given location-whether correctly or incorrectly-neural populations in both V1 and CA1 placed the animal in the reward zone. We conclude that visual responses in V1 are controlled by navigational signals, which are coherent with those encoded in hippocampus and reflect the animal's subjective position. The presence of such navigational signals as early as a primary sensory area suggests that they permeate sensory processing in the cortex.
Collapse
|
15
|
Hippocampal place cells construct reward related sequences through unexplored space. eLife 2015; 4:e06063. [PMID: 26112828 PMCID: PMC4479790 DOI: 10.7554/elife.06063] [Citation(s) in RCA: 152] [Impact Index Per Article: 16.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2014] [Accepted: 05/22/2015] [Indexed: 12/04/2022] Open
Abstract
Dominant theories of hippocampal function propose that place cell representations are formed during an animal's first encounter with a novel environment and are subsequently replayed during off-line states to support consolidation and future behaviour. Here we report that viewing the delivery of food to an unvisited portion of an environment leads to off-line pre-activation of place cells sequences corresponding to that space. Such ‘preplay’ was not observed for an unrewarded but otherwise similar portion of the environment. These results suggest that a hippocampal representation of a visible, yet unexplored environment can be formed if the environment is of motivational relevance to the animal. We hypothesise such goal-biased preplay may support preparation for future experiences in novel environments. DOI:http://dx.doi.org/10.7554/eLife.06063.001 As an animal explores an area, part of the brain called the hippocampus creates a mental map of the space. When the animal is in one location, a few neurons called ‘place cells’ will fire. If the animal moves to a new spot, other place cells fire instead. Each time the animal returns to that spot, the same place cells will fire. Thus, as the animal moves, a place-specific pattern of firing emerges that scientists can view by recording the cells' activity and which can be used to reconstruct the animal's position. After exploring a space, the hippocampus may replay the new place-specific pattern of activity during sleep. By doing so, the brain consolidates the memory of the space for return visits. Recent evidence now suggests that these mental rehearsals—or internal simulations of the space—may begin even before a new space has been explored. Now, Ólafsdóttir, Barry et al. report that whether an animal's brain simulates a first visit to a new space depends on whether the animal anticipates a reward. In the experiments, rats were allowed to run up to the junction in a T-shaped track. The animals could see into each of the arms, but not enter them. Food was then placed in one of the inaccessible arms. Ólafsdóttir, Barry et al. recorded the firing of place cells in the brain of the animals when they were on the track and during a rest period afterwards. The rats were then allowed onto the inaccessible arms, and again their brain activity was recorded. In the rest period after the rats first viewed the inaccessible arms, the place cell pattern that would later form the mental map of a journey to and from the food-containing arm was pre-activated. However, the place cell pattern that would become the mental map of the other inaccessible arm was not activated before the rat explored that area. Therefore, Ólafsdóttir, Barry et al. suggest that the perception of reward influences which place cell pattern is simulated during rest. An implication of these findings is that the brain preferentially simulates past or future experiences that are deemed to be functionally significant, such as those associated with reward. A future challenge will be to determine whether this goal-related simulation of unvisited spaces predicts and is needed for behaviour such as successful navigation to a goal. DOI:http://dx.doi.org/10.7554/eLife.06063.002
Collapse
|
16
|
Integration of visual motion and locomotion in mouse visual cortex. Nat Neurosci 2013; 16:1864-9. [PMID: 24185423 PMCID: PMC3926520 DOI: 10.1038/nn.3567] [Citation(s) in RCA: 251] [Impact Index Per Article: 22.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/09/2013] [Accepted: 10/04/2013] [Indexed: 12/22/2022]
Abstract
Successful navigation through the world requires accurate estimation of one's own speed. To derive this estimate, animals integrate visual speed gauged from optic flow and run speed gauged from proprioceptive and locomotor systems. The primary visual cortex (V1) carries signals related to visual speed, and its responses are also affected by run speed. To study how V1 combines these signals during navigation, we recorded from mice that traversed a virtual environment. Nearly half of the V1 neurons were reliably driven by combinations of visual speed and run speed. These neurons performed a weighted sum of the two speeds. The weights were diverse across neurons, and typically positive. As a population, V1 neurons predicted a linear combination of visual and run speeds better than either visual or run speeds alone. These data indicate that V1 in the mouse participates in a multimodal processing system that integrates visual motion and locomotion during navigation.
Collapse
|
17
|
Locomotion controls spatial integration in mouse visual cortex. Curr Biol 2013; 23:890-4. [PMID: 23664971 PMCID: PMC3661981 DOI: 10.1016/j.cub.2013.04.012] [Citation(s) in RCA: 157] [Impact Index Per Article: 14.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2013] [Revised: 04/03/2013] [Accepted: 04/04/2013] [Indexed: 11/18/2022]
Abstract
Growing evidence indicates that responses in sensory cortex are modulated by factors beyond direct sensory stimulation [1–8]. In primary visual cortex (V1), for instance, responses increase with locomotion [9, 10]. Here we show that this increase is accompanied by a profound change in spatial integration. We recorded from V1 neurons in head-fixed mice placed on a spherical treadmill. We characterized spatial integration and found that the responses of most neurons were suppressed by large stimuli. As in primates [11, 12], this surround suppression increased with stimulus contrast. These effects were captured by a divisive normalization model [13, 14], where the numerator originates from a central region driving the neuron and the denominator originates from a larger suppressive field. We then studied the effects of locomotion and found that it markedly reduced surround suppression, allowing V1 neurons to integrate over larger regions of visual space. Locomotion had two main effects: it increased spontaneous activity, and it weakened the suppressive signals mediating normalization, relative to the driving signals. We conclude that a fundamental aspect of visual processing, spatial integration, is controlled by an apparently unrelated factor, locomotion. This control might operate through the mechanisms that are in place to deliver surround suppression. Spatial integration in neurons of mouse visual cortex depends on locomotion Locomotion increases the region of visual space that V1 neurons integrate Locomotion reduces surround suppression Locomotion could achieve this by adjusting the strength of divisive normalization
Collapse
|
18
|
Bimodal optomotor response to plaids in blowflies: mechanisms of component selectivity and evidence for pattern selectivity. J Neurosci 2012; 32:1634-42. [PMID: 22302805 PMCID: PMC6703340 DOI: 10.1523/jneurosci.4940-11.2012] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2011] [Revised: 11/30/2011] [Accepted: 12/06/2011] [Indexed: 11/21/2022] Open
Abstract
Many animals estimate their self-motion and the movement of external objects by exploiting panoramic patterns of visual motion. To probe how visual systems process compound motion patterns, superimposed visual gratings moving in different directions, plaid stimuli, have been successfully used in vertebrates. Surprisingly, nothing is known about how visually guided insects process plaids. Here, we explored in the blowfly how the well characterized yaw optomotor reflex and the activity of identified visual interneurons depend on plaid stimuli. We show that contrary to previous expectations, the yaw optomotor reflex shows a bimodal directional tuning for certain plaid stimuli. To understand the neural correlates of this behavior, we recorded the responses of a visual interneuron supporting the reflex, the H1 cell, which was also bimodally tuned to the plaid direction. Using a computational model, we identified the essential neural processing steps required to capture the observed response properties. These processing steps have functional parallels with mechanisms found in the primate visual system, despite different biophysical implementations. By characterizing other visual neurons supporting visually guided behaviors, we found responses that ranged from being bimodally tuned to the stimulus direction (component-selective), to responses that appear to be tuned to the direction of the global pattern (pattern-selective). Our results extend the current understanding of neural mechanisms of motion processing in insects, and indicate that the fly employs a wider range of behavioral responses to multiple motion cues than previously reported.
Collapse
|
19
|
Methods for predicting cortical UP and DOWN states from the phase of deep layer local field potentials. J Comput Neurosci 2010; 29:49-62. [PMID: 20225075 DOI: 10.1007/s10827-010-0228-5] [Citation(s) in RCA: 50] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2009] [Revised: 02/09/2010] [Accepted: 02/17/2010] [Indexed: 01/12/2023]
Abstract
During anesthesia, slow-wave sleep and quiet wakefulness, neuronal membrane potentials collectively switch between de- and hyperpolarized levels, the cortical UP and DOWN states. Previous studies have shown that these cortical UP/DOWN states affect the excitability of individual neurons in response to sensory stimuli, indicating that a significant amount of the trial-to-trial variability in neuronal responses can be attributed to ongoing fluctuations in network activity. However, as intracellular recordings are frequently not available, it is important to be able to estimate their occurrence purely from extracellular data. Here, we combine in vivo whole cell recordings from single neurons with multi-site extracellular microelectrode recordings, to quantify the performance of various approaches to predicting UP/DOWN states from the deep-layer local field potential (LFP). We find that UP/DOWN states in deep cortical layers of rat primary auditory cortex (A1) are predictable from the phase of LFP at low frequencies (< 4 Hz), and that the likelihood of a given state varies sinusoidally with the phase of LFP at these frequencies. We introduce a novel method of detecting cortical state by combining information concerning the phase of the LFP and ongoing multi-unit activity.
Collapse
|
20
|
|
21
|
Receptive field characterization by spike-triggered independent component analysis. J Vis 2008; 8:2.1-16. [PMID: 19146332 DOI: 10.1167/8.13.2] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2007] [Accepted: 08/03/2008] [Indexed: 11/24/2022] Open
Abstract
The spikes generated by a neuron in response to stimuli provide information about the nature of the stimuli and also about the functional organization of the circuit in which the neuron is embedded. Spike-triggered analysis techniques such as spike-triggered covariance (STC) have been proposed to characterize the receptive field properties of neurons. So far, they have been able to provide only limited information about the functional organization of neural circuitry; in particular, STC tends to generate subfields that are mixed observations of independent processes. We address this problem by adding a criterion that sources are independent, resulting in an approach we call spike-triggered independent component analysis (ST-ICA). The method exploits the central limit theorem to find the directions in the high-dimensional stimulus space of spike-triggered data that are most independent. We demonstrate the improvement of the ST-ICA method over STC analysis using simulated neurons. When tested on data obtained from the H1 neuron in the fly visual system, it predicts a spatial arrangement of functional subunits with adjacent receptive fields. The properties of these subunits strongly resemble the known properties of elementary movement detector inputs to the H1 neuron. Using the ST-ICA method, we derive a model that captures functional and physiological properties of fly motion vision.
Collapse
|