151
|
Buckley CL, Nowotny T. Transient dynamics between displaced fixed points: an alternate nonlinear dynamical framework for olfaction. BMC Neurosci 2011. [PMCID: PMC3240342 DOI: 10.1186/1471-2202-12-s1-p237] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022] Open
|
152
|
Katori Y, Sakamoto K, Saito N, Tanji J, Mushiake H, Aihara K. Representational switching by dynamical reorganization of attractor structure in a network model of the prefrontal cortex. PLoS Comput Biol 2011; 7:e1002266. [PMID: 22102803 PMCID: PMC3213170 DOI: 10.1371/journal.pcbi.1002266] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2011] [Accepted: 09/19/2011] [Indexed: 12/05/2022] Open
Abstract
The prefrontal cortex (PFC) plays a crucial role in flexible cognitive behavior by representing task relevant information with its working memory. The working memory with sustained neural activity is described as a neural dynamical system composed of multiple attractors, each attractor of which corresponds to an active state of a cell assembly, representing a fragment of information. Recent studies have revealed that the PFC not only represents multiple sets of information but also switches multiple representations and transforms a set of information to another set depending on a given task context. This representational switching between different sets of information is possibly generated endogenously by flexible network dynamics but details of underlying mechanisms are unclear. Here we propose a dynamically reorganizable attractor network model based on certain internal changes in synaptic connectivity, or short-term plasticity. We construct a network model based on a spiking neuron model with dynamical synapses, which can qualitatively reproduce experimentally demonstrated representational switching in the PFC when a monkey was performing a goal-oriented action-planning task. The model holds multiple sets of information that are required for action planning before and after representational switching by reconfiguration of functional cell assemblies. Furthermore, we analyzed population dynamics of this model with a mean field model and show that the changes in cell assemblies' configuration correspond to those in attractor structure that can be viewed as a bifurcation process of the dynamical system. This dynamical reorganization of a neural network could be a key to uncovering the mechanism of flexible information processing in the PFC. The prefrontal cortex plays a highly flexible role in various cognitive tasks e.g., decision making and action planning. Neurons in the prefrontal cortex exhibit flexible representation or selectivity for task relevant information and are involved in working memory with sustained activity, which can be modeled as attractor dynamics. Moreover, recent experiments revealed that prefrontal neurons not only represent parametric or discrete sets of information but also switch the representation and transform a set of information to another set in order to match the context of the required task. However, underlying mechanisms of this flexible representational switching are unknown. Here we propose a dynamically reorganizable attractor network model in which short-term modulation of the synaptic connections reconfigures the structure of neural attractors by assembly and disassembly of a network of cells to produce flexible attractor dynamics. On the basis of computer simulation as well as theoretical analysis, we showed that this model reproduced experimentally demonstrated representational switching, and that switching on certain characteristic axes defining neural dynamics well describes the essence of the representational switching. This model has the potential to provide unique insights about the flexible information representations and processing in the cortical network.
Collapse
Affiliation(s)
- Yuichi Katori
- FIRST, Aihara Innovative Mathematical Modelling Project, JST, Kawaguchi, Japan.
| | | | | | | | | | | |
Collapse
|
153
|
Perdikis D, Huys R, Jirsa VK. Time scale hierarchies in the functional organization of complex behaviors. PLoS Comput Biol 2011; 7:e1002198. [PMID: 21980278 PMCID: PMC3182871 DOI: 10.1371/journal.pcbi.1002198] [Citation(s) in RCA: 34] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2011] [Accepted: 08/02/2011] [Indexed: 12/01/2022] Open
Abstract
Traditional approaches to cognitive modelling generally portray cognitive events in terms of ‘discrete’ states (point attractor dynamics) rather than in terms of processes, thereby neglecting the time structure of cognition. In contrast, more recent approaches explicitly address this temporal dimension, but typically provide no entry points into cognitive categorization of events and experiences. With the aim to incorporate both these aspects, we propose a framework for functional architectures. Our approach is grounded in the notion that arbitrary complex (human) behaviour is decomposable into functional modes (elementary units), which we conceptualize as low-dimensional dynamical objects (structured flows on manifolds). The ensemble of modes at an agent’s disposal constitutes his/her functional repertoire. The modes may be subjected to additional dynamics (termed operational signals), in particular, instantaneous inputs, and a mechanism that sequentially selects a mode so that it temporarily dominates the functional dynamics. The inputs and selection mechanisms act on faster and slower time scales then that inherent to the modes, respectively. The dynamics across the three time scales are coupled via feedback, rendering the entire architecture autonomous. We illustrate the functional architecture in the context of serial behaviour, namely cursive handwriting. Subsequently, we investigate the possibility of recovering the contributions of functional modes and operational signals from the output, which appears to be possible only when examining the output phase flow (i.e., not from trajectories in phase space or time). In most established approaches to cognitive modelling, cognitive events are treated as ‘discrete’ states, thus passing by the continuous nature of cognitive processes. In contrast, some novel approaches explicitly acknowledge cognition’s temporal structure but provides no entry points into cognitive categorization of events and experiences. We attempt to incorporate both aspects in a new framework, which departs from the established idea that complex (human) behaviour is made up of elementary functional ‘building blocks’, referred to as modes. We model these as mathematical objects that are inherently dynamic (i.e., account for change over time). A mechanism sequentially selects the modes required and binds them together to compose complex behaviours. These modes may be subjected to brief inputs. The ensemble of these three ingredients, which influence one another and operate on different time scales, constitutes a functional architecture. We illustrate the architecture via cursive handwriting simulations, and investigate the possibility of recovering the contributions of the architecture from the written word. This appears possible only when focussing on the dynamic modes.
Collapse
Affiliation(s)
- Dionysios Perdikis
- Theoretical Neuroscience Group, UMR6233, Institut Science du Mouvement, University of the Mediterranean, Marseille, France
- * E-mail: (DP); (VKJ)
| | - Raoul Huys
- Theoretical Neuroscience Group, UMR6233, Institut Science du Mouvement, University of the Mediterranean, Marseille, France
| | - Viktor K. Jirsa
- Theoretical Neuroscience Group, UMR6233, Institut Science du Mouvement, University of the Mediterranean, Marseille, France
- * E-mail: (DP); (VKJ)
| |
Collapse
|
154
|
Graziano M, Polosecki P, Shalom DE, Sigman M. Parsing a perceptual decision into a sequence of moments of thought. Front Integr Neurosci 2011; 5:45. [PMID: 21941470 PMCID: PMC3170920 DOI: 10.3389/fnint.2011.00045] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2011] [Accepted: 08/13/2011] [Indexed: 11/13/2022] Open
Abstract
Theoretical, computational, and experimental studies have converged to a model of decision-making in which sensory evidence is stochastically integrated to a threshold, implementing a shift from an analog to a discrete form of computation. Understanding how this process can be chained and sequenced – as virtually all real-life tasks involve a sequence of decisions – remains an open question in neuroscience. We reasoned that incorporating a virtual continuum of possible behavioral outcomes in a simple decision task – a fundamental ingredient of real-life decision-making – should result in a progressive sequential approximation to the correct response. We used real-time tracking of motor action in a decision task, as a measure of cognitive states reflecting an internal decision process. We found that response trajectories were spontaneously segmented into a discrete sequence of explorations separated by brief stops (about 200 ms) – which remained unconscious to the participants. The characteristics of these stops were indicative of a decision process – a “moment of thought”: their duration correlated with the difficulty of the decision and with the efficiency of the subsequent exploration. Our findings suggest that simple navigation in an abstract space involves a discrete sequence of explorations and stops and, moreover, that these stops reveal a fingerprint of moments of thought.
Collapse
Affiliation(s)
- Martín Graziano
- Laboratorio de Neurociencia Integrativa, Departamento de Física, Facultad de Ciencias Exactas y Naturales - Universidad de Buenos Aires and Instituto de Física de Buenos Aires, Consejo Nacional de Investigaciones Científicas y Técnicas Buenos Aires, Argentina
| | | | | | | |
Collapse
|
155
|
Buckley CL, Nowotny T. Transient dynamics between displaced fixed points: an alternate nonlinear dynamical framework for olfaction. Brain Res 2011; 1434:62-72. [PMID: 21840510 DOI: 10.1016/j.brainres.2011.07.032] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/24/2011] [Revised: 06/02/2011] [Accepted: 07/14/2011] [Indexed: 12/01/2022]
Abstract
Significant insights into the dynamics of neuronal populations have been gained in the olfactory system where rich spatio-temporal dynamics is observed during, and following, exposure to odours. It is now widely accepted that odour identity is represented in terms of stimulus-specific rate patterning observed in the cells of the antennal lobe (AL). Here we describe a nonlinear dynamical framework inspired by recent experimental findings which provides a compelling account of both the origin and the function of these dynamics. We start by analytically reducing a biologically plausible conductance based model of the AL to a quantitatively equivalent rate model and construct conditions such that the rate dynamics are well described by a single globally stable fixed point (FP). We then describe the AL's response to an odour stimulus as rich transient trajectories between this stable baseline state (the single FP in absence of odour stimulation) and the odour-specific position of the single FP during odour stimulation. We show how this framework can account for three phenomena that are observed experimentally. First, for an inhibitory period often observed immediately after an odour stimulus is removed. Second, for the qualitative differences between the dynamics in the presence and the absence of odour. Lastly, we show how it can account for the invariance of a representation of odour identity to both the duration and intensity of an odour stimulus. We compare and contrast this framework with the currently prevalent nonlinear dynamical framework of 'winnerless competition' which describes AL dynamics in terms of heteroclinic orbits. This article is part of a Special Issue entitled "Neural Coding".
Collapse
Affiliation(s)
- Christopher L Buckley
- Centre for Computational Neuroscience and Robotics, University of Sussex, Falmer, Brighton BN1 9QJ, UK.
| | | |
Collapse
|
156
|
Rabinovich MI, Varona P. Robust transient dynamics and brain functions. Front Comput Neurosci 2011; 5:24. [PMID: 21716642 PMCID: PMC3116137 DOI: 10.3389/fncom.2011.00024] [Citation(s) in RCA: 81] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/11/2010] [Accepted: 05/09/2011] [Indexed: 11/13/2022] Open
Abstract
In the last few decades several concepts of dynamical systems theory (DST) have guided psychologists, cognitive scientists, and neuroscientists to rethink about sensory motor behavior and embodied cognition. A critical step in the progress of DST application to the brain (supported by modern methods of brain imaging and multi-electrode recording techniques) has been the transfer of its initial success in motor behavior to mental function, i.e., perception, emotion, and cognition. Open questions from research in genetics, ecology, brain sciences, etc., have changed DST itself and lead to the discovery of a new dynamical phenomenon, i.e., reproducible and robust transients that are at the same time sensitive to informational signals. The goal of this review is to describe a new mathematical framework - heteroclinic sequential dynamics - to understand self-organized activity in the brain that can explain certain aspects of robust itinerant behavior. Specifically, we discuss a hierarchy of coarse-grain models of mental dynamics in the form of kinetic equations of modes. These modes compete for resources at three levels: (i) within the same modality, (ii) among different modalities from the same family (like perception), and (iii) among modalities from different families (like emotion and cognition). The analysis of the conditions for robustness, i.e., the structural stability of transient (sequential) dynamics, give us the possibility to explain phenomena like the finite capacity of our sequential working memory - a vital cognitive function -, and to find specific dynamical signatures - different kinds of instabilities - of several brain functions and mental diseases.
Collapse
|
157
|
Kurikawa T, Kaneko K. Learning shapes spontaneous activity itinerating over memorized states. PLoS One 2011; 6:e17432. [PMID: 21408170 PMCID: PMC3050897 DOI: 10.1371/journal.pone.0017432] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/13/2011] [Accepted: 01/31/2011] [Indexed: 12/22/2022] Open
Abstract
Learning is a process that helps create neural dynamical systems so that an appropriate output pattern is generated for a given input. Often, such a memory is considered to be included in one of the attractors in neural dynamical systems, depending on the initial neural state specified by an input. Neither neural activities observed in the absence of inputs nor changes caused in the neural activity when an input is provided were studied extensively in the past. However, recent experimental studies have reported existence of structured spontaneous neural activity and its changes when an input is provided. With this background, we propose that memory recall occurs when the spontaneous neural activity changes to an appropriate output activity upon the application of an input, and this phenomenon is known as bifurcation in the dynamical systems theory. We introduce a reinforcement-learning-based layered neural network model with two synaptic time scales; in this network, I/O relations are successively memorized when the difference between the time scales is appropriate. After the learning process is complete, the neural dynamics are shaped so that it changes appropriately with each input. As the number of memorized patterns is increased, the generated spontaneous neural activity after learning shows itineration over the previously learned output patterns. This theoretical finding also shows remarkable agreement with recent experimental reports, where spontaneous neural activity in the visual cortex without stimuli itinerate over evoked patterns by previously applied signals. Our results suggest that itinerant spontaneous activity can be a natural outcome of successive learning of several patterns, and it facilitates bifurcation of the network when an input is provided.
Collapse
Affiliation(s)
- Tomoki Kurikawa
- Department of Basic Science, University of Tokyo, Tokyo, Japan.
| | | |
Collapse
|
158
|
Perdikis D, Huys R, Jirsa V. Complex processes from dynamical architectures with time-scale hierarchy. PLoS One 2011; 6:e16589. [PMID: 21347363 PMCID: PMC3037373 DOI: 10.1371/journal.pone.0016589] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2010] [Accepted: 12/21/2010] [Indexed: 11/18/2022] Open
Abstract
The idea that complex motor, perceptual, and cognitive behaviors are composed of smaller units, which are somehow brought into a meaningful relation, permeates the biological and life sciences. However, no principled framework defining the constituent elementary processes has been developed to this date. Consequently, functional configurations (or architectures) relating elementary processes and external influences are mostly piecemeal formulations suitable to particular instances only. Here, we develop a general dynamical framework for distinct functional architectures characterized by the time-scale separation of their constituents and evaluate their efficiency. Thereto, we build on the (phase) flow of a system, which prescribes the temporal evolution of its state variables. The phase flow topology allows for the unambiguous classification of qualitatively distinct processes, which we consider to represent the functional units or modes within the dynamical architecture. Using the example of a composite movement we illustrate how different architectures can be characterized by their degree of time scale separation between the internal elements of the architecture (i.e. the functional modes) and external interventions. We reveal a tradeoff of the interactions between internal and external influences, which offers a theoretical justification for the efficient composition of complex processes out of non-trivial elementary processes or functional modes.
Collapse
Affiliation(s)
- Dionysios Perdikis
- Theoretical Neuroscience Group, UMR6233 Institut Science du Mouvement, University of the Mediterranean, Marseille, France.
| | | | | |
Collapse
|
159
|
Building Neurocognitive Networks with a Distributed Functional Architecture. ADVANCES IN EXPERIMENTAL MEDICINE AND BIOLOGY 2011; 718:101-9. [DOI: 10.1007/978-1-4614-0164-3_9] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/19/2023]
|
160
|
History-dependent excitability as a single-cell substrate of transient memory for information discrimination. PLoS One 2010; 5:e15023. [PMID: 21203387 PMCID: PMC3010997 DOI: 10.1371/journal.pone.0015023] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2010] [Accepted: 10/08/2010] [Indexed: 11/19/2022] Open
Abstract
Neurons react differently to incoming stimuli depending upon their previous history of stimulation. This property can be considered as a single-cell substrate for transient memory, or context-dependent information processing: depending upon the current context that the neuron "sees" through the subset of the network impinging on it in the immediate past, the same synaptic event can evoke a postsynaptic spike or just a subthreshold depolarization. We propose a formal definition of History-Dependent Excitability (HDE) as a measure of the propensity to firing in any moment in time, linking the subthreshold history-dependent dynamics with spike generation. This definition allows the quantitative assessment of the intrinsic memory for different single-neuron dynamics and input statistics. We illustrate the concept of HDE by considering two general dynamical mechanisms: the passive behavior of an Integrate and Fire (IF) neuron, and the inductive behavior of a Generalized Integrate and Fire (GIF) neuron with subthreshold damped oscillations. This framework allows us to characterize the sensitivity of different model neurons to the detailed temporal structure of incoming stimuli. While a neuron with intrinsic oscillations discriminates equally well between input trains with the same or different frequency, a passive neuron discriminates better between inputs with different frequencies. This suggests that passive neurons are better suited to rate-based computation, while neurons with subthreshold oscillations are advantageous in a temporal coding scheme. We also address the influence of intrinsic properties in single-cell processing as a function of input statistics, and show that intrinsic oscillations enhance discrimination sensitivity at high input rates. Finally, we discuss how the recognition of these cell-specific discrimination properties might further our understanding of neuronal network computations and their relationships to the distribution and functional connectivity of different neuronal types.
Collapse
|
161
|
Abstract
A widely discussed hypothesis in neuroscience is that transiently active ensembles of neurons, known as "cell assemblies," underlie numerous operations of the brain, from encoding memories to reasoning. However, the mechanisms responsible for the formation and disbanding of cell assemblies and temporal evolution of cell assembly sequences are not well understood. I introduce and review three interconnected topics, which could facilitate progress in defining cell assemblies, identifying their neuronal organization, and revealing causal relationships between assembly organization and behavior. First, I hypothesize that cell assemblies are best understood in light of their output product, as detected by "reader-actuator" mechanisms. Second, I suggest that the hierarchical organization of cell assemblies may be regarded as a neural syntax. Third, constituents of the neural syntax are linked together by dynamically changing constellations of synaptic weights ("synapsembles"). The existing support for this tripartite framework is reviewed and strategies for experimental testing of its predictions are discussed.
Collapse
Affiliation(s)
- György Buzsáki
- Center for Molecular and Behavioral Neuroscience, Rutgers, The State University of New Jersey, 197 University Avenue, Newark, NJ 07102, USA.
| |
Collapse
|
162
|
Pascanu R, Jaeger H. A neurodynamical model for working memory. Neural Netw 2010; 24:199-207. [PMID: 21036537 DOI: 10.1016/j.neunet.2010.10.003] [Citation(s) in RCA: 35] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2010] [Revised: 09/28/2010] [Accepted: 10/08/2010] [Indexed: 10/18/2022]
Abstract
Neurodynamical models of working memory (WM) should provide mechanisms for storing, maintaining, retrieving, and deleting information. Many models address only a subset of these aspects. Here we present a rather simple WM model in which all of these performance modes are trained into a recurrent neural network (RNN) of the echo state network (ESN) type. The model is demonstrated on a bracket level parsing task with a stream of rich and noisy graphical script input. In terms of nonlinear dynamics, memory states correspond, intuitively, to attractors in an input-driven system. As a supplementary contribution, the article proposes a rigorous formal framework to describe such attractors, generalizing from the standard definition of attractors in autonomous (input-free) dynamical systems.
Collapse
|
163
|
Afraimovich V, Young T, Muezzinoglu MK, Rabinovich MI. Nonlinear dynamics of emotion-cognition interaction: when emotion does not destroy cognition? Bull Math Biol 2010; 73:266-84. [PMID: 20821062 DOI: 10.1007/s11538-010-9572-x] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2010] [Accepted: 07/05/2010] [Indexed: 11/26/2022]
Abstract
Emotion (i.e., spontaneous motivation and subsequent implementation of a behavior) and cognition (i.e., problem solving by information processing) are essential to how we, as humans, respond to changes in our environment. Recent studies in cognitive science suggest that emotion and cognition are subserved by different, although heavily integrated, neural systems. Understanding the time-varying relationship of emotion and cognition is a challenging goal with important implications for neuroscience. We formulate here the dynamical model of emotion-cognition interaction that is based on the following principles: (1) the temporal evolution of cognitive and emotion modes are captured by the incoming stimuli and competition within and among themselves (competition principle); (2) metastable states exist in the unified emotion-cognition phase space; and (3) the brain processes information with robust and reproducible transients through the sequence of metastable states. Such a model can take advantage of the often ignored temporal structure of the emotion-cognition interaction to provide a robust and generalizable method for understanding the relationship between brain activation and complex human behavior. The mathematical image of the robust and reproducible transient dynamics is a Stable Heteroclinic Sequence (SHS), and the Stable Heteroclinic Channels (SHCs). These have been hypothesized to be possible mechanisms that lead to the sequential transient behavior observed in networks. We investigate the modularity of SHCs, i.e., given a SHS and a SHC that is supported in one part of a network, we study conditions under which the SHC pertaining to the cognition will continue to function in the presence of interfering activity with other parts of the network, i.e., emotion.
Collapse
|
164
|
Jirsa VK, Stefanescu RA. Neural population modes capture biologically realistic large scale network dynamics. Bull Math Biol 2010; 73:325-43. [PMID: 20821061 DOI: 10.1007/s11538-010-9573-9] [Citation(s) in RCA: 24] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2010] [Accepted: 07/05/2010] [Indexed: 11/28/2022]
Abstract
Large scale brain networks are understood nowadays to underlie the emergence of cognitive functions, though the detailed mechanisms are hitherto unknown. The challenges in the study of large scale brain networks are amongst others their high dimensionality requiring significant computational efforts, the complex connectivity across brain areas and the associated transmission delays, as well as the stochastic nature of neuronal processes. To decrease the computational effort, neurons are clustered into neural masses, which then are approximated by reduced descriptions of population dynamics. Here, we implement a neural population mode approach (Assisi et al. in Phys. Rev. Lett. 94(1):018106, 2005; Stefanescu and Jirsa in PLoS Comput. Biol. 4(11):e1000219, 2008), which parsimoniously captures various types of population behavior. We numerically demonstrate that the reduced population mode system favorably captures the high-dimensional dynamics of neuron networks with an architecture involving homogeneous local connectivity and a large-scale, fiber-like connection with time delay.
Collapse
Affiliation(s)
- Viktor K Jirsa
- Theoretical Neuroscience Group, Institute Sciences de Mouvement, UMR6233 CNRS, Marseille, France.
| | | |
Collapse
|
165
|
Sequentially switching cell assemblies in random inhibitory networks of spiking neurons in the striatum. J Neurosci 2010; 30:5894-911. [PMID: 20427650 DOI: 10.1523/jneurosci.5540-09.2010] [Citation(s) in RCA: 92] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
The striatum is composed of GABAergic medium spiny neurons with inhibitory collaterals forming a sparse random asymmetric network and receiving an excitatory glutamatergic cortical projection. Because the inhibitory collaterals are sparse and weak, their role in striatal network dynamics is puzzling. However, here we show by simulation of a striatal inhibitory network model composed of spiking neurons that cells form assemblies that fire in sequential coherent episodes and display complex identity-temporal spiking patterns even when cortical excitation is simply constant or fluctuating noisily. Strongly correlated large-scale firing rate fluctuations on slow behaviorally relevant timescales of hundreds of milliseconds are shown by members of the same assembly whereas members of different assemblies show strong negative correlation, and we show how randomly connected spiking networks can generate this activity. Cells display highly irregular spiking with high coefficients of variation, broadly distributed low firing rates, and interspike interval distributions that are consistent with exponentially tailed power laws. Although firing rates vary coherently on slow timescales, precise spiking synchronization is absent in general. Our model only requires the minimal but striatally realistic assumptions of sparse to intermediate random connectivity, weak inhibitory synapses, and sufficient cortical excitation so that some cells are depolarized above the firing threshold during up states. Our results are in good qualitative agreement with experimental studies, consistent with recently determined striatal anatomy and physiology, and support a new view of endogenously generated metastable state switching dynamics of the striatal network underlying its information processing operations.
Collapse
|
166
|
Bick C, Rabinovich MI. Dynamical origin of the effective storage capacity in the brain's working memory. PHYSICAL REVIEW LETTERS 2009; 103:218101. [PMID: 20366069 DOI: 10.1103/physrevlett.103.218101] [Citation(s) in RCA: 27] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/03/2009] [Indexed: 05/29/2023]
Abstract
The capacity of working memory (WM), a short-term buffer for information in the brain, is limited. We suggest a model for sequential WM that is based upon winnerless competition amongst representations of available informational items. Analytical results for the underlying mathematical model relate WM capacity and relative lateral inhibition in the corresponding neural network. This implies an upper bound for WM capacity, which is, under reasonable neurobiological assumptions, close to the "magical number seven."
Collapse
Affiliation(s)
- Christian Bick
- BioCircuits Institute, University of California, San Diego, 9500 Gilman Drive, La Jolla, California 92093-0402, USA.
| | | |
Collapse
|
167
|
Abstract
The neural basis of olfactory information processing and olfactory percept formation is a topic of intense investigation as new genetic, optical, and psychophysical tools are brought to bear to identify the sites and interaction modes of cortical areas involved in the central processing of olfactory information. New methods for recording cellular interactions and network events in the awake, behaving brain during olfactory processing and odor-based decision making have shown remarkable new properties of neuromodulation and synaptic interactions distinct from those observed in anesthetized brains. Psychophysical, imaging, and computational studies point to the orbitofrontal cortex as the likely locus of odor percept formation in mammals, but further work is needed to identify a causal link between perceptual and neural events in this area.
Collapse
Affiliation(s)
- Alan Gelperin
- Monell Chemical Senses Center, Philadelphia, Pennsylvania 19104, USA.
| | | |
Collapse
|
168
|
Kiebel SJ, von Kriegstein K, Daunizeau J, Friston KJ. Recognizing sequences of sequences. PLoS Comput Biol 2009; 5:e1000464. [PMID: 19680429 PMCID: PMC2714976 DOI: 10.1371/journal.pcbi.1000464] [Citation(s) in RCA: 89] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/18/2009] [Accepted: 07/10/2009] [Indexed: 11/17/2022] Open
Abstract
The brain's decoding of fast sensory streams is currently impossible to emulate, even approximately, with artificial agents. For example, robust speech recognition is relatively easy for humans but exceptionally difficult for artificial speech recognition systems. In this paper, we propose that recognition can be simplified with an internal model of how sensory input is generated, when formulated in a Bayesian framework. We show that a plausible candidate for an internal or generative model is a hierarchy of ‘stable heteroclinic channels’. This model describes continuous dynamics in the environment as a hierarchy of sequences, where slower sequences cause faster sequences. Under this model, online recognition corresponds to the dynamic decoding of causal sequences, giving a representation of the environment with predictive power on several timescales. We illustrate the ensuing decoding or recognition scheme using synthetic sequences of syllables, where syllables are sequences of phonemes and phonemes are sequences of sound-wave modulations. By presenting anomalous stimuli, we find that the resulting recognition dynamics disclose inference at multiple time scales and are reminiscent of neuronal dynamics seen in the real brain. Despite tremendous advances in neuroscience, we cannot yet build machines that recognize the world as effortlessly as we do. One reason might be that there are computational approaches to recognition that have not yet been exploited. Here, we demonstrate that the ability to recognize temporal sequences might play an important part. We show that an artificial decoding device can extract natural speech sounds from sound waves if speech is generated as dynamic and transient sequences of sequences. In principle, this means that artificial recognition can be implemented robustly and online using dynamic systems theory and Bayesian inference.
Collapse
|
169
|
Sauer TD, Schiff SJ. Data assimilation for heterogeneous networks: the consensus set. PHYSICAL REVIEW. E, STATISTICAL, NONLINEAR, AND SOFT MATTER PHYSICS 2009; 79:051909. [PMID: 19518482 PMCID: PMC2951269 DOI: 10.1103/physreve.79.051909] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/07/2008] [Revised: 03/11/2009] [Indexed: 05/12/2023]
Abstract
Data assimilation in dynamical networks is intrinsically challenging. A method is introduced for the tracking of heterogeneous networks of oscillators or excitable cells in a nonstationary environment, using a homogeneous model network to expedite the accurate reconstruction of parameters and unobserved variables. An implementation using ensemble Kalman filtering to track the states of the heterogeneous network is demonstrated on simulated data and applied to a mammalian brain network experiment. The approach has broad applicability for the prediction and control of biological and physical networks.
Collapse
Affiliation(s)
- Timothy D Sauer
- Department of Mathematical Sciences, George Mason University, Fairfax, Virginia 22030, USA.
| | | |
Collapse
|
170
|
Oullier O, Kirman AP, Kelso JAS. The coordination dynamics of economic decision making: a multilevel approach to social neuroeconomics. IEEE Trans Neural Syst Rehabil Eng 2009; 16:557-71. [PMID: 19144588 DOI: 10.1109/tnsre.2008.2009960] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
The basic reciprocity between individual parts and collective organization constitutes a key scientific question spanning the biological and social sciences. Such reciprocity is accompanied by the absence of direct linkages between levels of description giving rise to what is often referred to as the aggregation or nonequivalence problem between levels of analysis. This issue is encountered both in neuroscience and economics. So far, in spite of being identified and extensively discussed in various (other) scientific fields, the problem of understanding the nature of the interactions and coordination dynamics between individual (neuron approximately agent) and collective (neural networks approximately population of humans) behaviors has received little, if any attention in the growing field of neuroeconomics. The present contribution focuses on bringing a theoretical perspective to the interpretation of experiments recently published in this field and addressing how the concepts and methods of coordination dynamics may impact future research. First, we very briefly discuss the links between biology and economics. Second, we address the nonequivalence problem between different levels of analysis and the concept of reciprocal causality. Third, neuroeconomics studies that investigate the neural underpinnings of social decision making in the context of two economic games (trust and ultimatum) are reviewed to highlight issues that arise when experimental results exist at multiple scales of observation and description. Finally, in the last two sections, we discuss how coordination dynamics might provide novel routes to studying and modelling the relation between brain activity and decision making.
Collapse
Affiliation(s)
- Olivier Oullier
- Human Neurobiology Laboratory, Aix-Marseille University and CNRS, F-13331 Marseille, France.
| | | | | |
Collapse
|
171
|
Canalization of gene expression and domain shifts in the Drosophila blastoderm by dynamical attractors. PLoS Comput Biol 2009; 5:e1000303. [PMID: 19282965 PMCID: PMC2646127 DOI: 10.1371/journal.pcbi.1000303] [Citation(s) in RCA: 166] [Impact Index Per Article: 11.1] [Reference Citation Analysis] [Abstract] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2008] [Accepted: 01/23/2009] [Indexed: 12/25/2022] Open
Abstract
The variation in the expression patterns of the gap genes in the blastoderm of the fruit fly Drosophila melanogaster reduces over time as a result of cross regulation between these genes, a fact that we have demonstrated in an accompanying article in PLoS Biology (see Manu et al., doi:10.1371/journal.pbio.1000049). This biologically essential process is an example of the phenomenon known as canalization. It has been suggested that the developmental trajectory of a wild-type organism is inherently stable, and that canalization is a manifestation of this property. Although the role of gap genes in the canalization process was established by correctly predicting the response of the system to particular perturbations, the stability of the developmental trajectory remains to be investigated. For many years, it has been speculated that stability against perturbations during development can be described by dynamical systems having attracting sets that drive reductions of volume in phase space. In this paper, we show that both the reduction in variability of gap gene expression as well as shifts in the position of posterior gap gene domains are the result of the actions of attractors in the gap gene dynamical system. Two biologically distinct dynamical regions exist in the early embryo, separated by a bifurcation at 53% egg length. In the anterior region, reduction in variation occurs because of stability induced by point attractors, while in the posterior, the stability of the developmental trajectory arises from a one-dimensional attracting manifold. This manifold also controls a previously characterized anterior shift of posterior region gap domains. Our analysis shows that the complex phenomena of canalization and pattern formation in the Drosophila blastoderm can be understood in terms of the qualitative features of the dynamical system. The result confirms the idea that attractors are important for developmental stability and shows a richer variety of dynamical attractors in developmental systems than has been previously recognized.
Collapse
|
172
|
Komarov MA, Osipov GV, Suykens JAK, Rabinovich MI. Numerical studies of slow rhythms emergence in neural microcircuits: bifurcations and stability. CHAOS (WOODBURY, N.Y.) 2009; 19:015107. [PMID: 19335011 DOI: 10.1063/1.3096412] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
There is a growing body of evidence that slow brain rhythms are generated by simple inhibitory neural networks. Sequential switching of tonic spiking activity is a widespread phenomenon underlying such rhythms. A realistic generative model explaining such reproducible switching is a dynamical system that employs a closed stable heteroclinic channel (SHC) in its phase space. Despite strong evidence on the existence of SHC, the conditions on its emergence in a spiking network are unclear. In this paper, we analyze a minimal, reciprocally connected circuit of three spiking units and explore all possible dynamical regimes and transitions between them. We show that the SHC arises due to a Neimark-Sacker bifurcation of an unstable cycle.
Collapse
Affiliation(s)
- M A Komarov
- Department of Control Theory, Nizhny Novgorod University, Nizhny Novgorod, Russia
| | | | | | | |
Collapse
|
173
|
beim Graben P, Potthast R. Inverse problems in dynamic cognitive modeling. CHAOS (WOODBURY, N.Y.) 2009; 19:015103. [PMID: 19335007 DOI: 10.1063/1.3097067] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
Inverse problems for dynamical system models of cognitive processes comprise the determination of synaptic weight matrices or kernel functions for neural networks or neural/dynamic field models, respectively. We introduce dynamic cognitive modeling as a three tier top-down approach where cognitive processes are first described as algorithms that operate on complex symbolic data structures. Second, symbolic expressions and operations are represented by states and transformations in abstract vector spaces. Third, prescribed trajectories through representation space are implemented in neurodynamical systems. We discuss the Amari equation for a neural/dynamic field theory as a special case and show that the kernel construction problem is particularly ill-posed. We suggest a Tikhonov-Hebbian learning method as regularization technique and demonstrate its validity and robustness for basic examples of cognitive computations.
Collapse
Affiliation(s)
- Peter beim Graben
- School of Psychology and Clinical Language Sciences, University of Reading, Reading, Berkshire, United Kingdom.
| | | |
Collapse
|
174
|
Arecchi FT, Kurths J. Introduction to focus issue: nonlinear dynamics in cognitive and neural systems. CHAOS (WOODBURY, N.Y.) 2009; 19:015101. [PMID: 19335005 DOI: 10.1063/1.3106111] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
In this Focus Issue, two interrelated concepts, namely, deterministic chaos and cognitive abilities, are discussed.
Collapse
Affiliation(s)
- F Tito Arecchi
- CNR-Istituto Nazionale di Ottica Applicata, Firenze, Italy.
| | | |
Collapse
|
175
|
|
176
|
Afraimovich V, Tristan I, Huerta R, Rabinovich MI. Winnerless competition principle and prediction of the transient dynamics in a Lotka-Volterra model. CHAOS (WOODBURY, N.Y.) 2008; 18:043103. [PMID: 19123613 DOI: 10.1063/1.2991108] [Citation(s) in RCA: 30] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
Predicting the evolution of multispecies ecological systems is an intriguing problem. A sufficiently complex model with the necessary predicting power requires solutions that are structurally stable. Small variations of the system parameters should not qualitatively perturb its solutions. When one is interested in just asymptotic results of evolution (as time goes to infinity), then the problem has a straightforward mathematical image involving simple attractors (fixed points or limit cycles) of a dynamical system. However, for an accurate prediction of evolution, the analysis of transient solutions is critical. In this paper, in the framework of the traditional Lotka-Volterra model (generalized in some sense), we show that the transient solution representing multispecies sequential competition can be reproducible and predictable with high probability.
Collapse
Affiliation(s)
- Valentin Afraimovich
- Instituto de Investigacion en Comunicacion Optica, Universidad Autonoma de San Luis Potosi, Karakorum 1470, Lomas 4a, 78220 San Luis Potosi, S.L.P., Mexico
| | | | | | | |
Collapse
|
177
|
|