1
|
Dinh C, Samuelsson JG, Hunold A, Hämäläinen MS, Khan S. Contextual MEG and EEG Source Estimates Using Spatiotemporal LSTM Networks. Front Neurosci 2021; 15:552666. [PMID: 33767606 PMCID: PMC7985163 DOI: 10.3389/fnins.2021.552666] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/16/2020] [Accepted: 01/25/2021] [Indexed: 11/13/2022] Open
Abstract
Most magneto- and electroencephalography (M/EEG) based source estimation techniques derive their estimates sample wise, independently across time. However, neuronal assemblies are intricately interconnected, constraining the temporal evolution of neural activity that is detected by MEG and EEG; the observed neural currents must thus be highly context dependent. Here, we use a network of Long Short-Term Memory (LSTM) cells where the input is a sequence of past source estimates and the output is a prediction of the following estimate. This prediction is then used to correct the estimate. In this study, we applied this technique on noise-normalized minimum norm estimates (MNE). Because the correction is found by using past activity (context), we call this implementation Contextual MNE (CMNE), although this technique can be used in conjunction with any source estimation method. We test CMNE on simulated epileptiform activity and recorded auditory steady state response (ASSR) data, showing that the CMNE estimates exhibit a higher degree of spatial fidelity than the unfiltered estimates in the tested cases.
Collapse
Affiliation(s)
- Christoph Dinh
- Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Charlestown, MA, United States.,Department of Radiology, Massachusetts General Hospital (MGH), Charlestown, MA, United States.,Institute for Medical Engineering, Research Campus STIMULATE, Otto-von-Guericke University, Magdeburg, Germany
| | - John G Samuelsson
- Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Charlestown, MA, United States.,Harvard Medical School, Boston, MA, United States.,Harvard-MIT Division of Health Sciences and Technology (HST), Massachusetts Institute of Technology, Cambridge, MA, United States
| | - Alexander Hunold
- Institute of Biomedical Engineering and Informatics, Technische Universität Ilmenau, Ilmenau, Germany
| | - Matti S Hämäläinen
- Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Charlestown, MA, United States.,Department of Radiology, Massachusetts General Hospital (MGH), Charlestown, MA, United States.,Harvard Medical School, Boston, MA, United States
| | - Sheraz Khan
- Athinoula A. Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Charlestown, MA, United States.,Department of Radiology, Massachusetts General Hospital (MGH), Charlestown, MA, United States.,Harvard Medical School, Boston, MA, United States
| |
Collapse
|
2
|
Fasoli D, Cattani A, Panzeri S. Transitions between asynchronous and synchronous states: a theory of correlations in small neural circuits. J Comput Neurosci 2017; 44:25-43. [PMID: 29124505 PMCID: PMC5770155 DOI: 10.1007/s10827-017-0667-3] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2017] [Revised: 09/04/2017] [Accepted: 10/10/2017] [Indexed: 12/11/2022]
Abstract
The study of correlations in neural circuits of different size, from the small size of cortical microcolumns to the large-scale organization of distributed networks studied with functional imaging, is a topic of central importance to systems neuroscience. However, a theory that explains how the parameters of mesoscopic networks composed of a few tens of neurons affect the underlying correlation structure is still missing. Here we consider a theory that can be applied to networks of arbitrary size with multiple populations of homogeneous fully-connected neurons, and we focus its analysis to a case of two populations of small size. We combine the analysis of local bifurcations of the dynamics of these networks with the analytical calculation of their cross-correlations. We study the correlation structure in different regimes, showing that a variation of the external stimuli causes the network to switch from asynchronous states, characterized by weak correlation and low variability, to synchronous states characterized by strong correlations and wide temporal fluctuations. We show that asynchronous states are generated by strong stimuli, while synchronous states occur through critical slowing down when the stimulus moves the network close to a local bifurcation. In particular, strongly positive correlations occur at the saddle-node and Andronov-Hopf bifurcations of the network, while strongly negative correlations occur when the network undergoes a spontaneous symmetry-breaking at the branching-point bifurcations. These results show how the correlation structure of firing-rate network models is strongly modulated by the external stimuli, even keeping the anatomical connections fixed. These results also suggest an effective mechanism through which biological networks may dynamically modulate the encoding and integration of sensory information.
Collapse
Affiliation(s)
- Diego Fasoli
- Laboratory of Neural Computation, Center for Neuroscience and Cognitive Systems @UniTn, Istituto Italiano di Tecnologia, 38068, Rovereto, Italy.
- Center for Brain and Cognition, Computational Neuroscience Group, Universitat Pompeu Fabra, 08002, Barcelona, Spain.
| | - Anna Cattani
- Laboratory of Neural Computation, Center for Neuroscience and Cognitive Systems @UniTn, Istituto Italiano di Tecnologia, 38068, Rovereto, Italy
- Department of Biomedical and Clinical Sciences "L. Sacco", University of Milan, Milan, Italy
| | - Stefano Panzeri
- Laboratory of Neural Computation, Center for Neuroscience and Cognitive Systems @UniTn, Istituto Italiano di Tecnologia, 38068, Rovereto, Italy
| |
Collapse
|
3
|
Fasoli D, Cattani A, Panzeri S. The Complexity of Dynamics in Small Neural Circuits. PLoS Comput Biol 2016; 12:e1004992. [PMID: 27494737 PMCID: PMC4975407 DOI: 10.1371/journal.pcbi.1004992] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2016] [Accepted: 05/20/2016] [Indexed: 01/06/2023] Open
Abstract
Mean-field approximations are a powerful tool for studying large neural networks. However, they do not describe well the behavior of networks composed of a small number of neurons. In this case, major differences between the mean-field approximation and the real behavior of the network can arise. Yet, many interesting problems in neuroscience involve the study of mesoscopic networks composed of a few tens of neurons. Nonetheless, mathematical methods that correctly describe networks of small size are still rare, and this prevents us to make progress in understanding neural dynamics at these intermediate scales. Here we develop a novel systematic analysis of the dynamics of arbitrarily small networks composed of homogeneous populations of excitatory and inhibitory firing-rate neurons. We study the local bifurcations of their neural activity with an approach that is largely analytically tractable, and we numerically determine the global bifurcations. We find that for strong inhibition these networks give rise to very complex dynamics, caused by the formation of multiple branching solutions of the neural dynamics equations that emerge through spontaneous symmetry-breaking. This qualitative change of the neural dynamics is a finite-size effect of the network, that reveals qualitative and previously unexplored differences between mesoscopic cortical circuits and their mean-field approximation. The most important consequence of spontaneous symmetry-breaking is the ability of mesoscopic networks to regulate their degree of functional heterogeneity, which is thought to help reducing the detrimental effect of noise correlations on cortical information processing. The mesoscopic level of brain organization, describing the organization and dynamics of small circuits of neurons including from few tens to few thousands, has recently received considerable experimental attention. It is useful for describing small neural systems of invertebrates, and in mammalian neural systems it is often seen as a middle ground that is fundamental to link single neuron activity to complex functions and behavior. However, and somewhat counter-intuitively, the behavior of neural networks of small and intermediate size can be much more difficult to study mathematically than that of large networks, and appropriate mathematical methods to study the dynamics of such networks have not been developed yet. Here we consider a model of a network of firing-rate neurons with arbitrary finite size, and we study its local bifurcations using an analytical approach. This analysis, complemented by numerical studies for both the local and global bifurcations, shows the emergence of strong and previously unexplored finite-size effects that are particularly hard to detect in large networks. This study advances the tools available for the comprehension of finite-size neural circuits, going beyond the insights provided by the mean-field approximation and the current techniques for the quantification of finite-size effects.
Collapse
Affiliation(s)
- Diego Fasoli
- Laboratory of Neural Computation, Center for Neuroscience and Cognitive Systems @UniTn, Istituto Italiano di Tecnologia, Rovereto, Italy
- * E-mail:
| | - Anna Cattani
- Laboratory of Neural Computation, Center for Neuroscience and Cognitive Systems @UniTn, Istituto Italiano di Tecnologia, Rovereto, Italy
| | - Stefano Panzeri
- Laboratory of Neural Computation, Center for Neuroscience and Cognitive Systems @UniTn, Istituto Italiano di Tecnologia, Rovereto, Italy
| |
Collapse
|
4
|
Wright JJ, Bourke PD. On the dynamics of cortical development: synchrony and synaptic self-organization. Front Comput Neurosci 2013; 7:4. [PMID: 23596410 PMCID: PMC3573321 DOI: 10.3389/fncom.2013.00004] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2012] [Accepted: 01/24/2013] [Indexed: 12/02/2022] Open
Abstract
We describe a model for cortical development that resolves long-standing difficulties of earlier models. It is proposed that, during embryonic development, synchronous firing of neurons and their competition for limited metabolic resources leads to selection of an array of neurons with ultra-small-world characteristics. Consequently, in the visual cortex, macrocolumns linked by superficial patchy connections emerge in anatomically realistic patterns, with an ante-natal arrangement which projects signals from the surrounding cortex onto each macrocolumn in a form analogous to the projection of a Euclidean plane onto a Möbius strip. This configuration reproduces typical cortical response maps, and simulations of signal flow explain cortical responses to moving lines as functions of stimulus velocity, length, and orientation. With the introduction of direct visual inputs, under the operation of Hebbian learning, development of mature selective response “tuning” to stimuli of given orientation, spatial frequency, and temporal frequency would then take place, overwriting the earlier ante-natal configuration. The model is provisionally extended to hierarchical interactions of the visual cortex with higher centers, and a general principle for cortical processing of spatio-temporal images is sketched.
Collapse
Affiliation(s)
- James Joseph Wright
- Department of Psychological Medicine, Faculty of Medicine, The University of Auckland Auckland, New Zealand ; Liggins Institute, The University of Auckland Auckland, New Zealand
| | | |
Collapse
|
5
|
The laminar cortex model: a new continuum cortex model incorporating laminar architecture. PLoS Comput Biol 2012; 8:e1002733. [PMID: 23093925 PMCID: PMC3475685 DOI: 10.1371/journal.pcbi.1002733] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2012] [Accepted: 08/22/2012] [Indexed: 12/02/2022] Open
Abstract
Local field potentials (LFPs) are widely used to study the function of local networks in the brain. They are also closely correlated with the blood-oxygen-level-dependent signal, the predominant contrast mechanism in functional magnetic resonance imaging. We developed a new laminar cortex model (LCM) to simulate the amplitude and frequency of LFPs. Our model combines the laminar architecture of the cerebral cortex and multiple continuum models to simulate the collective activity of cortical neurons. The five cortical layers (layer I, II/III, IV, V, and VI) are simulated as separate continuum models between which there are synaptic connections. The LCM was used to simulate the dynamics of the visual cortex under different conditions of visual stimulation. LFPs are reported for two kinds of visual stimulation: general visual stimulation and intermittent light stimulation. The power spectra of LFPs were calculated and compared with existing empirical data. The LCM was able to produce spontaneous LFPs exhibiting frequency-inverse (1/ƒ) power spectrum behaviour. Laminar profiles of current source density showed similarities to experimental data. General stimulation enhanced the oscillation of LFPs corresponding to gamma frequencies. During simulated intermittent light stimulation, the LCM captured the fundamental as well as high order harmonics as previously reported. The power spectrum expected with a reduction in layer IV neurons, often observed with focal cortical dysplasias associated with epilepsy was also simulated. Local field potentials (LFPs) are low-frequency fluctuations of the electric fields produced by the brain. They have been widely studied to understand brain function and activity. LFPs reflect the activity of neurons within a few square millimeters of the cerebral cortex, an area containing more than 10,000 neurons. To avoid the complexity of simulating such a large number of individual neurons, the continuum cortex model was devised to simulate the collective activity of groups of neurons generating cortical LFPs. However, the continuum cortex model assumes that the cortex is two-dimensional and does not take into account the laminar architecture of the cerebral cortex. We developed a three-dimensional laminar cortex model (LCM) by combining laminar architecture with the continuum cortex model. This expansion enables the LCM to simulate the detailed three-dimensional distribution of the LFP within the cortex. We used the LCM to simulate LFPs within the visual cortex under different conditions of visual stimulation. The LCM reproduced the key features of LFPs observed in electrophysiological experiments. We conclude that the LCM is a potentially useful tool to investigate the underlying mechanism of LFPs.
Collapse
|
6
|
Lamus C, Hämäläinen MS, Temereanca S, Brown EN, Purdon PL. A spatiotemporal dynamic distributed solution to the MEG inverse problem. Neuroimage 2011; 63:894-909. [PMID: 22155043 DOI: 10.1016/j.neuroimage.2011.11.020] [Citation(s) in RCA: 51] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2011] [Revised: 11/03/2011] [Accepted: 11/07/2011] [Indexed: 11/29/2022] Open
Abstract
MEG/EEG are non-invasive imaging techniques that record brain activity with high temporal resolution. However, estimation of brain source currents from surface recordings requires solving an ill-conditioned inverse problem. Converging lines of evidence in neuroscience, from neuronal network models to resting-state imaging and neurophysiology, suggest that cortical activation is a distributed spatiotemporal dynamic process, supported by both local and long-distance neuroanatomic connections. Because spatiotemporal dynamics of this kind are central to brain physiology, inverse solutions could be improved by incorporating models of these dynamics. In this article, we present a model for cortical activity based on nearest-neighbor autoregression that incorporates local spatiotemporal interactions between distributed sources in a manner consistent with neurophysiology and neuroanatomy. We develop a dynamic maximum a posteriori expectation-maximization (dMAP-EM) source localization algorithm for estimation of cortical sources and model parameters based on the Kalman Filter, the Fixed Interval Smoother, and the EM algorithms. We apply the dMAP-EM algorithm to simulated experiments as well as to human experimental data. Furthermore, we derive expressions to relate our dynamic estimation formulas to those of standard static models, and show how dynamic methods optimally assimilate past and future data. Our results establish the feasibility of spatiotemporal dynamic estimation in large-scale distributed source spaces with several thousand source locations and hundreds of sensors, with resulting inverse solutions that provide substantial performance improvements over static methods.
Collapse
Affiliation(s)
- Camilo Lamus
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, USA.
| | | | | | | | | |
Collapse
|
7
|
WRIGHT JJ. CORTICAL PHASE TRANSITIONS: PROPERTIES DEMONSTRATED IN CONTINUUM SIMULATIONS AT MESOSCOPIC AND MACROSCOPIC SCALES. ACTA ACUST UNITED AC 2011. [DOI: 10.1142/s1793005709001210] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
Continuum simulations of cortical dynamics permit consistent simulations to be performed at different spatial scales, using scale-adjusted parameter values. Properties of the simulations described here accord with Freeman's experimental and theoretical findings on gamma synchrony, phase transition, phase cones, and null spikes. State equations include effects of retrograde action potential propagation into dendritic trees, and kinetics of AMPA, GABA, and NMDA receptors. Realistic field potentials and pulse rates, gamma resonance and oscillation, and 1/f2 background activity are obtained. Zero-lag synchrony and traveling waves occur as complementary aspects of cortical transmission, and lead/lag relations between excitatory and inhibitory cell populations vary systematically around transition to autonomous gamma oscillation. Autonomous gamma is initiated by focal excitation of excitatory cells and suppressed by laterally spreading trans-cortical excitation. By implication, patches of cortex excited to gamma oscillation can mutually synchronize into larger fields, self-organized into sequences by mutual negative feedback relations, while the sequence of synchronous fields is regulated both by cortical/subcortical interactions and by traveling waves in the cortex — the latter observable as phase cones. At a critical level of cortical excitation, just before transition to autonomous gamma, patches of cortex exhibit selective sensitivity to action potential pulse trains modulated in the gamma band, while autonomous gamma releases pulse trains modulated in the same band, implying coupling of input and output modes. Transition between input and output modes may be heralded by phase slips and null spikes. Synaptic segregation by retrograde action potential propagation implies state-specific synaptic information storage.
Collapse
Affiliation(s)
- J. J. WRIGHT
- Liggins Institute, and Department of Psychological Medicine, Faculty of Medicine and Health Sciences, University of Auckland, Grafton Road, Auckland, New Zealand
- Brain Dynamics Centre, University of Sydney, Acacia House, Westmead Hospital, Hawkesbury Road, Westmead NSW 2145, Australia
| |
Collapse
|
8
|
Barutta J, Aravena P, Ibáñez A. The Machine Paradigm and Alternative Approaches in Cognitive Science. Integr Psychol Behav Sci 2010; 44:176-83. [DOI: 10.1007/s12124-010-9116-9] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
9
|
Ursino M, Cona F, Zavaglia M. The generation of rhythms within a cortical region: analysis of a neural mass model. Neuroimage 2010; 52:1080-94. [PMID: 20045071 DOI: 10.1016/j.neuroimage.2009.12.084] [Citation(s) in RCA: 66] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2009] [Revised: 12/18/2009] [Accepted: 12/21/2009] [Indexed: 11/28/2022] Open
Abstract
Rhythms in brain electrical activity are assumed to play a significant role in many cognitive and perceptual processes. It is thus of great value to analyze these rhythms and their mutual relationships in large scale models of cortical regions. In the present work, we modified the neural mass model by Wendling et al. (Eur. J. Neurosci. 15 (2002) 1499-1508) by including a new inhibitory self-loop among GABAA,fast interneurons. A theoretical analysis was performed to demonstrate that, thanks to this loop, GABAA,fast interneurons can produce a gamma rhythm in the power spectral density (PSD) even without the participation of the other neural populations. Then, the model of a whole cortical region, built upon four interconnected neural populations (pyramidal cells, excitatory, GABAA,slow and GABAA,fast interneurons) was investigated by changing the internal connectivity parameters. Results show that different rhythm combinations (beta and gamma, alpha and gamma, or a wide spectrum) can be obtained within the same region by simply altering connectivity values, without the need to change synaptic kinetics. Finally, two or three cortical regions were connected by using different topologies of long range connections. Results show that long-range connections directed from pyramidal neurons to GABAA,fast interneurons are the most efficient to transmit rhythms from one region to another. In this way, PSD with three or four peaks can be obtained using simple connectivity patterns. The model can be of value to gain a deeper insight into the mechanisms involved in the generation of gamma rhythms and provide a better understanding of cortical EEG spectra.
Collapse
Affiliation(s)
- Mauro Ursino
- Department of Electronics, Computer Science and Systems, University of Bologna, Bologna, Italy.
| | | | | |
Collapse
|
10
|
Spiegler A, Kiebel SJ, Atay FM, Knösche TR. Bifurcation analysis of neural mass models: Impact of extrinsic inputs and dendritic time constants. Neuroimage 2010; 52:1041-58. [PMID: 20045068 DOI: 10.1016/j.neuroimage.2009.12.081] [Citation(s) in RCA: 83] [Impact Index Per Article: 5.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/02/2009] [Revised: 12/17/2009] [Accepted: 12/21/2009] [Indexed: 11/25/2022] Open
Abstract
Neural mass models (NMMs) explain dynamics of neuronal populations and were designed to strike a balance between mathematical simplicity and biological plausibility. They are currently widely used as generative models for noninvasive electrophysiological brain measurements; that is, magneto- and electroencephalography (M/EEG). Here, we systematically describe the oscillatory regimes which a NMM of a single cortical source with extrinsic input from other cortical and subcortical areas to each subpopulation can explain. For this purpose, we used bifurcation analysis to describe qualitative changes in system behavior in response to quantitative input changes. This approach allowed us to describe sequences of oscillatory regimes, given some specific input trajectory. We systematically classified these sequential phenomena and mapped them into parameter space. Our analysis suggests a principled scheme of how complex M/EEG phenomena can be modeled parsimoniously on two time scales: While the system displays fast oscillations, it slowly traverses phase space to another qualitatively different oscillatory regime, depending on the input dynamics. The resulting scheme is useful for applications where one needs to model an ordered sequence of switching between qualitatively different oscillatory regimes, for example, in pharmacological interventions, epilepsy, sleep, or context-induced state changes.
Collapse
Affiliation(s)
- Andreas Spiegler
- Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany.
| | | | | | | |
Collapse
|
11
|
Mizraji E, Pomi A, Valle-Lisboa JC. Dynamic searching in the brain. Cogn Neurodyn 2009; 3:401-14. [PMID: 19496023 PMCID: PMC2777191 DOI: 10.1007/s11571-009-9084-2] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2009] [Revised: 04/27/2009] [Accepted: 04/27/2009] [Indexed: 11/30/2022] Open
Abstract
Cognitive functions rely on the extensive use of information stored in the brain, and the searching for the relevant information for solving some problem is a very complex task. Human cognition largely uses biological search engines, and we assume that to study cognitive function we need to understand the way these brain search engines work. The approach we favor is to study multi-modular network models, able to solve particular problems that involve searching for information. The building blocks of these multimodular networks are the context dependent memory models we have been using for almost 20 years. These models work by associating an output to the Kronecker product of an input and a context. Input, context and output are vectors that represent cognitive variables. Our models constitute a natural extension of the traditional linear associator. We show that coding the information in vectors that are processed through association matrices, allows for a direct contact between these memory models and some procedures that are now classical in the Information Retrieval field. One essential feature of context-dependent models is that they are based on the thematic packing of information, whereby each context points to a particular set of related concepts. The thematic packing can be extended to multimodular networks involving input-output contexts, in order to accomplish more complex tasks. Contexts act as passwords that elicit the appropriate memory to deal with a query. We also show toy versions of several 'neuromimetic' devices that solve cognitive tasks as diverse as decision making or word sense disambiguation. The functioning of these multimodular networks can be described as dynamical systems at the level of cognitive variables.
Collapse
Affiliation(s)
- Eduardo Mizraji
- Group of Cognitive Systems Modeling, Biophysical Section, Facultad de Ciencias, Universidad de la República, Iguá 4225, Montevideo, 11400 Uruguay
| | - Andrés Pomi
- Group of Cognitive Systems Modeling, Biophysical Section, Facultad de Ciencias, Universidad de la República, Iguá 4225, Montevideo, 11400 Uruguay
| | - Juan C. Valle-Lisboa
- Group of Cognitive Systems Modeling, Biophysical Section, Facultad de Ciencias, Universidad de la República, Iguá 4225, Montevideo, 11400 Uruguay
| |
Collapse
|
12
|
beim Graben P, Potthast R. Inverse problems in dynamic cognitive modeling. CHAOS (WOODBURY, N.Y.) 2009; 19:015103. [PMID: 19335007 DOI: 10.1063/1.3097067] [Citation(s) in RCA: 21] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
Inverse problems for dynamical system models of cognitive processes comprise the determination of synaptic weight matrices or kernel functions for neural networks or neural/dynamic field models, respectively. We introduce dynamic cognitive modeling as a three tier top-down approach where cognitive processes are first described as algorithms that operate on complex symbolic data structures. Second, symbolic expressions and operations are represented by states and transformations in abstract vector spaces. Third, prescribed trajectories through representation space are implemented in neurodynamical systems. We discuss the Amari equation for a neural/dynamic field theory as a special case and show that the kernel construction problem is particularly ill-posed. We suggest a Tikhonov-Hebbian learning method as regularization technique and demonstrate its validity and robustness for basic examples of cognitive computations.
Collapse
Affiliation(s)
- Peter beim Graben
- School of Psychology and Clinical Language Sciences, University of Reading, Reading, Berkshire, United Kingdom.
| | | |
Collapse
|
13
|
Wright JJ. Generation and control of cortical gamma: findings from simulation at two scales. Neural Netw 2008; 22:373-84. [PMID: 19095406 DOI: 10.1016/j.neunet.2008.11.001] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/08/2007] [Revised: 04/15/2008] [Accepted: 11/06/2008] [Indexed: 11/27/2022]
Abstract
A continuum model of electrocortical activity was applied separately at centimetric and macrocolumnar scales, permitting analysis of interaction between scales. State equations included effects of retrograde action potential propagation in dendritic trees, and kinetics of AMPA, GABA and NMDA receptors. Parameter values were provided from independent physiological and anatomical estimates. Realistic field potentials and pulse rates were obtained, including resonances in the alpha/theta and gamma ranges, 1/f(2) background activity, and autonomous gamma activity. Zero-lag synchrony and travelling waves occurred as complementary aspects of cortical transmission, and lead/lag relations between excitatory and inhibitory cell populations varied systematically around transition to autonomous gamma oscillation. Properties of the simulations can account for generation and control of gamma activity. All factors acting on excitatory/inhibitory balance controlled the onset and offset of gamma oscillation. Autonomous gamma was initiated by focal excitation of excitatory cells, and suppressed by laterally spreading trans-cortical excitation, which acted on both excitatory and inhibitory cell populations. Consequently, although spatially extensive non-specific reticular activation tended to suppress autonomous gamma, spatial variation of reticular activation could preferentially select fields of synchrony.
Collapse
Affiliation(s)
- J J Wright
- Liggins Institute, and Department of Psychological Medicine, University of Auckland, Auckland, New Zealand.
| |
Collapse
|
14
|
Marreiros AC, Kiebel SJ, Daunizeau J, Harrison LM, Friston KJ. Population dynamics under the Laplace assumption. Neuroimage 2008; 44:701-14. [PMID: 19013532 DOI: 10.1016/j.neuroimage.2008.10.008] [Citation(s) in RCA: 65] [Impact Index Per Article: 4.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/04/2008] [Revised: 09/30/2008] [Accepted: 10/10/2008] [Indexed: 11/30/2022] Open
Abstract
In this paper, we describe a generic approach to modelling dynamics in neuronal populations. This approach models a full density on the states of neuronal populations but finesses this high-dimensional problem by re-formulating density dynamics in terms of ordinary differential equations on the sufficient statistics of the densities considered (c.f., the method of moments). The particular form for the population density we adopt is a Gaussian density (c.f., the Laplace assumption). This means population dynamics are described by equations governing the evolution of the population's mean and covariance. We derive these equations from the Fokker-Planck formalism and illustrate their application to a conductance-based model of neuronal exchanges. One interesting aspect of this formulation is that we can uncouple the mean and covariance to furnish a neural-mass model, which rests only on the populations mean. This enables us to compare equivalent mean-field and neural-mass models of the same populations and evaluate, quantitatively, the contribution of population variance to the expected dynamics. The mean-field model presented here will form the basis of a dynamic causal model of observed electromagnetic signals in future work.
Collapse
Affiliation(s)
- André C Marreiros
- The Wellcome Trust Centre for Neuroimaging, Institute of Neurology, University College London, London, UK.
| | | | | | | | | |
Collapse
|
15
|
Abstract
There is growing evidence in favor of the temporal-coding hypothesis that temporal correlation of neuronal discharges may serve to bind distributed neuronal activity into unique representations and, in particular, that theta (3.5-7.5 Hz) and delta (0.5 < 3.5 Hz) oscillations facilitate information coding. The theta- and delta-rhythms are shown to be involved in various sleep stages, and during anesthesia, they undergo changes with the depth of anesthesia. We introduce a thalamocortical model of interacting neuronal ensembles to describe phase relationships between theta- and delta-oscillations, especially during deep and light anesthesia. Asymmetric and long-range interactions among the thalamocortical neuronal oscillators are taken into account. The model results are compared with experimental observations. The delta- and theta-activities are found to be separately generated and are governed by the thalamus and cortex, respectively. Changes in the degree of intraensemble and interensemble synchrony imply that the neuronal ensembles inhibit information coding during deep anesthesia and facilitate it during light anesthesia.
Collapse
|
16
|
Wennekers T. Tuned solutions in dynamic neural fields as building blocks for extended EEG models. Cogn Neurodyn 2008; 2:137-46. [PMID: 19003480 DOI: 10.1007/s11571-008-9045-1] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/02/2008] [Accepted: 03/26/2008] [Indexed: 11/29/2022] Open
Abstract
The most prominent functional property of cortical neurons in sensory areas are their tuned receptive fields which provide specific responses of the neurons to external stimuli. Tuned neural firing indeed reflects the most basic and best worked out level of cognitive representations. Tuning properties can be dynamic on a short time-scale of fractions of a second. Such dynamic effects have been modeled by localised solutions (also called "bumps" or "peaks") in dynamic neural fields. In the present work we develop an approximation method to reduce the dynamics of localised activation peaks in systems of n coupled nonlinear d-dimensional neural fields with transmission delays to a small set of delay differential equations for the peak amplitudes and widths only. The method considerably simplifies the analysis of peaked solutions as demonstrated for a two-dimensional example model of neural feature selectivity in the brain. The reduced equations describe the effective interaction between pools of local neurons of several (n) classes that participate in shaping the dynamic receptive field responses. To lowest order they resemble neural mass models as they often form the base of EEG-models. Thereby they provide a link between functional small-scale receptive field models and more coarse-grained EEG-models. More specifically, they connect the dynamics in feature-selective cortical microcircuits to the more abstract local elements used in coarse-grained models. However, beside amplitudes the reduced equations also reflect the sharpness of tuning of the activity in a d-dimensional feature space in response to localised stimuli.
Collapse
Affiliation(s)
- Thomas Wennekers
- Centre for Theoretical and Computational Neuroscience, University of Plymouth, Drake Circus, PL4 8AA, UK,
| |
Collapse
|
17
|
beim Graben P, Kurths J. Simulating global properties of electroencephalograms with minimal random neural networks. Neurocomputing 2008. [DOI: 10.1016/j.neucom.2007.02.007] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
18
|
Moran RJ, Kiebel SJ, Stephan KE, Reilly RB, Daunizeau J, Friston KJ. A neural mass model of spectral responses in electrophysiology. Neuroimage 2007; 37:706-20. [PMID: 17632015 PMCID: PMC2644418 DOI: 10.1016/j.neuroimage.2007.05.032] [Citation(s) in RCA: 138] [Impact Index Per Article: 8.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/18/2006] [Revised: 05/01/2007] [Accepted: 05/07/2007] [Indexed: 11/29/2022] Open
Abstract
We present a neural mass model of steady-state membrane potentials measured with local field potentials or electroencephalography in the frequency domain. This model is an extended version of previous dynamic causal models for investigating event-related potentials in the time-domain. In this paper, we augment the previous formulation with parameters that mediate spike-rate adaptation and recurrent intrinsic inhibitory connections. We then use linear systems analysis to show how the model's spectral response changes with its neurophysiological parameters. We demonstrate that much of the interesting behaviour depends on the non-linearity which couples mean membrane potential to mean spiking rate. This non-linearity is analogous, at the population level, to the firing rate–input curves often used to characterize single-cell responses. This function depends on the model's gain and adaptation currents which, neurobiologically, are influenced by the activity of modulatory neurotransmitters. The key contribution of this paper is to show how neuromodulatory effects can be modelled by adding adaptation currents to a simple phenomenological model of EEG. Critically, we show that these effects are expressed in a systematic way in the spectral density of EEG recordings. Inversion of the model, given such non-invasive recordings, should allow one to quantify pharmacologically induced changes in adaptation currents. In short, this work establishes a forward or generative model of electrophysiological recordings for psychopharmacological studies.
Collapse
Affiliation(s)
- R J Moran
- The Wellcome Trust Centre for Neuroimaging, Institute of Neurology, University College London, 12 Queen Square, London, WC1N 3BG, UK.
| | | | | | | | | | | |
Collapse
|
19
|
Wright JJ, Alexander DM, Bourke PD. Contribution of lateral interactions in V1 to organization of response properties. Vision Res 2006; 46:2703-20. [PMID: 16600322 DOI: 10.1016/j.visres.2006.02.017] [Citation(s) in RCA: 15] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2005] [Revised: 12/04/2005] [Accepted: 02/14/2006] [Indexed: 11/30/2022]
Abstract
We propose a model of self-organization of synaptic connections in V1, emphasizing lateral interactions. Subject to Hebbian learning with decay, evolution of synaptic strengths proceeds to a stable state in which all synapses are either saturated, or have minimum pre/post-synaptic coincidence. The most stable configuration gives rise to anatomically realistic "local maps", each of macro-columnar size, and each organized as Mobius projections of retinotopic space. A tiling of V1, constructed of approximately mirror-image reflections of each local map by its neighbors is formed, accounting for orientation-preference singularities, linear zones, and saddle points-with each map linked by connections between sites of common orientation preference. Ocular dominance columns are partly explained as a special case of the same process. The occurrence of direction preference fractures always in odd numbers around singularities is a specific feature explained by the Mobius configuration of the local map. Effects of stimulus velocity, orientation relative to direction of motion, and extension, upon orientation preference, which are not accounted for by spatial filtering, are explained by interactions between the classic receptive field and global V1.
Collapse
Affiliation(s)
- J J Wright
- Liggins Institute, University of Auckland, Auckland, New Zealand.
| | | | | |
Collapse
|
20
|
Abstract
Cortical activity is the product of interactions among neuronal populations. Macroscopic electrophysiological phenomena are generated by these interactions. In principle, the mechanisms of these interactions afford constraints on biologically plausible models of electrophysiological responses. In other words, the macroscopic features of cortical activity can be modelled in terms of the microscopic behaviour of neurons. An evoked response potential (ERP) is the mean electrical potential measured from an electrode on the scalp, in response to some event. The purpose of this paper is to outline a population density approach to modelling ERPs. We propose a biologically plausible model of neuronal activity that enables the estimation of physiologically meaningful parameters from electrophysiological data. The model encompasses four basic characteristics of neuronal activity and organization: (i) neurons are dynamic units, (ii) driven by stochastic forces, (iii) organized into populations with similar biophysical properties and response characteristics and (iv) multiple populations interact to form functional networks. This leads to a formulation of population dynamics in terms of the Fokker-Planck equation. The solution of this equation is the temporal evolution of a probability density over state-space, representing the distribution of an ensemble of trajectories. Each trajectory corresponds to the changing state of a neuron. Measurements can be modelled by taking expectations over this density, e.g. mean membrane potential, firing rate or energy consumption per neuron. The key motivation behind our approach is that ERPs represent an average response over many neurons. This means it is sufficient to model the probability density over neurons, because this implicitly models their average state. Although the dynamics of each neuron can be highly stochastic, the dynamics of the density is not. This means we can use Bayesian inference and estimation tools that have already been established for deterministic systems. The potential importance of modelling density dynamics (as opposed to more conventional neural mass models) is that they include interactions among the moments of neuronal states (e.g. the mean depolarization may depend on the variance of synaptic currents through nonlinear mechanisms).Here, we formulate a population model, based on biologically informed model-neurons with spike-rate adaptation and synaptic dynamics. Neuronal sub-populations are coupled to form an observation model, with the aim of estimating and making inferences about coupling among sub-populations using real data. We approximate the time-dependent solution of the system using a bi-orthogonal set and first-order perturbation expansion. For didactic purposes, the model is developed first in the context of deterministic input, and then extended to include stochastic effects. The approach is demonstrated using synthetic data, where model parameters are identified using a Bayesian estimation scheme we have described previously.
Collapse
Affiliation(s)
- L M Harrison
- The Wellcome Department of Imaging Neuroscience, Institute of Neurology, UCL, 12 Queen Square, London WC1N 3BG, UK.
| | | | | |
Collapse
|
21
|
Burke DP, de Paor AM. A stochastic limit cycle oscillator model of the EEG. BIOLOGICAL CYBERNETICS 2004; 91:221-230. [PMID: 15378376 DOI: 10.1007/s00422-004-0509-z] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/23/2003] [Accepted: 07/15/2004] [Indexed: 05/24/2023]
Abstract
We present an empirical model of the electroencephalogram (EEG) signal based on the construction of a stochastic limit cycle oscillator using Ito calculus. This formulation, where the noise influences actually interact with the dynamics, is substantially different from the usual definition of measurement noise. Analysis of model data is compared with actual EEG data using both traditional methods and modern techniques from nonlinear time series analysis. The model demonstrates visually displayed patterns and statistics that are similar to actual EEG data. In addition, the nonlinear mechanisms underlying the dynamics of the model do not manifest themselves in nonlinear time series analysis, paralleling the situation with real, non-pathological EEG data. This modeling exercise suggests that the EEG is optimally described by stochastic limit cycle behavior.
Collapse
Affiliation(s)
- D P Burke
- Department of Electrical and Electronic Engineering, National University of Ireland, Dublin, Dublin 4, Belfield, Ireland.
| | | |
Collapse
|