1
|
Labay-Mora A, García-Beni J, Giorgi GL, Soriano MC, Zambrini R. Neural networks with quantum states of light. PHILOSOPHICAL TRANSACTIONS. SERIES A, MATHEMATICAL, PHYSICAL, AND ENGINEERING SCIENCES 2024; 382:20230346. [PMID: 39717979 DOI: 10.1098/rsta.2023.0346] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/27/2024] [Revised: 08/12/2024] [Accepted: 08/12/2024] [Indexed: 12/25/2024]
Abstract
Quantum optical networks are instrumental in addressing the fundamental questions and enable applications ranging from communication to computation and, more recently, machine learning (ML). In particular, photonic artificial neural networks (ANNs) offer the opportunity to exploit the advantages of both classical and quantum optics. Photonic neuro-inspired computation and ML have been successfully demonstrated in classical settings, while quantum optical networks have triggered breakthrough applications such as teleportation, quantum key distribution and quantum computing. We present a perspective on the state of the art in quantum optical ML and the potential advantages of ANNs in circuit designs and beyond, in more general, analogue settings characterized by recurrent and coherent complex interactions. We consider two analogue neuro-inspired applications, namely quantum reservoir computing and quantum associative memories, and discuss the enhanced capabilities offered by quantum substrates, highlighting the specific role of light squeezing in this context.This article is part of the theme issue 'The quantum theory of light'.
Collapse
Affiliation(s)
- Adrià Labay-Mora
- Institute for Cross-Disciplinary Physics and Complex Systems (IFISC) UIB-CSIC, Campus Universitat Illes Balears, Palma de Mallorca 07122, Spain
| | - Jorge García-Beni
- Institute for Cross-Disciplinary Physics and Complex Systems (IFISC) UIB-CSIC, Campus Universitat Illes Balears, Palma de Mallorca 07122, Spain
| | - Gian Luca Giorgi
- Institute for Cross-Disciplinary Physics and Complex Systems (IFISC) UIB-CSIC, Campus Universitat Illes Balears, Palma de Mallorca 07122, Spain
| | - Miguel C Soriano
- Institute for Cross-Disciplinary Physics and Complex Systems (IFISC) UIB-CSIC, Campus Universitat Illes Balears, Palma de Mallorca 07122, Spain
| | - Roberta Zambrini
- Institute for Cross-Disciplinary Physics and Complex Systems (IFISC) UIB-CSIC, Campus Universitat Illes Balears, Palma de Mallorca 07122, Spain
| |
Collapse
|
2
|
Bardella G, Franchini S, Pani P, Ferraina S. Lattice physics approaches for neural networks. iScience 2024; 27:111390. [PMID: 39679297 PMCID: PMC11638618 DOI: 10.1016/j.isci.2024.111390] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/17/2024] Open
Abstract
Modern neuroscience has evolved into a frontier field that draws on numerous disciplines, resulting in the flourishing of novel conceptual frames primarily inspired by physics and complex systems science. Contributing in this direction, we recently introduced a mathematical framework to describe the spatiotemporal interactions of systems of neurons using lattice field theory, the reference paradigm for theoretical particle physics. In this note, we provide a concise summary of the basics of the theory, aiming to be intuitive to the interdisciplinary neuroscience community. We contextualize our methods, illustrating how to readily connect the parameters of our formulation to experimental variables using well-known renormalization procedures. This synopsis yields the key concepts needed to describe neural networks using lattice physics. Such classes of methods are attention-worthy in an era of blistering improvements in numerical computations, as they can facilitate relating the observation of neural activity to generative models underpinned by physical principles.
Collapse
Affiliation(s)
- Giampiero Bardella
- Department of Physiology and Pharmacology, Sapienza University of Rome, Rome, Italy
| | - Simone Franchini
- Department of Physiology and Pharmacology, Sapienza University of Rome, Rome, Italy
| | - Pierpaolo Pani
- Department of Physiology and Pharmacology, Sapienza University of Rome, Rome, Italy
| | - Stefano Ferraina
- Department of Physiology and Pharmacology, Sapienza University of Rome, Rome, Italy
| |
Collapse
|
3
|
Paliwal S, Ocker GK, Brinkman BAW. Metastability in networks of nonlinear stochastic integrate-and-fire neurons. ARXIV 2024:arXiv:2406.07445v2. [PMID: 38947936 PMCID: PMC11213153] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Subscribe] [Scholar Register] [Indexed: 07/02/2024]
Abstract
Neurons in the brain continuously process the barrage of sensory inputs they receive from the environment. A wide array of experimental work has shown that the collective activity of neural populations encodes and processes this constant bombardment of information. How these collective patterns of activity depend on single-neuron properties is often unclear. Single-neuron recordings have shown that individual neurons' responses to inputs are nonlinear, which prevents a straight-forward extrapolation from single neuron features to emergent collective states. Here, we use a field-theoretic formulation of a stochastic leaky integrate-and-fire model to study the impact of single-neuron nonlinearities on macroscopic network activity. In this model, a neuron integrates spiking output from other neurons in its membrane voltage and emits spikes stochastically with an intensity depending on the membrane voltage, after which the voltage resets. We show that the interplay between nonlinear spike intensity functions and membrane potential resets can i) give rise to metastable active firing rate states in recurrent networks, and ii) can enhance or suppress mean firing rates and membrane potentials in the same or paradoxically opposite directions.
Collapse
Affiliation(s)
- Siddharth Paliwal
- Graduate Program in Neuroscience, Stony Brook University, Stony Brook, NY, 11794, USA
| | - Gabriel Koch Ocker
- Department of Mathematics and Statistics, Boston University, Boston, MA, 02215, USA
| | - Braden A. W. Brinkman
- Department of Neurobiology and Behavior, Stony Brook University, Stony Brook, NY, 11794, USA
| |
Collapse
|
4
|
Hancock F, Rosas FE, Luppi AI, Zhang M, Mediano PAM, Cabral J, Deco G, Kringelbach ML, Breakspear M, Kelso JAS, Turkheimer FE. Metastability demystified - the foundational past, the pragmatic present and the promising future. Nat Rev Neurosci 2024:10.1038/s41583-024-00883-1. [PMID: 39663408 DOI: 10.1038/s41583-024-00883-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/01/2024] [Indexed: 12/13/2024]
Abstract
Healthy brain function depends on balancing stable integration between brain areas for effective coordinated functioning, with coexisting segregation that allows subsystems to express their functional specialization. Metastability, a concept from the dynamical systems literature, has been proposed as a key signature that characterizes this balance. Building on this principle, the neuroscience literature has leveraged the phenomenon of metastability to investigate various aspects of brain function in health and disease. However, this body of work often uses the notion of metastability heuristically, and sometimes inaccurately, making it difficult to navigate the vast literature, interpret findings and foster further development of theoretical and experimental methodologies. Here, we provide a comprehensive review of metastability and its applications in neuroscience, covering its scientific and historical foundations and the practical measures used to assess it in empirical data. We also provide a critical analysis of recent theoretical developments, clarifying common misconceptions and paving the road for future developments.
Collapse
Affiliation(s)
- Fran Hancock
- Department of Neuroimaging, Institute of Psychiatry, Psychology and Neuroscience, King's College London, London, UK.
| | - Fernando E Rosas
- Department of Informatics, University of Sussex, Brighton, UK.
- Sussex Centre for Consciousness Science, University of Sussex, Brighton, UK.
- Centre for Psychedelic Research, Department of Brain Science, Imperial College London, London, UK.
- Centre for Eudaimonia and Human Flourishing, University of Oxford, Oxford, UK.
- Sussex AI, University of Sussex, Brighton, UK.
- Centre for Complexity Science, Department of Brain Science, Imperial College London, London, UK.
| | - Andrea I Luppi
- Centre for Eudaimonia and Human Flourishing, University of Oxford, Oxford, UK
- St John's College, University of Cambridge, Cambridge, UK
- Department of Psychiatry, University of Oxford, Oxford, UK
| | - Mengsen Zhang
- Department of Computational Mathematics, Science and Engineering, Michigan State University, East Lansing, MI, USA
| | - Pedro A M Mediano
- Department of Computing, Imperial College London, London, UK
- Division of Psychology and Language Sciences, University College London, London, UK
| | - Joana Cabral
- Centre for Eudaimonia and Human Flourishing, University of Oxford, Oxford, UK
- Life and Health Sciences Research Institute School of Medicine, University of Minho, Braga, Portugal
| | - Gustavo Deco
- Computational Neuroscience Group, Center for Brain and Cognition, Department of Information and Communication Technologies, Universitat Pompeu Fabra, Barcelona, Spain
- Institución Catalana de la Recerca i Estudis Avancats (ICREA), Barcelona, Spain
- Department of Neuropsychology, Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany
- School of Psychological Sciences, Monash University Clayton, Melbourne, Victoria, Australia
| | - Morten L Kringelbach
- Centre for Eudaimonia and Human Flourishing, University of Oxford, Oxford, UK
- Center for Music in the Brain, Department of Clinical Medicine, Aarhus University, Aarhus, Denmark
| | - Michael Breakspear
- School of Psychological Sciences, College of Engineering, Science and the Environment, University of Newcastle, Newcastle, New South Wales, Australia
| | - J A Scott Kelso
- Center for Complex Systems and Brain Sciences, Florida Atlantic University, Boca Raton, FL, USA
- Intelligent Systems Research Centre, Ulster University, Derry~Londonderry, Northern Ireland
- The Bath Institute for the Augmented Human, University of Bath, Bath, UK
| | - Federico E Turkheimer
- Department of Neuroimaging, Institute of Psychiatry, Psychology and Neuroscience, King's College London, London, UK
- The Institute for Human and Synthetic Minds, King's College London, London, UK
| |
Collapse
|
5
|
Morozov A, Feudel U, Hastings A, Abbott KC, Cuddington K, Heggerud CM, Petrovskii S. Long-living transients in ecological models: Recent progress, new challenges, and open questions. Phys Life Rev 2024; 51:423-441. [PMID: 39581175 DOI: 10.1016/j.plrev.2024.11.004] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/04/2024] [Accepted: 11/08/2024] [Indexed: 11/26/2024]
Abstract
Traditionally, mathematical models in ecology placed an emphasis on asymptotic, long-term dynamics. However, a large number of recent studies highlighted the importance of transient dynamics in ecological and eco-evolutionary systems, in particular 'long transients' that can last for hundreds of generations or even longer. Many models as well as empirical studies indicated that a system can function for a long time in a certain state or regime (a 'metastable regime') but later exhibits an abrupt transition to another regime not preceded by any parameter change (or following the change that occurred long before the transition). This scenario where tipping occurs without any apparent source of a regime shift is also referred to as 'metastability'. Despite considerable evidence of the presence of long transients in real-world systems as well as models, until recently research into long-living transients in ecology has remained in its infancy, largely lacking systematisation. Within the past decade, however, substantial progress has been made in creating a unifying theory of long transients in deterministic as well as stochastic systems. This has considerably accelerated further studies on long transients, in particular on those characterised by more complicated patterns and/or underlying mechanisms. The main goal of this review is to provide an overview of recent research on long transients and related regime shifts in models of ecological dynamics. We pay special attention to the role of environmental stochasticity, the effect of multiple timescales (slow-fast systems), transient spatial patterns, and relation between transients and spatial synchronisation. We also discuss current challenges and open questions in understanding transients with applications to ecosystems dynamics.
Collapse
Affiliation(s)
- Andrew Morozov
- School of Computing and Mathematical Sciences, University of Leicester, LE1 7RH, UK; Institute of Ecology and Evolution, Russian Academy of Sciences, Moscow, Russia
| | - Ulrike Feudel
- Institute for Chemistry and Biology of the Marine Environment, Carl von Ossietzky University Oldenburg, Oldenburg, Germany
| | - Alan Hastings
- Department of Environmental Science and Policy, University of California, Davis, USA; Santa Fe Institute, Santa Fe, New Mexico, USA
| | - Karen C Abbott
- Department of Biology, Case Western Reserve University, Cleveland, OH 44106, USA
| | - Kim Cuddington
- Department of Biology, University of Waterloo, Waterloo, Ontario, Canada
| | | | - Sergei Petrovskii
- School of Computing and Mathematical Sciences, Institute for Environmental Futures, University of Leicester, LE1 7RH, UK; Peoples Friendship University of Russia (RUDN University), 6 Miklukho-Maklaya Str., 117198 Moscow, Russia.
| |
Collapse
|
6
|
White SM, Morningstar MD, De Falco E, Linsenbardt DN, Ma B, Parks MA, Czachowski CL, Lapish CC. Impulsive Choices Emerge When the Anterior Cingulate Cortex Fails to Encode Deliberative Strategies. eNeuro 2024; 11:ENEURO.0379-24.2024. [PMID: 39557563 PMCID: PMC11573491 DOI: 10.1523/eneuro.0379-24.2024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2024] [Revised: 10/18/2024] [Accepted: 10/22/2024] [Indexed: 11/20/2024] Open
Abstract
Impulsive individuals excessively discount the value of delayed rewards, and this is thought to reflect deficits in brain regions critical for impulse control such as the anterior cingulate cortex (ACC). Delay discounting (DD) is an established measure of cognitive impulsivity, referring to the devaluation of rewards delayed in time. This study used male Wistar rats performing a DD task to test the hypothesis that neural activity states in ACC ensembles encode strategies that guide decision-making. Optogenetic silencing of ACC neurons exclusively increased impulsive choices at the 8 s delay by increasing the number of consecutive low-value, immediate choices. In contrast to shorter delays where animals preferred the delay option, no immediate or delay preference was detected at 8 s. These data suggest that ACC was critical for decisions requiring more deliberation between choice options. To address the role of ACC in this process, large-scale multiple single-unit recordings were performed and revealed that 4 and 8 s delays were associated with procedural versus deliberative neural encoding mechanisms, respectively. The 4 and 8 s delay differed in encoding of strategy corresponding to immediate and delay run termination. Specifically, neural ensemble states at 4 s were relatively stable throughout the choice but exhibited temporal evolution in state space during the choice epoch that resembled ramping during the 8 s delay. Collectively, these findings indicate that ensemble states in ACC facilitate strategies that guide decision-making, and impulsivity increases with disruptions of deliberative encoding mechanisms.
Collapse
Affiliation(s)
- Shelby M White
- Psychology Department, Indiana University-Purdue University, Indianapolis, Indiana 46202
| | - Mitchell D Morningstar
- Psychology Department, Indiana University-Purdue University, Indianapolis, Indiana 46202
| | - Emanuela De Falco
- Neuroscience, EPFL Center for Neuroprosthetics, Lausanne 1015, Switzerland
| | - David N Linsenbardt
- Department of Neurosciences, University of New Mexico, Albuquerque, New Mexico 87131
| | - Baofeng Ma
- Department of Anatomy, Cell Biology, and Physiology, Stark Neuroscience Institute, Indianapolis, Indiana 46202
| | - Macedonia A Parks
- Psychology Department, Indiana University-Purdue University, Indianapolis, Indiana 46202
| | - Cristine L Czachowski
- Psychology Department, Indiana University-Purdue University, Indianapolis, Indiana 46202
| | - Christopher C Lapish
- Psychology Department, Indiana University-Purdue University, Indianapolis, Indiana 46202
- Department of Anatomy, Cell Biology, and Physiology, Stark Neuroscience Institute, Indianapolis, Indiana 46202
| |
Collapse
|
7
|
Igamberdiev AU. Reflexive neural circuits and the origin of language and music codes. Biosystems 2024; 246:105346. [PMID: 39349135 DOI: 10.1016/j.biosystems.2024.105346] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2024] [Revised: 09/23/2024] [Accepted: 09/26/2024] [Indexed: 10/02/2024]
Abstract
Conscious activity is grounded in the reflexive self-awareness in sense perception, through which the codes signifying sensual perceptive events operate and constrain human behavior. These codes grow via the creative generation of hypertextual statements. We apply the model of Vladimir Lefebvre (Lefebvre, V.A., 1987, J. Soc. Biol. Struct. 10, 129-175) to reveal the underlying structures on which the perception and creative development of language and music codes are based. According to this model, the reflexive structure of conscious subject is grounded in three thermodynamic cycles united by the control of the basic functional cycle by the second one, and resulting in the internal action that it turn is perceived by the third cycle evaluating this action. In this arrangement, the generative language structures are formed and the frequencies of sounds that form musical phrases and patterns are selected. We discuss the participation of certain neural brain structures and the establishment of reflexive neural circuits in the ad hoc transformation of perceptive signals, and show the similarities between the processes of perception and of biological self-maintenance and morphogenesis. We trace the peculiarities of the temporal encoding of emotions in music and musical creativity, as well as the principles of sharing musical information between the performing and the perceiving individuals.
Collapse
Affiliation(s)
- Abir U Igamberdiev
- Department of Biology, Memorial University of Newfoundland, St. John's, NL A1C 5S7, Canada.
| |
Collapse
|
8
|
Cihak HL, Kilpatrick ZP. Robustly encoding certainty in a metastable neural circuit model. Phys Rev E 2024; 110:034404. [PMID: 39425424 DOI: 10.1103/physreve.110.034404] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2024] [Accepted: 08/19/2024] [Indexed: 10/21/2024]
Abstract
Localized persistent neural activity can encode delayed estimates of continuous variables. Common experiments require that subjects store and report the feature value (e.g., orientation) of a particular cue (e.g., oriented bar on a screen) after a delay. Visualizing recorded activity of neurons along their feature tuning reveals activity bumps whose centers wander stochastically, degrading the estimate over time. Bump position therefore represents the remembered estimate. Recent work suggests bump amplitude may represent estimate certainty reflecting a probabilistic population code for a Bayesian posterior. Idealized models of this type are fragile due to the fine tuning common to constructed continuum attractors in dynamical systems. Here we propose an alternative metastable model for robustly supporting multiple bump amplitudes by extending neural circuit models to include quantized nonlinearities. Asymptotic projections of circuit activity produce low-dimensional evolution equations for the amplitude and position of bump solutions in response to external stimuli and noise perturbations. Analysis of reduced equations accurately characterizes phase variance and the dynamics of amplitude transitions between stable discrete values. More salient cues generate bumps of higher amplitude which wander less, consistent with experiments showing certainty correlates with more accurate memories.
Collapse
|
9
|
Naik BR, Chandran Y, Rohini K, Verma D, Ramanathan S, Balakrishnan V. Origin of discrete resistive switching in chemically heterogeneous vanadium oxide crystals. MATERIALS HORIZONS 2024; 11:4086-4093. [PMID: 38894698 DOI: 10.1039/d4mh00034j] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/21/2024]
Abstract
Phase changes in oxide materials such as VO2 offer a foundational platform for designing novel solid-state devices. Tuning the V : O stoichiometry offers a vast electronic phase space with non-trivial collective properties. Here, we report the observation of discrete threshold switching voltages (Vth) with constant ΔVth between cycles in vanadium oxide crystals. The observed threshold fields over 10 000 cycles are ∼100× lower than that noted for stoichiometric VO2 and show unique discrete behaviour with constant ΔVth. We correlate the observed discrete memristor behaviour with the valence change mechanism and fluctuations in the chemical composition of spatially distributed VO2-VnO2n-1 complex oxide phases that can synergistically co-operate with the insulator-metal transition resulting in sharp current jumps. The design of chemical heterogeneity in oxide crystals, therefore, offers an intriguing path to realizing low-energy neuromorphic devices.
Collapse
Affiliation(s)
- B Raju Naik
- School of Mechanical and Materials Engineering, Indian Institute of Technology, Mandi, Himachal Pradesh, 175075, India.
| | - Yadu Chandran
- School of Mechanical and Materials Engineering, Indian Institute of Technology, Mandi, Himachal Pradesh, 175075, India.
| | - Kakunuri Rohini
- School of Mechanical and Materials Engineering, Indian Institute of Technology, Mandi, Himachal Pradesh, 175075, India.
| | - Divya Verma
- School of Mechanical and Materials Engineering, Indian Institute of Technology, Mandi, Himachal Pradesh, 175075, India.
| | - Shriram Ramanathan
- Department of Electrical and Computer Engineering, Rutgers State University, New Jersey, Piscataway, NJ 08854, USA
| | - Viswanath Balakrishnan
- School of Mechanical and Materials Engineering, Indian Institute of Technology, Mandi, Himachal Pradesh, 175075, India.
| |
Collapse
|
10
|
Li T, La Camera G. A sticky Poisson Hidden Markov Model for spike data. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2024.08.07.606969. [PMID: 39149270 PMCID: PMC11326216 DOI: 10.1101/2024.08.07.606969] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 08/17/2024]
Abstract
Fitting a hidden Markov Model (HMM) to neural data is a powerful method to segment a spatiotemporal stream of neural activity into sequences of discrete hidden states. Application of HMM has allowed to uncover hidden states and signatures of neural dynamics that seem relevant for sensory and cognitive processes. This has been accomplished especially in datasets comprising ensembles of simultaneously recorded cortical spike trains. However, the HMM analysis of spike data is involved and requires a careful handling of model selection. Two main issues are: (i) the cross-validated likelihood function typically increases with the number of hidden states; (ii) decoding the data with an HMM can lead to very rapid state switching due to fast oscillations in state probabilities. The first problem is related to the phenomenon of over-segmentation and leads to overfitting. The second problem is at odds with the empirical fact that hidden states in cortex tend to last from hundred of milliseconds to seconds. Here, we show that we can alleviate both problems by regularizing a Poisson-HMM during training so as to enforce large self-transition probabilities. We call this algorithm the 'sticky Poisson-HMM' (sPHMM). When used together with the Bayesian Information Criterion for model selection, the sPHMM successfully eliminates rapid state switching, outperforming an alternative strategy based on an HMM with a large prior on the self-transition probabilities. The sPHMM also captures the ground truth in surrogate datasets built to resemble the statistical properties of the experimental data.
Collapse
Affiliation(s)
- Tianshu Li
- Department of Neurobiology & Behavior, Stony Brook University
- Graduate Program in Neuroscience, Stony Brook University
- Center for Neural Circuit Dynamics, Stony Brook University
| | - Giancarlo La Camera
- Department of Neurobiology & Behavior, Stony Brook University
- Graduate Program in Neuroscience, Stony Brook University
- Center for Neural Circuit Dynamics, Stony Brook University
| |
Collapse
|
11
|
Yang X, La Camera G. Co-existence of synaptic plasticity and metastable dynamics in a spiking model of cortical circuits. PLoS Comput Biol 2024; 20:e1012220. [PMID: 38950068 PMCID: PMC11244818 DOI: 10.1371/journal.pcbi.1012220] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2024] [Revised: 07/12/2024] [Accepted: 06/01/2024] [Indexed: 07/03/2024] Open
Abstract
Evidence for metastable dynamics and its role in brain function is emerging at a fast pace and is changing our understanding of neural coding by putting an emphasis on hidden states of transient activity. Clustered networks of spiking neurons have enhanced synaptic connections among groups of neurons forming structures called cell assemblies; such networks are capable of producing metastable dynamics that is in agreement with many experimental results. However, it is unclear how a clustered network structure producing metastable dynamics may emerge from a fully local plasticity rule, i.e., a plasticity rule where each synapse has only access to the activity of the neurons it connects (as opposed to the activity of other neurons or other synapses). Here, we propose a local plasticity rule producing ongoing metastable dynamics in a deterministic, recurrent network of spiking neurons. The metastable dynamics co-exists with ongoing plasticity and is the consequence of a self-tuning mechanism that keeps the synaptic weights close to the instability line where memories are spontaneously reactivated. In turn, the synaptic structure is stable to ongoing dynamics and random perturbations, yet it remains sufficiently plastic to remap sensory representations to encode new sets of stimuli. Both the plasticity rule and the metastable dynamics scale well with network size, with synaptic stability increasing with the number of neurons. Overall, our results show that it is possible to generate metastable dynamics over meaningful hidden states using a simple but biologically plausible plasticity rule which co-exists with ongoing neural dynamics.
Collapse
Affiliation(s)
- Xiaoyu Yang
- Graduate Program in Physics and Astronomy, Stony Brook University, Stony Brook, New York, United States of America
- Department of Neurobiology and Behavior, Stony Brook University, Stony Brook, New York, United States of America
- Center for Neural Circuit Dynamics, Stony Brook University, Stony Brook, New York, United States of America
| | - Giancarlo La Camera
- Department of Neurobiology and Behavior, Stony Brook University, Stony Brook, New York, United States of America
- Center for Neural Circuit Dynamics, Stony Brook University, Stony Brook, New York, United States of America
| |
Collapse
|
12
|
Yang X, La Camera G. Co-existence of synaptic plasticity and metastable dynamics in a spiking model of cortical circuits. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2024:2023.12.07.570692. [PMID: 38106233 PMCID: PMC10723399 DOI: 10.1101/2023.12.07.570692] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/19/2023]
Abstract
Evidence for metastable dynamics and its role in brain function is emerging at a fast pace and is changing our understanding of neural coding by putting an emphasis on hidden states of transient activity. Clustered networks of spiking neurons have enhanced synaptic connections among groups of neurons forming structures called cell assemblies; such networks are capable of producing metastable dynamics that is in agreement with many experimental results. However, it is unclear how a clustered network structure producing metastable dynamics may emerge from a fully local plasticity rule, i.e., a plasticity rule where each synapse has only access to the activity of the neurons it connects (as opposed to the activity of other neurons or other synapses). Here, we propose a local plasticity rule producing ongoing metastable dynamics in a deterministic, recurrent network of spiking neurons. The metastable dynamics co-exists with ongoing plasticity and is the consequence of a self-tuning mechanism that keeps the synaptic weights close to the instability line where memories are spontaneously reactivated. In turn, the synaptic structure is stable to ongoing dynamics and random perturbations, yet it remains sufficiently plastic to remap sensory representations to encode new sets of stimuli. Both the plasticity rule and the metastable dynamics scale well with network size, with synaptic stability increasing with the number of neurons. Overall, our results show that it is possible to generate metastable dynamics over meaningful hidden states using a simple but biologically plausible plasticity rule which co-exists with ongoing neural dynamics.
Collapse
Affiliation(s)
- Xiaoyu Yang
- Graduate Program in Physics and Astronomy, Stony Brook University
- Department of Neurobiology & Behavior, Stony Brook University
- Center for Neural Circuit Dynamics, Stony Brook University
| | - Giancarlo La Camera
- Department of Neurobiology & Behavior, Stony Brook University
- Center for Neural Circuit Dynamics, Stony Brook University
| |
Collapse
|
13
|
Kogan JF, Fontanini A. Learning enhances representations of taste-guided decisions in the mouse gustatory insular cortex. Curr Biol 2024; 34:1880-1892.e5. [PMID: 38631343 PMCID: PMC11188718 DOI: 10.1016/j.cub.2024.03.034] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/16/2023] [Revised: 02/07/2024] [Accepted: 03/19/2024] [Indexed: 04/19/2024]
Abstract
Learning to discriminate overlapping gustatory stimuli that predict distinct outcomes-a feat known as discrimination learning-can mean the difference between ingesting a poison or a nutritive meal. Despite the obvious importance of this process, very little is known about the neural basis of taste discrimination learning. In other sensory modalities, this form of learning can be mediated by either the sharpening of sensory representations or the enhanced ability of "decision-making" circuits to interpret sensory information. Given the dual role of the gustatory insular cortex (GC) in encoding both sensory and decision-related variables, this region represents an ideal site for investigating how neural activity changes as animals learn a novel taste discrimination. Here, we present results from experiments relying on two-photon calcium imaging of GC neural activity in mice performing a taste-guided mixture discrimination task. The task allows for the recording of neural activity before and after learning induced by training mice to discriminate increasingly similar pairs of taste mixtures. Single-neuron and population analyses show a time-varying pattern of activity, with early sensory responses emerging after taste delivery and binary, choice-encoding responses emerging later in the delay before a decision is made. Our results demonstrate that, while both sensory and decision-related information is encoded by GC in the context of a taste mixture discrimination task, learning and improved performance are associated with a specific enhancement of decision-related responses.
Collapse
Affiliation(s)
- Joshua F Kogan
- Graduate Program in Neuroscience, Stony Brook University, Stony Brook, NY 11794, USA; Medical Scientist Training Program, Stony Brook University, Stony Brook, NY 11794, USA; Department of Neurobiology and Behavior, Stony Brook University, Stony Brook, NY 11794, USA.
| | - Alfredo Fontanini
- Graduate Program in Neuroscience, Stony Brook University, Stony Brook, NY 11794, USA; Medical Scientist Training Program, Stony Brook University, Stony Brook, NY 11794, USA; Department of Neurobiology and Behavior, Stony Brook University, Stony Brook, NY 11794, USA.
| |
Collapse
|
14
|
Cotteret M, Greatorex H, Ziegler M, Chicca E. Vector Symbolic Finite State Machines in Attractor Neural Networks. Neural Comput 2024; 36:549-595. [PMID: 38457766 DOI: 10.1162/neco_a_01638] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2023] [Accepted: 10/19/2023] [Indexed: 03/10/2024]
Abstract
Hopfield attractor networks are robust distributed models of human memory, but they lack a general mechanism for effecting state-dependent attractor transitions in response to input. We propose construction rules such that an attractor network may implement an arbitrary finite state machine (FSM), where states and stimuli are represented by high-dimensional random vectors and all state transitions are enacted by the attractor network's dynamics. Numerical simulations show the capacity of the model, in terms of the maximum size of implementable FSM, to be linear in the size of the attractor network for dense bipolar state vectors and approximately quadratic for sparse binary state vectors. We show that the model is robust to imprecise and noisy weights, and so a prime candidate for implementation with high-density but unreliable devices. By endowing attractor networks with the ability to emulate arbitrary FSMs, we propose a plausible path by which FSMs could exist as a distributed computational primitive in biological neural networks.
Collapse
Affiliation(s)
- Madison Cotteret
- Micro- and Nanoelectronic Systems, Institute of Micro- and Nanotechnologies (IMN) MacroNano, Technische Universität Ilmenau, 98693 Ilmenau, Germany
- Bio-Inspired Circuits and Systems Lab, Zernike Institute for Advanced Materials, and Groningen Cognitive Systems and Materials Center, University of Groningen, 9747 AG Groningen, Netherlands
| | - Hugh Greatorex
- Bio-Inspired Circuits and Systems Lab, Zernike Institute for Advanced Materials, and Groningen Cognitive Systems and Materials Center, University of Groningen, 9747 AG Groningen, Netherlands
| | - Martin Ziegler
- Micro- and Nanoelectronic Systems, Institute of Micro- and Nanotechnologies (IMN) MacroNano, Technische Universität Ilmenau, 98693 Ilmenau, Germany
| | - Elisabetta Chicca
- Bio-Inspired Circuits and Systems Lab, Zernike Institute for Advanced Materials, and Groningen Cognitive Systems and Materials Center, University of Groningen, 9747 AG Groningen, Netherlands
| |
Collapse
|
15
|
Venkadesh S, Shaikh A, Shakeri H, Barreto E, Van Horn JD. Biophysical modulation and robustness of itinerant complexity in neuronal networks. FRONTIERS IN NETWORK PHYSIOLOGY 2024; 4:1302499. [PMID: 38516614 PMCID: PMC10954887 DOI: 10.3389/fnetp.2024.1302499] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 09/26/2023] [Accepted: 02/26/2024] [Indexed: 03/23/2024]
Abstract
Transient synchronization of bursting activity in neuronal networks, which occurs in patterns of metastable itinerant phase relationships between neurons, is a notable feature of network dynamics observed in vivo. However, the mechanisms that contribute to this dynamical complexity in neuronal circuits are not well understood. Local circuits in cortical regions consist of populations of neurons with diverse intrinsic oscillatory features. In this study, we numerically show that the phenomenon of transient synchronization, also referred to as metastability, can emerge in an inhibitory neuronal population when the neurons' intrinsic fast-spiking dynamics are appropriately modulated by slower inputs from an excitatory neuronal population. Using a compact model of a mesoscopic-scale network consisting of excitatory pyramidal and inhibitory fast-spiking neurons, our work demonstrates a relationship between the frequency of pyramidal population oscillations and the features of emergent metastability in the inhibitory population. In addition, we introduce a method to characterize collective transitions in metastable networks. Finally, we discuss potential applications of this study in mechanistically understanding cortical network dynamics.
Collapse
Affiliation(s)
- Siva Venkadesh
- Department of Psychology, University of Virginia, Charlottesville, VA, United States
| | - Asmir Shaikh
- Department of Computer Science, University of Virginia, Charlottesville, VA, United States
| | - Heman Shakeri
- School of Data Science, University of Virginia, Charlottesville, VA, United States
- Biomedical Engineering, University of Virginia, Charlottesville, VA, United States
| | - Ernest Barreto
- Department of Physics and Astronomy and the Interdisciplinary Program in Neuroscience, George Mason University, Fairfax, VA, United States
| | - John Darrell Van Horn
- Department of Psychology, University of Virginia, Charlottesville, VA, United States
- School of Data Science, University of Virginia, Charlottesville, VA, United States
| |
Collapse
|
16
|
Kogan JF, Fontanini A. Learning enhances representations of taste-guided decisions in the mouse gustatory insular cortex. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.10.16.562605. [PMID: 37905010 PMCID: PMC10614904 DOI: 10.1101/2023.10.16.562605] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/02/2023]
Abstract
Learning to discriminate overlapping gustatory stimuli that predict distinct outcomes - a feat known as discrimination learning - can mean the difference between ingesting a poison or a nutritive meal. Despite the obvious importance of this process, very little is known on the neural basis of taste discrimination learning. In other sensory modalities, this form of learning can be mediated by either sharpening of sensory representations, or enhanced ability of "decision-making" circuits to interpret sensory information. Given the dual role of the gustatory insular cortex (GC) in encoding both sensory and decision-related variables, this region represents an ideal site for investigating how neural activity changes as animals learn a novel taste discrimination. Here we present results from experiments relying on two photon calcium imaging of GC neural activity in mice performing a taste-guided mixture discrimination task. The task allows for recording of neural activity before and after learning induced by training mice to discriminate increasingly similar pairs of taste mixtures. Single neuron and population analyses show a time-varying pattern of activity, with early sensory responses emerging after taste delivery and binary, choice encoding responses emerging later in the delay before a decision is made. Our results demonstrate that while both sensory and decision-related information is encoded by GC in the context of a taste mixture discrimination task, learning and improved performance are associated with a specific enhancement of decision-related responses.
Collapse
|
17
|
Zhang Q, Wan G, Starchenko V, Hu G, Dufresne EM, Zhou H, Jeen H, Almazan IC, Dong Y, Liu H, Sandy AR, Sterbinsky GE, Lee HN, Ganesh P, Fong DD. Intermittent Defect Fluctuations in Oxide Heterostructures. ADVANCED MATERIALS (DEERFIELD BEACH, FLA.) 2023; 35:e2305383. [PMID: 37578079 DOI: 10.1002/adma.202305383] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/05/2023] [Revised: 07/31/2023] [Indexed: 08/15/2023]
Abstract
The heterogeneous nature, local presence, and dynamic evolution of defects typically govern the ionic and electronic properties of a wide variety of functional materials. While the last 50 years have seen considerable efforts into development of new methods to identify the nature of defects in complex materials, such as the perovskite oxides, very little is known about defect dynamics and their influence on the functionality of a material. Here, the discovery of the intermittent behavior of point defects (oxygen vacancies) in oxide heterostructures employing X-ray photon correlation spectroscopy is reported. Local fluctuations between two ordered phases in strained SrCoOx with different degrees of stability of the oxygen vacancies are observed. Ab-initio-informed phase-field modeling reveals that fluctuations between the competing ordered phases are modulated by the oxygen ion/vacancy interaction energy and epitaxial strain. The results demonstrate how defect dynamics, evidenced by measurement and modeling of their temporal fluctuations, give rise to stochastic properties that now can be fully characterized using coherent X-rays, coupled for the first time to multiscale modeling in functional complex oxide heterostructures. The study and its findings open new avenues for engineering the dynamical response of functional materials used in neuromorphic and electrochemical applications.
Collapse
Affiliation(s)
- Qingteng Zhang
- X-Ray Science Division, Argonne National Laboratory, Lemont, IL, 60439, USA
| | - Gang Wan
- Material Science Division, Argonne National Laboratory, Lemont, IL, 60439, USA
| | - Vitalii Starchenko
- Chemical Sciences Division, Oak Ridge National Laboratory, Oak Ridge, TN, 37831, USA
| | - Guoxiang Hu
- Center for Nanophase Materials Sciences, Oak Ridge National Laboratory, Oak Ridge, TN, 37831, USA
| | - Eric M Dufresne
- X-Ray Science Division, Argonne National Laboratory, Lemont, IL, 60439, USA
| | - Hua Zhou
- X-Ray Science Division, Argonne National Laboratory, Lemont, IL, 60439, USA
| | - Hyoungjeen Jeen
- Department of Physics, Pusan National University, Busan, 46241, South Korea
| | - Irene Calvo Almazan
- Material Science Division, Argonne National Laboratory, Lemont, IL, 60439, USA
| | - Yongqi Dong
- X-Ray Science Division, Argonne National Laboratory, Lemont, IL, 60439, USA
| | - Huajun Liu
- Material Science Division, Argonne National Laboratory, Lemont, IL, 60439, USA
| | - Alec R Sandy
- X-Ray Science Division, Argonne National Laboratory, Lemont, IL, 60439, USA
| | | | - Ho Nyung Lee
- Materials Science and Technology Division, Oak Ridge National Laboratory, Oak Ridge, TN, 37831, USA
| | - P Ganesh
- Center for Nanophase Materials Sciences, Oak Ridge National Laboratory, Oak Ridge, TN, 37831, USA
| | - Dillon D Fong
- Material Science Division, Argonne National Laboratory, Lemont, IL, 60439, USA
| |
Collapse
|
18
|
Ye L, Feng J, Li C. Controlling brain dynamics: Landscape and transition path for working memory. PLoS Comput Biol 2023; 19:e1011446. [PMID: 37669311 PMCID: PMC10503743 DOI: 10.1371/journal.pcbi.1011446] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/27/2023] [Revised: 09/15/2023] [Accepted: 08/21/2023] [Indexed: 09/07/2023] Open
Abstract
Understanding the underlying dynamical mechanisms of the brain and controlling it is a crucial issue in brain science. The energy landscape and transition path approach provides a possible route to address these challenges. Here, taking working memory as an example, we quantified its landscape based on a large-scale macaque model. The working memory function is governed by the change of landscape and brain-wide state switching in response to the task demands. The kinetic transition path reveals that information flow follows the direction of hierarchical structure. Importantly, we propose a landscape control approach to manipulate brain state transition by modulating external stimulation or inter-areal connectivity, demonstrating the crucial roles of associative areas, especially prefrontal and parietal cortical areas in working memory performance. Our findings provide new insights into the dynamical mechanism of cognitive function, and the landscape control approach helps to develop therapeutic strategies for brain disorders.
Collapse
Affiliation(s)
- Leijun Ye
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai, China
- Shanghai Center for Mathematical Sciences, Fudan University, Shanghai, China
| | - Jianfeng Feng
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai, China
- Shanghai Center for Mathematical Sciences, Fudan University, Shanghai, China
- Department of Computer Science, University of Warwick, Coventry, United Kingdom
| | - Chunhe Li
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai, China
- Shanghai Center for Mathematical Sciences, Fudan University, Shanghai, China
- School of Mathematical Sciences and MOE Frontiers Center for Brain Science, Fudan University, Shanghai, China
| |
Collapse
|
19
|
Labay-Mora A, Zambrini R, Giorgi GL. Quantum Associative Memory with a Single Driven-Dissipative Nonlinear Oscillator. PHYSICAL REVIEW LETTERS 2023; 130:190602. [PMID: 37243658 DOI: 10.1103/physrevlett.130.190602] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/31/2022] [Accepted: 04/14/2023] [Indexed: 05/29/2023]
Abstract
Algorithms for associative memory typically rely on a network of many connected units. The prototypical example is the Hopfield model, whose generalizations to the quantum realm are mainly based on open quantum Ising models. We propose a realization of associative memory with a single driven-dissipative quantum oscillator exploiting its infinite degrees of freedom in phase space. The model can improve the storage capacity of discrete neuron-based systems in a large regime and we prove successful state discrimination between n coherent states, which represent the stored patterns of the system. These can be tuned continuously by modifying the driving strength, constituting a modified learning rule. We show that the associative-memory capability is inherently related to the existence of a spectral separation in the Liouvillian superoperator, which results in a long timescale separation in the dynamics corresponding to a metastable phase.
Collapse
Affiliation(s)
- Adrià Labay-Mora
- Institute for Cross Disciplinary Physics and Complex Systems (IFISC) UIB-CSIC, Campus Universitat Illes Balears, Palma de Mallorca, Spain
| | - Roberta Zambrini
- Institute for Cross Disciplinary Physics and Complex Systems (IFISC) UIB-CSIC, Campus Universitat Illes Balears, Palma de Mallorca, Spain
| | - Gian Luca Giorgi
- Institute for Cross Disciplinary Physics and Complex Systems (IFISC) UIB-CSIC, Campus Universitat Illes Balears, Palma de Mallorca, Spain
| |
Collapse
|
20
|
Ji P, Wang Y, Peron T, Li C, Nagler J, Du J. Structure and function in artificial, zebrafish and human neural networks. Phys Life Rev 2023; 45:74-111. [PMID: 37182376 DOI: 10.1016/j.plrev.2023.04.004] [Citation(s) in RCA: 9] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2023] [Accepted: 04/20/2023] [Indexed: 05/16/2023]
Abstract
Network science provides a set of tools for the characterization of the structure and functional behavior of complex systems. Yet a major problem is to quantify how the structural domain is related to the dynamical one. In other words, how the diversity of dynamical states of a system can be predicted from the static network structure? Or the reverse problem: starting from a set of signals derived from experimental recordings, how can one discover the network connections or the causal relations behind the observed dynamics? Despite the advances achieved over the last two decades, many challenges remain concerning the study of the structure-dynamics interplay of complex systems. In neuroscience, progress is typically constrained by the low spatio-temporal resolution of experiments and by the lack of a universal inferring framework for empirical systems. To address these issues, applications of network science and artificial intelligence to neural data have been rapidly growing. In this article, we review important recent applications of methods from those fields to the study of the interplay between structure and functional dynamics of human and zebrafish brain. We cover the selection of topological features for the characterization of brain networks, inference of functional connections, dynamical modeling, and close with applications to both the human and zebrafish brain. This review is intended to neuroscientists who want to become acquainted with techniques from network science, as well as to researchers from the latter field who are interested in exploring novel application scenarios in neuroscience.
Collapse
Affiliation(s)
- Peng Ji
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University, Shanghai 200433, China; Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence, Ministry of Education, Shanghai 200433, China; MOE Frontiers Center for Brain Science, Fudan University, Shanghai 200433, China
| | - Yufan Wang
- Institute of Neuroscience, State Key Laboratory of Neuroscience, Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, 320 Yue-Yang Road, Shanghai 200031, China
| | - Thomas Peron
- Institute of Mathematics and Computer Science, University of São Paulo, São Carlos 13566-590, São Paulo, Brazil.
| | - Chunhe Li
- Shanghai Center for Mathematical Sciences and School of Mathematical Sciences, Fudan University, Shanghai 200433, China; Institute of Science and Technology for Brain-Inspired Intelligence and MOE Frontiers Center for Brain Science, Fudan University, Shanghai 200433, China.
| | - Jan Nagler
- Deep Dynamics, Frankfurt School of Finance & Management, Frankfurt, Germany; Centre for Human and Machine Intelligence, Frankfurt School of Finance & Management, Frankfurt, Germany
| | - Jiulin Du
- Institute of Neuroscience, State Key Laboratory of Neuroscience, Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, 320 Yue-Yang Road, Shanghai 200031, China.
| |
Collapse
|
21
|
Vinci GV, Benzi R, Mattia M. Self-Consistent Stochastic Dynamics for Finite-Size Networks of Spiking Neurons. PHYSICAL REVIEW LETTERS 2023; 130:097402. [PMID: 36930929 DOI: 10.1103/physrevlett.130.097402] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/17/2022] [Revised: 12/23/2022] [Accepted: 02/09/2023] [Indexed: 06/18/2023]
Abstract
Despite the huge number of neurons composing a brain network, ongoing activity of local cell assemblies is intrinsically stochastic. Fluctuations in their instantaneous rate of spike firing ν(t) scale with the size of the assembly and persist in isolated networks, i.e., in the absence of external sources of noise. Although deterministic chaos due to the quenched disorder of the synaptic couplings underlies this seemingly stochastic dynamics, an effective theory for the network dynamics of a finite assembly of spiking neurons is lacking. Here, we fill this gap by extending the so-called population density approach including an activity- and size-dependent stochastic source in the Fokker-Planck equation for the membrane potential density. The finite-size noise embedded in this stochastic partial derivative equation is analytically characterized leading to a self-consistent and nonperturbative description of ν(t) valid for a wide class of spiking neuron networks. Power spectra of ν(t) are found in excellent agreement with those from detailed simulations both in the linear regime and across a synchronization phase transition, when a size-dependent smearing of the critical dynamics emerges.
Collapse
Affiliation(s)
- Gianni V Vinci
- Natl. Center for Radiation Protection and Computational Physics, Istituto Superiore di Sanità, 00161 Roma, Italy
- PhD Program in Physics, Dept. of Physics, "Tor Vergata" University of Rome, 00133 Roma, Italy
| | - Roberto Benzi
- Dept. of Physics and INFN, "Tor Vergata" University of Rome, 00133 Roma, Italy
- Centro Ricerche "E. Fermi," 00184, Roma, Italy
| | - Maurizio Mattia
- Natl. Center for Radiation Protection and Computational Physics, Istituto Superiore di Sanità, 00161 Roma, Italy
| |
Collapse
|
22
|
Temporal progression along discrete coding states during decision-making in the mouse gustatory cortex. PLoS Comput Biol 2023; 19:e1010865. [PMID: 36749734 PMCID: PMC9904478 DOI: 10.1371/journal.pcbi.1010865] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2022] [Accepted: 01/10/2023] [Indexed: 02/08/2023] Open
Abstract
The mouse gustatory cortex (GC) is involved in taste-guided decision-making in addition to sensory processing. Rodent GC exhibits metastable neural dynamics during ongoing and stimulus-evoked activity, but how these dynamics evolve in the context of a taste-based decision-making task remains unclear. Here we employ analytical and modeling approaches to i) extract metastable dynamics in ensemble spiking activity recorded from the GC of mice performing a perceptual decision-making task; ii) investigate the computational mechanisms underlying GC metastability in this task; and iii) establish a relationship between GC dynamics and behavioral performance. Our results show that activity in GC during perceptual decision-making is metastable and that this metastability may serve as a substrate for sequentially encoding sensory, abstract cue, and decision information over time. Perturbations of the model's metastable dynamics indicate that boosting inhibition in different coding epochs differentially impacts network performance, explaining a counterintuitive effect of GC optogenetic silencing on mouse behavior.
Collapse
|
23
|
Controllable branching of robust response patterns in nonlinear mechanical resonators. Nat Commun 2023; 14:161. [PMID: 36631442 PMCID: PMC9834403 DOI: 10.1038/s41467-022-35685-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2022] [Accepted: 12/15/2022] [Indexed: 01/13/2023] Open
Abstract
In lieu of continuous time active feedback control in complex systems, nonlinear dynamics offers a means to generate desired long-term responses using short-time control signals. This type of control has been proposed for use in resonators that exhibit a plethora of complex dynamic behaviors resulting from energy exchange between modes. However, the dynamic response and, ultimately, the ability to control the response of these systems remains poorly understood. Here, we show that a micromechanical resonator can generate diverse, robust dynamical responses that occur on a timescale five orders of magnitude larger than the external harmonic driving and these responses can be selected by inserting small pulses at specific branching points. We develop a theoretical model and experimentally show the ability to control these response patterns. Hence, these mechanical resonators may represent a simple physical platform for the development of springboard concepts for nonlinear, flexible, yet robust dynamics found in other areas of physics, chemistry, and biology.
Collapse
|
24
|
Xing Y, Zan C, Liu L. Recent advances in understanding neuronal diversity and neural circuit complexity across different brain regions using single-cell sequencing. Front Neural Circuits 2023; 17:1007755. [PMID: 37063385 PMCID: PMC10097998 DOI: 10.3389/fncir.2023.1007755] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2022] [Accepted: 02/16/2023] [Indexed: 04/18/2023] Open
Abstract
Neural circuits are characterized as interconnecting neuron networks connected by synapses. Some kinds of gene expression and/or functional changes of neurons and synaptic connections may result in aberrant neural circuits, which has been recognized as one crucial pathological mechanism for the onset of many neurological diseases. Gradual advances in single-cell sequencing approaches with strong technological advantages, as exemplified by high throughput and increased resolution for live cells, have enabled it to assist us in understanding neuronal diversity across diverse brain regions and further transformed our knowledge of cellular building blocks of neural circuits through revealing numerous molecular signatures. Currently published transcriptomic studies have elucidated various neuronal subpopulations as well as their distribution across prefrontal cortex, hippocampus, hypothalamus, and dorsal root ganglion, etc. Better characterization of brain region-specific circuits may shed light on new pathological mechanisms involved and assist in selecting potential targets for the prevention and treatment of specific neurological disorders based on their established roles. Given diverse neuronal populations across different brain regions, we aim to give a brief sketch of current progress in understanding neuronal diversity and neural circuit complexity according to their locations. With the special focus on the application of single-cell sequencing, we thereby summarize relevant region-specific findings. Considering the importance of spatial context and connectivity in neural circuits, we also discuss a few published results obtained by spatial transcriptomics. Taken together, these single-cell sequencing data may lay a mechanistic basis for functional identification of brain circuit components, which links their molecular signatures to anatomical regions, connectivity, morphology, and physiology. Furthermore, the comprehensive characterization of neuron subtypes, their distributions, and connectivity patterns via single-cell sequencing is critical for understanding neural circuit properties and how they generate region-dependent interactions in different context.
Collapse
Affiliation(s)
- Yu Xing
- Department of Neurology, Beidahuang Industry Group General Hospital, Harbin, China
| | - Chunfang Zan
- Institute for Stroke and Dementia Research (ISD), LMU Klinikum, Ludwig-Maximilian-University (LMU), Munich, Germany
| | - Lu Liu
- Munich Medical Research School (MMRS), LMU Klinikum, Ludwig-Maximilian-University (LMU), Munich, Germany
- *Correspondence: Lu Liu, ,
| |
Collapse
|
25
|
Pietras B, Schmutz V, Schwalger T. Mesoscopic description of hippocampal replay and metastability in spiking neural networks with short-term plasticity. PLoS Comput Biol 2022; 18:e1010809. [PMID: 36548392 PMCID: PMC9822116 DOI: 10.1371/journal.pcbi.1010809] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/29/2022] [Revised: 01/06/2023] [Accepted: 12/11/2022] [Indexed: 12/24/2022] Open
Abstract
Bottom-up models of functionally relevant patterns of neural activity provide an explicit link between neuronal dynamics and computation. A prime example of functional activity patterns are propagating bursts of place-cell activities called hippocampal replay, which is critical for memory consolidation. The sudden and repeated occurrences of these burst states during ongoing neural activity suggest metastable neural circuit dynamics. As metastability has been attributed to noise and/or slow fatigue mechanisms, we propose a concise mesoscopic model which accounts for both. Crucially, our model is bottom-up: it is analytically derived from the dynamics of finite-size networks of Linear-Nonlinear Poisson neurons with short-term synaptic depression. As such, noise is explicitly linked to stochastic spiking and network size, and fatigue is explicitly linked to synaptic dynamics. To derive the mesoscopic model, we first consider a homogeneous spiking neural network and follow the temporal coarse-graining approach of Gillespie to obtain a "chemical Langevin equation", which can be naturally interpreted as a stochastic neural mass model. The Langevin equation is computationally inexpensive to simulate and enables a thorough study of metastable dynamics in classical setups (population spikes and Up-Down-states dynamics) by means of phase-plane analysis. An extension of the Langevin equation for small network sizes is also presented. The stochastic neural mass model constitutes the basic component of our mesoscopic model for replay. We show that the mesoscopic model faithfully captures the statistical structure of individual replayed trajectories in microscopic simulations and in previously reported experimental data. Moreover, compared to the deterministic Romani-Tsodyks model of place-cell dynamics, it exhibits a higher level of variability regarding order, direction and timing of replayed trajectories, which seems biologically more plausible and could be functionally desirable. This variability is the product of a new dynamical regime where metastability emerges from a complex interplay between finite-size fluctuations and local fatigue.
Collapse
Affiliation(s)
- Bastian Pietras
- Institute for Mathematics, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience, Berlin, Germany
- Department of Information and Communication Technologies, Universitat Pompeu Fabra, Barcelona, Spain
| | - Valentin Schmutz
- Brain Mind Institute, School of Computer and Communication Sciences and School of Life Sciences, École Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland
| | - Tilo Schwalger
- Institute for Mathematics, Technische Universität Berlin, Berlin, Germany
- Bernstein Center for Computational Neuroscience, Berlin, Germany
| |
Collapse
|
26
|
Mazzucato L. Neural mechanisms underlying the temporal organization of naturalistic animal behavior. eLife 2022; 11:e76577. [PMID: 35792884 PMCID: PMC9259028 DOI: 10.7554/elife.76577] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2022] [Accepted: 06/07/2022] [Indexed: 12/17/2022] Open
Abstract
Naturalistic animal behavior exhibits a strikingly complex organization in the temporal domain, with variability arising from at least three sources: hierarchical, contextual, and stochastic. What neural mechanisms and computational principles underlie such intricate temporal features? In this review, we provide a critical assessment of the existing behavioral and neurophysiological evidence for these sources of temporal variability in naturalistic behavior. Recent research converges on an emergent mechanistic theory of temporal variability based on attractor neural networks and metastable dynamics, arising via coordinated interactions between mesoscopic neural circuits. We highlight the crucial role played by structural heterogeneities as well as noise from mesoscopic feedback loops in regulating flexible behavior. We assess the shortcomings and missing links in the current theoretical and experimental literature and propose new directions of investigation to fill these gaps.
Collapse
Affiliation(s)
- Luca Mazzucato
- Institute of Neuroscience, Departments of Biology, Mathematics and Physics, University of OregonEugeneUnited States
| |
Collapse
|
27
|
Wang X, Liu Y, Cheng H, Ouyang X. Surface Wettability for Skin-Interfaced Sensors and Devices. ADVANCED FUNCTIONAL MATERIALS 2022; 32:2200260. [PMID: 36176721 PMCID: PMC9514151 DOI: 10.1002/adfm.202200260] [Citation(s) in RCA: 38] [Impact Index Per Article: 12.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/08/2022] [Indexed: 05/05/2023]
Abstract
The practical applications of skin-interfaced sensors and devices in daily life hinge on the rational design of surface wettability to maintain device integrity and achieve improved sensing performance under complex hydrated conditions. Various bio-inspired strategies have been implemented to engineer desired surface wettability for varying hydrated conditions. Although the bodily fluids can negatively affect the device performance, they also provide a rich reservoir of health-relevant information and sustained energy for next-generation stretchable self-powered devices. As a result, the design and manipulation of the surface wettability are critical to effectively control the liquid behavior on the device surface for enhanced performance. The sensors and devices with engineered surface wettability can collect and analyze health biomarkers while being minimally affected by bodily fluids or ambient humid environments. The energy harvesters also benefit from surface wettability design to achieve enhanced performance for powering on-body electronics. In this review, we first summarize the commonly used approaches to tune the surface wettability for target applications toward stretchable self-powered devices. By considering the existing challenges, we also discuss the opportunities as a small fraction of potential future developments, which can lead to a new class of skin-interfaced devices for use in digital health and personalized medicine.
Collapse
Affiliation(s)
- Xiufeng Wang
- School of Materials Science and Engineering, Xiangtan University, Xiangtan, Hunan 411105, China
| | - Yangchengyi Liu
- School of Materials Science and Engineering, Xiangtan University, Xiangtan, Hunan 411105, China
| | - Huanyu Cheng
- Department of Engineering Science and Mechanics, The Pennsylvania State University, University Park, PA 16802, USA
| | - Xiaoping Ouyang
- School of Materials Science and Engineering, Xiangtan University, Xiangtan, Hunan 411105, China
| |
Collapse
|