1
|
Brückner DB, Broedersz CP. Learning dynamical models of single and collective cell migration: a review. REPORTS ON PROGRESS IN PHYSICS. PHYSICAL SOCIETY (GREAT BRITAIN) 2024; 87:056601. [PMID: 38518358 DOI: 10.1088/1361-6633/ad36d2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/07/2023] [Accepted: 03/22/2024] [Indexed: 03/24/2024]
Abstract
Single and collective cell migration are fundamental processes critical for physiological phenomena ranging from embryonic development and immune response to wound healing and cancer metastasis. To understand cell migration from a physical perspective, a broad variety of models for the underlying physical mechanisms that govern cell motility have been developed. A key challenge in the development of such models is how to connect them to experimental observations, which often exhibit complex stochastic behaviours. In this review, we discuss recent advances in data-driven theoretical approaches that directly connect with experimental data to infer dynamical models of stochastic cell migration. Leveraging advances in nanofabrication, image analysis, and tracking technology, experimental studies now provide unprecedented large datasets on cellular dynamics. In parallel, theoretical efforts have been directed towards integrating such datasets into physical models from the single cell to the tissue scale with the aim of conceptualising the emergent behaviour of cells. We first review how this inference problem has been addressed in both freely migrating and confined cells. Next, we discuss why these dynamics typically take the form of underdamped stochastic equations of motion, and how such equations can be inferred from data. We then review applications of data-driven inference and machine learning approaches to heterogeneity in cell behaviour, subcellular degrees of freedom, and to the collective dynamics of multicellular systems. Across these applications, we emphasise how data-driven methods can be integrated with physical active matter models of migrating cells, and help reveal how underlying molecular mechanisms control cell behaviour. Together, these data-driven approaches are a promising avenue for building physical models of cell migration directly from experimental data, and for providing conceptual links between different length-scales of description.
Collapse
Affiliation(s)
- David B Brückner
- Institute of Science and Technology Austria, Am Campus 1, 3400 Klosterneuburg, Austria
| | - Chase P Broedersz
- Department of Physics and Astronomy, Vrije Universiteit Amsterdam, 1081 HV Amsterdam, The Netherlands
- Arnold Sommerfeld Center for Theoretical Physics and Center for NanoScience, Department of Physics, Ludwig-Maximilian-University Munich, Theresienstr. 37, D-80333 Munich, Germany
| |
Collapse
|
2
|
Bourantas C, Torii R, Karabasov S, Krams R. Editorial: Computational modelling of cardiovascular hemodynamics and machine learning. Front Cardiovasc Med 2024; 11:1355843. [PMID: 38455721 PMCID: PMC10917996 DOI: 10.3389/fcvm.2024.1355843] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2023] [Accepted: 02/09/2024] [Indexed: 03/09/2024] Open
Affiliation(s)
- Christos Bourantas
- Department of Cardiology, Bart’s Heart Centre, Barts Health NHS Trust, London, United Kingdom
- Device and Innovation Centre, William Harvey Research Institute, Queen Mary University, London, United Kingdom
| | - Ryo Torii
- Department of Mechanical Engineering, University College, London, United Kingdom
| | - Sergey Karabasov
- School for Science and Engineering, Queen Mary University, London, United Kingdom
| | - Rob Krams
- School for Science and Engineering, Queen Mary University, London, United Kingdom
| |
Collapse
|
3
|
Wang JH, Tsin D, Engel TA. Predictive variational autoencoder for learning robust representations of time-series data. ARXIV 2023:arXiv:2312.06932v1. [PMID: 38168462 PMCID: PMC10760197] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Subscribe] [Scholar Register] [Indexed: 01/05/2024]
Abstract
Variational autoencoders (VAEs) have been used extensively to discover low-dimensional latent factors governing neural activity and animal behavior. However, without careful model selection, the uncovered latent factors may reflect noise in the data rather than true underlying features, rendering such representations unsuitable for scientific interpretation. Existing solutions to this problem involve introducing additional measured variables or data augmentations specific to a particular data type. We propose a VAE architecture that predicts the next point in time and show that it mitigates the learning of spurious features. In addition, we introduce a model selection metric based on smoothness over time in the latent space. We show that together these two constraints on VAEs to be smooth over time produce robust latent representations and faithfully recover latent factors on synthetic datasets.
Collapse
Affiliation(s)
- Julia H Wang
- Cold Spring Harbor Laboratory School of Biological Sciences Cold Spring Harbor Laboratory Cold Spring Harbor, New York, USA
| | - Dexter Tsin
- Princeton Neuroscience Institute Prineton University Princeton, New Jersey, USA
| | - Tatiana A Engel
- Princeton Neuroscience Institute Prineton University Princeton, New Jersey, USA
| |
Collapse
|
4
|
Genkin M, Shenoy KV, Chandrasekaran C, Engel TA. The dynamics and geometry of choice in premotor cortex. BIORXIV : THE PREPRINT SERVER FOR BIOLOGY 2023:2023.07.22.550183. [PMID: 37546748 PMCID: PMC10401920 DOI: 10.1101/2023.07.22.550183] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 08/08/2023]
Abstract
The brain represents sensory variables in the coordinated activity of neural populations, in which tuning curves of single neurons define the geometry of the population code. Whether the same coding principle holds for dynamic cognitive variables remains unknown because internal cognitive processes unfold with a unique time course on single trials observed only in the irregular spiking of heterogeneous neural populations. Here we show the existence of such a population code for the dynamics of choice formation in the primate premotor cortex. We developed an approach to simultaneously infer population dynamics and tuning functions of single neurons to the population state. Applied to spike data recorded during decision-making, our model revealed that populations of neurons encoded the same dynamic variable predicting choices, and heterogeneous firing rates resulted from the diverse tuning of single neurons to this decision variable. The inferred dynamics indicated an attractor mechanism for decision computation. Our results reveal a common geometric principle for neural encoding of sensory and dynamic cognitive variables.
Collapse
Affiliation(s)
| | - Krishna V Shenoy
- Howard Hughes Medical Institute, Stanford University, Stanford, CA
- Department of Electrical Engineering, Stanford University, Stanford, CA
| | - Chandramouli Chandrasekaran
- Department of Anatomy & Neurobiology, Boston University, Boston, MA
- Department of Psychological and Brain Sciences, Boston University, Boston, MA
- Center for Systems Neuroscience, Boston University, Boston, MA
| | - Tatiana A Engel
- Cold Spring Harbor Laboratory, Cold Spring Harbor, NY
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ
| |
Collapse
|
5
|
Langdon C, Genkin M, Engel TA. A unifying perspective on neural manifolds and circuits for cognition. Nat Rev Neurosci 2023; 24:363-377. [PMID: 37055616 PMCID: PMC11058347 DOI: 10.1038/s41583-023-00693-x] [Citation(s) in RCA: 19] [Impact Index Per Article: 19.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/06/2023] [Indexed: 04/15/2023]
Abstract
Two different perspectives have informed efforts to explain the link between the brain and behaviour. One approach seeks to identify neural circuit elements that carry out specific functions, emphasizing connectivity between neurons as a substrate for neural computations. Another approach centres on neural manifolds - low-dimensional representations of behavioural signals in neural population activity - and suggests that neural computations are realized by emergent dynamics. Although manifolds reveal an interpretable structure in heterogeneous neuronal activity, finding the corresponding structure in connectivity remains a challenge. We highlight examples in which establishing the correspondence between low-dimensional activity and connectivity has been possible, unifying the neural manifold and circuit perspectives. This relationship is conspicuous in systems in which the geometry of neural responses mirrors their spatial layout in the brain, such as the fly navigational system. Furthermore, we describe evidence that, in systems in which neural responses are heterogeneous, the circuit comprises interactions between activity patterns on the manifold via low-rank connectivity. We suggest that unifying the manifold and circuit approaches is important if we are to be able to causally test theories about the neural computations that underlie behaviour.
Collapse
Affiliation(s)
- Christopher Langdon
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA
- Cold Spring Harbor Laboratory, Cold Spring Harbor, NY, USA
| | - Mikhail Genkin
- Cold Spring Harbor Laboratory, Cold Spring Harbor, NY, USA
| | - Tatiana A Engel
- Princeton Neuroscience Institute, Princeton University, Princeton, NJ, USA.
- Cold Spring Harbor Laboratory, Cold Spring Harbor, NY, USA.
| |
Collapse
|
6
|
Chen F, Li C. Inferring structural and dynamical properties of gene networks from data with deep learning. NAR Genom Bioinform 2022; 4:lqac068. [PMID: 36110897 PMCID: PMC9469930 DOI: 10.1093/nargab/lqac068] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2022] [Revised: 07/22/2022] [Accepted: 08/24/2022] [Indexed: 11/29/2022] Open
Abstract
The reconstruction of gene regulatory networks (GRNs) from data is vital in systems biology. Although different approaches have been proposed to infer causality from data, some challenges remain, such as how to accurately infer the direction and type of interactions, how to deal with complex network involving multiple feedbacks, as well as how to infer causality between variables from real-world data, especially single cell data. Here, we tackle these problems by deep neural networks (DNNs). The underlying regulatory network for different systems (gene regulations, ecology, diseases, development) can be successfully reconstructed from trained DNN models. We show that DNN is superior to existing approaches including Boolean network, Random Forest and partial cross mapping for network inference. Further, by interrogating the ensemble DNN model trained from single cell data from dynamical system perspective, we are able to unravel complex cell fate dynamics during preimplantation development. We also propose a data-driven approach to quantify the energy landscape for gene regulatory systems, by combining DNN with the partial self-consistent mean field approximation (PSCA) approach. We anticipate the proposed method can be applied to other fields to decipher the underlying dynamical mechanisms of systems from data.
Collapse
Affiliation(s)
- Feng Chen
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University , Shanghai 200433, China
- Shanghai Center for Mathematical Sciences, Fudan University , Shanghai 200433, China
| | - Chunhe Li
- Institute of Science and Technology for Brain-Inspired Intelligence, Fudan University , Shanghai 200433, China
- Shanghai Center for Mathematical Sciences, Fudan University , Shanghai 200433, China
- School of Mathematical Sciences, Fudan University , Shanghai 200433, China
| |
Collapse
|
7
|
Mazzucato L. Neural mechanisms underlying the temporal organization of naturalistic animal behavior. eLife 2022; 11:76577. [PMID: 35792884 PMCID: PMC9259028 DOI: 10.7554/elife.76577] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2022] [Accepted: 06/07/2022] [Indexed: 12/17/2022] Open
Abstract
Naturalistic animal behavior exhibits a strikingly complex organization in the temporal domain, with variability arising from at least three sources: hierarchical, contextual, and stochastic. What neural mechanisms and computational principles underlie such intricate temporal features? In this review, we provide a critical assessment of the existing behavioral and neurophysiological evidence for these sources of temporal variability in naturalistic behavior. Recent research converges on an emergent mechanistic theory of temporal variability based on attractor neural networks and metastable dynamics, arising via coordinated interactions between mesoscopic neural circuits. We highlight the crucial role played by structural heterogeneities as well as noise from mesoscopic feedback loops in regulating flexible behavior. We assess the shortcomings and missing links in the current theoretical and experimental literature and propose new directions of investigation to fill these gaps.
Collapse
Affiliation(s)
- Luca Mazzucato
- Institute of Neuroscience, Departments of Biology, Mathematics and Physics, University of Oregon
| |
Collapse
|
8
|
A flexible Bayesian framework for unbiased estimation of timescales. NATURE COMPUTATIONAL SCIENCE 2022; 2:193-204. [PMID: 36644291 PMCID: PMC9835171 DOI: 10.1038/s43588-022-00214-3] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/19/2023]
Abstract
Timescales characterize the pace of change for many dynamic processes in nature. Timescales are usually estimated by fitting the exponential decay of data autocorrelation in the time or frequency domain. Here we show that this standard procedure often fails to recover the correct timescales due to a statistical bias arising from the finite sample size. We develop an alternative approach which estimates timescales by fitting the sample autocorrelation or power spectrum with a generative model based on a mixture of Ornstein-Uhlenbeck (OU) processes using adaptive approximate Bayesian computations (aABC). Our method accounts for finite sample size and noise in data and returns a posterior distribution of timescales that quantifies the estimation uncertainty and can be used for model selection. We demonstrate the accuracy of our method on synthetic data and illustrate its application to recordings from primate cortex. We provide a customizable Python package implementing our framework with different generative models suitable for diverse applications.
Collapse
|
9
|
Dynamics on the manifold: Identifying computational dynamical activity from neural population recordings. Curr Opin Neurobiol 2021; 70:163-170. [PMID: 34837752 DOI: 10.1016/j.conb.2021.10.014] [Citation(s) in RCA: 18] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2021] [Revised: 10/27/2021] [Accepted: 10/28/2021] [Indexed: 11/21/2022]
Abstract
The question of how the collective activity of neural populations gives rise to complex behaviour is fundamental to neuroscience. At the core of this question lie considerations about how neural circuits can perform computations that enable sensory perception, decision making, and motor control. It is thought that such computations are implemented through the dynamical evolution of distributed activity in recurrent circuits. Thus, identifying dynamical structure in neural population activity is a key challenge towards a better understanding of neural computation. At the same time, interpreting this structure in light of the computation of interest is essential for linking the time-varying activity patterns of the neural population to ongoing computational processes. Here, we review methods that aim to quantify structure in neural population recordings through a dynamical system defined in a low-dimensional latent variable space. We discuss advantages and limitations of different modelling approaches and address future challenges for the field.
Collapse
|
10
|
Genkin M, Hughes O, Engel TA. Learning non-stationary Langevin dynamics from stochastic observations of latent trajectories. Nat Commun 2021; 12:5986. [PMID: 34645828 PMCID: PMC8514604 DOI: 10.1038/s41467-021-26202-1] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/14/2021] [Accepted: 09/22/2021] [Indexed: 11/09/2022] Open
Abstract
Many complex systems operating far from the equilibrium exhibit stochastic dynamics that can be described by a Langevin equation. Inferring Langevin equations from data can reveal how transient dynamics of such systems give rise to their function. However, dynamics are often inaccessible directly and can be only gleaned through a stochastic observation process, which makes the inference challenging. Here we present a non-parametric framework for inferring the Langevin equation, which explicitly models the stochastic observation process and non-stationary latent dynamics. The framework accounts for the non-equilibrium initial and final states of the observed system and for the possibility that the system's dynamics define the duration of observations. Omitting any of these non-stationary components results in incorrect inference, in which erroneous features arise in the dynamics due to non-stationary data distribution. We illustrate the framework using models of neural dynamics underlying decision making in the brain.
Collapse
Affiliation(s)
- Mikhail Genkin
- Cold Spring Harbor Laboratory, Cold Spring Harbor, NY, USA
| | | | | |
Collapse
|