1
|
Zhang F, Li C, Li Z, Dong L, Zhao J. Recent progress in three-terminal artificial synapses based on 2D materials: from mechanisms to applications. MICROSYSTEMS & NANOENGINEERING 2023; 9:16. [PMID: 36817330 PMCID: PMC9935897 DOI: 10.1038/s41378-023-00487-2] [Citation(s) in RCA: 20] [Impact Index Per Article: 20.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/12/2022] [Revised: 12/17/2022] [Accepted: 01/03/2023] [Indexed: 06/18/2023]
Abstract
Synapses are essential for the transmission of neural signals. Synaptic plasticity allows for changes in synaptic strength, enabling the brain to learn from experience. With the rapid development of neuromorphic electronics, tremendous efforts have been devoted to designing and fabricating electronic devices that can mimic synapse operating modes. This growing interest in the field will provide unprecedented opportunities for new hardware architectures for artificial intelligence. In this review, we focus on research of three-terminal artificial synapses based on two-dimensional (2D) materials regulated by electrical, optical and mechanical stimulation. In addition, we systematically summarize artificial synapse applications in various sensory systems, including bioplastic bionics, logical transformation, associative learning, image recognition, and multimodal pattern recognition. Finally, the current challenges and future perspectives involving integration, power consumption and functionality are outlined.
Collapse
Affiliation(s)
- Fanqing Zhang
- School of Mechatronical Engineering, Beijing Institute of Technology, 100081 Beijing, China
- Beijing Advanced Innovation Center for Intelligent Robots and Systems, Beijing Institute of Technology, 100081 Beijing, China
| | - Chunyang Li
- School of Mechatronical Engineering, Beijing Institute of Technology, 100081 Beijing, China
- Beijing Advanced Innovation Center for Intelligent Robots and Systems, Beijing Institute of Technology, 100081 Beijing, China
| | - Zhongyi Li
- School of Mechatronical Engineering, Beijing Institute of Technology, 100081 Beijing, China
- Beijing Advanced Innovation Center for Intelligent Robots and Systems, Beijing Institute of Technology, 100081 Beijing, China
| | - Lixin Dong
- Department of Biomedical Engineering, City University of Hong Kong, Kowloon Tong, 999077 Hong Kong, China
| | - Jing Zhao
- School of Mechatronical Engineering, Beijing Institute of Technology, 100081 Beijing, China
- Beijing Advanced Innovation Center for Intelligent Robots and Systems, Beijing Institute of Technology, 100081 Beijing, China
| |
Collapse
|
2
|
Elliott T. Mean First Passage Memory Lifetimes by Reducing Complex Synapses to Simple Synapses. Neural Comput 2017; 29:1468-1527. [PMID: 28333590 DOI: 10.1162/neco_a_00956] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Memory models that store new memories by forgetting old ones have memory lifetimes that are rather short and grow only logarithmically in the number of synapses. Attempts to overcome these deficits include "complex" models of synaptic plasticity in which synapses possess internal states governing the expression of synaptic plasticity. Integrate-and-express, filter-based models of synaptic plasticity propose that synapses act as low-pass filters, integrating plasticity induction signals before expressing synaptic plasticity. Such mechanisms enhance memory lifetimes, leading to an initial rise in the memory signal that is in radical contrast to other related, but nonintegrative, memory models. Because of the complexity of models with internal synaptic states, however, their dynamics can be more difficult to extract compared to "simple" models that lack internal states. Here, we show that by focusing only on processes that lead to changes in synaptic strength, we can integrate out internal synaptic states and effectively reduce complex synapses to simple synapses. For binary-strength synapses, these simplified dynamics then allow us to work directly in the transitions in perceptron activation induced by memory storage rather than in the underlying transitions in synaptic configurations. This permits us to write down master and Fokker-Planck equations that may be simplified under certain, well-defined approximations. These methods allow us to see that memory based on synaptic filters can be viewed as an initial transient that leads to memory signal rise, followed by the emergence of Ornstein-Uhlenbeck-like dynamics that return the system to equilibrium. We may use this approach to compute mean first passage time-defined memory lifetimes for complex models of memory storage.
Collapse
Affiliation(s)
- Terry Elliott
- Department of Electronics and Computer Science, University of Southampton, Highfield, Southampton, SO17 1BJ, U.K.
| |
Collapse
|
3
|
Brea J, Gerstner W. Does computational neuroscience need new synaptic learning paradigms? Curr Opin Behav Sci 2016. [DOI: 10.1016/j.cobeha.2016.05.012] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
|
4
|
Elliott T. The Enhanced Rise and Delayed Fall of Memory in a Model of Synaptic Integration: Extension to Discrete State Synapses. Neural Comput 2016; 28:1927-84. [PMID: 27391686 DOI: 10.1162/neco_a_00867] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Integrate-and-express models of synaptic plasticity propose that synapses may act as low-pass filters, integrating synaptic plasticity induction signals in order to discern trends before expressing synaptic plasticity. We have previously shown that synaptic filtering strongly controls destabilizing fluctuations in developmental models. When applied to palimpsest memory systems that learn new memories by forgetting old ones, we have also shown that with binary-strength synapses, integrative synapses lead to an initial memory signal rise before its fall back to equilibrium. Such an initial rise is in dramatic contrast to nonintegrative synapses, in which the memory signal falls monotonically. We now extend our earlier analysis of palimpsest memories with synaptic filters to consider the more general case of discrete state, multilevel synapses. We derive exact results for the memory signal dynamics and then consider various simplifying approximations. We show that multilevel synapses enhance the initial rise in the memory signal and then delay its subsequent fall by inducing a plateau-like region in the memory signal. Such dynamics significantly increase memory lifetimes, defined by a signal-to-noise ratio (SNR). We derive expressions for optimal choices of synaptic parameters (filter size, number of strength states, number of synapses) that maximize SNR memory lifetimes. However, we find that with memory lifetimes defined via mean-first-passage times, such optimality conditions do not exist, suggesting that optimality may be an artifact of SNRs.
Collapse
Affiliation(s)
- Terry Elliott
- Department of Electronics and Computer Science, University of Southampton, Highfield, Southampton, SO17 1BJ, U.K.
| |
Collapse
|
5
|
Elliott T, Lagogiannis K. The Rise and Fall of Memory in a Model of Synaptic Integration. Neural Comput 2012; 24:2604-54. [DOI: 10.1162/neco_a_00335] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Plasticity-inducing stimuli must typically be presented many times before synaptic plasticity is expressed, perhaps because induction signals gradually accumulate before overt strength changes occur. We consider memory dynamics in a mathematical model with synapses that integrate plasticity induction signals before expressing plasticity. We find that the memory trace initially rises before reaching a maximum and then falling. The memory signal dissociates into separate oblivescence and reminiscence components, with reminiscence initially dominating recall. In radical contrast, related but nonintegrative models exhibit only a highly problematic oblivescence. Synaptic integration mechanisms possess natural timescales, depending on the statistics of the induction signals. Together with neuromodulation, these timescales may therefore also begin to provide a natural account of the well-known spacing effect in the transition to late-phase plasticity. Finally, we propose experiments that could distinguish between integrative and nonintegrative synapses. Such experiments should further elucidate the synaptic signal processing mechanisms postulated by our model.
Collapse
Affiliation(s)
- Terry Elliott
- Department of Electronics and Computer Science, University of Southampton, Highfield, Southampton, SO17 1BJ, U.K
| | - Konstantinos Lagogiannis
- Department of Electronics and Computer Science, University of Southampton, Highfield, Southampton, SO17 1BJ, U.K
| |
Collapse
|
6
|
Elliott T. The Mean Time to Express Synaptic Plasticity in Integrate-and-Express, Stochastic Models of Synaptic Plasticity Induction. Neural Comput 2011; 23:124-59. [DOI: 10.1162/neco_a_00061] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
Stochastic models of synaptic plasticity propose that single synapses perform a directed random walk of fixed step sizes in synaptic strength, thereby embracing the view that the mechanisms of synaptic plasticity constitute a stochastic dynamical system. However, fluctuations in synaptic strength present a formidable challenge to such an approach. We have previously proposed that single synapses must interpose an integration and filtering mechanism between the induction of synaptic plasticity and the expression of synaptic plasticity in order to control fluctuations. We analyze a class of three such mechanisms in the presence of possibly non-Markovian plasticity induction processes, deriving expressions for the mean expression time in these models. One of these filtering mechanisms constitutes a discrete low-pass filter that could be implemented on a small collection of molecules at single synapses, such as CaMKII, and we analyze this discrete filter in some detail. After considering Markov induction processes, we examine our own stochastic model of spike-timing-dependent plasticity, for which the probability density functions of the induction of plasticity steps have previously been derived. We determine the dependence of the mean time to express a plasticity step on pre- and postsynaptic firing rates in this model, and we also consider, numerically, the long-term stability against fluctuations of patterns of neuronal connectivity that typically emerge during neuronal development.
Collapse
Affiliation(s)
- Terry Elliott
- Department of Electronics and Computer Science, University of Southampton, Highfield, Southampton, SO17 1BJ, U.K
| |
Collapse
|
7
|
Elliott T. Stability against fluctuations: scaling, bifurcations, and spontaneous symmetry breaking in stochastic models of synaptic plasticity. Neural Comput 2010; 23:674-734. [PMID: 21162665 DOI: 10.1162/neco_a_00088] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
In stochastic models of synaptic plasticity based on a random walk, the control of fluctuations is imperative. We have argued that synapses could act as low-pass filters, filtering plasticity induction steps before expressing a step change in synaptic strength. Earlier work showed, in simulation, that such a synaptic filter tames fluctuations very well, leading to patterns of synaptic connectivity that are stable for long periods of time. Here, we approach this problem analytically. We explicitly calculate the lifetime of meta-stable states of synaptic connectivity using a Fokker-Planck formalism in order to understand the dependence of this lifetime on both the plasticity step size and the filtering mechanism. We find that our analytical results agree very well with simulation results, despite having to make two approximations. Our analysis reveals, however, a deeper significance to the filtering mechanism and the plasticity step size. We show that a filter scales the step size into a smaller, effective step size. This scaling suggests that the step size may itself play the role of a temperature parameter, so that a filter cools the dynamics, thereby reducing the influence of fluctuations. Using the master equation, we explicitly demonstrate a bifurcation at a critical step size, confirming this interpretation. At this critical point, spontaneous symmetry breaking occurs in the class of stochastic models of synaptic plasticity that we consider.
Collapse
Affiliation(s)
- Terry Elliott
- Department of Electronics and Computer Science, University of Southampton, Highfield, Southampton, SO17 1BJ, U.K.
| |
Collapse
|
8
|
Elliott T. A Non-Markovian Random Walk Underlies a Stochastic Model of Spike-Timing-Dependent Plasticity. Neural Comput 2010; 22:1180-230. [DOI: 10.1162/neco.2009.06-09-1038] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
Abstract
A stochastic model of spike-timing-dependent plasticity (STDP) proposes that spike timing influences the probability but not the amplitude of synaptic strength change at single synapses. The classic, biphasic STDP profile emerges as a spatial average over many synapses presented with a single spike pair or as a temporal average over a single synapse presented with many spike pairs. We have previously shown that the model accounts for a variety of experimental data, including spike triplet results, and has a number of desirable theoretical properties, including being entirely self-stabilizing in all regions of parameter space. Our earlier analyses of the model have employed cumbersome spike-to-spike averaging arguments to derive results. Here, we show that the model can be reformulated as a non-Markovian random walk in synaptic strength, the step sizes being fixed as postulated. This change of perspective greatly simplifies earlier calculations by integrating out the proposed switch mechanism by which changes in strength are driven and instead concentrating on the changes in strength themselves. Moreover, this change of viewpoint is generative, facilitating further calculations that would be intractable, if not impossible, with earlier approaches. We prepare the machinery here for these later calculations but also briefly indicate how this machinery may be used by considering two particular applications.
Collapse
Affiliation(s)
- Terry Elliott
- Department of Electronics and Computer Science, University of Southampton, Highfield, Southampton, SO17 1BJ, U.K
| |
Collapse
|