1
|
Benedetti M, Ventura E, Marinari E, Ruocco G, Zamponi F. Supervised perceptron learning versus unsupervised Hebbian unlearning: approaching optimal memory retrieval in Hopfield-like networks. J Chem Phys 2022; 156:104107. [DOI: 10.1063/5.0084219] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
|
2
|
Balanced excitation and inhibition are required for high-capacity, noise-robust neuronal selectivity. Proc Natl Acad Sci U S A 2017; 114:E9366-E9375. [PMID: 29042519 DOI: 10.1073/pnas.1705841114] [Citation(s) in RCA: 49] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Neurons and networks in the cerebral cortex must operate reliably despite multiple sources of noise. To evaluate the impact of both input and output noise, we determine the robustness of single-neuron stimulus selective responses, as well as the robustness of attractor states of networks of neurons performing memory tasks. We find that robustness to output noise requires synaptic connections to be in a balanced regime in which excitation and inhibition are strong and largely cancel each other. We evaluate the conditions required for this regime to exist and determine the properties of networks operating within it. A plausible synaptic plasticity rule for learning that balances weight configurations is presented. Our theory predicts an optimal ratio of the number of excitatory and inhibitory synapses for maximizing the encoding capacity of balanced networks for given statistics of afferent activations. Previous work has shown that balanced networks amplify spatiotemporal variability and account for observed asynchronous irregular states. Here we present a distinct type of balanced network that amplifies small changes in the impinging signals and emerges automatically from learning to perform neuronal and network functions robustly.
Collapse
|
3
|
Abstract
The brain map project aims to map out the neuron connections of the human brain. Even with all of the wirings mapped out, the global and physical understandings of the function and behavior are still challenging. Hopfield quantified the learning and memory process of symmetrically connected neural networks globally through equilibrium energy. The energy basins of attractions represent memories, and the memory retrieval dynamics is determined by the energy gradient. However, the realistic neural networks are asymmetrically connected, and oscillations cannot emerge from symmetric neural networks. Here, we developed a nonequilibrium landscape-flux theory for realistic asymmetrically connected neural networks. We uncovered the underlying potential landscape and the associated Lyapunov function for quantifying the global stability and function. We found the dynamics and oscillations in human brains responsible for cognitive processes and physiological rhythm regulations are determined not only by the landscape gradient but also by the flux. We found that the flux is closely related to the degrees of the asymmetric connections in neural networks and is the origin of the neural oscillations. The neural oscillation landscape shows a closed-ring attractor topology. The landscape gradient attracts the network down to the ring. The flux is responsible for coherent oscillations on the ring. We suggest the flux may provide the driving force for associations among memories. We applied our theory to rapid-eye movement sleep cycle. We identified the key regulation factors for function through global sensitivity analysis of landscape topography against wirings, which are in good agreements with experiments.
Collapse
|
4
|
|
5
|
Rigotti M, Ben Dayan Rubin D, Morrison SE, Salzman CD, Fusi S. Attractor concretion as a mechanism for the formation of context representations. Neuroimage 2010; 52:833-47. [PMID: 20100580 DOI: 10.1016/j.neuroimage.2010.01.047] [Citation(s) in RCA: 27] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2009] [Revised: 01/09/2010] [Accepted: 01/14/2010] [Indexed: 10/19/2022] Open
Abstract
Complex tasks often require the memory of recent events, the knowledge about the context in which they occur, and the goals we intend to reach. All this information is stored in our mental states. Given a set of mental states, reinforcement learning (RL) algorithms predict the optimal policy that maximizes future reward. RL algorithms assign a value to each already-known state so that discovering the optimal policy reduces to selecting the action leading to the state with the highest value. But how does the brain create representations of these mental states in the first place? We propose a mechanism for the creation of mental states that contain information about the temporal statistics of the events in a particular context. We suggest that the mental states are represented by stable patterns of reverberating activity, which are attractors of the neural dynamics. These representations are built from neurons that are selective to specific combinations of external events (e.g. sensory stimuli) and pre-existent mental states. Consistent with this notion, we find that neurons in the amygdala and in orbitofrontal cortex (OFC) often exhibit this form of mixed selectivity. We propose that activating different mixed selectivity neurons in a fixed temporal order modifies synaptic connections so that conjunctions of events and mental states merge into a single pattern of reverberating activity. This process corresponds to the birth of a new, different mental state that encodes a different temporal context. The concretion process depends on temporal contiguity, i.e. on the probability that a combination of an event and mental states follows or precedes the events and states that define a certain context. The information contained in the context thereby allows an animal to assign unambiguously a value to the events that initially appeared in different situations with different meanings.
Collapse
Affiliation(s)
- Mattia Rigotti
- Department of Neuroscience, Columbia University College of Physicians and Surgeons, New York, NY 10032-2695, USA
| | | | | | | | | |
Collapse
|
6
|
|
7
|
Chengxiang Z, Dasgupta C, Singh MP. Retrieval properties of a Hopfield model with random asymmetric interactions. Neural Comput 2000; 12:865-80. [PMID: 10770835 DOI: 10.1162/089976600300015628] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/01/2023]
Abstract
The process of pattern retrieval in a Hopfield model in which a random antisymmetric component is added to the otherwise symmetric synaptic matrix is studied by computer simulations. The introduction of the anti-symmetric component is found to increase the fraction of random inputs that converge to the memory states. However, the size of the basin of attraction of a memory state does not show any significant change when asymmetry is introduced in the synaptic matrix. We show that this is due to the fact that the spurious fixed points, which are destabilized by the introduction of asymmetry, have very small basins of attraction. The convergence time to spurious fixed-point attractors increases faster than that for the memory states as the asymmetry parameter is increased. The possibility of convergence to spurious fixed points is greatly reduced if a suitable upper limit is set for the convergence time. This prescription works better if the synaptic matrix has an antisymmetric component.
Collapse
Affiliation(s)
- Z Chengxiang
- Department of Physics, Indian Institute of Science, Bangalore 560012, India
| | | | | |
Collapse
|
8
|
|
9
|
|
10
|
Griniasty M, Gutfreund H. Learning and retrieval in attractor neural networks above saturation. ACTA ACUST UNITED AC 1999. [DOI: 10.1088/0305-4470/24/3/030] [Citation(s) in RCA: 53] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
|
11
|
Bouten M, Engel A, Komoda A, Serneels R. Quenched versus annealed dilution in neural networks. ACTA ACUST UNITED AC 1999. [DOI: 10.1088/0305-4470/23/20/025] [Citation(s) in RCA: 46] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
|
12
|
Gardner E, Gutfreund H, Yekutieli I. The phase space of interactions in neural networks with definite symmetry. ACTA ACUST UNITED AC 1999. [DOI: 10.1088/0305-4470/22/12/005] [Citation(s) in RCA: 30] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
13
|
|
14
|
|
15
|
Wong KYM, Sherrington D. Optimally adapted attractor neural networks in the presence of noise. ACTA ACUST UNITED AC 1999. [DOI: 10.1088/0305-4470/23/20/026] [Citation(s) in RCA: 39] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
16
|
|
17
|
Amit DJ, Evans MR, Horner H, Wong KYM. Retrieval phase diagrams for attractor neural networks with optimal interactions. ACTA ACUST UNITED AC 1999. [DOI: 10.1088/0305-4470/23/14/032] [Citation(s) in RCA: 35] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
18
|
Opper M, Kleinz J, Kohler H, Kinzel W. Basins of attraction near the critical storage capacity for neural networks with constant stabilities. ACTA ACUST UNITED AC 1999. [DOI: 10.1088/0305-4470/22/9/010] [Citation(s) in RCA: 23] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
19
|
|
20
|
Mezard M. The space of interactions in neural networks: Gardner's computation with the cavity method. ACTA ACUST UNITED AC 1999. [DOI: 10.1088/0305-4470/22/12/018] [Citation(s) in RCA: 61] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
|
21
|
|
22
|
|
23
|
Nardulli G, Pasquariello G. Domains of attraction of neural networks at finite temperature. ACTA ACUST UNITED AC 1999. [DOI: 10.1088/0305-4470/24/5/024] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
|
24
|
Amit DJ, Campbell C, Wong KYM. The interaction space of neural networks with sign-constrained synapses. ACTA ACUST UNITED AC 1999. [DOI: 10.1088/0305-4470/22/21/030] [Citation(s) in RCA: 27] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022]
|
25
|
Raffin B, Gordon MB. Learning and generalization with Minimerror, a temperature-dependent learning algorithm. Neural Comput 1995; 7:1206-24. [PMID: 7584899 DOI: 10.1162/neco.1995.7.6.1206] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/26/2023]
Abstract
We study the numerical performances of Minimerror, a recently introduced learning algorithm for the perceptron that has analytically been shown to be optimal both on learning linearly and nonlinearly separable functions. We present its implementation on learning linearly separable boolean functions. Numerical results are in excellent agreement with the theoretical predictions.
Collapse
Affiliation(s)
- B Raffin
- CEA/Département de Recherche Fondamentale sur la Matière Condensée, SPSMS/MDN, Centre d'Etudes Nucléaires de Grenoble, France
| | | |
Collapse
|
26
|
Wong KY, Sherrington D. Neural networks optimally trained with noisy data. PHYSICAL REVIEW. E, STATISTICAL PHYSICS, PLASMAS, FLUIDS, AND RELATED INTERDISCIPLINARY TOPICS 1993; 47:4465-4482. [PMID: 9960524 DOI: 10.1103/physreve.47.4465] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/22/2023]
|
27
|
Bohr HG, Wolynes PG. Initial events of protein folding from an information-processing viewpoint. PHYSICAL REVIEW. A, ATOMIC, MOLECULAR, AND OPTICAL PHYSICS 1992; 46:5242-5248. [PMID: 9908746 DOI: 10.1103/physreva.46.5242] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/22/2023]
|
28
|
Engel A, Bouten M, Komoda A, Serneels R. Enlarged basin of attraction in neural networks with persistent stimuli. PHYSICAL REVIEW. A, ATOMIC, MOLECULAR, AND OPTICAL PHYSICS 1990; 42:4998-5005. [PMID: 9904612 DOI: 10.1103/physreva.42.4998] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/22/2023]
|
29
|
K�hler H, Diederich S, Kinzel W, Opper M. Learning algorithm for a neural network with binary synapses. ACTA ACUST UNITED AC 1990. [DOI: 10.1007/bf01307854] [Citation(s) in RCA: 22] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
30
|
Mézard M. Learning algorithms in neural networks: recent results. Neurocomputing 1990. [DOI: 10.1007/978-3-642-76153-9_8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
31
|
|
32
|
|
33
|
Coolen AC, Jonker HJ, Ruijgrok TW. Size of the domains of attraction in the Hopfield model. PHYSICAL REVIEW. A, GENERAL PHYSICS 1989; 40:5295-5298. [PMID: 9902795 DOI: 10.1103/physreva.40.5295] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/22/2023]
|