Creatore C, Sabathiel S, Solstad T. Learning exact enumeration and approximate estimation in deep neural network models.
Cognition 2021;
215:104815. [PMID:
34182145 DOI:
10.1016/j.cognition.2021.104815]
[Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/01/2020] [Revised: 06/12/2021] [Accepted: 06/15/2021] [Indexed: 01/08/2023]
Abstract
A system for approximate number discrimination has been shown to arise in at least two types of hierarchical neural network models-a generative Deep Belief Network (DBN) and a Hierarchical Convolutional Neural Network (HCNN) trained to classify natural objects. Here, we investigate whether the same two network architectures can learn to recognise exact numerosity. A clear difference in performance could be traced to the specificity of the unit responses that emerged in the last hidden layer of each network. In the DBN, the emergence of a layer of monotonic 'summation units' was sufficient to produce classification behaviour consistent with the behavioural signature of the approximate number system. In the HCNN, a layer of units uniquely tuned to the transition between particular numerosities effectively encoded a thermometer-like 'numerosity code' that ensured near-perfect classification accuracy. The results support the notion that parallel pattern-recognition mechanisms may give rise to exact and approximate number concepts, both of which may contribute to the learning of symbolic numbers and arithmetic.
Collapse