Elizondo DA, Birkenhead R, Góngora M, Taillard E, Luyima P. Analysis and test of efficient methods for building recursive deterministic perceptron neural networks.
Neural Netw 2007;
20:1095-108. [PMID:
17904333 DOI:
10.1016/j.neunet.2007.07.009]
[Citation(s) in RCA: 13] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2006] [Accepted: 07/12/2007] [Indexed: 11/24/2022]
Abstract
The Recursive Deterministic Perceptron (RDP) feed-forward multilayer neural network is a generalisation of the single layer perceptron topology. This model is capable of solving any two-class classification problem as opposed to the single layer perceptron which can only solve classification problems dealing with linearly separable sets. For all classification problems, the construction of an RDP is done automatically and convergence is always guaranteed. Three methods for constructing RDP neural networks exist: Batch, Incremental, and Modular. The Batch method has been extensively tested and it has been shown to produce results comparable with those obtained with other neural network methods such as Back Propagation, Cascade Correlation, Rulex, and Ruleneg. However, no testing has been done before on the Incremental and Modular methods. Contrary to the Batch method, the complexity of these two methods is not NP-Complete. For the first time, a study on the three methods is presented. This study will allow the highlighting of the main advantages and disadvantages of each of these methods by comparing the results obtained while building RDP neural networks with the three methods in terms of the convergence time, the level of generalisation, and the topology size. The networks were trained and tested using the following standard benchmark classification datasets: IRIS, SOYBEAN, and Wisconsin Breast Cancer. The results obtained show the effectiveness of the Incremental and the Modular methods which are as good as that of the NP-Complete Batch method but with a much lower complexity level. The results obtained with the RDP are comparable to those obtained with the backpropagation and the Cascade Correlation algorithms.
Collapse