1
|
A New Selective Neural Network Ensemble Method Based on Error Vectorization and Its Application in High-density Polyethylene (HDPE) Cascade Reaction Process. Chin J Chem Eng 2012. [DOI: 10.1016/s1004-9541(12)60599-0] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2022]
|
2
|
|
3
|
|
4
|
Akhand MAH, Shill PC, Murase K. Hybrid Ensemble Construction with Selected Neural Networks. JOURNAL OF ADVANCED COMPUTATIONAL INTELLIGENCE AND INTELLIGENT INFORMATICS 2011. [DOI: 10.20965/jaciii.2011.p0652] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
A Neural Network Ensemble (NNE) is convenient for improving classification task performance. Among the remarkable number of methods based on different techniques for constructing NNEs, Negative Correlation Learning (NCL), bagging, and boosting are the most popular. None of them, however, could show better performance for all problems. To improve performance combining the complementary strengths of the individual methods, we propose two different ways to construct hybrid ensembles combining NCL with bagging and boosting. One produces a pool of predefined numbers of networks using standard NCL and bagging (or boosting) and then uses a genetic algorithm to select an optimal network subset for an NNE from the pool. Results of experiments confirmed that our proposals show consistently better performance with concise ensembles than conventional methods when tested using a suite of 25 benchmark problems.
Collapse
|