Ensemble pruning of ELM via migratory binary glowworm swarm optimization and margin distance minimization
- PDF / 1,192,063 Bytes
- 25 Pages / 439.37 x 666.142 pts Page_size
- 39 Downloads / 189 Views
Ensemble pruning of ELM via migratory binary glowworm swarm optimization and margin distance minimization Xuhui Zhu1,2 · Zhiwei Ni1,2 · Liping Ni1,2 · Feifei Jin1,2 · Meiying Cheng3 · Zhangjun Wu1,2
© Springer Science+Business Media, LLC, part of Springer Nature 2020
Abstract Ensemble pruning aims at attaining an ensemble composed of less size of leaners for improving classification ability. Extreme Learning Machine (ELM) is employed as a base learner in this work, in light of its salient features, an initial pool is constructed using ELM. An ensemble composed of ELMs with better performance and diversity can make it perform the best, but the average accuracy of the whole ELMs must be decreased as the increase of diversity among them. Hence there exists a balance between the diversity and the precision of ELMs. Existing works find it via diversity measures or heuristic algorithms, which cannot find the exact tradeoff. To solve the issue, ensemble pruning of ELM via migratory binary glowworm swarm optimization and margin distance minimization (EPEMBM) is proposed utilizing the integration of the proposed migratory binary glowworm swarm optimization (MBGSO) and margin distance minimization (MDM). First, the created ELMs in a pool can be pre-pruned by MDM, and it can markedly downsize the ELMs in the pool, and significantly alleviates its computation overhead. Second, the retaining ELMs are further pruned utilizing MBGSO, and the final ensemble is attained with a high efficiency. Experimental results on 21 UCI classification tasks indicate that EPEMBM outperforms techniques, and that its effectiveness and efficiency. It is a very useful tool for solving the selection problem of ELMs. Keywords Glowworm swarm optimization · Margin distance minimization · Ensemble pruning · Diversity · Extreme learning machine
B
Zhiwei Ni [email protected]
1
School of Management, Hefei University of Technology, Hefei 230009, China
2
Key Laboratory of Process Optimization and Intelligent Decision-Making, Ministry of Education, Hefei 230009, China
3
Business School, Huzhou University, Huzhou 313000, China
123
X. Zhu et al.
1 Introduction As we known, ensemble learning is a well-known learning paradigm within the pattern recognition [1, 2], machine learning [3, 4], and data mining [5]. It is to build a classification model by integrating multiple classifiers [6, 7]. In comparison with single classifiers, an ensemble with multiple learners shows its improvements in classification capacity, which is widely used in many domains [8–10], such as image processing [11–14], age prediction [15, 16], detection [17–19], and the like. To perk up the predictive ability generalization performance of an ensemble, many learners are used to form the ensemble [7, 8]. However, with the data size increasing rapidly, it requires the amount of storage space and computation overhead, and ensemble pruning can solve it. It aims at selecting the best subset of learners to form the final ensemble for improving the predictive ability with less resource con
Data Loading...