A Weighted Ensemble Learning Algorithm Based on Diversity Using a Novel Particle Swarm Optimization Approach
Abstract
:1. Introduction
2. Related Work
2.1. Notations and definitions
2.2. Combining Strategies of Classifiers
2.3. Classifier Diversity Metrics
2.3.1. Individual Learner Diversity Measurement Method
2.3.2. Weighted Ensemble Process Diversity Measurement Method
2.4. PSO Algorithm
2.4.1. Basic PSO Solution Process
2.4.2. PSO Algorithm for Binary Optimization
2.4.3. Mixed PSO Algorithm
3. Materials and Methods
3.1. Architecture of the Proposed Methodology
3.2. Description of the WELAD
3.2.1. Stage I: Generate the Ensemble Classifier with Appropriate Diversity
Algorithm1. Using MPSO to optimize the resampling ratio of subset and carry out feature selection |
|
3.2.2. Stage II: Generate the Weighted Ensemble Classifier
Algorithm2. Using PSO to optimize the weight parameters of the classifier w |
|
4. Results and Discussion
4.1. Experimental Dataset and Setup
4.1.1. Experimental Dataset
4.1.2. Experimental Setup
4.2. Experiment
4.2.1. Experimental Results
4.2.2. Statistical Test
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Rokach, L. Ensemble Learning: Pattern Classification Using Ensemble Methods; World Scientific Publishing Co Pte Ltd: Singapore, 2019. [Google Scholar]
- Pintelas, P.; Livieris, I.E. Special Issue on Ensemble Learning and Applications. Algorithms 2020, 13, 140. [Google Scholar] [CrossRef]
- Al-Abassi, A.; Karimipour, H.; Dehghantanha, A.; Parizi, R.M. An Ensemble Deep Learning-Based Cyber-Attack Detection in Industrial Control System. IEEE Access 2020, 8, 83965–83973. [Google Scholar] [CrossRef]
- Haider, S.; Akhunzada, A.; Mustafa, I.; Patel, T.B.; Fernandez, A.; Choo, K.R.; Iqbal, J. A Deep CNN Ensemble Framework for Efficient DDoS Attack Detection in Software Defined Networks. IEEE Access 2020, 8, 53972–53983. [Google Scholar] [CrossRef]
- Zhou, Y.; Wang, P. An ensemble learning approach for XSS attack detection with domain knowledge and threat intelligence. Comput. Secur. 2019, 82, 261–269. [Google Scholar] [CrossRef]
- Kim, E.; Lee, J.; Shin, H.; Yang, H.; Cho, S.; Nam, S.; Song, Y.; Yoon, J.; Kim, J. Champion-challenger analysis for credit card fraud detection: Hybrid ensemble and deep learning. Expert Syst. Appl. 2019, 128, 214–224. [Google Scholar] [CrossRef]
- Thaseen, I.S.; Lavanya, K. Credit Card Fraud Detection Using Correlation-based Feature Extraction and Ensemble of Learners. In International Conference on Intelligent Computing and Smart Communication 2019. Algorithms for Intelligent Systems; Singh Tomar, G., Chaudhari, N., Barbosa, J., Aghwariya, M., Eds.; Springer: Singapore, 2020; pp. 7–18. [Google Scholar]
- Vennelakanti, A.; Shreya, S.; Rajendran, R.; Sarkar, D.; Muddegowda, D.; Hanagal, P. Traffic sign detection and recognition using a CNN ensemble. In Proceedings of the 2019 IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA, 11–13 January 2019; IEEE: Los Alamitos, CA, USA, 2019; pp. 1–4. [Google Scholar]
- Xia, S.; Xia, Y.; Yu, H.; Liu, Q.; Luo, Y.; Wang, G.; Chen, Z. Transferring Ensemble Representations Using Deep Convolutional Neural Networks for Small-Scale Image Classification. IEEE Access 2019, 7, 168175–168186. [Google Scholar] [CrossRef]
- Brunese, L.; Mercaldo, F.; Reginelli, A.; Santone, A. An ensemble learning approach for brain cancer detection exploiting radiomic features. Comput. Methods Prog. Bio. 2020, 185, 105134. [Google Scholar] [CrossRef]
- Zheng, H.; Zhang, Y.; Yang, L.; Liang, P.; Zhao, Z.; Wang, C.; Chen, D.Z. A new ensemble learning framework for 3D biomedical image segmentation. In Proceedings of the AAAI Conference on Artificial Intelligence, Honolulu, HI, USA, 27 January–1 February 2019; pp. 5909–5916. [Google Scholar]
- Kim, M.; Lee, M.; An, M.; Lee, H. Effective automatic defect classification process based on CNN with stacking ensemble model for TFT-LCD panel. J. Intell. Manuf. 2020, 31, 1165–1174. [Google Scholar] [CrossRef]
- Wen, L.; Gao, L.; Dong, Y.; Zhu, Z. A negative correlation ensemble transfer learning method for fault diagnosis based on convolutional neural network. Math. Biosci. Eng. 2019, 16, 3311–3330. [Google Scholar] [CrossRef]
- Bullock, E.L.; Woodcock, C.E.; Holden, C.E. Improved change monitoring using an ensemble of time series algorithms. Remote Sens. Environ. 2020, 238, 111165. [Google Scholar] [CrossRef]
- Ribeiro, M.H.D.M.; Dos Santos Coelho, L. Ensemble approach based on bagging, boosting and stacking for short-term prediction in agribusiness time series. Appl. Soft. Comput. 2020, 86, 105837. [Google Scholar] [CrossRef]
- Salzberg, S.L. C4. 5: Programs for Machine Learning by J. Ross Quinlan; Morgan Kaufmann Publishers, Inc.: San Francisco, CA, USA, 1993. [Google Scholar]
- Vapnik, V. The Nature of Statistical Learning Theory; Springer Science & Business Media: Berlin, Germany, 2013. [Google Scholar]
- Rumelhart, D.E.; Hinton, G.E.; Williams, R.J. Learning Internal Representations by Error Propagation; California Univ. San Diego, La Jolla Inst. for Cognitive Science: La Jolla, CA, USA, 1985. [Google Scholar]
- Aha, D.W.; Kibler, D.; Albert, M.K. Instance-based learning algorithms. Mach. Learn. 1991, 6, 37–66. [Google Scholar] [CrossRef] [Green Version]
- Large, J.; Lines, J.; Bagnall, A. A probabilistic classifier ensemble weighting scheme based on cross-validated accuracy estimates. Data Min. Knowl. Disc. 2019, 33, 1674–1709. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Sagi, O.; Rokach, L. Ensemble learning: A survey. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2018, 8, e1249. [Google Scholar] [CrossRef]
- Galar, M.; Fernandez, A.; Barrenechea, E.; Bustince, H.; Herrera, F. A review on ensembles for the class imbalance problem: Bagging-, boosting-, and hybrid-based approaches. IEEE Trans. Syst. Man Cybern. Part. C 2011, 42, 463–484. [Google Scholar] [CrossRef]
- Zhou, Z. Ensemble Methods: Foundations and Algorithms; CRC Press: Boca Raton, FL, USA, 2012. [Google Scholar]
- Livieris, I.E.; Iliadis, L.; Pintelas, P. On ensemble techniques of weight-constrained neural networks. Evol. Syst. 2020, 1–13. [Google Scholar] [CrossRef]
- Freund, Y.; Schapire, R.E. Experiments with a New Boosting Algorithm. In ICML; Citeseer: Princeton, NJ, USA, 1996; pp. 148–156. [Google Scholar]
- Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef] [Green Version]
- Friedman, J.H. Greedy function approximation: A gradient boosting machine. Ann. Stat. 2001, 29, 1189–1232. [Google Scholar] [CrossRef]
- Geurts, P.; Ernst, D.; Wehenkel, L. Extremely randomized trees. Mach. Learn. 2006, 63, 3–42. [Google Scholar] [CrossRef] [Green Version]
- Rokach, L. Pattern Classification Using Ensemble Methods; World Scientific: Singapore, 2010; Volume 75. [Google Scholar]
- Melville, P.; Mooney, R.J. Creating diversity in ensembles using artificial data. Inform. Fusion 2005, 6, 99–111. [Google Scholar] [CrossRef]
- Bi, Y. The impact of diversity on the accuracy of evidential classifier ensembles. Int. J. Approx. Reason 2012, 53, 584–607. [Google Scholar] [CrossRef] [Green Version]
- Mao, S.; Lin, W.; Jiao, L.; Gou, S.; Chen, J. End-to-End Ensemble Learning by Exploiting the Correlation Between Individuals and Weights. IEEE Trans. Cybern. 2019. [Google Scholar] [CrossRef] [PubMed]
- Kuncheva, L.I. A bound on kappa-error diagrams for analysis of classifier ensembles. IEEE Trans. Knowl. Data Eng. 2011, 25, 494–501. [Google Scholar] [CrossRef]
- Liu, H.; Jiang, Z.; Song, Y.; Zhang, T.; Wu, Z. User preference modeling based on meta paths and diversity regularization in heterogeneous information networks. Knowl. Based Syst. 2019, 181, 104784. [Google Scholar] [CrossRef]
- Zhang, H.; He, H.; Zhang, W. Classifier selection and clustering with fuzzy assignment in ensemble model for credit scoring. Neurocomputing 2018, 316, 210–221. [Google Scholar] [CrossRef]
- Mao, S.; Chen, J.; Jiao, L.; Gou, S.; Wang, R. Maximizing diversity by transformed ensemble learning. Appl. Soft. Comput. 2019, 82, 105580. [Google Scholar] [CrossRef]
- Pratt, A.J.; Suárez, E.; Zuckerman, D.M.; Chong, L.T. Extensive Evaluation of Weighted Ensemble Strategies for Calculating Rate Constants and Binding Affinities of Molecular Association/Dissociation Processes. bioRxiv 2019, 671172. [Google Scholar]
- Livieris, I.E.; Kanavos, A.; Tampakas, V.; Pintelas, P. A weighted voting ensemble self-labeled algorithm for the detection of lung abnormalities from X-rays. Algorithms 2019, 12, 64. [Google Scholar] [CrossRef] [Green Version]
- Pawlikowski, M.; Chorowska, A. Weighted ensemble of statistical models. Int. J. Forecast. 2020, 36, 93–97. [Google Scholar] [CrossRef]
- Darwish, A.; Hassanien, A.E.; Das, S. A survey of swarm and evolutionary computing approaches for deep learning. Artif. Intell. Rev. 2020, 53, 1767–1812. [Google Scholar] [CrossRef]
- Li, X.; Wong, K. Multiobjective patient stratification using evolutionary multiobjective optimization. IEEE J. Biomed. Health 2017, 22, 1619–1629. [Google Scholar] [CrossRef] [PubMed]
- Li, X.; Wong, K. Evolutionary multiobjective clustering and its applications to patient stratification. IEEE Trans. Cybern. 2018, 49, 1680–1693. [Google Scholar] [CrossRef] [PubMed]
- Li, X.; Zhang, S.; Wong, K. Single-cell RNA-seq interpretations using evolutionary multiobjective ensemble pruning. Bioinformatics 2019, 35, 2809–2817. [Google Scholar] [CrossRef] [PubMed]
- Sengupta, S.; Basak, S.; Peters, R.A. Particle Swarm Optimization: A survey of historical and recent developments with hybridization perspectives. Mach. Learn. Knowl. Extr. 2019, 1, 157–191. [Google Scholar] [CrossRef] [Green Version]
- Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995; IEEE: Los Alamitos, CA, USA, 1995; pp. 1942–1948. [Google Scholar]
- Kuncheva, L.I.; Whitaker, C.J. Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Mach. Learn. 2003, 51, 181–207. [Google Scholar] [CrossRef]
- Zhou, H.; Sun, G.; Fu, S.; Liu, J.; Zhou, X.; Zhou, J. A big data mining approach of PSO-based BP neural network for financial risk management with IoT. IEEE Access 2019, 7, 154035–154043. [Google Scholar] [CrossRef]
- Jamali, B.; Rasekh, M.; Jamadi, F.; Gandomkar, R.; Makiabadi, F. Using PSO-GA algorithm for training artificial neural network to forecast solar space heating system parameters. Appl. Therm. Eng. 2019, 147, 647–660. [Google Scholar] [CrossRef]
- Wang, Y.; Bai, P.; Liang, X.; Wang, W.; Zhang, J.; Fu, Q. Reconnaissance mission conducted by UAV swarms based on distributed PSO path planning algorithms. IEEE Access 2019, 7, 105086–105099. [Google Scholar] [CrossRef]
- Joloudari, J.H.; Saadatfar, H.; Dehzangi, A.; Shamshirband, S. Computer-aided decision-making for predicting liver disease using PSO-based optimized SVM with feature selection. Inform. Med. Unlocked 2019, 17, 100255. [Google Scholar] [CrossRef]
- Wang, Y.; Ma, Z.; Wong, K.; Li, X. Evolving Multiobjective Cancer Subtype Diagnosis from Cancer Gene Expression Data. IEEE/ACM Trans. Comput. Biol. Bioinform. 2020. [Google Scholar] [CrossRef]
- Tam, J.H.; Ong, Z.C.; Ismail, Z.; Ang, B.C.; Khoo, S.Y. A new hybrid GA− ACO− PSO algorithm for solving various engineering design problems. Int. J. Comput. Math. 2019, 96, 883–919. [Google Scholar] [CrossRef]
- Taherkhani, M.; Safabakhsh, R. A novel stability-based adaptive inertia weight for particle swarm optimization. Appl. Soft. Comput. 2016, 38, 281–295. [Google Scholar] [CrossRef]
- Kennedy, J.; Eberhart, R.C. A discrete binary version of the particle swarm algorithm. In Proceedings of the 1997 IEEE International Conference on Systems, Man, and Cybernetics. Computational Cybernetics and Simulation, Orlando, FL, USA, 12–15 October 1997; IEEE: Los Alamitos, CA, USA, 1997; pp. 4104–4108. [Google Scholar]
- Nguyen, B.H.; Xue, B.; Andreae, P. A Novel Binary Particle Swarm Optimization Algorithm and its Applications on Knapsack and Feature Selection Problems. In Intelligent and Evolutionary Systems; Springer: Berlin/Heidelberg, Germany, 2017; pp. 319–332. [Google Scholar]
- Chowdhury, S.; Tong, W.; Messac, A.; Zhang, J. A mixed-discrete particle swarm optimization algorithm with explicit diversity-preservation. Struct. Multidiscip. Optim. 2013, 47, 367–388. [Google Scholar] [CrossRef] [Green Version]
- Nguyen, B.H.; Xue, B.; Andreae, P.; Zhang, M. A new binary particle swarm optimization approach: Momentum and dynamic balance between exploration and exploitation. IEEE Trans. Cybern. 2019. [Google Scholar] [CrossRef] [PubMed]
- Demšar, J. Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 2006, 7, 1–30. [Google Scholar]
- Li, X.; Wong, K. Elucidating genome-wide protein-RNA interactions using differential evolution. IEEE/ACM Trans. Comput. Biol. Bioinform. 2017, 16, 272–282. [Google Scholar] [CrossRef] [PubMed]
- Li, X.; Yin, M. A particle swarm inspired cuckoo search algorithm for real parameter optimization. Soft. Comput. 2016, 20, 1389–1413. [Google Scholar] [CrossRef]
- Li, X.; Zhang, S.; Wong, K. Nature-inspired multiobjective epistasis elucidation from genome-wide association studies. IEEE/ACM Trans. Comput. Biol. Bioinform. 2018, 17, 226–237. [Google Scholar] [CrossRef]
- Shahhosseini, M.; Hu, G.; Pham, H. Optimizing ensemble weights and hyperparameters of machine learning models for regression problems. arXiv 2019, arXiv:1908.05287. [Google Scholar]
Notations | Description |
---|---|
The lth classifier (L = size of ensemble classifier) | |
Ensemble classifier, | |
The number of dimensions (features) of dataset | |
The number of samples of dataset | |
The categories of dataset | |
The predicted output of classifier on sample | |
The weight vector of an ensemble classifier | |
The multiplication operation of the corresponding elements in the same dimension vector | |
The XOR operation | |
The prediction output of ensemble learning by the majority voting method | |
The prediction output of ensemble learning by the weighted voting method | |
Diversity measures based on consideration of accuracy | |
Diversity measures based on avoiding overfitting | |
The number of subsets on a dataset using K-fold cross-validation method | |
The number of clusters on a dataset using K-means clustering method |
Dataset ID | Dataset Name | Samples | Features | Classes | Computational Complexity |
---|---|---|---|---|---|
1 | abalone | 4177 | 8 | 5 | 3.57 × 1010 |
2 | australian | 690 | 14 | 2 | 1.09 × 1011 |
3 | balance-scale | 625 | 4 | 3 | 2.50 × 107 |
4 | breast_cancer-wis | 699 | 9 | 2 | 2.25 × 109 |
5 | breast_cancer | 286 | 9 | 2 | 3.77 × 108 |
6 | cervical_cancer | 858 | 32 | 2 | 1.01 × 1017 |
7 | cleveland | 303 | 13 | 5 | 9.78 × 109 |
8 | crowdsourced-mapping | 10,545 | 28 | 6 | 8.36 × 1017 |
9 | dermatology | 366 | 34 | 6 | 7.82 × 1016 |
10 | diabetes | 768 | 8 | 2 | 1.21 × 109 |
11 | drug_consumption | 1885 | 12 | 7 | 1.75 × 1011 |
12 | firm-teacher | 10,796 | 16 | 4 | 1.22 × 1014 |
13 | flags | 194 | 29 | 8 | 5.86 × 1014 |
14 | frogs_mfccs | 7195 | 22 | 10 | 4.78 × 1015 |
15 | haberman | 306 | 3 | 2 | 2.25 × 106 |
16 | heart-disease-h | 294 | 13 | 2 | 9.21 × 109 |
17 | heart-stalog | 270 | 13 | 2 | 7.76 × 109 |
18 | hepatitis | 155 | 19 | 2 | 2.39 × 1011 |
19 | indian-liver | 583 | 10 | 2 | 3.48 × 109 |
20 | ionosphere | 351 | 34 | 2 | 7.20 × 1016 |
21 | lymph | 148 | 18 | 4 | 1.03 × 1011 |
22 | mammographic | 961 | 5 | 2 | 1.48 × 108 |
23 | primary-tumor | 339 | 17 | 2 | 2.56 × 1011 |
24 | segment | 2310 | 19 | 7 | 5.32 × 1013 |
25 | seismic-bumps | 2584 | 18 | 2 | 3.15 × 1013 |
26 | sonar | 208 | 60 | 2 | 2.99 × 1024 |
27 | transfusion | 748 | 4 | 2 | 3.58 × 107 |
28 | vehicle | 846 | 18 | 4 | 3.38 × 1012 |
29 | waveform | 5000 | 40 | 3 | 1.10 × 1021 |
30 | wholesale | 440 | 7 | 3 | 1.73 × 108 |
Dataset ID | AdaBoost | Bagging | DECORATE | ERT | GBDT | RF | WELAD |
---|---|---|---|---|---|---|---|
1 | 0.6481 | 0.6760 | 0.6831 | 0.6864 | 0.6934 | 0.6906 | 0.6836 |
2 | 0.8681 | 0.8654 | 0.8459 | 0.8603 | 0.8603 | 0.8704 | 0.8674 |
3 | 0.8722 | 0.8080 | 0.7747 | 0.8213 | 0.8642 | 0.8268 | 0.8830 |
4 | 0.9528 | 0.9564 | 0.9568 | 0.9667 | 0.9578 | 0.9655 | 0.9717 |
5 | 0.7165 | 0.6973 | 0.7287 | 0.7128 | 0.7235 | 0.7088 | 0.7412 |
6 | 0.9138 | 0.8925 | 0.9008 | 0.9082 | 0.9068 | 0.9054 | 0.9118 |
7 | 0.5356 | 0.5565 | 0.5347 | 0.5690 | 0.5615 | 0.5714 | 0.5739 |
8 | 0.8337 | 0.9368 | 0.9372 | 0.9503 | 0.9206 | 0.9432 | 0.9398 |
9 | 0.5642 | 0.9708 | 0.9777 | 0.9761 | 0.9702 | 0.9723 | 0.9766 |
10 | 0.7526 | 0.7517 | 0.7517 | 0.7482 | 0.7434 | 0.7499 | 0.7561 |
11 | 0.6757 | 0.6783 | 0.6390 | 0.6859 | 0.6968 | 0.6882 | 0.6938 |
12 | 0.7360 | 0.8044 | 0.7656 | 0.8024 | 0.7808 | 0.8074 | 0.7771 |
13 | 0.5067 | 0.6188 | 0.6248 | 0.6551 | 0.6635 | 0.6522 | 0.6507 |
14 | 0.7792 | 0.9695 | 0.9735 | 0.9832 | 0.9672 | 0.9808 | 0.9788 |
15 | 0.7249 | 0.6676 | 0.7325 | 0.6842 | 0.7055 | 0.6862 | 0.7389 |
16 | 0.8332 | 0.7941 | 0.7879 | 0.8271 | 0.8093 | 0.8177 | 0.8322 |
17 | 0.8000 | 0.8078 | 0.8033 | 0.8048 | 0.8111 | 0.8204 | 0.8163 |
18 | 0.6750 | 0.6489 | 0.6396 | 0.6495 | 0.6192 | 0.6614 | 0.6690 |
19 | 0.6911 | 0.6984 | 0.6836 | 0.7266 | 0.7004 | 0.7062 | 0.7158 |
20 | 0.9230 | 0.9198 | 0.9257 | 0.9410 | 0.9288 | 0.9313 | 0.9344 |
21 | 0.7200 | 0.8303 | 0.7904 | 0.8469 | 0.8531 | 0.8400 | 0.8697 |
22 | 0.8273 | 0.7911 | 0.8246 | 0.7863 | 0.8343 | 0.7951 | 0.8269 |
23 | 0.7609 | 0.7300 | 0.7334 | 0.7368 | 0.7786 | 0.7318 | 0.7624 |
24 | 0.7320 | 0.9762 | 0.9797 | 0.9795 | 0.9749 | 0.9800 | 0.9743 |
25 | 0.9342 | 0.9287 | 0.9312 | 0.9253 | 0.9326 | 0.9319 | 0.9341 |
26 | 0.8362 | 0.7967 | 0.8115 | 0.8482 | 0.8117 | 0.8164 | 0.8225 |
27 | 0.7795 | 0.7410 | 0.7831 | 0.7421 | 0.782 | 0.7447 | 0.7784 |
28 | 0.6005 | 0.7264 | 0.7292 | 0.7302 | 0.7206 | 0.7282 | 0.7326 |
29 | 0.8128 | 0.8134 | 0.7796 | 0.8336 | 0.8424 | 0.8288 | 0.8394 |
30 | 0.7182 | 0.6864 | 0.6973 | 0.6873 | 0.6966 | 0.6998 | 0.7168 |
Mean | 0.7575 | 0.7913 | 0.7909 | 0.8025 | 0.8037 | 0.8018 | 0.8123 |
Standard deviation | 0.1134 | 0.1137 | 0.1148 | 0.1128 | 0.1099 | 0.1112 | 0.1084 |
Best/Worst | 4/11 | 0/8 | 2/7 | 5/2 | 6/2 | 4/0 | 9/0 |
Dataset ID | AdaBoost | Bagging | DECORATE | ERT | GBDT | RF | WELAD |
---|---|---|---|---|---|---|---|
1 | 1 | 2 | 3 | 5 | 7 | 6 | 4 |
2 | 6 | 4 | 1 | 2 | 2 | 7 | 5 |
3 | 6 | 2 | 1 | 3 | 5 | 4 | 7 |
4 | 1 | 2 | 3 | 6 | 4 | 5 | 7 |
5 | 4 | 1 | 6 | 3 | 5 | 2 | 7 |
6 | 7 | 1 | 2 | 5 | 4 | 3 | 6 |
7 | 2 | 3 | 1 | 5 | 4 | 6 | 7 |
8 | 1 | 3 | 4 | 7 | 2 | 6 | 5 |
9 | 1 | 3 | 7 | 5 | 2 | 4 | 6 |
10 | 6 | 4 | 4 | 2 | 1 | 3 | 7 |
11 | 2 | 3 | 1 | 4 | 7 | 5 | 6 |
12 | 1 | 6 | 2 | 5 | 4 | 7 | 3 |
13 | 1 | 2 | 3 | 6 | 7 | 5 | 4 |
14 | 1 | 3 | 4 | 7 | 2 | 6 | 5 |
15 | 5 | 1 | 6 | 2 | 4 | 3 | 7 |
16 | 7 | 2 | 1 | 5 | 3 | 4 | 6 |
17 | 1 | 4 | 2 | 3 | 5 | 7 | 6 |
18 | 7 | 3 | 2 | 4 | 1 | 5 | 6 |
19 | 2 | 3 | 1 | 7 | 4 | 5 | 6 |
20 | 2 | 1 | 3 | 7 | 4 | 5 | 6 |
21 | 1 | 3 | 2 | 5 | 6 | 4 | 7 |
22 | 6 | 2 | 4 | 1 | 7 | 3 | 5 |
23 | 5 | 1 | 3 | 4 | 7 | 2 | 6 |
24 | 1 | 4 | 6 | 5 | 3 | 7 | 2 |
25 | 6 | 2 | 3 | 1 | 5 | 4 | 7 |
26 | 6 | 1 | 2 | 7 | 3 | 4 | 5 |
27 | 5 | 1 | 7 | 2 | 6 | 3 | 4 |
28 | 1 | 3 | 5 | 6 | 2 | 4 | 7 |
29 | 2 | 3 | 1 | 5 | 7 | 4 | 6 |
30 | 7 | 1 | 4 | 2 | 3 | 5 | 6 |
Mean Rank | 3.47 | 2.48 | 3.15 | 4.38 | 4.22 | 4.60 | 5.70 |
Number (N) | Chi-Square | Degrees of Freedom (df) | Asymp. Sig. (p-Value) |
---|---|---|---|
30 | 43.452 | 6 | 0.000 |
WELAD- AdaBoost | WELAD- Bagging | WELAD- DECORATE | WELAD- ERT | WELAD- GBDT | WELAD- RF | |
---|---|---|---|---|---|---|
Z | −3.733 | −4.350 | −4.371 | −2.376 | −3.304 | −3.219 |
Significance (two-tailed) | 0.000 | 0.000 | 0.000 | 0.018 | 0.002 | 0.001 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
You, G.-R.; Shiue, Y.-R.; Yeh, W.-C.; Chen, X.-L.; Chen, C.-M. A Weighted Ensemble Learning Algorithm Based on Diversity Using a Novel Particle Swarm Optimization Approach. Algorithms 2020, 13, 255. https://doi.org/10.3390/a13100255
You G-R, Shiue Y-R, Yeh W-C, Chen X-L, Chen C-M. A Weighted Ensemble Learning Algorithm Based on Diversity Using a Novel Particle Swarm Optimization Approach. Algorithms. 2020; 13(10):255. https://doi.org/10.3390/a13100255
Chicago/Turabian StyleYou, Gui-Rong, Yeou-Ren Shiue, Wei-Chang Yeh, Xi-Li Chen, and Chih-Ming Chen. 2020. "A Weighted Ensemble Learning Algorithm Based on Diversity Using a Novel Particle Swarm Optimization Approach" Algorithms 13, no. 10: 255. https://doi.org/10.3390/a13100255
APA StyleYou, G. -R., Shiue, Y. -R., Yeh, W. -C., Chen, X. -L., & Chen, C. -M. (2020). A Weighted Ensemble Learning Algorithm Based on Diversity Using a Novel Particle Swarm Optimization Approach. Algorithms, 13(10), 255. https://doi.org/10.3390/a13100255