Neural Network Algorithm with Dropout Using Elite Selection
Abstract
:1. Introduction
- (1)
- NNA is analyzed from the perspective of an evolutionary algorithm, including crossover, mutation, and selection processes, which correspond to every step of NNA. It shows that NNA belongs to an evolutionary algorithm.
- (2)
- In the crossover stage of the DESNNA, similar to dropout in the neural network, the dropout strategy is applied to NNA: a certain proportion of the individuals are dropped and do not participate in the crossover process, which ensures the superiority of the individuals participating in the crossover process.
- (3)
- In the selection process of the DESNNA, some individuals who performed well in the previous generation are directly retained when updating the population, which increases the optimization ability of the algorithm without losing the diversity of the population.
2. Neural Network Algorithm
2.1. Artificial Neural Network
2.2. The Introduction of Neural Network Algorithm
- (1)
- Initialization stage
- (2)
- Cycle stage
Algorithm 1. The implementation of the neural network algorithm (NNA). |
01 Create random initial population X and weights W with constraints by Equations (4) and (5) |
02 Calculate the cost of every pattern solution and set the target solution and target weight |
03 For i = 1:max_iteration |
04 Generate new pattern solutions Xt+1 by Equations (6) and (7) |
05 Update the weights by Equation (8) |
06 If rand ≤ β |
07 Perform the bias operator for pattern solutions Xt+1 and weights Wt+1 by Equations (9) and (10) |
08 Else |
09 Perform the transfer function operator on Xt+1 by Equation (11) |
10 End if |
11 Calculate the cost of every pattern solution and find the optimal solution and weight |
12 Reduce the modification factor β by Equation (12) |
13 End for |
3. The Neural Network Algorithm with Dropout Using Elite Selection
3.1. NNA from the Perspective of Evolutionary Algorithm
3.2. The Introduced Dropout Strategy in the DESNNA
3.3. The Elite Selection in the DESNNA
Algorithm 2. The implementation of the neural network algorithm with dropout using elite selection (DES-NNA). |
01 Create random initial population X and weights W with constraints by Equations (4) and (5) |
02 Calculate the cost of every pattern solution and set the target solution and target weight |
03 For i = 1:max_iteration |
04 10% of individuals with the worst fitness corresponding pattern solution Xworst is set 0 Generate new pattern solutions Xt+1 by Equations (14), (15) and (7) |
05 Update the weights by Equation (8) |
06 If rand ≤ β |
07 Perform the bias operator for pattern solutions Xt+1 and weights Wt+1 by Equations (9) and (10) |
08 Else |
09 Perform the transfer function operator on Xt+1 by Equation (11) |
10 End if |
11 Calculate the cost of every pattern solution and find the optimal solution and weight |
12 Sort the cost of each pattern solution in the new population |
13 Save the top 15% of individuals with the highest fitness to the next generation |
14 Reduce the modification factor β by Equation (12) |
15 End for |
4. DESNNA for Global Optimization
4.1. Benchmark Functions
4.2. Comparison between Improved DESNNA and NNA
4.3. Comparisons between the Improved DESNNA and Other Algorithms
5. Conclusions and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Nomenclature
D | The dimension of optimization problem |
N | Population size |
LB | The lower limit of variables |
UB | The upper limit of variables |
Max_iteration | The maximum number of iterations |
References
- Sergeyev, Y.D.; Kvasov, D.E. A deterministic global optimization using smooth diagonal auxiliary functions. Commun. Nonlinear Sci. Numer. Simul. 2015, 21, 99–111. [Google Scholar] [CrossRef] [Green Version]
- Magoulas, G.D.; Vrahatis, M.N. Adaptive algorithms for neural network supervised learning: A deterministic optimization approach. Int. J. Bifurc. Chaos 2006, 16, 1929–1950. [Google Scholar] [CrossRef] [Green Version]
- Kvasov, D.E.; Mukhametzhanov, M.S. Metaheuristic vs. deterministic global optimization algorithms: The univariate case. Appl. Math. Comput. 2018, 318, 245–259. [Google Scholar] [CrossRef]
- Sergeyev, Y.D.; Kvasov, D.E.; Mukhametzhanov, M.S. Operational zones for comparing metaheuristic and deterministic one-dimensional global optimization algorithms. Math. Comput. Simul. 2017, 141, 96–109. [Google Scholar] [CrossRef]
- Ma, Y.; Wang, Z.; Yang, H.; Yang, L. Artificial intelligence applications in the development of autonomous vehicles: A survey. IEEE/CAA J. Autom. Sin. 2020, 7, 315–329. [Google Scholar] [CrossRef]
- Zhao, Z.; Liu, S.; Zhou, M.; Abusorrah, A. Dual-objective mixed integer linear program and memetic algorithm for an industrial group scheduling problem. IEEE/CAA J. Autom. Sin. 2020, 8, 1199–1209. [Google Scholar] [CrossRef]
- Zhang, Z.; Cao, Y.; Cui, Z.; Zhang, W.; Chen, J. A Many-Objective Optimization Based Intelligent Intrusion Detection Algorithm for Enhancing Security of Vehicular Networks in 6G. IEEE Trans. Veh. Technol. 2021, 70, 5234–5243. [Google Scholar] [CrossRef]
- Dokeroglu, T.; Sevinc, E.; Kucukyilmaz, T.; Cosar, A. A survey on new generation metaheuristic algorithms. Comput. Ind. Eng. 2019, 137, 106040. [Google Scholar] [CrossRef]
- Wang, G.-G.; Cai, X.; Cui, Z.; Min, G.; Chen, J. High performance computing for cyber physical social systems by using evolutionary multi-objective optimization algorithm. IEEE Trans. Emerg. Top. Comput. 2020, 8, 20–30. [Google Scholar] [CrossRef]
- Wang, G.-G.; Tan, Y. Improving metaheuristic algorithms with information feedback models. IEEE Trans. Cybern. 2019, 49, 542–555. [Google Scholar] [CrossRef]
- Wang, G.-G.; Gao, D.; Pedrycz, W. Solving multi-objective fuzzy job-shop scheduling problem by a hybrid adaptive differential evolution algorithm. IEEE Trans. Ind. Inform. 2022, 1. [Google Scholar] [CrossRef]
- Holland, J.H. Genetic algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar] [CrossRef]
- Kirkpatrick, S.; Gelatt, C.D., Jr.; Vecchi, M.P. Optimization by simulated annealing. Science 1983, 220, 671–680. [Google Scholar] [CrossRef]
- Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995; pp. 1942–1948. [Google Scholar]
- Cui, Z.; Zhang, J.; Wu, D.; Cai, X.; Wang, H.; Zhang, W.; Chen, J. Hybrid many-objective particle swarm optimization algorithm for green coal production problem. Inf. Sci. 2020, 518, 256–271. [Google Scholar] [CrossRef]
- Zhang, W.; Hou, W.; Li, C.; Yang, W.; Gen, M. Multidirection Update-Based Multiobjective Particle Swarm Optimization for Mixed No-Idle Flow-Shop Scheduling Problem. Complex Syst. Model. Simul. 2021, 1, 176–197. [Google Scholar] [CrossRef]
- Geem, Z.W.; Kim, J.H.; Loganathan, G.V. A new heuristic optimization algorithm: Harmony search. Simulation 2001, 76, 60–68. [Google Scholar] [CrossRef]
- Storn, R.; Price, K. Differential evolution—A simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
- Gao, D.; Wang, G.-G.; Pedrycz, W. Solving fuzzy job-shop scheduling problem using DE algorithm improved by a selection mechanism. IEEE Trans. Fuzzy Syst. 2020, 28, 3265–3275. [Google Scholar] [CrossRef]
- Dorigo, M.; Maniezzo, V.; Colorni, A. Ant system: Optimization by a colony of cooperating agents. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 1996, 26, 29–41. [Google Scholar] [CrossRef] [Green Version]
- Karaboga, D.; Basturk, B. A powerful and efficient algorithm for numerical function optimization: Artificial bee colony (ABC) algorithm. J. Glob. Optim. 2007, 39, 459–471. [Google Scholar] [CrossRef]
- Sadollah, A.; Sayyaadi, H.; Yadav, A. A dynamic metaheuristic optimization model inspired by biological nervous systems: Neural network algorithm. Appl. Soft Comput. 2018, 71, 747–782. [Google Scholar] [CrossRef]
- Dhiman, G.; Kumar, V. Spotted hyena optimizer: A novel bio-inspired based metaheuristic technique for engineering applications. Adv. Eng. Softw. 2017, 114, 48–70. [Google Scholar] [CrossRef]
- Dhiman, G.; Kumar, V. Seagull optimization algorithm: Theory and its applications for large-scale industrial engineering problems. Knowl.-Based Syst. 2019, 165, 169–196. [Google Scholar] [CrossRef]
- Kaur, S.; Awasthi, L.K.; Sangal, A.; Dhiman, G. Tunicate Swarm Algorithm: A new bio-inspired based metaheuristic paradigm for global optimization. Eng. Appl. Artif. Intell. 2020, 90, 103541. [Google Scholar] [CrossRef]
- Wang, G.-G.; Deb, S.; Coelho, L.d.S. Elephant herding optimization. In Proceedings of the 2015 3rd International Symposium on Computational and Business Intelligence (ISCBI), Bali, Indonesia, 7–9 December 2015; pp. 1–5. [Google Scholar]
- Dhiman, G.; Kaur, A. STOA: A bio-inspired based optimization algorithm for industrial engineering problems. Eng. Appl. Artif. Intell. 2019, 82, 148–174. [Google Scholar] [CrossRef]
- Zhang, Y. Chaotic neural network algorithm with competitive learning for global optimization. Knowl.-Based Syst. 2021, 231, 107405. [Google Scholar] [CrossRef]
- Wang, G.-G.; Deb, S.; Cui, Z. Monarch butterfly optimization. Neural Comput. Appl. 2019, 31, 1995–2014. [Google Scholar] [CrossRef] [Green Version]
- Lakshminarayanan, S.; Abdulgader, M.; Kaur, D. Scheduling energy storage unit with GWO for smart home integrated with renewable energy. Int. J. Artif. Intell. Soft Comput. 2020, 7, 146–163. [Google Scholar]
- Wang, G.-G.; Deb, S.; Coelho, L.D.S. Earthworm optimisation algorithm: A bio-inspired metaheuristic algorithm for global optimisation problems. Int. J. Bio-Inspired Comput. 2018, 12, 1–22. [Google Scholar] [CrossRef]
- Wang, G.-G. Moth search algorithm: A bio-inspired metaheuristic algorithm for global optimization problems. Memetic Comput. 2018, 10, 151–164. [Google Scholar] [CrossRef]
- Ghaemi, M.; Feizi-Derakhshi, M.-R. Forest optimization algorithm. Expert Syst. Appl. 2014, 41, 6676–6687. [Google Scholar] [CrossRef]
- Grabski, J.K.; Walczak, T.; Buśkiewicz, J.; Michałowska, M. Comparison of some evolutionary algorithms for optimization of the path synthesis problem. In Proceedings of the AIP Conference Proceedings, Lublin, Poland, 13–16 September 2017; p. 020006. [Google Scholar]
- Liang, Y.-C.; Cuevas Juarez, J.R. A novel metaheuristic for continuous optimization problems: Virus optimization algorithm. Eng. Optim. 2016, 48, 73–93. [Google Scholar] [CrossRef]
- Grabski, J.K.; Mrozek, A. Identification of elastoplastic properties of rods from torsion test using meshless methods and a metaheuristic. Comput. Math. Appl. 2021, 92, 149–158. [Google Scholar] [CrossRef]
- Qadeer, K.; Ahmad, A.; Naquash, A.; Qyyum, M.A.; Majeed, K.; Zhou, Z.; He, T.; Nizami, A.-S.; Lee, M. Neural network-inspired performance enhancement of synthetic natural gas liquefaction plant with different minimum approach temperatures. Fuel 2022, 308, 121858. [Google Scholar] [CrossRef]
- Bhullar, A.K.; Kaur, R.; Sondhi, S. Design and Comparative Analysis of Optimized Fopid Controller Using Neural Network Algorithm. In Proceedings of the 2020 IEEE 15th International Conference on Industrial and Information Systems (ICIIS), Rupnagar, India, 26–28 November 2020; pp. 91–96. [Google Scholar]
- Zhang, Y.; Jin, Z.; Chen, Y. Hybrid teaching–learning-based optimization and neural network algorithm for engineering design optimization problems. Knowl.-Based Syst. 2020, 187, 104836. [Google Scholar] [CrossRef]
- Zhang, Y.; Jin, Z.; Chen, Y. Hybridizing grey wolf optimization with neural network algorithm for global numerical optimization problems. Neural Comput. Appl. 2020, 32, 10451–10470. [Google Scholar] [CrossRef]
- Zhang, H.; Sheng, J.J. Complex fracture network simulation and optimization in naturally fractured shale reservoir based on modified neural network algorithm. J. Nat. Gas Sci. Eng. 2021, 95, 104232. [Google Scholar] [CrossRef]
- Nguyen, T.P.; Nguyen, T.A.; Phan, T.V.-H.; Vo, D.N. A comprehensive analysis for multi-objective distributed generations and capacitor banks placement in radial distribution networks using hybrid neural network algorithm. Knowl.-Based Syst. 2021, 231, 107387. [Google Scholar] [CrossRef]
- Van Tran, T.; Truong, B.-H.; Nguyen, T.P.; Nguyen, T.A.; Duong, T.L.; Vo, D.N. Reconfiguration of Distribution Networks With Distributed Generations Using an Improved Neural Network Algorithm. IEEE Access 2021, 9, 165618–165647. [Google Scholar] [CrossRef]
- Marugán, A.P.; Márquez, F.P.G.; Perez, J.M.P.; Ruiz-Hernández, D. A survey of artificial neural network in wind energy systems. Appl. Energy 2018, 228, 1822–1836. [Google Scholar] [CrossRef] [Green Version]
- Srivastava, N.; Hinton, G.; Krizhevsky, A.; Sutskever, I.; Salakhutdinov, R. Dropout: A simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 2014, 15, 1929–1958. [Google Scholar]
- Bhandari, D.; Paul, S.; Narayan, A. Deep neural networks for multimodal data fusion and affect recognition. Int. J. Artif. Intell. Soft Comput. 2020, 7, 130–145. [Google Scholar]
- Agrawal, A.; Barratt, S.; Boyd, S. Learning Convex Optimization Models. IEEE/CAA J. Autom. Sin. 2021, 8, 1355–1364. [Google Scholar] [CrossRef]
- Hirasawa, T.; Aoyama, K.; Tanimoto, T.; Ishihara, S.; Shichijo, S.; Ozawa, T.; Ohnishi, T.; Fujishiro, M.; Matsuo, K.; Fujisaki, J. Application of artificial intelligence using a convolutional neural network for detecting gastric cancer in endoscopic images. Gastric Cancer 2018, 21, 653–660. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Paoletti, M.; Haut, J.; Plaza, J.; Plaza, A. A new deep convolutional neural network for fast hyperspectral image classification. ISPRS J. Photogramm. Remote Sens. 2018, 145, 120–147. [Google Scholar] [CrossRef]
- Devin, C.; Gupta, A.; Darrell, T.; Abbeel, P.; Levine, S. Learning modular neural network policies for multi-task and multi-robot transfer. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 2169–2176. [Google Scholar]
- Parashar, S.; Senthilnath, J.; Yang, X.-S. A novel bat algorithm fuzzy classifier approach for classification problems. Int. J. Artif. Intell. Soft Comput. 2017, 6, 108–128. [Google Scholar] [CrossRef]
- Laudani, A.; Lozito, G.M.; Riganti Fulginei, F.; Salvini, A. On training efficiency and computational costs of a feed forward neural network: A review. Comput. Intell. Neurosci. 2015, 2015, 818243. [Google Scholar] [CrossRef] [Green Version]
- Cui, Z.; Xue, F.; Cai, X.; Cao, Y.; Wang, G.-G.; Chen, J. Detection of malicious code variants based on deep learning. IEEE Trans. Ind. Inform. 2018, 14, 3187–3196. [Google Scholar] [CrossRef]
- Herrera, F.; Lozano, M.; Molina, D. Test Suite for the Special Issue of Soft Computing on Scalability of Evolutionary Algorithms and Other Metaheuristics for Large Scale Continuous Optimization Problems. Available online: http://150.214.190.154/sites/default/files/files/TematicWebSites/EAMHCO/functions1-19.pdf (accessed on 25 April 2022).
- Liang, J.J.; Qu, B.Y.; Suganthan, P.N. Problem definitions and evaluation criteria for the CEC 2014 special session and competition on single objective real-parameter numerical optimization. Comput. Intell. Lab. Zhengzhou Univ. Zhengzhou China Technol. Rep. Nanyang Technol. Univ. Singap. 2013, 635, 490. [Google Scholar]
Function | Name | Definition |
---|---|---|
F1 | Shifted Sphere Function | |
F2 | Shifted Schwefel Problem 2.21 | |
F3 | Shifted Rosenbrock’s Function | |
F4 | Shifted Rastrigin’s Function | |
F5 | Shifted Griewank’s Function | |
F6 | Shifted Ackley’s Function | |
F7 | Schwefel’s Problem 2.22 | |
F8 | Schwefel’s Problem 1.2 | |
F9 | Extended f10 | |
F10 | Bohachevsky | |
F11 | Schaffffer |
Function | First Function | Second Function | Weight Factor |
---|---|---|---|
F12 | F9 | F1 | 0.25 |
F13 | F9 | F4 | 0.25 |
F14 | F5 | F1 | 0.5 |
F15 | F3 | F4 | 0.5 |
F16 | F9 | F1 | 0.75 |
F17 | F9 | F3 | 0.75 |
F18 | F9 | F4 | 0.75 |
Function | Range | Optimum | Unimodal/ Multimodal | Separable | Shifted | f_bias |
---|---|---|---|---|---|---|
F1 | [−100, 100]D | 0 | U | Y | Y | −450 |
F2 | [−100, 100]D | 0 | U | N | Y | −450 |
F3 | [−100, 100]D | 0 | M | Y | Y | 390 |
F4 | [−5, 5]D | 0 | M | Y | Y | −330 |
F5 | [−600, 600]D | 0 | M | N | Y | −180 |
F6 | [−32, 32]D | 0 | M | Y | Y | −140 |
F7 | [−10, 10]D | 0 | U | Y | N | − |
F8 | [−65.536, 65.536]D | 0 | U | N | N | − |
F9 | [−100, 100]D | 0 | U | N | N | − |
F10 | [−15, 15]D | 0 | U | Y | N | − |
F11 | [−100, 100]D | 0 | U | Y | N | − |
F12 | [−100, 100]D | 0 | U | N | Y | −450 |
F13 | [−5, 5]D | 0 | M | N | Y | −330 |
F14 | [−100, 100]D | 0 | U | N | Y | −630 |
F15 | [−10, 10]D | 0 | M | Y | Y | 60 |
F16 | [−100, 100]D | 0 | U | N | Y | −450 |
F17 | [−100, 100]D | 0 | M | N | Y | 390 |
F18 | [−5, 5]D | 0 | M | N | Y | −330 |
Methods | Parameters | Optimal Values |
---|---|---|
GA | N | 50 |
Pc | 0.8 | |
Pm | 0.2 | |
PSO | N | 50 |
C1, C2 | 2 | |
w | 0.9 | |
HS | N | 50 |
HMCR | 0.95 | |
PAR | 0.3 | |
CCLNNA | N | 50 |
N | N | 50 |
N | N | 50 |
rateOfSelect | 0.15 | |
rateOfDropt | 0.10 |
Function | Methods | Best Error | Average Error | Worst Error | Error Standard Deviation |
---|---|---|---|---|---|
F1 | NNA DESNNA | 5.684 × 10−14 | 5.684 × 10−14 | 3.411 × 10−13 | 1.339 × 10−13 |
0 | 0 | 1.137 × 10−13 | 5.971 × 10−14 | ||
F2 | NNA DESNNA | 7.209 × 10−2 | 3.251 × 10−1 | 1.425 × 100 | 2.901 × 10−1 |
2.626 × 10−2 | 3.152 × 10−1 | 1.010 × 100 | 2.389 × 10−1 | ||
F3 | NNA DESNNA | 4.822 × 101 | 5.860 × 101 | 2.094 × 102 | 3.953 × 101 |
4.822 × 101 | 4.822 × 101 | 4.822 × 101 | 3.493 × 10−13 | ||
F4 | NNA DESNNA | 4.547 × 10−13 | 1.825 × 100 | 7.960 × 100 | 2.653 × 100 |
0 | 5.978 × 10−1 | 4.975 × 100 | 1.470 × 100 | ||
F5 | NNA DESNNA | 5.684 × 10−14 | 5.754 × 10−4 | 9.865 × 10−3 | 2.213 × 10−3 |
0 | 0 | 5.684 × 10−14 | 3.077 × 10−14 | ||
F6 | NNA DESNNA | 2.154 × 10−11 | 7.012 × 10−11 | 1.863 × 10−10 | 3.857 × 10−11 |
5.684 × 10−14 | 2.842 × 10−14 | 1.705 × 10−13 | 5.853 × 10−14 | ||
F7 | NNA DESNNA | 3.132 × 10−12 | 3.177 × 10−11 | 1.101 × 10−10 | 3.063 × 10−11 |
6.335 × 10−15 | 6.297 × 10−14 | 5.688 × 10−13 | 1.072 × 10−13 | ||
F8 | NNA DESNNA | 9.448 × 10−4 | 5.364 × 10−3 | 2.151 × 10−2 | 5.002 × 10−3 |
7.039 × 10−5 | 1.794 × 10−3 | 1.380 × 10−2 | 2.701 × 10−3 | ||
F9 | NNA DESNNA | 3.105 × 10−1 | 4.330 × 100 | 2.096 × 101 | 4.678 × 100 |
1.418 × 10−2 | 3.460 × 100 | 1.369 × 101 | 4.206 × 100 | ||
F10 | NNA DESNNA | 0 | 3.701 × 10−17 | 2.220 × 10−16 | 8.417 × 10−17 |
0 | 0 | 0 | 0 | ||
F11 | NNA DESNNA | 2.143 × 10−1 | 5.928 × 100 | 2.416 × 101 | 6.384 × 100 |
1.787 × 10−2 | 2.126 × 100 | 1.356 × 101 | 3.941 × 100 | ||
F12 | NNA DESNNA | 4.151 × 10−5 | 2.217 × 10−2 | 6.596 × 10−1 | 1.204 × 10−1 |
5.449 × 10−10 | 1.710 × 10−8 | 8.094 × 10−8 | 1.962 × 10−8 | ||
F13 | NNA DESNNA | 1.906 × 10−5 | 1.727 × 100 | 9.720 × 100 | 2.649 × 100 |
2.314 × 10−8 | 1.017 × 100 | 7.469 × 100 | 2.367 × 100 | ||
F14 | NNA DESNNA | 2.274 × 10−13 | 8.102 × 10−3 | 2.061 × 10−1 | 3.772 × 10−2 |
0 | 3.196 × 10−3 | 5.157 × 10−2 | 1.120 × 10−2 | ||
F15 | NNA DESNNA | 2.346 × 101 | 2.356 × 101 | 2.645 × 101 | 5.449 × 10−1 |
2.346 × 101 | 2.346 × 101 | 2.346 × 101 | 5.357 × 10−10 | ||
F16 | NNA DESNNA | 3.345 × 10−2 | 2.657 × 100 | 8.910 × 100 | 2.854 × 100 |
1.699 × 10−5 | 2.869 × 10−1 | 4.255 × 100 | 8.419 × 10−1 | ||
F17 | NNA DESNNA | 1.096 × 101 | 4.957 × 101 | 2.714 × 102 | 5.833 × 101 |
1.061 × 101 | 2.680 × 101 | 1.434 × 102 | 3.733 × 101 | ||
F18 | NNA DESNNA | 1.152 × 10−2 | 2.101 × 100 | 1.293 × 101 | 3.218 × 100 |
7.856 × 10−6 | 1.070 × 100 | 1.092 × 101 | 2.881 × 100 |
Methods | Best Error | Average Error | Worst Error | Error Standard Deviation |
---|---|---|---|---|
F1 | ||||
DESNNA | 0 | 0 | 1.137 × 10−13 | 5.971 × 10−14 |
CCLNNA | 7.135 × 10−9 | 3.697 × 10−8 | 1.215 × 10−7 | 2.172 × 10−8 |
TSA | 5.684 × 10−14 | 5.684 × 10−14 | 1.137 × 10−13 | 5.382 × 10−14 |
PSO | 1.137 × 10−13 | 1.137 × 10−13 | 5.116 × 10−13 | 2.635 × 10−13 |
GA | 1.503 × 10−8 | 3.612 × 10−8 | 1.002 × 10−7 | 1.858 × 10−8 |
HS | 1.694 × 103 | 2.893 × 103 | 3.857 × 103 | 5.336 × 102 |
F2 | ||||
DESNNA | 2.626 × 10−2 | 3.152 × 10−1 | 1.010 × 100 | 2.389 × 10−1 |
CCLNNA | 8.691 × 10−3 | 2.016 × 10−2 | 3.789 × 10−2 | 6.979 × 10−3 |
TSA | 1.070 × 10−8 | 1.568 × 10−6 | 7.838 × 10−6 | 2.323 × 10−6 |
PSO | 2.019 × 101 | 2.019 × 101 | 2.696 × 101 | 3.218 × 100 |
GA | 1.367 × 100 | 2.145 × 100 | 3.068 × 100 | 3.789 × 10−1 |
HS | 4.165 × 101 | 4.724 × 101 | 5.265 × 101 | 2.566 × 100 |
F3 | ||||
DESNNA | 4.822 × 101 | 4.822 × 101 | 4.822 × 101 | 3.493 × 10−13 |
CCLNNA | 4.822 × 101 | 5.123 × 101 | 1.308 × 102 | 1.507 × 101 |
TSA | 4.841 × 101 | 4.863 × 101 | 4.882 × 101 | 1.484 × 10−1 |
PSO | 2.430 × 1010 | 2.430 × 1010 | 3.757 × 1010 | 7.454 × 109 |
GA | 4.822 × 101 | 4.825 × 101 | 4.861 × 101 | 8.374 × 10−2 |
HS | 8.008 × 107 | 1.502 × 108 | 2.711 × 108 | 4.366 × 107 |
F4 | ||||
DESNNA | 0 | 5.978 × 10−1 | 4.975 × 100 | 1.470 × 100 |
CCLNNA | 1.020 × 10−8 | 3.317 × 10−2 | 9.950 × 10−1 | 1.817 × 10−1 |
TSA | 2.270 × 102 | 3.014 × 102 | 3.827 × 102 | 4.299 × 101 |
PSO | 5.530 × 102 | 5.530 × 102 | 6.947 × 102 | 5.642 × 101 |
GA | 8.955 × 100 | 2.030 × 101 | 5.373 × 101 | 9.087 × 100 |
HS | 2.478 × 102 | 2.783 × 102 | 3.106 × 102 | 1.740 × 101 |
F5 | ||||
DESNNA | 0 | 0 | 5.684 × 10−14 | 3.077 × 10−14 |
CCLNNA | 3.530 × 10−8 | 7.538 × 10−3 | 5.867 × 10−2 | 1.363 × 10−2 |
TSA | 2.842 × 10−14 | 3.489 × 10−3 | 2.495 × 10−2 | 6.073 × 10−3 |
PSO | 2.170 × 10−1 | 2.170 × 10−1 | 9.131 × 10−1 | 3.363 × 10−1 |
GA | 3.800 × 10−10 | 1.802 × 10−3 | 3.680 × 10−2 | 6.969 × 10−3 |
HS | 1.897 × 101 | 3.000 × 101 | 3.834 × 101 | 4.877 × 100 |
F6 | ||||
DESNNA | 5.684 × 10−14 | 2.842 × 10−14 | 1.705 × 10−13 | 5.853 × 10−14 |
CCLNNA | 1.605 × 10−5 | 3.366 × 10−5 | 5.397 × 10−5 | 7.631 × 10−6 |
TSA | 8.527 × 10−14 | 1.060 × 100 | 3.385 × 100 | 1.436 × 100 |
PSO | 1.919 × 10−7 | 1.919 × 10−7 | 1.809 × 10−6 | 3.368 × 10−7 |
GA | 9.199 × 10−5 | 1.027 × 100 | 1.945 × 100 | 6.774 × 10−1 |
HS | 7.334 × 100 | 9.207 × 100 | 1.034 × 101 | 6.722 × 10−1 |
F7 | ||||
DESNNA | 6.335 × 10−15 | 6.297 × 10−14 | 5.688 × 10−13 | 1.072 × 10−13 |
CCLNNA | 6.837 × 10−5 | 1.053 × 10−4 | 1.493 × 10−4 | 2.521 × 10−5 |
TSA | 1.162 × 10−132 | 2.264 × 10−127 | 6.129 × 10−126 | 1.117 × 10−126 |
PSO | 1.097 × 10−10 | 1.097 × 10−10 | 1.058 × 10−9 | 1.998 × 10−10 |
GA | 4.951 × 10−1 | 2.597 × 100 | 4.642 × 100 | 1.114 × 100 |
HS | 1.917 × 101 | 2.156 × 101 | 2.353 × 101 | 1.276 × 100 |
F8 | ||||
DESNNA | 7.039 × 10−5 | 1.794 × 10−3 | 1.380 × 10−2 | 2.701 × 10−3 |
CCLNNA | 3.661 × 10−2 | 8.557 × 10−2 | 1.540 × 10−1 | 2.742 × 10−2 |
TSA | 6.565 × 10−58 | 1.529 × 10−34 | 3.362 × 10−33 | 6.436 × 10−34 |
PSO | 3.630 × 104 | 3.630 × 104 | 4.899 × 104 | 5.672 × 103 |
GA | 9.505 × 10−1 | 9.095 × 100 | 4.847 × 101 | 1.405 × 101 |
HS | 3.420 × 104 | 5.157 × 104 | 6.727 × 104 | 9.005 × 103 |
F9 | ||||
DESNNA | 1.418 × 10−2 | 3.460 × 100 | 1.369 × 101 | 4.206 × 100 |
CCLNNA | 7.099 × 100 | 1.795 × 101 | 3.166 × 101 | 4.477 × 100 |
TSA | 2.168 × 101 | 5.648 × 101 | 1.543 × 102 | 2.840 × 101 |
PSO | 4.944 × 102 | 4.944 × 102 | 5.464 × 102 | 3.333 × 101 |
GA | 2.670 × 101 | 3.818 × 101 | 4.814 × 101 | 4.947 × 100 |
HS | 1.469 × 102 | 1.863 × 102 | 2.114 × 102 | 1.564 × 101 |
F10 | ||||
DESNNA | 0 | 0 | 0 | 0 |
CCLNNA | 1.485 × 10−8 | 4.157 × 10−8 | 1.051 × 10−7 | 2.222 × 10−8 |
TSA | 0 | 4.246 × 100 | 3.191 × 101 | 9.976 × 100 |
PSO | 3.892 × 103 | 3.892 × 103 | 5.628 × 103 | 7.097 × 102 |
GA | 5.249 × 100 | 8.968 × 100 | 1.727 × 101 | 2.900 × 100 |
HS | 1.519 × 102 | 1.987 × 102 | 2.723 × 102 | 3.068 × 101 |
F11 | ||||
DESNNA | 1.787 × 10−2 | 2.126 × 100 | 1.356 × 101 | 3.941 × 100 |
CCLNNA | 8.918 × 100 | 1.739 × 101 | 3.008 × 101 | 4.385 × 100 |
TSA | 1.440 × 101 | 5.089 × 101 | 1.683 × 102 | 3.334 × 101 |
PSO | 4.653 × 102 | 4.653 × 102 | 5.113 × 102 | 2.339 × 101 |
GA | 2.475 × 101 | 3.564 × 101 | 4.380 × 101 | 4.903 × 100 |
HS | 1.603 × 102 | 1.807 × 102 | 2.132 × 102 | 1.307 × 101 |
F12 | ||||
DESNNA | 5.449 × 10−10 | 1.710 × 10−8 | 8.094 × 10−8 | 1.962 × 10−8 |
CCLNNA | 8.100 × 10−2 | 2.453 × 10−1 | 1.333 × 100 | 2.410 × 10−1 |
TSA | 1.137 × 10−13 | 2.020 × 100 | 1.651 × 101 | 4.570 × 100 |
PSO | 3.922 × 104 | 3.922 × 104 | 7.287 × 104 | 1.119 × 104 |
GA | 7.491 × 10−2 | 4.669 × 100 | 1.343 × 101 | 3.597 × 100 |
HS | 7.270 × 102 | 1.231 × 103 | 2.018 × 103 | 3.087 × 102 |
F13 | ||||
DESNNA | 2.314 × 10−8 | 1.017 × 100 | 7.469 × 100 | 2.367 × 100 |
CCLNNA | 1.724 × 10−2 | 3.492 × 10−1 | 2.526 × 100 | 6.498 × 10−1 |
TSA | 1.577 × 102 | 2.325 × 102 | 3.012 × 102 | 3.723 × 101 |
PSO | 4.077 × 102 | 4.077 × 102 | 5.031 × 102 | 4.723 × 101 |
GA | 1.084 × 101 | 2.201 × 101 | 3.664 × 101 | 5.255 × 100 |
HS | 1.602 × 102 | 1.989 × 102 | 2.269 × 102 | 1.666 × 101 |
F14 | ||||
DESNNA | 0 | 3.196 × 10−3 | 5.157 × 10−2 | 1.120 × 10−2 |
CCLNNA | 6.372 × 10−8 | 2.275 × 10−2 | 9.816 × 10−2 | 2.461 × 10−2 |
TSA | 2.274 × 10−13 | 7.897 × 10−3 | 4.276 × 10−2 | 1.055 × 10−2 |
PSO | 2.045 × 104 | 2.045 × 104 | 3.515 × 104 | 7.255 × 103 |
GA | 2.863 × 10−7 | 1.402 × 10−2 | 5.987 × 10−2 | 1.993 × 10−2 |
HS | 7.064 × 101 | 1.534 × 102 | 2.943 × 102 | 6.223 × 101 |
F15 | ||||
DESNNA | 2.346 × 101 | 2.346 × 101 | 2.346 × 101 | 5.357 × 10−10 |
CCLNNA | 2.346 × 101 | 2.347 × 101 | 2.356 × 101 | 1.750 × 10−2 |
TSA | 8.121 × 101 | 1.363 × 102 | 2.428 × 102 | 3.715 × 101 |
PSO | 6.650 × 105 | 6.650 × 105 | 1.519 × 106 | 3.700 × 105 |
GA | 2.546 × 101 | 4.732 × 101 | 1.409 × 102 | 2.594 × 101 |
HS | 1.258 × 103 | 1.565 × 103 | 1.953 × 103 | 2.054 × 102 |
F16 | ||||
DESNNA | 1.699 × 10−5 | 2.869 × 10−1 | 4.255 × 100 | 8.419 × 10−1 |
CCLNNA | 5.691 × 100 | 1.168 × 101 | 2.231 × 101 | 4.565 × 100 |
TSA | 8.421 × 100 | 4.459 × 101 | 1.386 × 102 | 2.386 × 101 |
PSO | 6.533 × 103 | 6.533 × 103 | 1.763 × 104 | 4.735 × 103 |
GA | 1.568 × 101 | 2.582 × 101 | 3.621 × 101 | 5.827 × 100 |
HS | 1.653 × 102 | 2.057 × 102 | 2.295 × 102 | 1.711 × 101 |
F17 | ||||
DESNNA | 1.061 × 101 | 2.680 × 101 | 1.434 × 102 | 3.733 × 101 |
CCLNNA | 1.610 × 101 | 3.373 × 101 | 8.777 × 101 | 1.650 × 101 |
TSA | 2.165 × 101 | 6.243 × 101 | 4.272 × 102 | 7.386 × 101 |
PSO | 2.160 × 109 | 2.160 × 109 | 9.648 × 109 | 2.406 × 109 |
GA | 3.540 × 101 | 5.233 × 101 | 9.555 × 101 | 1.336 × 101 |
HS | 5.139 × 102 | 2.355 × 103 | 9.487 × 103 | 2.902 × 103 |
F18 | ||||
DESNNA | 7.856 × 10−6 | 1.070 × 100 | 1.092 × 101 | 2.881 × 100 |
CCLNNA | 5.046 × 10−1 | 2.065 × 100 | 4.341 × 100 | 1.014 × 100 |
TSA | 8.064 × 101 | 1.093 × 102 | 1.443 × 102 | 1.659 × 101 |
PSO | 1.737 × 102 | 1.737 × 102 | 2.260 × 102 | 1.931 × 101 |
GA | 2.264 × 101 | 3.461 × 101 | 4.626 × 101 | 6.337 × 100 |
HS | 8.716 × 101 | 9.495 × 101 | 1.038 × 102 | 4.256 × 100 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, Y.; Wang, K.; Wang, G. Neural Network Algorithm with Dropout Using Elite Selection. Mathematics 2022, 10, 1827. https://doi.org/10.3390/math10111827
Wang Y, Wang K, Wang G. Neural Network Algorithm with Dropout Using Elite Selection. Mathematics. 2022; 10(11):1827. https://doi.org/10.3390/math10111827
Chicago/Turabian StyleWang, Yong, Kunzhao Wang, and Gaige Wang. 2022. "Neural Network Algorithm with Dropout Using Elite Selection" Mathematics 10, no. 11: 1827. https://doi.org/10.3390/math10111827
APA StyleWang, Y., Wang, K., & Wang, G. (2022). Neural Network Algorithm with Dropout Using Elite Selection. Mathematics, 10(11), 1827. https://doi.org/10.3390/math10111827