Dendritic Growth Optimization: A Novel Nature-Inspired Algorithm for Real-World Optimization Problems
Abstract
:1. Introduction
- DGO is a novel addition to the optimization toolkit. It diverges from traditional methods and existing nature-inspired algorithms by capitalizing on dendritic growth principles.
- The architecture of DGO is inherently distinctive. Inspired by the branching and connection mechanisms observed in dendritic structures, it introduces an innovative way of navigating complex solution spaces.
- The study details how the proposed strategy solves the local minima problem and finds a global optimum.
- In this paper, DGO is not limited to theoretical concepts but is applied practically to various datasets across different domains. This pragmatic approach demonstrates adaptability and relevance in diverse problem-solving scenarios and validates its generalizable nature.
- DGO’s performance is rigorously assessed using a variety of evaluation metrics. The study does not limit ourselves to a single criterion but provide a comprehensive analysis, ensuring a thorough understanding of its strengths and limitations.
- A list of the study’s applications, potential challenges, and limitations has been presented. By openly discussing these aspects, the study provides a balanced view of its capabilities and areas for potential improvement.
2. Materials and Methods
2.1. Related Works
2.2. Methodology
2.2.1. Mathematical Representation of DGO in Action
2.2.2. DGO and the Local Minima Problem
- DGO introduces genetic diversity by incorporating mutation operations. Mutation involves making small, random changes to candidate solutions. This helps prevent premature convergence to local minima, as individuals in the population explore nearby regions of the search space.
- DGO balances exploration and exploitation. While it encourages the exploration of new regions through mutation and crossover, it also exploits promising solutions by selecting them to be part of the next generation. This allows DGO to benefit from both exploration (escaping local minima) and exploitation (refining promising solutions).
- DGO uses crossover operations to exchange information between candidate solutions. Combining traits from multiple individuals creates offspring that inherit characteristics from their parents. This sharing of genetic information allows DGO to escape local minima by combining the strengths of different solutions.
- Many variants of DGO incorporate adaptive strategies. These mechanisms allow DGO to adjust its behavior during optimization dynamically. Suppose the algorithm detects that it is converging too quickly or getting trapped in local minima. In that case, it can shift its focus toward exploration by increasing mutation rates or altering other parameters.
- DGO preserves good solutions throughout the optimization process. Even if a local minimum is encountered, DGO does not immediately discard it. Instead, it retains the solution as part of the population, ensuring that valuable information is not lost. This helps prevent the complete abandonment of promising solutions.
- Some DGO variants incorporate memory mechanisms to store historical information about solutions and their performance. This memory can guide the optimization process and prevent revisiting regions of the search space that have been explored extensively.
- DGO may dynamically adjust its parameters, such as mutation rates and selection pressure, to adapt to the optimization problem’s characteristics. This adaptability helps the algorithm effectively navigate challenging landscapes with local minima.
- Initialization—DGO begins by initializing a population of candidate solutions.
- Evaluate Fitness—The fitness of each candidate solution is evaluated based on the objective function.
- Select Individuals for Reproduction—DGO selects individuals from the population, applies crossover and mutation operations, and generates offspring. Perturbation refers to diverse strategies to introduce disturbances either to the entire population or specific solution subsets, whereas mutation is a specialized genetic operator employed during reproduction, targeting individual solutions. Perturbation is concerned with the ability to sustain diversity and forestall premature convergence, while mutation is geared towards fostering genetic diversity in offspring to explore uncharted regions of the search space.
- Fitness Evaluation for Offspring—The fitness of the offspring is evaluated.
- Population Replacement—A portion of the current population is replaced with the offspring.
- Adaptive Strategies/Global Exploration Check—DGO may employ adaptive mechanisms to dynamically adjust its behavior.
- Termination Criteria—The optimization process continues until termination criteria are met, which could include a maximum number of iterations, convergence threshold, or other conditions.
- End—The algorithm concludes when it either finds a solution that meets the termination criteria or reaches a predefined stopping point.
- Regarding the Branching Mechanism, while it is true that branching involves generating multiple variants (branches) from a seed, the key distinction lies in how DGO achieves branching. In DGO, the branching mechanism is inspired by natural dendritic growth, emphasizing a more organic and nature-inspired approach.
- The concept of Dendritic Growth itself, as DGO introduces a unique concept by mimicking dendritic growth patterns found in nature. Dendritic growth involves the iterative development of branching structures, and DGO adapts this concept to explore the solution space. This organic, hierarchical growth is distinct from traditional crossover/mutation operations.
- DGO involves a competition and pruning mechanism, where solutions compete for survival based on their fitness. The pruning process helps refine the population, focusing on promising branches. This combination of branching, competition, and pruning differentiates DGO from other evolutionary algorithms.
- Moreover, DGO draws inspiration from the branching and growth observed in biological systems, emphasizing the adaptation of natural processes for optimization. This biological inspiration differentiates it from algorithms that primarily focus on mathematical operations like crossover and mutation.
3. Results and Observations
3.1. Datasets
3.2. Performance Evaluation Metrics
- Accuracy: Accuracy is one of the most straightforward metrics, measuring the proportion of correctly predicted instances out of the total predictions. It is suitable for balanced datasets but can be misleading when dealing with imbalanced datasets.Accuracy = (True Positives + True Negatives)/Total Predictions
- Precision: Precision assesses the model’s ability to correctly identify positive instances. It measures the ratio of true positive predictions to the total predicted positives. High precision indicates fewer false positives.Precision = True Positives/(True Positives + False Positives)
- Recall (Sensitivity or True Positive Rate): Recall evaluates the model’s capability to identify all relevant instances in the dataset. It calculates the ratio of true positives to the total actual positives. High recall implies fewer false negatives.Recall = True Positives/(True Positives + False Negatives)
- F1 Score: The F1 Score combines precision and recall into a single metric, balancing the trade-off between false positives and false negatives. It is particularly useful when dealing with imbalanced datasets.F1 Score = 2 × (Precision − Recall)/(Precision + Recall)
- ROC Curves (Receiver Operating Characteristic Curves): ROC curves are valuable for binary classification models. They illustrate the trade-off between true positive rate (sensitivity) and false positive rate (1-specificity) at various thresholds. The area under the ROC curve (AUC-ROC) quantifies the model’s overall performance, with a higher AUC indicating better performance. AUC-ROC measures the area under the ROC curve, ranging from 0 to 1, where 0.5 represents random guessing and 1 signifies perfect classification.
3.3. Results
Model | Accuracy | Precision | Recall | F1-Score | Time to Run (s) |
---|---|---|---|---|---|
KNN + DGO | 0.71 | 0.70 | 0.70 | 0.69 | 1.32 |
LR + DGO | 0.78 | 0.75 | 0.75 | 0.75 | 1.65 |
ANN + DGO | 0.81 | 0.79 | 0.78 | 0.79 | 5.73 |
SVM + DGO | 0.78 | 0.76 | 0.77 | 0.78 | 8.54 |
RF + DGO | 0.75 | 0.76 | 0.70 | 0.70 | 3.18 |
CNN + DGO | 0.81 | 0.82 | 0.81 | 0.80 | 9.23 |
CNN + LSTM + DGO | 0.78 | 0.79 | 0.77 | 0.80 | 20.09 |
PSO + DGO | 0.83 | 0.81 | 0.78 | 0.81 | 26.34 |
NSGA-II + DGO | 0.79 | 0.81 | 0.81 | 0.79 | 41.04 |
ACO + DGO | 0.83 | 0.82 | 0.80 | 0.82 | 35.55 |
GA + DGO | 0.81 | 0.82 | 0.80 | 0.79 | 29.26 |
Model | Accuracy | Precision | Recall | F1-Score | Time to Run (s) |
---|---|---|---|---|---|
KNN + DGO | 0.94 | 0.93 | 0.98 | 0.95 | 1.87 |
LR + DGO | 0.98 | 0.92 | 0.99 | 0.97 | 2.03 |
ANN + DGO | 0.97 | 0.97 | 0.98 | 0.98 | 5.37 |
SVM + DGO | 0.95 | 0.97 | 0.96 | 0.96 | 9.12 |
RF + DGO | 0.96 | 0.95 | 0.98 | 0.96 | 2.89 |
CNN + DGO | 0.96 | 0.97 | 0.96 | 0.97 | 13.52 |
CNN + LSTM + DGO | 0.97 | 0.97 | 0.98 | 0.98 | 29.11 |
PSO + DGO | 0.96 | 0.94 | 0.95 | 0.94 | 38.06 |
NSGA-II + DGO | 0.96 | 0.98 | 0.92 | 0.92 | 48.66 |
ACO + DGO | 0.97 | 0.94 | 0.95 | 0.93 | 40.98 |
GA + DGO | 0.98 | 0.97 | 0.98 | 0.95 | 37.83 |
3.4. Comparative Analysis
3.5. Observations
- DGO is a newly proposed optimization algorithm inspired by natural branching patterns, making it unique in its approach to solving complex optimization problems. Its innovative architecture and principles set it apart from traditional optimization techniques.
- DGO’s architecture is designed to be adaptable and flexible, allowing it to be applied to a wide variety of optimization challenges and tasks. This flexibility makes DGO a promising candidate for diverse applications across different domains.
- DGO addresses the local minima problem by employing a combination of strategies that promote diversity, balance exploration and exploitation, encourage information exchange between solutions, and adapt to the optimization problem’s dynamics. These mechanisms work together to help DGO escape local minima and converge toward the global minimum in complex optimization scenarios.
- When DGO is applied to various machine learning and deep learning algorithms, it consistently leads to performance improvements. Metrics such as precision, accuracy, recall, F-1 score, and ROC curve all show improved performances.
- DGO’s effectiveness is validated through extensive testing on two distinct datasets, i.e., the Diabetes dataset and the Breast Cancer dataset. Both datasets serve as benchmarks to assess DGO’s performance on different data types and domains.
- DGO’s robustness is evident in its ability to tackle various optimization challenges effectively. It can adapt to complex problem scenarios and deliver meaningful results.
- DGO’s scalability is a key attribute that suggests its potential for application in large-scale and intricate problem domains. It can handle complex optimization tasks that involve a significant volume of data and parameters.
- DGO excels in efficiently exploring complex solution spaces. It leverages its inspiration from natural dendritic growth patterns to adapt and navigate diverse optimization landscapes effectively. This adaptability sets it apart from some other nature-inspired methods, which might struggle in highly nonlinear or multi-modal spaces.
- DGO often demonstrates faster convergence rates compared to certain traditional optimization algorithms and other nature-inspired techniques. Its ability to rapidly identify promising regions in the solution space contributes to quicker optimization, saving computational time and resources.
- DGO showcases versatility by consistently delivering strong performance across various problem domains and datasets. This adaptability extends its utility beyond specific niches and allows it to address a wide array of optimization challenges, whereas some other techniques may require fine-tuning for different domains.
- DGO’s innovative nature-inspired approach, drawing inspiration from dendritic growth patterns, introduces a fresh perspective to optimization. This uniqueness can be advantageous for solving novel or unconventional optimization problems where traditional methods might not be directly applicable.
- DGO strikes a balance between global and local optimization. It is capable of efficiently searching for global optima while also navigating local optima effectively. This balance is crucial for solving complex optimization tasks with multiple extrema, which can be challenging for some other algorithms.
- DGO exhibits robustness in handling noisy data and complex problem scenarios. Moreover, its scalability makes it suitable for both small and large-scale optimization problems, providing consistent performance across a wide spectrum of applications.
- DGO’s adaptability and efficiency extend its utility to various fields, including machine learning, logistics, engineering, and beyond. This interdisciplinary applicability broadens its potential impact compared to optimization techniques tailored to specific domains.
4. Limitations and Constraints
5. Conclusions and Future Work
Funding
Data Availability Statement
Conflicts of Interest
References
- Sun, S.; Cao, Z.; Zhu, H.; Zhao, J. A survey of optimization methods from a machine learning perspective. IEEE Trans. Cybern. 2019, 50, 3668–3681. [Google Scholar] [CrossRef]
- Wei, Y.; Othman, Z.; Daud, K.M.; Luo, Q.; Zhou, Y. Advances in Slime Mould Algorithm: A Comprehensive Survey. Biomimetics 2024, 9, 31. [Google Scholar] [CrossRef]
- Venkatasubramanian, S. Optimal Cluster head selection-based Hybrid Moth Search Algorithm with Tree Seed algorithm for multipath routing in WSN. In Proceedings of the 2023 International Conference on Networking and Communications (ICNWC), Chennai, India, 5–6 April 2023; pp. 1–7. [Google Scholar]
- Ekinci, S.; Izci, D.; Eker, E.; Abualigah, L.; Thanh, C.L.; Khatir, S. Hunger games pattern search with elite opposite-based solution for solving complex engineering design problems. Evol. Syst. 2023, 1–26. [Google Scholar] [CrossRef]
- Ahmadianfar, I.; Halder, B.; Heddam, S.; Goliatt, L.; Tan, M.L.; Sa’adi, Z.; Yaseen, Z.M. An Enhanced Multioperator Runge–Kutta Algorithm for Optimizing Complex Water Engineering Problems. Sustainability 2023, 15, 1825. [Google Scholar] [CrossRef]
- He, X.; Shan, W.; Zhang, R.; Heidari, A.A.; Chen, H.; Zhang, Y. Improved Colony Predation Algorithm Optimized Convolutional Neural Networks for Electrocardiogram Signal Classification. Biomimetics 2023, 8, 268. [Google Scholar] [CrossRef]
- Ikram RM, A.; Mostafa, R.R.; Chen, Z.; Parmar, K.S.; Kisi, O.; Zounemat-Kermani, M. Water temperature prediction using improved deep learning methods through reptile search algorithm and weighted mean of vectors optimizer. J. Mar. Sci. Eng. 2023, 11, 259. [Google Scholar] [CrossRef]
- Peng, L.; Cai, Z.; Heidari, A.A.; Zhang, L.; Chen, H. Hierarchical Harris hawks optimizer for feature selection. J. Adv. Res. 2023, 53, 261–278. [Google Scholar] [CrossRef] [PubMed]
- Su, H.; Zhao, D.; Heidari, A.A.; Liu, L.; Zhang, X.; Mafarja, M.; Chen, H. RIME: A physics-based optimization. Neurocomputing 2023, 532, 183–214. [Google Scholar] [CrossRef]
- Houssein, E.H.; Oliva, D.; Samee, N.A.; Mahmoud, N.F.; Emam, M.M. Liver Cancer Algorithm: A novel bio-inspired optimizer. Comput. Biol. Med. 2023, 165, 107389. [Google Scholar] [CrossRef] [PubMed]
- Sun, Y.; Zong, C.; Pancheri, F.; Chen, T.; Lueth, T.C. Design of topology optimized compliant legs for bio-inspired quadruped robots. Sci. Rep. 2023, 13, 4875. [Google Scholar] [CrossRef] [PubMed]
- Li, B.; Xuan, C.; Tang, W.; Zhu, Y.; Yan, K. Topology optimization of plate/shell structures with respect to eigenfrequencies using a biologically inspired algorithm. Eng. Optim. 2018, 51, 1829–1844. [Google Scholar] [CrossRef]
- Priyadarshini, I.; Sharma, R.; Bhatt, D.; Al-Numay, M. Human activity recognition in cyber-physical systems using optimized machine learning techniques. Clust. Comput. 2023, 26, 2199–2215. [Google Scholar] [CrossRef]
- Kumar, A.; Nadeem, M.; Banka, H. Nature inspired optimization algorithms: A comprehensive overview. Evol. Syst. 2023, 14, 141–156. [Google Scholar] [CrossRef]
- Yuan, Y.L.; Hu, C.M.; Li, L.; Mei, Y.; Wang, X.Y. Regional-modal optimization problems and corresponding normal search particle swarm optimization algorithm. Swarm Evol. Comput. 2023, 78, 101257. [Google Scholar] [CrossRef]
- Karimi, F.; Dowlatshahi, M.B.; Hashemi, A. SemiACO: A semi-supervised feature selection based on ant colony optimization. Expert. Syst. Appl. 2023, 214, 119130. [Google Scholar] [CrossRef]
- Corominas, G.R.; Blesa, M.J.; Blum, C. AntNetAlign: Ant colony optimization for network alignment. Appl. Soft Comput. 2023, 132, 109832. [Google Scholar] [CrossRef]
- Yang, X.S. Nature-inspired optimization algorithms: Challenges and open problems. J. Comput. Sci. 2020, 46, 101104. [Google Scholar] [CrossRef]
- Gao, Z.M.; Zhao, J.; Hu, Y.R.; Chen, H.F. The challenge for the nature-inspired global optimization algorithms: Non-symmetric benchmark functions. IEEE Access 2021, 9, 106317–106339. [Google Scholar] [CrossRef]
- Mahadeva, R.; Kumar, M.; Gupta, V.; Manik, G.; Patole, S.P. Modified Whale Optimization Algorithm based ANN: A novel predictive model for RO desalination plant. Sci. Rep. 2023, 13, 2901. [Google Scholar] [CrossRef]
- Zhao, S.; Zhang, T.; Ma, S.; Wang, M. Sea-horse optimizer: A novel nature-inspired meta-heuristic for global optimization problems. Appl. Intell. 2023, 53, 11833–11860. [Google Scholar] [CrossRef]
- Sahoo, S.K.; Saha, A.K.; Nama, S.; Masdari, M. An improved moth flame optimization algorithm based on modified dynamic opposite learning strategy. Artif. Intell. Rev. 2023, 56, 2811–2869. [Google Scholar] [CrossRef]
- Dhal, K.G.; Das, A.; Ray, S.; Rai, R.; Ghosh, T.K. Archimedes optimizer-based fast and robust fuzzy clustering for noisy image segmentation. J. Supercomput. 2023, 79, 3691–3730. [Google Scholar] [CrossRef]
- Hu, G.; Yang, R.; Qin, X.; Wei, G. MCSA: Multi-strategy boosted chameleon-inspired optimization algorithm for engineering applications. Comput. Methods Appl. Mech. Eng. 2023, 403, 115676. [Google Scholar] [CrossRef]
- Seyyedabbasi, A.; Kiani, F. Sand Cat swarm optimization: A nature-inspired algorithm to solve global optimization problems. Eng. Comput. 2023, 39, 2627–2651. [Google Scholar] [CrossRef]
- Mohammed, H.; Rashid, T. FOX: A FOX-inspired optimization algorithm. Appl. Intell. 2023, 53, 1030–1050. [Google Scholar] [CrossRef]
- Yuan, Y.; Shen, Q.; Wang, S.; Ren, J.; Yang, D.; Yang, Q.; Fan, J.; Mu, X. Coronavirus Mask Protection Algorithm: A New Bio-inspired Optimization Algorithm and Its Applications. J. Bionic Eng. 2023, 20, 1747–1765. [Google Scholar] [CrossRef] [PubMed]
- Agushaka, J.O.; Ezugwu, A.E.; Abualigah, L. Gazelle optimization algorithm: A novel nature-inspired metaheuristic optimizer. Neural Comput. Applic. 2023, 35, 4099–4131. [Google Scholar] [CrossRef]
- Abdel-Basset, M.; Mohamed, R.; Jameel, M.; Abouhawwash, M. Nutcracker optimizer: A novel nature-inspired metaheuristic algorithm for global optimization and engineering design problems. Knowl. Based Syst. 2023, 262, 110248. [Google Scholar] [CrossRef]
- Hashim, F.A.; Hussien, A.G. Snake Optimizer: A novel meta-heuristic optimization algorithm. Knowl. Based Syst. 2022, 242, 108320. [Google Scholar] [CrossRef]
- Zhong, C.; Li, G.; Meng, Z. Beluga whale optimization: A novel nature-inspired metaheuristic algorithm. Knowl. Based Syst. 2022, 251, 109215. [Google Scholar] [CrossRef]
- Rajeswari, S.V.K.R.; Ponnusamy, V. Prediction of diabetes mellitus using machine learning algorithm. Ann. Rom. Soc. Cell Biol. 2021, 21, 5655–5662. [Google Scholar]
- Kadhim, R.R.; Kamil, M.Y. Comparison of breast cancer classification models on Wisconsin dataset. Int. J. Reconfigurable Embed. Syst. ISSN 2022, 2089, 4864. [Google Scholar] [CrossRef]
- Shandilya, S.K.; Choi, B.J.; Kumar, A.; Upadhyay, S. Modified Firefly Optimization Algorithm-Based IDS for Nature-Inspired Cybersecurity. Processes 2023, 11, 715. [Google Scholar] [CrossRef]
- Singh, L.K.; Khanna, M.; Thawkar, S.; Singh, R. Nature-inspired computing and machine learning based classification approach for glaucoma in retinal fundus images. Multimed. Tools Appl. 2023, 82, 42851–42899. [Google Scholar] [CrossRef]
- Yuan, Y.; Ren, J.; Wang, S.; Wang, Z.; Mu, X.; Zhao, W. Alpine skiing optimization: A new bio-inspired optimization algorithm. Adv. Eng. Softw. 2022, 170, 103158. [Google Scholar] [CrossRef]
- Husnain, G.; Anwar, S. An intelligent probabilistic whale optimization algorithm (i-WOA) for clustering in vehicular ad hoc networks. Int. J. Wirel. Inf. Netw. 2022, 29, 143–156. [Google Scholar] [CrossRef]
- Patil, R.N.; Rawandale, S.; Rawandale, N.; Rawandale, U.; Patil, S. An efficient stacking based NSGA-II approach for predicting type 2 diabetes. Int. J. Electr. Comput. Eng. (IJECE) 2023, 13, 1015–1023. [Google Scholar] [CrossRef]
Model | Accuracy | Precision | Recall | F1-Score | Time to Run (s) |
---|---|---|---|---|---|
KNN | 0.69 | 0.68 | 0.69 | 0.68 | 0.23 |
LR | 0.73 | 0.69 | 0.72 | 0.71 | 0.31 |
ANN | 0.75 | 0.71 | 0.75 | 0.73 | 2.34 |
SVM | 0.74 | 0.69 | 0.78 | 0.73 | 4.12 |
RF | 0.72 | 0.69 | 0.68 | 0.68 | 0.79 |
CNN-1D | 0.79 | 0.78 | 0.78 | 0.77 | 6.44 |
CNN-LSTM | 0.76 | 0.77 | 0.76 | 0.78 | 14.27 |
PSO | 0.81 | 0.80 | 0.77 | 0.77 | 18. 14 |
NSGA-II | 0.76 | 0.78 | 0.79 | 0.78 | 26.14. |
ACO | 0.81 | 0.81 | 0.81 | 0.79 | 21.45 |
GA | 0.79 | 0.80 | 0.79 | 0.80 | 20.87 |
Model | Accuracy | Precision | Recall | F1-Score | Time to Run (s) |
---|---|---|---|---|---|
KNN | 0.92 | 0.93 | 0.95 | 0.94 | 0.13 |
LR | 0.97 | 0.97 | 0.98 | 0.97 | 0.19 |
ANN | 0.97 | 0.97 | 0.98 | 0.97 | 2.02 |
SVM | 0.94 | 0.92 | 0.95 | 0.96 | 5.03 |
RF | 0.93 | 0.93 | 0.97 | 0.95 | 0.45 |
CNN-1D | 0.94 | 0.96 | 0.96 | 0.95 | 8.18 |
CNN-LSTM | 0.96 | 0.97 | 0.97 | 0.97 | 19.37 |
PSO | 0.95 | 0.93 | 0.92 | 0.94 | 25.21 |
NSGA-II | 0.94 | 0.96 | 0.93 | 0.93 | 34.08 |
ACO | 0.96 | 0.95 | 0.92 | 0.95 | 29.81 |
GA | 0.96 | 0.92 | 0.94 | 0.94 | 26.03 |
Author and Year | Proposed Work | Methodology/Parameters | Results |
---|---|---|---|
Seyyedabbasi and Kiani 2023 [25] | Nature-inspired optimization algorithm, Sand Cat Swarm Optimization (SCSO), | More then 20 test functions of CEC2019 benchmark functions | SCSO performed best in 63.3% of the test functions |
Yuan et al., 2023 [27] | Bionic optimization algorithm, Coronavirus Mask Protection Algorithm (CMPA), | CEC2020 suite problems, state-of-the-art metaheuristic algorithms | Mass and deflection improved by 16.44% and 7.49% |
Shandilya et al., 2023 [34] | Modified Firefly Optimization Algorithm-Based IDS | Early detection of suspicious nodes, event management schemes | Suspicious nodes reduced by 60–80% |
Singh et al., 2023 [35] | Nature-inspired computing for detecting glaucoma in retinal fundus images | Particle Swarm Optimization (PSO), Artificial Bee Colony (ABC), and Binary Cuckoo Search (BCS) | BCS shows up to 98.46% accuracy |
Yuan et al., 2022 [36] | Alpine skiing optimization | Mathematical modelling, performance evaluation | Braking efficiency factor is improved by 28.446% |
Husnain and Anwar 2022 [37] | Intelligent Probabilistic Whale Optimization Algorithm | Clustering in vehicular ad hoc networks | 75% improvement in cluster optimization |
Patil et al., 2023 [38] | Predicting type-2 diabetes using optimization | Stacking-based non-dominated sorting genetic algorithm (NSGA-II) | Accuracy is 81% on Pima dataset and 89% on collected data |
Proposed Work | Dendritic Growth Optimization (DGO) Algorithm | KNN, LR, ANN, SVM, RF, CNN, CNN-LSTM, PSO, NSGA-II, ACO, GA | Significant improvement in performance (up to 83% accuracy in dataset1, and 98% accuracy in dataset2) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Priyadarshini, I. Dendritic Growth Optimization: A Novel Nature-Inspired Algorithm for Real-World Optimization Problems. Biomimetics 2024, 9, 130. https://doi.org/10.3390/biomimetics9030130
Priyadarshini I. Dendritic Growth Optimization: A Novel Nature-Inspired Algorithm for Real-World Optimization Problems. Biomimetics. 2024; 9(3):130. https://doi.org/10.3390/biomimetics9030130
Chicago/Turabian StylePriyadarshini, Ishaani. 2024. "Dendritic Growth Optimization: A Novel Nature-Inspired Algorithm for Real-World Optimization Problems" Biomimetics 9, no. 3: 130. https://doi.org/10.3390/biomimetics9030130
APA StylePriyadarshini, I. (2024). Dendritic Growth Optimization: A Novel Nature-Inspired Algorithm for Real-World Optimization Problems. Biomimetics, 9(3), 130. https://doi.org/10.3390/biomimetics9030130