Peak and Bad-Case Performance of Swarm and Evolutionary Optimization Algorithms

A special issue of Algorithms (ISSN 1999-4893). This special issue belongs to the section "Evolutionary Algorithms and Machine Learning".

Deadline for manuscript submissions: 20 October 2024 | Viewed by 3127

Special Issue Editors


E-Mail Website
Guest Editor
Faculty of Organizationa and Informatics, University of Zagreb, Pavlinska ul. 2, 42000 Varaždin, Croatia
Interests: swarm and evolutionary computation; computational intelligence; optimization; computer networks

E-Mail Website
Guest Editor
Faculty of Electrical Engineering and Computer Science, University of Maribor, Koroška cesta 46, 2000 Maribor, Slovenia
Interests: evolutionary algorithms; artificial intelligence

Special Issue Information

Dear Colleagues, 

This Special Issue focuses on swarm intelligence and evolutionary computation algorithms in general. Being stochastic, these algorithms generate better or worse solutions by chance. As a rule, in scientific research, the average performance based on the arithmetic mean is reported and analyzed. In practice, these algorithms can and should be executed multiple times (possibly in parallel) and the probability of obtaining peak performance solutions then increases arbitrarily to high certainty. Due to the parallelization trends of computing elements in recent decades, this became particularly practical. On the other hand, some application scenarios might require very high probabilities of obtaining a solution of at least some minimally acceptable quality and this is where bad-case performance matters.

Experimental studies of peak or bad-case performance of algorithms that previously showed state-of-the-art average performance are welcome. Large comparisons of peak performance or bad-case performance of swarm intelligence and evolutionary computation algorithms are welcome too, and theoretical findings concerning peak performance or bad-case performance are also welcome. Finally, parameter tuning procedures for peak performance or bad-case performance are welcome as well. 

Dr. Nikola Ivković
Dr. Matej Črepinšek
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Algorithms is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • evolutionary computation
  • swarm intelligence
  • natural computation
  • bioinspired optimization
  • metaheuristics
  • NP-hard
  • NPO
  • optimization
  • stochastic algorithms
  • soft computing
  • computational intelligence
  • peak performance
  • bad-case performance
  • probabilities in performance of stochastic algorithms
  • confidence intervals
  • multiple runs of swarm and evolutionary algorithms

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

20 pages, 1674 KiB  
Article
Application of Genetic Algorithms for Periodicity Recognition and Finite Sequences Sorting
by Mukhtar Zhassuzak, Marat Akhmet, Yedilkhan Amirgaliyev and Zholdas Buribayev
Algorithms 2024, 17(3), 101; https://doi.org/10.3390/a17030101 - 26 Feb 2024
Viewed by 958
Abstract
Unpredictable strings are sequences of data with complex and erratic behavior, which makes them an object of interest in various scientific fields. Unpredictable strings related to chaos theory was investigated using a genetic algorithm. This paper presents a new genetic algorithm for converting [...] Read more.
Unpredictable strings are sequences of data with complex and erratic behavior, which makes them an object of interest in various scientific fields. Unpredictable strings related to chaos theory was investigated using a genetic algorithm. This paper presents a new genetic algorithm for converting large binary sequences into their periodic form. The MakePeriod method is also presented, which is aimed at optimizing the search for such periodic sequences, which significantly reduces the number of generations to achieve the result of the problem under consideration. The analysis of the deviation of a nonperiodic sequence from its considered periodic transformation was carried out, and methods of crossover and mutation were investigated. The proposed algorithm and its associated conclusions can be applied to processing large sequences and different values of the period, and also emphasize the importance of choosing the right methods of crossover and mutation when applying genetic algorithms to this task. Full article
Show Figures

Figure 1

15 pages, 758 KiB  
Article
Measuring the Performance of Ant Colony Optimization Algorithms for the Dynamic Traveling Salesman Problem
by Michalis Mavrovouniotis, Maria N. Anastasiadou and Diofantos Hadjimitsis
Algorithms 2023, 16(12), 545; https://doi.org/10.3390/a16120545 - 28 Nov 2023
Cited by 1 | Viewed by 1426
Abstract
Ant colony optimization (ACO) has proven its adaptation capabilities on optimization problems with dynamic environments. In this work, the dynamic traveling salesman problem (DTSP) is used as the base problem to generate dynamic test cases. Two types of dynamic changes for the DTSP [...] Read more.
Ant colony optimization (ACO) has proven its adaptation capabilities on optimization problems with dynamic environments. In this work, the dynamic traveling salesman problem (DTSP) is used as the base problem to generate dynamic test cases. Two types of dynamic changes for the DTSP are considered: (1) node changes and (2) weight changes. In the experiments, ACO algorithms are systematically compared in different DTSP test cases. Statistical tests are performed using the arithmetic mean and standard deviation of ACO algorithms, which is the standard method of comparing ACO algorithms. To complement the comparisons, the quantiles of the distribution are also used to measure the peak-, average-, and bad-case performance of ACO algorithms. The experimental results demonstrate some advantages of using quantiles for evaluating the performance of ACO algorithms in some DTSP test cases. Full article
Show Figures

Figure 1

Back to TopTop