Next Article in Journal
Parallel Simulation Using Reactive Streams: Graph-Based Approach for Dynamic Modeling and Optimization
Previous Article in Journal
Computational Analysis of Tandem Micro-Vortex Generators for Supersonic Boundary Layer Flow Control
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Simplified Fish School Search Algorithm for Continuous Single-Objective Optimization

by
Elliackin Figueiredo
1,
Clodomir Santana
2,
Hugo Valadares Siqueira
3,
Mariana Macedo
4,
Attilio Converti
5,
Anu Gokhale
6 and
Carmelo Bastos-Filho
1,*
1
Department of Computer Engineering, University of Pernambuco, Recife 50670-901, Brazil
2
Department of Internal Medicine, University of California, Davis, CA 95616, USA
3
Department of Electric Engineering, Federal University of Technology–Paraná, Curitiba 80230-901, Brazil
4
Department of Computer Science, Northeastern University London, London E1W 1LP, UK
5
Department of Civil, Chemical and Environmental Engineering, Pole of Chemical Engineering, University of Genoa, Via Opera Pia 15, 16145 Genoa, Italy
6
Department of Computer Information Systems, Saint Augustine’s University, Raleigh, NC 27610, USA
*
Author to whom correspondence should be addressed.
Computation 2025, 13(5), 102; https://doi.org/10.3390/computation13050102
Submission received: 27 February 2025 / Revised: 31 March 2025 / Accepted: 10 April 2025 / Published: 25 April 2025

Abstract

:
The Fish School Search (FSS) algorithm is a metaheuristic known for its distinctive exploration and exploitation operators and cumulative success representation approach. Despite its success across various problem domains, the FSS presents issues due to its high number of parameters, making its performance susceptible to improper parameterization. Additionally, the interplay between its operators requires a sequential execution in a specific order, requiring two fitness evaluations per iteration for each individual. This operator’s intricacy and the number of fitness evaluations pose the issue of costly fitness functions and inhibit parallelization. To address these challenges, this paper proposes a Simplified Fish School Search (SFSS) algorithm that preserves the core features of the original FSS while redesigning the fish movement operators and introducing a new turbulence mechanism to enhance population diversity and robustness against stagnation. The SFSS also reduces the number of fitness evaluations per iteration and minimizes the algorithm’s parameter set. Computational experiments were conducted using a benchmark suite from the CEC 2017 competition to compare the SFSS with the traditional FSS and five other well-known metaheuristics. The SFSS outperformed the FSS in 84% of the problems and achieved the best results among all algorithms in 10 of the 26 problems.

1. Introduction

In computational intelligence, swarm and evolutionary metaheuristics have garnered significant attention for their ability to solve complex optimization problems. Leveraging nature’s process to evolve efficient and effective solutions to many challenges, techniques in this domain seek to apply these principles to the design metaheuristics. Among these metaheuristics are Genetic Algorithms (GAs) [1], Particle Swarm Optimization (PSO) [2], the Artificial Bee Colony (ABC) [3], and the Fish School Search (FSS) [4]. Besides drawing inspiration from unique biological and evolutionary principles in their design, each one of these methods possesses unique characteristics related to their behaviour, operators, and capabilities.
For example, Genetic Algorithms are a class of optimization algorithms inspired by natural selection and belong to the broader category of evolutionary algorithms (EAs) [1]. GAs leverage selection, crossover, and mutation operators to evolve a population of potential solutions over successive generations [5]. GAs typically encode solutions as strings or chromosomes, often using binary representation. This encoding scheme allows for the easy manipulation and combination of solutions, making it an excellent choice for combinatorial optimization problems [6].
Unlike GAs, Particle Swarm Optimization is in the swarm intelligence (SI) field. PSO is based on the social behaviour of birds flocking and fish schooling [2]. It comprises a population of candidate solutions (particles) that move through the search space to find the optimal solution. Unlike other metaheuristics, PSO uses velocity and position vectors to guide the search process. The velocity vector determines the direction and speed of a particle’s movement, while the position vector represents the particle’s current solution [7]. The updates are governed by equations that incorporate both the particle’s best-known position and the best position discovered by the swarm.
Another example of an SI metaheuristic is the Artificial Bee Colony, which simulates the foraging behaviour of honey bees [3]. In the ABC, there are three types of bees, employed bees, onlooker bees, and scout bees, representing phases of the algorithm. These phases allow for a balanced exploration and exploitation of the search space. The ABC differs from GAs and most swarm-based algorithms because the candidate solutions are not encoded as part of the agents (bees). Instead, the ABC uses the analogy of food sources to represent the candidate solutions. The bee searchers exploit these food sources to find better solutions to the optimization problem [3].
A third algorithm from the SI family is the Fish School Search. The Fish School Search (FSS) algorithm is inspired by fish swarms’ collective and individual behaviour [8]. In the FSS, individual fish represent potential solutions, and local and global behaviours influence their movements. Individual, collective instinct, and collective volitive components govern the fish movement [4]. Individual movement allows each fish to explore the search space based on its own experiences. In contrast, the collective movements direct the fish school towards promising regions in the search space, influenced by the overall school’s behaviour [4]. While the ABC and PSO use current or previous best positional information to estimate success, the FSS employs cumulative success representation for its candidate solutions [4]. The success accumulation is represented as the fish’s weight [9]. Over iterations, fish can gain weight when they improve their solution, and the populations will tend to move to regions with the heaviest fish. The FSS is known for its unique strategy to balance exploration and exploitation based on the fish movements and the feeding operator [8].
Despite their success and widespread application, these metaheuristics exhibit drawbacks. For example, GAs can present imitations linked to the definition of a proper solution encoding strategy [10] and premature convergence [11], PSO has issues maintaining swarm diversity and avoiding premature convergence [12,13,14], and the ABC is weak in exploration [15,16,17].
It is worth noting the significant advancements made in the field of metaheuristics, which has not only new advanced methods and applications such as the Salp Swarm Algorithm [18,19] but also novel ways to model and study fitness landscapes as hierarchical structure of solutions based on dominance relationships, where a solution is considered dominant if it outperforms others in multiple objectives or constraints [20].
Regarding the FSS, it is more complex to implement and has greater algorithmic complexity than GAs and PSO, and performance issues due to improper parametrization can occur. Also, the interplay between the movements requires them to be executed sequentially with two fitness evaluations per individual per iteration: one evaluation after the individual movement used to signal the individual guiding the collective movements and another after the collective movements to update the population’s fitness. Reducing the number of fitness evaluations benefits computationally expensive fitness calculations and allows for the parallel execution of the movements.
This paper proposes a novel simplified version of the Fish School Search algorithm. Our approach aims to retain the core advantages of FSS—such as the balance between exploration and exploitation, adaptability, and robustness—while reducing the number of parameters and fitness evaluations per iteration. The main challenges involve identifying and preserving the essential characteristics contributing to the algorithm’s success while eliminating redundancies and minimizing computational overhead. The simplification process aims to achieve the following:
  • Analyse the original FSS to identify critical elements that drive its performance and refine or eliminate non-essential components.
  • Reduce the number of fitness evaluations per iteration by redesigning the interplay between the fish movement operators.
  • Decrease the number of parameters to make it less susceptible to performance issues due to improper parametrization.
The remainder of this article is organized as follows: Section 2 describes the original FSS and other proposed versions of it, Section 3 presents the new version of the FSS algorithm, and Section 4 shows the computational results achieved using databases from the CEC 2017 competition and a discussion about them. Finally, Section 5 presents the Conclusions.

2. Fish School Search (FSS)

The Fish School Search has four operators: individual movement, feed operator, collective instinctive movement, and collective volitive movement [21]. The collective movements are unique to all schools.
It considers the following variables: N is the total number of fish or the school size and z i t is the current position of the fish i in the iteration t. The operators are described below [4]:
  • Individual movement ( n i t + 1 ) : This is a random search in which each fish randomly chooses a new position in its neighbourhood. It causes diversity and triggers the other operators. It is executed according to Equation (1):
    n i t + 1 = z i t + s t e p i n d . r a n d [ 1 , 1 ]
    where n i t + 1 is a new position (temporary), s t e p i n d is an individual step set by the user (this decay is often linear, although other variants of the FSS propose other methods such as exponential decay [22]), and r a n d [ 1 , 1 ] is a random value generated by a uniform probability density function in the interval [−1,1]. Observe that r a n d [ 1 , 1 ] must be drawn for each dimension d = 1 ,..., D separately, while s t e p i n d is constant in the current iteration. In this movement, the fish goes to the new position only if there is more food there than the current position.
  • Feeding operator ( w i t ): This updates the fish weight and occurs after the individual movement. Firstly, the new position n i t is evaluated according to the fitness f [ n i t ] obtained in the previous movement and compared to the fitness of the current position z i t , according to Equation (2):
    Δ f i t + 1 = | f [ n i t + 1 ] f [ z i t ] |
    The value Δ f i t + 1 is used to update the fish weight, as shown in Equation (3):
    w i t + 1 = w i t + Δ f i t + 1 m a x [ Δ f i t + 1 ]
    Equation (3) shows that the weight of the fish increases according to the success rate achieved by the individual movement. The fish will move to the new position n i t + 1 if the movement elevates its fitness or, in other words, if the new position is better than the current one (greedy search).
  • Instinctive collective movement ( m t ): This movement is influenced by the fish who successfully update their fitness from the individual movement. All the fish perform this movement, calculated via Equation (4):
    m t + 1 = i = 1 N Δ z i t + 1 Δ f i t + 1 i = 1 N Δ f i t + 1
    where z i t is the displacement of the fish i caused by the individual movement and Δ f i t is calculated by (2).
    So, the entire school has its position updated by Equation (5):
    z i t + 1 = z i t + m t + 1
  • Volitive collective movement ( B t ): This second collective movement is performed according to the overall success rate of the fish school, measured by the sum of the fish weights. If the total school weight has increased ( w t + 1 > w t ), this means that the current search was successful. So, the school should contract to improve the exploitation behaviour. However, if the school weight has decreased ( w t + 1 < w t ), it should expand to increase the exploration of the search space. This movement is executed according to the school barycenter calculated via (6):
    B t + 1 = i = 1 N z i t + 1 w i t + 1 i = 1 N w i t + 1
    If the weight of the school grows ( w t + 1 > w t ), the fish’s positions are updated according to Equation (7):
    z i t + 1 = z i t + 1 s t e p v o l . r a n d [ 0 , 1 ] z i t + 1 B t + 1
    If not ( w t + 1 < w t ), new positions are produced by (8):
    z i t + 1 = z i t + 1 + s t e p v o l . r a n d [ 0 , 1 ] z i t + 1 B t + 1
    where s t e p v o l = 2 . s t e p i n d (previously defined in the individual movement), and r a n d [ 0 , 1 ] is a random value generated by a uniform probability density function in the interval [0,1]. Observe that r a n d [ 0 , 1 ] must be drawn separately for each dimension d = 1 ,..., D , while s t e p v o l is constant in the current iteration.
Algorithm 1 presents the pseudocode of the original FSS.
Over the past decade, many improvements have been made to the FSS algorithm. The Density-based Fish School Search (dFSS) was developed to solve multimodal hyper-dimensional problems, adding new operators such as memory and partition [4]. The Weight-based Fish School Search (wFSS) modifies the barycenter by adding the Link Formation Rule, which causes the formation of niches [4]. Some versions are proposed to tackle premature convergence and stagnation [4]. Most recently, the FSS family was expanded to cover multi-objective problems for continuous and binary spaces [4]. Lastly, a simplified version of the FSS was also proposed for problems in the binary domain [4], which demonstrated that it was possible to reduce the complexity of the FSS while improving its performance.
Algorithm 1: FSS pseudocode.
Computation 13 00102 i001

3. The Proposed Fish School Search

The Simplified Fish School Search (SFSS) algorithm follows the structure and inspiration of the FSS (movements and operators). The main goal is to reduce the use of fitness functions while maintaining its generation of diversity and its automatic balance of exploitation and exploration mechanisms. Moreover, another objective is to minimize the number of parameters the user needs to initialize and define, for example, initial and final step sizes, initial weights, and weight limits. The only parameter preserved is the number of individuals in the swarm.
In the original FSS, the swarm evaluates twice: one time after the individual movement and another time after the volitive movement. To reduce to only one evaluation per iteration, instead of updating the individual’s position after each movement, the movements generate displacements based on the fish’s current position. After all displacements are calculated, the fish position is updated by combining all three displacement values. More than reducing the number of fitness evaluations, this strategy also allows the three displacements to be calculated in parallel, which reduces the execution time.
  • Individual Displacement ( I n d i t ) : For each fish in the school, a random value is drawn, generated by a uniform distribution in the interval [0, 1]. If the probability of the fish i is greater than the value generated, then the displacement is calculated using Equation (9). Otherwise, the fish does not perform an individual displacement.
    I n d i , d t + 1 = r a n d [ 1 , 1 ] · ( x i , d t 1 x j , d t 1 )
    where I n d i , d t is the displacement for the fish i, j is a random fish selected from the swarm (10), r a n d [ 1 , 1 ] is a random value generated by a uniform probability density function in the interval [−1,1], and d is a random dimension selected from the number of problem dimensions.
    Equation (10) calculates the fish selection probability.
    P i t + 1 = w i t + 1 m a x [ W t + 1 ]
    where w i is the weight of the fish i and m a x [ W t + 1 ] returns the weight of the heaviest fish in the school (i.e., the current best solution).
  • Instinctive Displacement ( I n s i t ) : For each fish that has improved in the school, the new position is calculated using the Equation (11):
    I n s i , d t + 1 = s e l e c t ( x i , d t 1 x i , d t ) i = 1 N w i t
    where I n s i t represents the instinctive displacement of the fish i in the dimension d, s e l e c t ( [ 1 , 1 ] ) is a function which selects and returns 1 or −1, and w i t is the weight of the fish i at the time t.
  • Volitive Collective Displacement ( V o l i t ) : For each fish in the school, the displacements are generated by Equation (12):
    V o l i t + 1 = s i g n x i t x j t
    where V o l i t + 1 is the volitive displacement for the fish i; j is a fish selected from the swarm using a binary tournament process; and s i g n is a function which returns a random value generated by a uniform probability density function in the interval [−1, 0], if the weight of the fish j is greater than the weight of fish i, or [1, 0] otherwise. This means that the fish i will move towards the fish j if the fish j is heavier.
The new position of the fish is generated by combining the three displacements, as can be seen in Equation (13):
x i t + 1 = I n d i t + 1 + I n s i t + 1 + V o l i t + 1
Another modification made is related to the feeding operator ( W i t ). The fish’s weight reflects how good the solution found is, and it determines the degree of influence the fish has on the swarm. Initially, when a fish moves to a better region, it gains weight; if it cannot improve, its weight remains the same. Even though it will be punished by not having the instinctive movement and its influence will decrease as other fish become heavier, this process might be slow. In SFSS, weight loss is introduced to penalize even more fish that do not improve. First, the ( Δ f i t + 1 ) is calculated using Equation (2); if Δ f i t + 1 > 0 , the fish weight is updated with the FSS weight gain (Equation (3)). Otherwise, (14) is used.
w i t + 1 = w i t . e Δ f t + 1 t m a x [ Δ f i t + 1 ]
where w i is the weight of the fish i, e is the exponential function, and m a x [ Δ f i t + 1 ] returns the maximum variation in fitness in the school.
We observed the algorithm’s performance in different problems in preliminary experiments by analysing the population weight over the iterations. In these experiments, we noticed that in some cases, the population weight could reach values below one after several iterations without improvements and losing diversity, causing stagnation issues. To address this issue, the SFSS features a turbulence mechanism to promote population diversity and increase the probability of improvements in the population. This mechanism is triggered only in stagnation situations (e.g., swarm weight below one) and for a limited number of fish in the population. In preliminary experiments, we noticed that applying the perturbation to 10% of the worst individuals in the school was enough to produce satisfactory results. This perturbation was not used in consecutive iterations to prevent the adverse effects of introducing too much diversity. Also, we chose the Gaussian perturbation as our turbulence operator as it was simple to implement, had a low computational cost, and produced the expected results.
The SFSS is described in Algorithm 2. It is important to mention that in line 5, the turbulence is only applied to the worst ten percent of fish in the school.
Algorithm 2: SFSS pseudocode.
Computation 13 00102 i002
Figure 1 compares the execution flow of the FSS and the SFSS. We highlight the components where the algorithms performs fitness evaluations of their populations in orange.

3.1. SFSS: Trials

During the development of the proposal, the following ideas were also considered as candidates to replace the movements and operators of the FSS. However, they were not used in the final version because they did not improve the algorithm performance, and another simpler solution revealed similar or better results than these.

3.1.1. Movements Trials

  • A roulette wheel is used to select a fish that will try to move and another roulette is used to choose a fish that will attract fish. The fish moves if the new location is better than the previous one.
  • This is similar to the previous one, but instead of a roulette wheel to select a fish that will try to move, all fish try to move.
  • All fish try to move, and selecting a random fish from the school will attract fish.
  • A new position is generated in all dimensions vs. modifying only one random dimension.

3.1.2. Feeding Operator Trials

The variations described in this section aimed to find a more appropriate form to penalize or reward the fish when necessary. The weight loss meant that a fish could not improve in the current iteration, and when the school weight decreased, it might have indicated that the swarm converged or was trapped in a local minimal.
  • Exponential weight gain and loss:
    w i t = w i t 1 e Δ f i t + 1 m a x [ Δ f i t + 1 ]
  • Nonlinear weight gain attempt 1:
    w i t = w i t 1 + | Δ f i t + 1 Δ f i t | m a x [ Δ f i t + 1 Δ f i t ] 1 Δ f i t + 1 Δ f i t m a x [ Δ f i t + 1 Δ f i t ] 1 1
  • Nonlinear weight gain attempt 2:
    It was similar to the previous one but with an addition operation instead of multiplication between the normalized variation of the delta cost and the last term.
  • Nonlinear weight gain attempt 3:
    w i t = w i t 1 + | Δ f i t + 1 Δ f i t | m a x [ Δ f i t + 1 Δ f i t ] | Δ f i t + 1 Δ f i t | m a x [ Δ f i t + 1 Δ f i t ] 5
  • Nonlinear weight gain attempt 4:
    w i t = w i t 1 | Δ f i t + 1 Δ f i t | m a x [ Δ f i t + 1 Δ f i t ] 5
  • Nonlinear weight gain attempt 5:
    w i t = w i t 1 + | Δ f i t + 1 Δ f i t | m a x [ Δ f i t + 1 Δ f i t ] · 100 | Δ f i t + 1 Δ f i t | m a x [ Δ f i t + 1 Δ f i t ] 1
    All the exponential weight gain attempts produced similar results. For this reason, the criteria for selecting one of the approaches were simplicity and computational cost.

4. Case Study

We tested the algorithms using 26 optimization problems from the IEEE Congress on Evolutionary Computation (CEC) 2017 test suite [23]. Although the test suite had 28 problems, we excluded the F17 and F21 because they showed unstable behaviour, possibly caused by the source code. The code was implemented and tested using Python version 3.12.7 and the code for the CEC’17 test suite can be downloaded from the GitHub page (more information available at https://github.com/tilleyd/cec2017-py/tree/master, accessed on 11 April 2025). All the functions were tested in 30 dimensions, and among this problem, we had unimodal, multimodal, shifted, rotated, and composed functions. The experiments were conducted using an Apple M4 Pro, 36 GB of RAM, 1 TB of hard drive, and the macOS Sequoia 15.3.2 operating system.
Since the ABC and the FSS algorithms have more than one fitness evaluation per individual per iteration, to provide a fair comparison, we decided to use the number of fitness evaluations as the stop criteria of the execution. Furthermore, considering that the CEC functions can be challenging, it was decided, after previous experiments, that the number of fitness evaluations adopted would be five hundred thousand. This means that the best result recorded for a given execution would be the one achieved by the 500,000 fitness evaluations. At this point, the algorithm would cease its execution. All the algorithms were executed 30 times for each function and had a population of 30 individuals. Also, all the algorithms employed the same population initialization strategy. At the beginning of each execution, the population was randomly distributed across the search space. Although some algorithms can benefit from different initialization strategies, we used the same one to provide a more fair comparison.
The PSO was implemented with a global best topology and used w 0 = 0.72984 , C 1 , and C 2 equal to (2.05w) and a maximum velocity of 100. The ABC algorithm employed the trial limit of 100. The GA algorithm was configured with a mutation rate of 0.05 and a crossover constant equal to 0.9. The FSS had initial and final individual steps, respectively, equal to 0.1 and 0.0001, an initial volitive step of 0.01, and a final volitive step of 0.001. Moreover, the initial weight and weight scale of the FSS were one and (number of fitness evaluation)/4.0, respectively. The SFSS and FA did not have additional parameters to set aside from the population size.
Figure 2 shows an example of the convergence curve of all seven algorithms in six CEC problems, while Table 1 and Figure 3 compare their performance in all 26 problems. As seen in Table 1 and Figure 3, the SFSS overcame the FSS algorithm in 23 of the 26 CEC problems. The SFSS performed best in 10 of the 26 problems compared to the other algorithms. These results suggest that the SFSS reduced the number of fitness evaluations per iteration and presented performance gains.
Figure 3 illustrates the results of the Wilcoxon test comparing the SFSS to the other algorithms. In this figure, the blue square means that the SFSS was superior, the red square denotes that the SFSS was inferior, and the grey square means that there was no statistical difference between them. The statistical results in Figure 3 reinforce the superiority of the SFSS over the FSS. Furthermore, comparing the SFSS to each algorithm revealed statistical significance in most of the results when the SFSS was better than the other algorithms.
Although the focus of this proposal was to reduce the complexity of the FSS by decreasing the number of user-defined parameters and fitness evaluations, the modifications introduced also suggest improvements in the algorithm’s performance. These improvements could be due to minimizing the risk of improved parametrization and introducing the turbulence mechanism and the redesigned displacement operations. However, more experiments are necessary to evaluate the impact of the modifications proposed in the FSS behaviour.
Regarding the algorithm complexity, we employed a calculation similar to the one adopted by CEC’17 [23], which can be described as follows:
  • Calculate the function complexity T i by computing the time of 10,000 evaluations for the problem i.
  • Compute the algorithm complexity T A i by computing the time of 10,000 evaluations for the problem i. To accommodate variations in performance due to the algorithms’ stochastic nature, T A i is the average of 15 runs.
  • The final complexity is given by A C = ( T A i T i ) / T i .
The main difference between this definition and the one of CEC’17 is that here, we present the complexity per function, while CEC’17 calculates the overall complexity in all problems in the test suite.
The results of the algorithms’ complexity across the benchmark problems are presented in Table 2. As shown in Table 2, the PSO, CSO, and GA were the algorithms with the lowest complexity, while the FA, FSS, and SFSS presented the highest values. As this definition assesses the time required to execute a given number of fitness evaluations, we expect the SFSS to exhibit higher values than the FSS. This result arises from the SFSS combining its operators’complexity within a unique fitness evaluation per iteration. In contrast, the FSS has two evaluations per iteration. In this case, the gains made by reducing the fitness evaluation numbers would be more prominent in scenarios where the objective function has a very high computational cost (i.e., higher than the FSS operators). Future experiments are needed to compare the FSS and SFSS in more complex problems, such as the hyperparametization tunning of machine learning models.

5. Conclusions

This paper presented the Simplified Fish School Search, a novel metaheuristic inspired by the Fish School Search (FSS) algorithm, designed to simplify the FSS’s operators and parameters and enhance its performance. The proposed approach modifies the structure of the FSS by consolidating the complexity of its operators into a single evaluation per iteration. We conducted experiments using the CEC 2017 benchmark suite to evaluate its effectiveness. The results show that it could overcome the FSS in most problems analysed (22 of 26) and compete with well-known algorithms such as the PSO, ABC, GA, CSO, and FA.
The proposed algorithm’s performance in unimodal, multimodal, and composition problems was satisfactory, showing the SFSS’s versatility. Furthermore, the computational cost from the number of fitness evaluations per individual per iteration was reduced. Reducing the number of calls to the fitness function is essential when dealing with functions with elevated costs.
Finally, we reduced the number of parameters, which led to a less user-dependent and problem-dependent algorithm with no parameter specification required besides the population size.
As the main limitations of our work, we list the necessity of in-depth experiments analysing the performance and behaviour changes introduced in the SFSS. Also, although we compared its performance against well-known algorithms, including more recent methods in the study would be desirable. We intend to address these issues in future works. To better evaluate the impact of the simplification proposed, we also plan to analyse the performance of the SFSS in more challenging and computationally expensive optimization tasks.

Author Contributions

Conceptualization, E.F., C.S., H.V.S., C.B.-F. and M.M.; methodology, E.F. and C.S.; software, C.S. and M.M.; validation, E.F., C.S. and M.M.; investigation, E.F. and C.S.; resources, C.B.-F., A.G. and A.C.; writing—original draft preparation, E.F., C.S. and M.M.; writing—review and editing, all authors; visualization, C.S.; supervision, C.B.-F. and A.G.; project administration, C.B.-F. and A.G.; funding acquisition, C.B.-F. and A.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

No new data were created, and code can be sent upon request.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ABCArtificial Bee Colony
EAsEvolutionary Algorithms
FSSFish School Search
GAsGenetic Algorithms
PSOParticle Swarm Optimization
SFSSSimplified Fish School Search
SISwarm Intelligence
wFSSWeight-based Fish School Search

References

  1. Holland, J.H. Genetic algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar] [CrossRef]
  2. Eberhart, R.; Kennedy, J. Particle swarm optimization. In Proceedings of the IEEE International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995; Volome 4, pp. 1942–1948. [Google Scholar]
  3. Karaboga, D.; Basturkl, B. A powerful and efficient algorithm for numerical function optimization: Artificial bee colony (abc) algorithm. J. Glob. Optim. 2007, 39, 459–471. [Google Scholar] [CrossRef]
  4. Bastos-Filho, C.J.A.; de Lima-Neto, F.B.; Lins, A.J.D.C.C.; de Lacerda, M.G.P.; da Motta Macedo, M.G.; de Santana Junior, C.J.; Siqueira, H.V.; da Silva, R.C.L.; Neto, H.A.; de Melo Menezes, B.A.; et al. Fish School Search: Account for the First Decade. In Handbook of AI-Based Metaheuristics; CRC Press: Boca Raton, FL, USA, 2021; pp. 21–42. [Google Scholar]
  5. Schmitt, L.M. Theory of genetic algorithms. Theor. Comput. Sci. 2001, 259, 1–61. [Google Scholar] [CrossRef]
  6. Kumar, A. Encoding schemes in genetic algorithm. Int. J. Adv. Res. Eng. 2013, 2, 1–7. [Google Scholar]
  7. Sousa-Ferreira, I.; Sousa, D. A review of velocity-type pso variants. J. Algorithms Comput. Technol. 2017, 11, 23–30. [Google Scholar] [CrossRef]
  8. Filho, C.J.B.; Neto, F.B.d.; Lins, A.J.; Nascimento, A.I.; Lima, M.P. A novel search algorithm based on fish school behavior. In Proceedings of the 2008 IEEE International Conference on Systems, Man and Cybernetics, Singapore, 12–15 October 2008; pp. 2646–2651. [Google Scholar]
  9. de Albuquerque, I.M.; Filho, J.M.; Neto, F.B.d.L.; Silva, A.M.d.O. Solving assembly line balancing problems with fish school search algorithm. In Proceedings of the 2016 IEEE Symposium Series on Computational Intelligence (SSCI), Athens, Greece, 6–9 December 2016; pp. 1–8. [Google Scholar]
  10. Ronald, S. Robust encodings in genetic algorithms: A survey of encoding issues. In Proceedings of the 1997 IEEE International Conference on Evolutionary Computation (ICEC’97), Indianapolis, IN, USA, 13–16 April 1997; pp. 43–48. [Google Scholar]
  11. Pandey, H.; Chaudhary, A.; Mehrotra, D. A comparative review of approaches to prevent premature convergence in GA. Appl. Soft Comput. 2014, 24, 1047–1077. [Google Scholar] [CrossRef]
  12. Jordehi, A.R. Enhanced leader pso (elpso): A new pso variant for solving global optimisation problems. Appl. Soft Comput. 2015, 26, 401–417. [Google Scholar] [CrossRef]
  13. Abdel-Kader, R.F. An improved pso algorithm with genetic and neighborhood-based diversity operators for the job shop scheduling problem. Appl. Artif. Intell. 2018, 32, 433–462. [Google Scholar] [CrossRef]
  14. Tang, Q.; Zeng, J.; Li, H.; Li, C.; Liu, Y. A particle swarm optimization algorithm based on genetic selection strategy. In Proceedings of the Advances in Neural Networks–ISNN 2009: 6th International Symposium on Neural Networks, ISNN 2009, Wuhan, China, 26–29 May 2009; Proceedings, Part III 6. Springer: Berlin/Heidelberg, Germany, 2009; pp. 126–135. [Google Scholar]
  15. Ye, T.; Wang, H.; Wang, W.; Zeng, T.; Zhang, L.; Huang, Z. Artificial bee colony algorithm with an adaptive search manner and dimension perturbation. Neural Comput. Appl. 2022, 34, 16239–16253. [Google Scholar] [CrossRef]
  16. Makas, H.; Yumuşak, N. Balancing exploration and exploitation by using sequential execution cooperation between artificial bee colony and migrating birds optimization algorithms. Turk. J. Electr. Eng. Comput. Sci. 2016, 24, 4935–4956. [Google Scholar] [CrossRef]
  17. Yu, W.-J.; Zhan, Z.-H.; Zhang, J. Artificial bee colony algorithm with an adaptive greedy position update strategy. Soft Comput. 2018, 22, 437–451. [Google Scholar] [CrossRef]
  18. Knypiński, Ł.; Kurzawa, M.; Wojciechowski, R.; Gwóźdź, M. Application of the Salp Swarm Algorithm to Optimal Design of Tuned Inductive Choke. Energies 2024, 17, 5129. [Google Scholar] [CrossRef]
  19. Khajehzadeh, M.; Iraji, A.; Majdi, A.; Keawsawasvong, S.; Nehdi, M.L. Adaptive Salp Swarm Algorithm for Optimization of Geotechnical Structures. Appl. Sci. 2022, 12, 6749. [Google Scholar] [CrossRef]
  20. Hao, G.-S.; Lim, M.-H.; Ong, Y.-S.; Huang, H.; Wang, G.-G. Domination landscape in evolutionary algorithms and its applications. Soft Comput. 2019, 23, 3563–3570. [Google Scholar] [CrossRef]
  21. Santana, C.J.; Bastos-Filho, C.J.; Macedo, M.; Siqueira, H. Sbfss: Simplified binary fish school search. In Proceedings of the 2019 IEEE Congress on Evolutionary Computation (CEC), Wellington, New Zealand, 10–13 June 2019; pp. 2595–2602. [Google Scholar]
  22. Demidova, L.; Gorchakov, A. Application of chaotic Fish School Search optimization algorithm with exponential step decay in neural network loss function optimization. Procedia Comput. Sci. 2021, 186, 352–359. [Google Scholar] [CrossRef]
  23. Wu, G.; Mallipeddi, R.; Suganthan, P. Problem definitions and evaluation criteria for the cec 2017 competition and special session on constrained single objective real-parameter optimization. Nanyang Technol. Univ. Singap. Tech. Rep. 2016, 1–18. [Google Scholar]
Figure 1. A comparison between the execution flow of the FSS (A) and SFSS (B). The steps highlighted represent the points the algorithm needs to evaluate the population’s fitness.
Figure 1. A comparison between the execution flow of the FSS (A) and SFSS (B). The steps highlighted represent the points the algorithm needs to evaluate the population’s fitness.
Computation 13 00102 g001
Figure 2. Examples of convergence curves of the algorithms in (A) F1, (B) F9, (C) F14, (D) F21, (E) F26, and (F) F28. We present the results for 26 CEC problems with 30 dimensions. The algorithms were interrupted after 500 thousand fitness evaluations.
Figure 2. Examples of convergence curves of the algorithms in (A) F1, (B) F9, (C) F14, (D) F21, (E) F26, and (F) F28. We present the results for 26 CEC problems with 30 dimensions. The algorithms were interrupted after 500 thousand fitness evaluations.
Computation 13 00102 g002
Figure 3. Wilcoxon test results comparing the proposed metaheuristic to others on 26 CEC problems (30D) after 500 K fitness evaluations. Blue indicates SFSS superiority, red denotes inferiority, and grey shows no statistical difference.
Figure 3. Wilcoxon test results comparing the proposed metaheuristic to others on 26 CEC problems (30D) after 500 K fitness evaluations. Blue indicates SFSS superiority, red denotes inferiority, and grey shows no statistical difference.
Computation 13 00102 g003
Table 1. Performance evaluation in terms of average fitness value and standard deviation for all algorithms. The results for 26 CEC problems with 30 dimensions. The algorithms were interrupted after 500 thousand fitness evaluations. In bold, we have the best results.
Table 1. Performance evaluation in terms of average fitness value and standard deviation for all algorithms. The results for 26 CEC problems with 30 dimensions. The algorithms were interrupted after 500 thousand fitness evaluations. In bold, we have the best results.
FunctionSFSSABCCSOFAFSSGAPSO
F1 1.09 × 10 4
( 6.17 × 10 3 )
6.30 × 10 3
( 2.84 × 10 3 )
2.87 × 10 10
( 4.46 × 10 9 )
7.87 × 10 5
( 7.86 × 10 4 )
6.46 × 10 5
( 8.34 × 10 4 )
5.95 × 10 10
( 1.64 × 10 10 )
9.71 × 10 3
( 5.43 × 10 3 )
F2 1.99 × 10 32
( 4.65 × 10 32 )
1.03 × 10 8
( 1.52 × 10 8 )
5.46 × 10 30
( 1.39 × 10 31 )
2.74 × 10 15
( 3.02 × 10 15 )
1.47 × 10 14
( 1.13 × 10 14 )
2.06 × 10 29
( 2.90 × 10 29 )
3.95 × 10 11
( 2.12 × 10 12 )
F3 9.24 × 10 4
( 7.02 × 10 4 )
1.89 × 10 5
( 1.48 × 10 4 )
9.48 × 10 4
( 1.65 × 10 4 )
9.99 × 10 3
( 5.81 × 10 2 )
1.09 × 10 4
( 2.04 × 10 3 )
1.38 × 10 5
( 1.33 × 10 4 )
5.02 × 10 4
( 2.90 × 10 4 )
F4 4.98 × 10 2
( 2.82 × 10 1 )
4.68 × 10 2
( 1.80 × 10 1 )
7.23 × 10 2
( 4.08 × 10 1 )
5.11 × 10 2
( 7.17 × 10 0 )
5.07 × 10 2
( 1.92 × 10 1 )
6.01 × 10 2
( 3.66 × 10 1 )
4.21 × 10 2
( 2.76 × 10 1 )
F5 6.25 × 10 2
( 4.23 × 10 1 )
7.11 × 10 2
( 1.89 × 10 1 )
7.39 × 10 2
( 1.58 × 10 1 )
7.60 × 10 2
( 1.45 × 10 1 )
1.59 × 10 3
( 1.33 × 10 2 )
1.48 × 10 3
( 5.23 × 10 1 )
7.79 × 10 2
( 7.30 × 10 1 )
F6 6.22 × 10 2
( 7.43 × 10 0 )
6.93 × 10 2
( 2.50 × 10 0 )
6.40 × 10 2
( 6.09 × 10 0 )
6.72 × 10 2
( 3.86 × 10 0 )
7.20 × 10 2
( 9.24 × 10 0 )
7.22 × 10 2
( 5.51 × 10 0 )
7.10 × 10 2
( 1.28 × 10 1 )
F7 1.04 × 10 3
( 1.46 × 10 2 )
8.60 × 10 2
( 1.73 × 10 1 )
1.02 × 10 3
( 1.73 × 10 1 )
1.30 × 10 3
( 1.49 × 10 1 )
6.63 × 10 3
( 9.01 × 10 2 )
6.32 × 10 3
( 2.17 × 10 2 )
1.25 × 10 3
( 1.66 × 10 2 )
F8 9.47 × 10 2
( 4.98 × 10 1 )
1.18 × 10 3
( 3.15 × 10 1 )
1.04 × 10 3
( 1.96 × 10 1 )
9.73 × 10 2
( 1.18 × 10 1 )
1.63 × 10 3
( 9.38 × 10 1 )
1.58 × 10 3
( 3.52 × 10 1 )
1.09 × 10 3
( 1.34 × 10 2 )
F9 1.21 × 10 3
( 4.42 × 10 2 )
1.32 × 10 4
( 5.76 × 10 2 )
3.04 × 10 3
( 8.07 × 10 2 )
5.13 × 10 3
( 1.47 × 10 2 )
1.94 × 10 4
( 2.28 × 10 3 )
1.95 × 10 4
( 1.11 × 10 3 )
1.80 × 10 4
( 2.08 × 10 3 )
F10 5.75 × 10 3
( 8.61 × 10 2 )
4.13 × 10 3
( 2.17 × 10 2 )
8.67 × 10 3
( 2.47 × 10 2 )
4.68 × 10 3
( 2.29 × 10 2 )
6.73 × 10 3
( 4.12 × 10 2 )
5.96 × 10 3
( 5.63 × 10 2 )
5.69 × 10 3
( 5.68 × 10 2 )
F11 1.29 × 10 3
( 2.89 × 10 2 )
2.30 × 10 3
( 6.97 × 10 2 )
2.37 × 10 3
( 3.42 × 10 2 )
1.17 × 10 3
( 4.17 × 10 0 )
1.47 × 10 3
( 4.34 × 10 1 )
1.84 × 10 3
( 1.51 × 10 2 )
1.36 × 10 3
( 6.25 × 10 1 )
F12 2.80 × 10 5
( 2.33 × 10 5 )
2.30 × 10 6
( 5.79 × 10 5 )
2.99 × 10 9
( 6.01 × 10 8 )
1.17 × 10 7
( 2.52 × 10 6 )
5.28 × 10 6
( 5.62 × 10 5 )
7.88 × 10 6
( 2.32 × 10 6 )
7.27 × 10 4
( 9.55 × 10 4 )
F13 4.77 × 10 3
( 3.43 × 10 3 )
1.16 × 10 4
( 4.17 × 10 3 )
1.41 × 10 9
( 4.54 × 10 8 )
5.92 × 10 4
( 1.16 × 10 4 )
3.14 × 10 5
( 4.78 × 10 4 )
1.49 × 10 4
( 2.18 × 10 3 )
7.17 × 10 3
( 5.14 × 10 3 )
F14 2.07 × 10 3
( 2.36 × 10 2 )
2.23 × 10 5
( 8.56 × 10 4 )
2.89 × 10 5
( 1.07 × 10 5 )
1.01 × 10 4
( 1.44 × 10 3 )
9.67 × 10 3
( 3.84 × 10 3 )
5.22 × 10 4
( 1.09 × 10 4 )
1.19 × 10 4
( 1.20 × 10 4 )
F15 1.08 × 10 5
( 1.30 × 10 5 )
6.12 × 10 3
( 3.93 × 10 3 )
8.55 × 10 7
( 4.76 × 10 7 )
1.60 × 10 4
( 1.53 × 10 3 )
1.20 × 10 5
( 1.78 × 10 4 )
2.31 × 10 3
( 4.47 × 10 2 )
9.72 × 10 3
( 9.40 × 10 3 )
F16 2.32 × 10 3
( 3.64 × 10 2 )
2.24 × 10 3
( 1.06 × 10 2 )
3.40 × 10 3
( 1.79 × 10 2 )
3.36 × 10 3
( 2.68 × 10 2 )
3.24 × 10 3
( 3.59 × 10 2 )
3.37 × 10 3
( 3.88 × 10 2 )
2.73 × 10 3
( 3.66 × 10 2 )
F18 8.81 × 10 4
( 3.76 × 10 4 )
3.17 × 10 5
( 9.84 × 10 4 )
2.52 × 10 6
( 1.25 × 10 6 )
9.46 × 10 4
( 8.78 × 10 3 )
1.56 × 10 5
( 1.75 × 10 4 )
1.06 × 10 6
( 2.04 × 10 5 )
1.43 × 10 5
( 8.95 × 10 4 )
F19 7.10 × 10 5
( 2.23 × 10 6 )
2.56 × 10 4
( 1.43 × 10 4 )
1.64 × 10 8
( 5.89 × 10 7 )
9.96 × 10 5
( 2.62 × 10 5 )
1.30 × 10 6
( 2.41 × 10 5 )
4.66 × 10 3
( 9.24 × 10 2 )
9.02 × 10 3
( 6.24 × 10 3 )
F21 2.40 × 10 3
( 3.40 × 10 1 )
2.55 × 10 3
( 1.67 × 10 1 )
2.53 × 10 3
( 1.42 × 10 1 )
2.56 × 10 3
( 7.90 × 10 1 )
3.00 × 10 3
( 6.62 × 10 1 )
2.93 × 10 3
( 3.19 × 10 1 )
2.56 × 10 3
( 6.84 × 10 1 )
F22 7.04 × 10 3
( 1.39 × 10 3 )
5.76 × 10 3
( 2.45 × 10 2 )
7.95 × 10 3
( 2.86 × 10 3 )
7.41 × 10 3
( 1.66 × 10 2 )
8.19 × 10 3
( 4.37 × 10 2 )
7.69 × 10 3
( 7.58 × 10 2 )
7.00 × 10 3
( 5.86 × 10 2 )
F23 2.74 × 10 3
( 4.10 × 10 1 )
2.82 × 10 3
( 1.81 × 10 1 )
2.93 × 10 3
( 3.20 × 10 1 )
3.88 × 10 3
( 1.06 × 10 2 )
5.00 × 10 3
( 2.44 × 10 2 )
4.69 × 10 3
( 1.57 × 10 2 )
3.24 × 10 3
( 4.52 × 10 2 )
F24 2.89 × 10 3
( 3.13 × 10 1 )
3.14 × 10 3
( 3.17 × 10 1 )
3.08 × 10 3
( 2.41 × 10 1 )
3.46 × 10 3
( 1.61 × 10 2 )
4.05 × 10 3
( 1.29 × 10 2 )
4.03 × 10 3
( 4.69 × 10 1 )
3.19 × 10 3
( 1.53 × 10 2 )
F25 2.90 × 10 3
( 1.30 × 10 1 )
2.88 × 10 3
( 5.62 × 10 2 )
3.03 × 10 3
( 1.80 × 10 1 )
2.90 × 10 3
( 2.02 × 10 0 )
2.88 × 10 3
( 1.72 × 10 0 )
2.91 × 10 3
( 1.47 × 10 1 )
2.89 × 10 3
( 1.25 × 10 1 )
F26 5.76 × 10 3
( 1.03 × 10 3 )
5.40 × 10 3
( 1.32 × 10 3 )
3.79 × 10 3
( 9.98 × 10 1 )
5.49 × 10 3
( 1.06 × 10 3 )
1.24 × 10 4
( 1.66 × 10 3 )
1.29 × 10 4
( 3.37 × 10 3 )
6.06 × 10 3
( 2.20 × 10 3 )
F27 3.24 × 10 3
( 1.90 × 10 1 )
3.22 × 10 3
( 3.66 × 10 0 )
3.37 × 10 3
( 3.37 × 10 1 )
4.94 × 10 3
( 2.63 × 10 2 )
4.06 × 10 3
( 1.56 × 10 2 )
3.97 × 10 3
( 2.16 × 10 2 )
3.40 × 10 3
( 1.06 × 10 2 )
F28 3.21 × 10 3
( 2.44 × 10 1 )
3.16 × 10 3
( 4.08 × 10 1 )
3.45 × 10 3
( 3.76 × 10 1 )
3.26 × 10 3
( 2.91 × 10 0 )
3.32 × 10 3
( 4.03 × 10 1 )
4.93 × 10 3
( 3.77 × 10 2 )
3.15 × 10 3
( 6.30 × 10 1 )
Table 2. Complexity analysis on 26 benchmarks shows SFSS has higher complexity than FSS, as it condenses all operator computations into one evaluation per iteration, while FSS splits them into two.
Table 2. Complexity analysis on 26 benchmarks shows SFSS has higher complexity than FSS, as it condenses all operator computations into one evaluation per iteration, while FSS splits them into two.
FunctionSFSSABCCSOFAFSSGAPSO
F18.1661.9370.3485.0674.6151.3960.957
F25.4161.8280.3154.8744.3671.3360.937
F33.5170.9330.1122.7832.4480.5770.331
F45.2831.0910.0373.2392.9130.7040.463
F56.4221.8200.3085.0594.3981.3110.915
F64.1091.0900.1553.2532.7820.7410.456
F73.2790.9020.1542.3921.9920.6000.411
F84.8691.5440.4413.9373.6371.0700.774
F93.1280.8640.1442.7452.3890.5880.385
F102.6490.7800.1542.3991.9620.5080.358
F111.5330.5670.1811.5311.2660.3910.292
F121.8800.5610.1971.3151.2040.3700.303
F131.6760.5440.1911.3211.1100.3580.278
F141.3050.4890.1841.2521.0320.3330.244
F151.5840.5240.1721.3271.1170.3500.267
F161.2280.4130.1531.0460.8970.2620.198
F181.2530.4520.1861.0930.9330.2980.226
F190.9680.3430.1620.8450.7180.2270.174
F210.9420.3060.1310.7990.7340.1630.114
F220.8500.3410.1960.7410.7480.2040.161
F230.6860.2170.1350.5570.4840.1110.124
F240.7820.2870.1620.6500.5760.1750.139
F250.7550.2040.1710.6060.4430.1110.077
F260.6720.2100.1590.5950.4480.1190.090
F270.5430.1230.1200.4040.3310.0530.048
F280.6610.1890.1790.4840.3960.1200.098
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Figueiredo, E.; Santana, C.; Siqueira, H.V.; Macedo, M.; Converti, A.; Gokhale, A.; Bastos-Filho, C. A Simplified Fish School Search Algorithm for Continuous Single-Objective Optimization. Computation 2025, 13, 102. https://doi.org/10.3390/computation13050102

AMA Style

Figueiredo E, Santana C, Siqueira HV, Macedo M, Converti A, Gokhale A, Bastos-Filho C. A Simplified Fish School Search Algorithm for Continuous Single-Objective Optimization. Computation. 2025; 13(5):102. https://doi.org/10.3390/computation13050102

Chicago/Turabian Style

Figueiredo, Elliackin, Clodomir Santana, Hugo Valadares Siqueira, Mariana Macedo, Attilio Converti, Anu Gokhale, and Carmelo Bastos-Filho. 2025. "A Simplified Fish School Search Algorithm for Continuous Single-Objective Optimization" Computation 13, no. 5: 102. https://doi.org/10.3390/computation13050102

APA Style

Figueiredo, E., Santana, C., Siqueira, H. V., Macedo, M., Converti, A., Gokhale, A., & Bastos-Filho, C. (2025). A Simplified Fish School Search Algorithm for Continuous Single-Objective Optimization. Computation, 13(5), 102. https://doi.org/10.3390/computation13050102

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop