**4. Numerical Results Analysis**

The proposed Improved PSO has been compared to five other well-known optimal algorithms on ten mathematical test functions having dimensions 100. The details are given as under:


For the current research work, we use mathematical test functions for the purpose to evaluate the novel method as well as other algorithms, as the said benchmark problems are popular in the field of engineering and are normally considered benchmark problems. In this paper, we employed ten mathematical functions to examine the effectiveness of particle swarm optimization with parameter adjustment. All these are unimodal and multimodal to validate the proposed IPSO algorithm's performance, and the results are compared to the various PSO variants such as, GPSO, AMPSO, MPSO, MPSOED, GCMPSO and MPSOEG, in tabulated data and plots of various methods indicated from 1~10. Table 1 shows these test functions along with the search space in which they are commonly optimized.



"D" means search space Dimension.

To judge a proper comparison among the various methods while analyzing the factual analysis of these optimization functions, we employed the same parameter values for all algorithms in the computational testing. The maximum generation was set to 2000 and the dimension to 100. In 60 trial runs, Table 2 records and reports the best values while the worst, mean, variance solution values for are available in Appendix A.

**Table 2.** Statistical Analysis of the Best Objective Function Values for 100 Dimensions Benchmark Problems.


#### **5. Discussion**

On the basis of these comparable data metrics, we claim that our proposed approach (IPSO) performs better as compared to the well-known other algorithms and strategies. The following are the most complicated benchmark problems that are chosen for the validation to recheck the performance of various algorithms. Consequently, the best objective function values for various techniques and our proposed algorithm are indicated in Table 2, while worst, mean and variance results are tabulated in the Appendix A.

Consider the test function, namely the "Rastrigin function", which is a complex multimodal function with a single global optimal solution and multiple local minima. According to tabulation results, we know that our new approach surpasses other methods such as GPSO, AMPSO, MPSO, MPSOED, GCMPSO, and MPSOEG. The test results of Rastrigin function shows that our proposed method performed well as compared to other ones, so it comes in the first category.

To recheck the stability and power of our proposed PSO, we validated the test function, i.e., the "Alpine 1" test function. The Alpine is also a complicated and complex multimodal function, having many local minima and one global optimal solution, while having the range between [−10, 10]. The tabulation value of Alpine function indicates that our algorithms gives minimum result as compared to others. We conclude that our novel approach shows outclassed results on Alpine function as compared to other algorithms.

Similarly, if we check the results of our modified PSO (IPSO) on sphere function, which is unimodal and complex, the global optimal solution of the sphere function is zero and having the range of the search space is [−10, 10]. The tabulation results shows that our modified PSO optimized the said function.

In addition, our modified method produced the top results on the HappyCat benchmark function. The HappyCat function is frequently used to validate the algorithms, due to the presence of so many local minima and complicated structures. If we observe the results of the Quartic function, it shows that our modified approach also gave the top results as compared to the other ones. In summary, the Schwefel's Problem 1.2 function and De Jong's, Bent Cigar, Step, Quartic, Alpine1, and Griewank were all these complex and complicated optimization problems that are commonly used to validate algorithms. In

short, our novel IPSO shows good results for most optimization problems as compared to other well-known modified algorithms.

The convergence curve based on test functions *f*1, *f*<sup>5</sup> and *f*<sup>7</sup> is represented in Figures 1–3 respectively, while the curves for *f*2, *f*<sup>3</sup> , *f*4, *f*6, *f*8, *f*<sup>9</sup> and *f*<sup>10</sup> are availble in appendix shows the convergence characteristics for various algorithms. Viewing the critical study of test function *f* <sup>1</sup> we notice that our approach finds the required solution space after 500 generations and other methods such as AMPSO, GPSO, MPSOED, and GCMPSO perform badly, which indicates their low performance and robustness.

From the study of second test function plots, we understand the low performance of other comparable methods and the efficacy of our proposed approach, as in the whole search process, other well-defined methods could not converge to a global region, while our novel modified approach finds the main region after 2000 generations. Similarly, our observation on the third function f3 plot is reported as the said idea converged before 600 generation, while other algorithms never found the optimal solution of the said algorithms.

If we observe the plot of the sixth test function, we conclude that MPSO performs a little bit better than AMPSO and while the IPSO (proposed approach) performs outclass as compared to all other algorithms, which shows its stability and maturity. So, from the plots, it is obvious that the novel algorithm shows the best performance.

**Figure 1.** Algorithms convergence plots on *f*1.

**Figure 2.** Algorithms convergence plots on *f*5.

**Figure 3.** Algorithms convergence plots on *f*7.

In this article, we employ the logarithm values of the objective function for comparison. From the graphical results of the test functions, the proposed IPSO converges to the global optimal region faster than the GPSO, AMPSO, MPSO, MPSOED, GCMPSO, and MPSOEG. The reasons are (1) the proposed novel adaptive mutation operator has prevented the diversity loss of the optimization process, (2) the proposed dynamic factor comprises the balance between exploration and exploitation in the search domain. Thus, we conclude from the plots that the suggested approach convergence plots for various test functions proves its superiority compared with others. From the convergence trajectories, it is clear that the novel technique is more efficient, stable and robust. Viewing the numerical results, the proposed IPSO's final solution has significantly greater quality as compared to the others, namely "GPSO", "AMPSO", "MPSO", "MPSOED", "GCMPSO" and "MPSOEG".

#### **6. Application**

For better performance analysis of our proposed approach, we choose an engineering electromagnetic device i.e., "TEAM workshop problem 22 (SMES)" as another case study. The optimal design of a SMES device is a popular problem in computational electromagnetics, and it is the 22nd benchmark problem for testing electromagnetic analysis methods (TEAM 22) [53]. The SMES device stores energy in the form of magnetic fields which is generated from the superconducting coils. The TEAM workshop problem 22, is also known as an optimization case of the SMES that has been adapted as a magneto-statics benchmark problem. The following diagram of TEAM 22's design goal, as illustrated in Figure 4, is that the main idea of the problem is to keep the stored energy as close as 180 M Joule, while minimizing the magnetic stray field observed on lines *a* and *b*. The first coil is charged to store energy, and the second should be built to reduce the first coil's high magnetic stray. In addition, to maintain the superconductivity of the inside and outside coils, the quenching condition should not be violated. As, the manufacturing tolerance in geometric variables (e.g., *R*2, *d*<sup>2</sup> and *h*<sup>2</sup> in Figure 4), as well as perturbation compensation of the current controller, can lead to a faulty device.

According to the design procedure of the problem, it incorporates three parameters related to the creation of SMES [54,55].

$$\begin{cases} \min f = B\_{\text{stray}}^2 / B\_{\text{norm}}^2 + \left| \text{Energy} - E\_{\text{ref}} \right| / E\_{\text{ref}}\\ \text{s.t. } (f\_i < -6.4 \left| (B\_{\text{max}})i + 54 \right) (A / mm^2) (i = 1, 2) \end{cases} \tag{18}$$

Obviously, this SMES device is a single objective function design problem, but it actually combines two objective functions to integrate magnetically stored energy in a couple of coils *Wm*, *Wer f* = 180, M Joule, *N* = 22, and *Bnorm* = 3*m Tesla*.

**Figure 4.** Schematic diagram of SMES device.

The mathematical equation for the stray magnetic field as follows:

$$OF = \frac{B\_{\text{stray}}^2}{B\_{\text{ref}}^2} + \frac{\left\| w\_m - w\_{m.ref} \right\|}{w\_{m.ref}} \tag{19}$$

$$B\_{stray}^2 = \frac{\sum\_{i=1}^{N} B\_{stray,i}^2}{N} \tag{20}$$

The finite element method is applied to calculate the performance parameters in the above two equations in current research work. When a magnetic field is created, it is essential to keep the physical condition of coils in order to guarantee superconductivity within the solenoids.

Because the current density is 22.5 A/mm2 then *B*max must be less than 4.92.

$$|J\_i < (-6.4 \vert (B\_{\text{max}})\_i \vert + 54) \left(\frac{A}{\text{mm}^2}\right) \tag{21}$$

where, *J*1, indicates the coil's current density, and *B*max, represents the maximum magnetic flux density of the *i*th coil, while *i*, denotes the coil's number.

The inner solenoid is fixed in this electromagnetic problem, optimization of SMES device is; *r*<sup>1</sup> = 2 m, *d*<sup>1</sup> = 0.27 m and *<sup>h</sup>*<sup>1</sup> <sup>2</sup> = 0.8 m, whereas the outer-solenoid geometrical dimensions is 0.6 ≤ *r*<sup>2</sup> ≤ 3.4 m and 0.1 ≤ *r*<sup>2</sup> ≤ 0.4 m are optimized.

The super conducting magnetic energy storage device conveys currents in opposing directions, associated with radius, height, thickness, and search space of the stray field, as demonstrated in Table 3. For the sake of fair comparison, we set all of the parameters to the same values for IPSO, GPSO, AMPSO, MPSO, MPSOED, GCMPSO, and MPSOEG, and the average value of the objective function was reported in Table 3. The results demonstrate that the novel IPSO recorded output is superior to those of the others.

To synthesize a magnetic field with a desired distribution, appropriately designed current-carrying coils can be used. There are several applications in biomedical engineering: a uniform magnetic field is the background of nuclear magnetic resonance spectroscopy, and a linear profile of the field is required for magnetic resonance imaging. Furthermore, in magneto-fluid hyperthermia (MFH), field uniformity aids in the uniform dispersion of heat generated in the nano-particle fluid that was previously injected into the target region, such as a tumor mass being treated. As a result, major practical applications influenced the concept behind this benchmark problem.


**Table 3.** Results Comparison of IPSO with other variants on TEAM Workshop Problem 22.
