Next Article in Journal
Genomic and Microscopic Analysis of Ballast Water in the Great Lakes Region
Previous Article in Journal
Subatomic-Level Solid/Fluid Boundary of Lennard-Jones Atoms: A Molecular Dynamics Study of Metal-Inert Fluid Interface
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Statistical Performances Evaluation of APSO and Improved APSO for Short Term Hydrothermal Scheduling Problem

by
Muhammad Salman Fakhar
1,†,
Syed Abdul Rahman Kashif
1,*,†,
Noor Ul Ain
1,†,
Hafiz Zaheer Hussain
2,†,
Akhtar Rasool
3,† and
Intisar Ali Sajjad
4,†
1
Department of Electrical Engineering, University of Engineering and Technology, Lahore 54890, Pakistan
2
Power & Mechanical Division, National Engineering Services Pakistan (NESPAK) Pvt. Limited, Lahore 54000, Pakistan
3
Sharif College of Engineering and Technology, Lahore, Pakistan
4
Department of Electrical Engineering, University of Engineering and Technology, Taxila, Pakistan
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Appl. Sci. 2019, 9(12), 2440; https://doi.org/10.3390/app9122440
Submission received: 22 April 2019 / Revised: 10 June 2019 / Accepted: 11 June 2019 / Published: 14 June 2019

Abstract

:

Featured Application

Short-term hydrothermal scheduling is a physically existing problem that deals with the combined operation of hydro and thermal generators. The two types of generations are so dispatched that they do not violate the constraints related to thermal power and water reservoirs related to hydro power, while serving the load demand and covering the transmission power losses. It is a type of economic dispatch problem. The research is still in progress to find algorithms that help in such an economic dispatch such that the fuel cost, related to the thermal generators, is minimized and the water reservoirs are also not completely depleted, which are primarily required for irrigation purposes. This research article is a presentation of an effort to find an elegant metaheuristic optimization algorithm that robustly finds the solution of two out of several types of short-term hydrothermal thermal scheduling problems, while considering the standard test problems as already discussed in the literature. The accelerated particle swarm optimization and its improved version are thus presented in this research article, which are serve the defined purpose.

Abstract

The Accelerated Particle Swarm Optimization (APSO) algorithm is an efficient and the easiest to implement variant of the famous Particle Swarm Optimization (PSO) algorithm. PSO and its variant APSO have been implemented on the famous Short-Term Hydrothermal Scheduling (STHTS) problem in recent research, and they have shown promising results. The APSO algorithm can be further modified to enhance its optimizing capability by deploying dynamic search space squeezing. This paper presents the implementation of the improved APSO algorithm that is based on dynamic search space squeezing, on the short-term hydrothermal scheduling problem. To give a quantitative comparison, a true statistical comparison based on comparing means is also presented to draw conclusions.

1. Introduction

The PSO algorithm has gained much popularity among the metaheuristic optimization algorithms in the recent past due to its ease of implementation and promise towards finding good approximates to global optimum solutions of complex optimization problems [1]. APSO is a simpler yet brilliant variant of PSO, and it has be proven to find good approximates of global optimum solutions in less time and fewer iterations as compared to PSO algorithms [2]. Dynamic search space squeezing has been applied on PSO to make yet another variant of PSO named improved PSO, and it gives better performance for some optimization problems as compared to PSO itself [3]. Short-term hydrothermal scheduling is a non-linear and multi-modal optimization problem, which has many forms. These can be of a non-cascaded form to a cascaded form. In the non-cascaded form, there is only one reservoir of water, whereas in the cascaded form, there is a series of downstream reservoirs. It can be single objective or multi-objective. In the single-objective Short-Term Hydrothermal Scheduling (STHTS) problem, the objective is only to minimize the cost of the operation of the thermal generating units, whereas in the multi-objective STHTS problem, the other objectives are to reduce the emission of COx, NOx, and SOx gases, as well. The works in [3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27] discussed all these types of the STHTS problem and the implementations of conventional and non-conventional metaheuristic optimization algorithms on STHTS problems and presented the superiority of one type of algorithm (usually metaheuristic) over other types. Improved PSO, PSO, cuckoo search optimization, quantum behaved PSO, teaching-learning-based optimization, the multi-objective differential algorithm, the gravitational search algorithm, the artificial bee colony algorithm, the genetic algorithm, civilized swarm optimization algorithms, and gravitational search algorithms are better choices for those types of STHTS problems as presented in [3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19]. The works in [20,21,22,23,24,25,26,27] discussed specifically the non-cascaded and single-objective STHTS problems in which one hydel and one equivalent thermal unit of a number of thermal units are dispatched to supply the power demand. PSO and APSO have the shown best results for such problems. Specially, according to [24,25], the APSO algorithm has outperformed the other previously-implemented deterministic and metaheuristic optimization algorithms on non-cascaded and single-objective STHTS problems.
The works in [19,20,21,22,23,24,25] discussed specifically the non-cascaded problems in which one hydel and one thermal unit are dispatched to supply the power demand. PSO and APSO have shown the best results for such problems. Especially, according to [24,25], the APSO algorithm has outperformed the other previously-implemented deterministic and metaheuristic optimization algorithms.
Due to the stochastic nature, it is required that there must be a true statistical comparison between the implementations of various algorithms on one type of optimization problem, which was not presented in previous works [3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27]. The work in [28] also presented this thesis while applying the PSO algorithm using digital pheromones. In the STHTS problem, it is therefore also required to present the quantitative analysis by applying statistical hypothesis tests.
This paper is an extended work of [24,25], already published by the authors, and presents the implementation of the improved APSO algorithm on the non-cascaded STHTS problems (two cases) and gives its statistical comparison with the previous APSO implementations on the same problems, by the authors of this paper, in [24,25]. Independent sample t-tests will be performed on the dataset of the two implementations to check the superiority one of the two APSO variants on the same STHTS problems, as discussed in [24,25].

2. Accelerated Particle Swarm Optimization and Its Improved Version

Particle swarm optimization is now a very famous metaheuristic optimization algorithm that has shown promising results when implemented on many types of complex, non-linear, and multi-modal optimization problems. The canonical version of PSO is given in Equation (1) as:
ν i n + 1 = θ ν i n + α ϵ 1 [ g * x i n ] + β ϵ 2 [ x i * x i n ] x i n + 1 = x i n + 1 + ν i n + 1
In Equation (1), ν i n + 1 is the velocity update of particle i, at iteration n + 1 . This velocity is added to the particle position x i n of particle i, at iteration n, to get the updated position x i n + 1 of particle i. x i * is the local best position of particle i, in iteration history, and g * is the global best among all the particles, at iteration n. ϵ 1 and ϵ 2 are random number ranging from 0–1.
There are almost more than two dozen variants of this canonical version of the PSO algorithm, as discussed in [29,30]. Accelerated particle swarm optimization is its variant, presented in [1,2], which is very elegant, i.e., easy to understand, the easiest to program, and proven itself to give good approximates to global optimum solutions. Its single step particle update equation is given as Equation (2):
x i n + 1 = ( 1 β ) x i n + β g * + α ( ϵ 0.5 )
As can be seen, this single step update equation has ignored the use of the velocity update and did not use the local best position in the updating of particle i. The typical values of α and β are 0.2 and 0.5, respectively. ϵ is a random variable, and its value is between zero and one, as given in [1,2]. This variant has been proven to find the global optimum of highly complex multi-modal functions like the Michaelwics2D function, as presented in [1,2]. There can be many possible variants of Equation (2), as discussed in [1,2]. The variant discussed in this paper is related to the dynamic search space squeezing method in which, at every iteration, the search space of the particles, in all dimensions, is readjusted by taking the influence of the global best particle, i.e., of g * , each iteration. This is how the chance of each particle to oscillate in the search space is decreased, and there is a lesser chance to get stuck to local optima. By reducing the search space, the range of particles at each dimension is reduced, i.e., the constraints are readjusted, and new constraints are bounded by the original constraints of the optimization problem, by using the dynamic search space squeezing Equations (3)–(7), as presented in [25]. The work in [25] has applied the concept of the dynamic search squeezing technique on the canonical form of PSO, i.e., on Equation (1), on the velocity update equation. These equations help to squeeze the search space dynamically from the given constraints to newer constraints from one iteration to the next iteration. This paper presents this concept of dynamic search space squeezing on the APSO algorithm, i.e., Equation (2), to further enhance its performance. Therefore, rather than squeezing the search space of velocity terms, the search space of particles themselves will be directly squeezed. The dynamic search space squeezing phenomenon is shown in Figure 1.
Δ l o w , i n = g i * n x i , m i n n x i , m a x n x i , m i n n
Δ h i g h , i n = x i , m a x n g i * n x i , m a x n x i , m i n n
Δ l o w , i n + Δ h i g h , i n = 1
x i , m i n n + 1 = x i , m i n n + ( g i * n x i , m i n n ) ( Δ l o w , i n )
x i , m a x n + 1 = x i , m a x n + ( x i , m a x n g i * n ) ( Δ h i g h , i n )

3. Short-Term Hydrothermal Scheduling Problem

Short-term hydrothermal scheduling is a type of economic dispatch problem in which a parallel operation of hydel power generating units is combined with thermal power generating units on a generation bus. This problem can have several forms varying from single objective to multi-objective, cascaded to non-cascaded, and pumped storage to non-pumped storage problems. The generation model of the STHTS problem is presented in Figure 2, as taken from [24]. The complexities of these problems can be further enhanced by considering the valve point loading effects of thermal generating units. The detailed analysis of all these problem types was presented in [3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25]. This paper considers two types of these problems as case studies, as already considered in [19,20,21,22,23,24,25], to implement the improved APSO algorithm and its comparison with the previous implementations of APSO algorithm in [24,25], in a formal quantitative hypothesis testing. The two problem cases belong to the following two types of STHTS problem.
  • Non-cascaded pumped storage short-term hydrothermal scheduling [24].
  • Non-cascaded non-pumped storage short-term hydrothermal scheduling while considering transmission losses [25].
It seems more logical to check the performance of one type of algorithm by implementing it on as many types of problems as possible. However, according to the “no free lunch theorem”, as given in [29,30,31,32,33,34], if one algorithm performs best on a Type A problem, it does not necessarily mean that it performs the best for a Type B problem, i.e., an algorithm cannot be proven to be the best optimization algorithm for all types of problems. Therefore, it is required to find a good performing algorithm for every type of optimization problem. Problems 1 and 2 belong to the same class of single objective and non-cascaded STHTS problems and are different in terms of structure from the remaining types of STHTS problems. Any algorithm, providing the best optimum to these problems, will perform well on all similar types of STHTS problems.
APSO algorithms are the simplest and most elegant metaheuristic algorithms, which, without having a lot of tuning parameters and having just a single-step particle update equation, are able to solve non-linear and multi-modal objective functions like Michaelwicz 2D functions, as given in [1,2]. Therefore, the elegance and robustness of these APSO algorithms make them the first choice out of many metaheuristic algorithms to implement on optimization problems. The other types of STHTS problems, for example the cascaded STHTS of a single objective-type problem, requires metaheuristic algorithms with more tuning parameters instead of the APSO algorithm, like PSO and improved PSO. Another type is the multi-objective STHTS problem, which requires variants of metaheuristic algorithms specially designed for multi-objective problems, like multi-objective PSO, ant colony and genetic algorithms, cuckoo search, the gravitational search algorithm, teaching-learning-based algorithms, and water cycle algorithms [3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19].
The problems taken into consideration in this paper have been solved using the APSO algorithm by the authors of this paper already in [23,24], which outperformed the other algorithms implemented on the same problems given in [22,26,27]. In this enhanced research, the improved APSO algorithm has been implemented on the same problems that used dynamic search space squeezing, and this modified APSO has shown even better results than the APSO algorithm, as presented in the Results Section. Figure 2 presents the basic generation model of the STHTS problem, which was also considered in [23,24].

4. Hypothesis Testing

Due to their stochastic nature, metaheuristic algorithms are destined to give different results for different trials on the same optimization problem. It is therefore required to test the performance of the implementations by performing a statistical analysis on a dataset of the results obtained. To check whether Algorithm A is superior to Algorithm B in terms of performance, there are hypothesis tests that quantitatively establish the superiority. The works in [3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27] have not taken into consideration this point, and though the performance of algorithms presented in those references was the best, the quantitative proof has not been established. This proof can be established by performing hypothesis testing to compare the means of the performances. For this purpose, a null hypothesis is established. The null hypothesis states that there is no difference in the performance of the two algorithms. By performing independent sample t-tests, the null hypothesis can either be accepted or rejected [28,35]. If the null hypothesis is rejected, this means that there is a significant difference between the performances of the two algorithms. If the null hypothesis is accepted, this means that, statistically, there is no difference between the performances of the two algorithms [28,35].

5. Methodology

According to [19,20,21,22,23,24,25,26,27], the non-cascaded hydrothermal scheduling problem can be described by Equations (8)–(14).
m i n ( f ) = j = 1 N n j F j ( c o s t f u n c t i o n o r o b j e c t i v e f u n c t i o n )
subject to:
P h y d + P T = P D e m a n d + P L o s s e s ( P o w e r b a l a n c e e q u a l i t y c o n s t r a i n t )
j = 1 N n j D i s j = D i s t o t a l ( D i s c h a r g e r a t e e q u a l i t y c o n s t r a i n t )
V m i n < V j < V m a x ( W a t e r r e s e r v o i r v o l u m e l i m i t s )
D i s m i n < D i s j < D i s m a x ( D i s c h a r g e r a t e l i m i t s )
P T , m i n < P T , j < P T , m a x ( T h e r m a l p o w e r l i m i t s )
P h y d , m i n < P h y d , j < P h y d , m a x ( H y d r o p o w e r l i m i t s )
The reservoir’s volume and the discharges are balanced by the continuity Equation (15) given as:
V j = V j 1 + n j ( R j D i s j S j )
To implement APSO or improved APSO algorithms on the non-cascaded STHTS problem, the following steps can be implemented;
  • Randomly initialize the volume vectors (particles) within the given volume constraints. In this paper, uniform random number generators have been used.
  • Calculate the water discharge rate, hydro power, and thermal power using the values in Step 1.
  • Check the limits in the constraints. If the limits are not met, restart from Step 1. If the limits are met, proceed to Step 4.
  • Find the total cost using the thermal power values found in Step 2, against each particle.
  • Take the minimum cost value and its corresponding volume vector. That volume vector will be the global best particle.
  • Update all the particles using APSO/improved APSO updating “(10)”.
  • Dynamically squeeze the volume constraints by applying search space squeezing equations for the improved APSO algorithm
  • Iterate from Steps 2–6 till the stopping criterion is reached.
  • Get the results.

6. Results and Discussions

This paper presents the essence and importance of true statistical quantitative tests to check the superiority of one type of algorithm over another type of algorithm. This is because, the metaheuristic optimization algorithms are stochastic, as well as deterministic in nature. Due to the stochastic nature, there is an in-built randomness, and therefore, no metaheuristic optimization algorithm the gives same redundant results when implemented on one type of problem. Therefore, to check if one algorithm is performing that same on the same problem every time, the performance is judged by standard deviations among the trials. To check the performance of two different types of algorithms, on the same problem, the data taken from an equal number of trials are hypothetically tested for significant differences in the means of all the trials. This type of testing is known as independent sample t-tests. This t-test was performed with the software SPSS, i.e., Statistical Package for Social Sciences. To test the performance of APSO and improved APSO, two test cases have been taken, as already discussed in [24,25].

6.1. Case 1: Non-Cascaded Pumped Storage Hydrothermal Scheduling [24,27]

This is a two-unit hydrothermal scheduling problem as given in [24]. The total number of scheduling periods was six, out of which three periods were pumping, i.e., discharged water is pumped back to the reservoir, while three periods were generating. The thermal unit characteristics are given in Equation (16):
F T = 3877.5 + 3.9795 P T + 0.00204 P T 2 $ / h
200 M W P T 2500 M W
The generating and pumping models are given by Equations (18) and (19), respectively.
D i s ( P h y d ) = 200 + 2 P h y d ( a c r e . f t / h ) f o r 0 P h y d 300 M W ( G e n e r a t i n g c h a r a c t e r i s t i c s )
D i s ( P H , p u m p ) = 600 ( a c r e . f t / h ) w i t h P H , p u m p = 300 M W ( D i s c h a r g i n g c h a r a c t e r i s t i c s )
The starting and ending volume of the reservoir = 8000 (acre-ft/h). The load demand at every interval is given in Table 1.
The problem is implemented using both the APSO algorithm and improved APSO algorithm. The work in [24] has already discussed the implementation of APSO on this problem; however, this paper presents its comparison with the improved APSO algorithm on hypothetical testing. The improved APSO algorithm has shown even more promising results as compared to APSO algorithm. Table 2 presents the comparison between the two implementations.
Both algorithms have been tested on the Case 1 problem for sets of a hundred and two hundred trials each to have a normally-distributed dataset. Their means were compared using the hypothesis testing independent sample t-test. The results obtained from the SPSS software run are given in Table 3.
The null hypothesis was that there was no difference in the performance of the two algorithms on the pumped storage STHTS problem. This hypothesis was tested on a confidence interval value of 95%, giving a significance value (2-tailed) equal to 0.000, i.e., for both sets of a hundred and two hundred trials lesser than 0.05 or 5% [35]. This value suggests that the null hypothesis is rejected, and the statistical test quantitatively suggests that the improved APSO algorithm has performed significantly differently from the simple APSO algorithm. Furthermore, Levine’s test of the equality of variances suggest that for a significance value equal to 0.000, which is less than 0.05, variances are also statistically different, significantly. Table 2 further elaborates the difference in performance by establishing the superiority of the improved APSO over the simple APSO algorithm. Table 2 also presents that though the minimum values achieved by the two algorithms were the same, the improved APSO algorithm converged to the acceptable result, i.e., this paper has considered an acceptable cost value less than 269,642.4001, 97% of the time as compared to the simple APSO algorithm, which converged to acceptable results 4% of the time. The number of iterations and computation time were very low for both algorithms; however, due to the dynamic search space squeezing, improved APSO took a slightly higher average time to converge to the solution as compared to the simple APSO algorithm. However, the difference in computation time was not very significant. Although the APSO algorithm has given promising results, the improved APSO algorithm has proven itself better for the pumped storage non-cascaded STHTS problem, statistically. Table 4 presents the best results of the improved APSO implementation on the non-cascaded pumped storage STHTS problem.
The convergence characteristics of the simple APSO and improved APSO are given in Figure 3 and Figure 4, respectively. Figure 5 presents the convergence characteristics of the improved APSO algorithm on the non-cascaded pumped STHTS problem for five different instances. The algorithm has shown good performance in reaching the good approximates of the global optimum solution for all the instances. These figures are taken as the successful trials for both algorithms. The number of particles was taken to be equal to 200 for both implementations. The convergence characteristics of both algorithms were quite similar and showed the general nature of PSO variants. The algorithms, as their names suggest, are accelerated in nature, i.e., they are fast at finding good approximates of global optima.

6.2. Case 2: Non-Cascaded Non-Pumped STHTS Considering Transmission Losses [25]

This is a two-unit and non-pumped hydrothermal scheduling problem, as given in [25]. The total number of scheduling periods is six, and the discharging characteristics of the thermal unit are given in Equations (20) and (21); where the fuel cost is 1.15 $/MBTU.
F T = 500 + 8 P T + 0.0016 P T 2 ( M B T U / h )
150 M W P T 1500 M W
The water discharging constraints are given in Equations (22)–(25):
D i s ( P h y d ) = 330 + 4.97 P h y d ( a c r e f t / h )
for:
0 M W P h y d 1000 M W
D i s ( P h y d ) = 5300 + 12 ( P h y d 1000 ) + 0.05 ( P h y d 1000 ) 2 ( a c r e f t / h )
for
1000 M W P h y d 1100 M W
The transmission losses are given by Equation (26):
( P l o s s ) = 0.0008 P h y d 2 ( M W )
The discharge rate characteristics are given in Equation (27):
5300 ( a c r e f t / h r ) D i s 7000 ( a c r e f t / h r )
The load demand in every interval is given in Table 5.
The reservoir’s water volume flow constraints as given in [25,27] are:
  • 100,000 acre-feet constitute the volume of water in the reservoir prior to the first scheduling period.
  • 60,000 acre-feet constitute the volume of water in the reservoir in the last period.
  • Volume limit constraints in the first iteration are in acre-ft: 60,000 (acre ft) ≤ V ≤ 120,000 (acre-ft)
  • Continuous incoming flow into the reservoir is 2000 acre-feet/hour throughout the scheduling period.
  • The continuity equation as given in Equation (15) is to be met.
The problem was implemented using both APSO algorithm and the improved APSO algorithm. The work in [25] already discussed the implementation of APSO for this problem; however, this paper presents its comparison with the improved APSO algorithm on hypothetical testing. The improved APSO algorithm has shown even more promising results as compared to the APSO algorithm. Table 6 presents the comparison between the two implementations.
Both algorithms have been tested on the Section 6.2 Case 2 problem for sets of a hundred and two hundred trials each to have a normally-distributed dataset. Their means were compared using the hypothesis testing independent sample t-test. The results obtained from the SPSS software run are given in Table 7.
For this (Section 6.2 Case 2) problem, the value of significance of two-tailed t-test showed that both algorithms were not significantly different in terms of performance because the value was greater than 5%. This means that for more than 25% of the time, the results were somewhat repeated. Therefore, at the 95% confidence level, the null hypothesis could not be rejected, and the means were not statistically different. However, the improved APSO has been able to find a better approximation to the global best optima, as given in Table 6. The number of iterations and computation time were very low for both of these algorithms; however, due to the dynamic search space squeezing, improved APSO took a slightly higher average time to converge to the solution as compared to the simple APSO algorithm. However, the difference in computation time was not very significant.
Therefore, if some more tuning parameters were properly adjusted, both algorithms would have the capability to find robust approximations to global optimum solutions with low standard deviations, for the non-cascaded non-pumped STHTS problem while considering transmission line losses, improved APSO having more ability to find a better approximation to global minima. Table 8 gives the best cost result of improved APSO’s performance.
The convergence characteristics of the simple APSO and improved APSO are given in Figure 6 and Figure 7, respectively, on the non-cascaded non-pumped STHTS problem considering transmission losses. These figures are taken for the successful trials for both the algorithms. Figure 8 presents the convergence characteristics of the improved APSO algorithm on the non-cascaded non-pumped STHTS problem for five different instances. The algorithm has shown good performance in reaching good approximates of the global optimum solution for four instances, whereas the algorithm could not reach a good solution for one instance. The number of particles was taken to be equal to 200 for both implementations. The convergence characteristics of both the algorithms were quite similar and showed the general nature of PSO variants. The algorithms, as their names suggest, are accelerated in nature, i.e., they are fast at finding good approximates of global optima.

7. Additional STHTS Problems (as Test Cases)

In this research paper, the two problems of interest were presented as Section 6.1 Case 1 and Section 6.2 Case 2. These two problems were already solved by other algorithms, as can be seen in [24,25]. For the interested reader, two new STHTS problems were added, which are quite similar to the Section 6.1 Case 1 and Section 6.2 Case 2 problems discussed in the previous section. These problems have not been solved in the literature and are very close to practical STHTS problems, as are Section 6.1 Case 1 and Section 6.2 Case 2. The simple APSO and improved APSO algorithms have been implemented on these two problems, taking α and β equal to 0.5 , and the statistical results are presented.

7.1. Case 1: Non-Cascaded Pumped-Storage STHTS Problem Considering Valve Point Loading

In steam turbine-based thermal generators, valves control the steam entering the turbine through separate nozzle groups. Each nozzle group achieves best efficiency when operated at full output. Therefore, when increasing the output, valves are opened in sequence in order to achieve the highest possible efficiency for a given output. This causes a rippled cost curve, which is usually a sinusoidal wave riding on the quadratic cost function [19]. A valve point loading-based STHTS problem was given to test the effectiveness of the proposed improved APSO algorithm in comparison with the already tested simple APSO algorithm in the literature. The system taken was that of Case 1 with a valve point loading-based cost function. The cost function is now as given in (28):
F T = 3877.5 + 3.9795 P T + 0.00204 P T 2 + 500 sin ( 0.085 ( 200 P T ) ) $ / h
The rest of the problem data are the same as Section 6.1 Case 1 (Equations (17)–(19)) of the previous section. The statistical results of both the simple APSO and improved APSO algorithms are shown in Table 9. Table 10 shows the results of the SPSS results of the independent sample t-test performed for the comparison of the simple APSO and improved APSO algorithms on the non-cascaded pumped-storage STHTS problem considering valve point loading. Table 11 shows the results of power flow and cost optimization with the improved APSO algorithm implementation on the non-cascaded pumped-storage STHTS problem considering valve point loading. Convergence results for the simple APSO and improved APSO are shown in Figure 9 and Figure 10, respectively.
The results show that there is no significant difference between the performances of these two algorithms, statistically; however, the improved APSO algorithm has reached acceptable solutions more quickly than the simple APSO algorithm. If the tuning parameters of both algorithms are adjusted properly, both algorithms can help in finding a good approximate of the global optimum solution.

7.2. Case 2: Non-Cascaded and Non-Pumped Storage STHTS Problem While Not Considering Transmission Losses

This problem is similar to the Section 6.2 Case 2 problem of the last section. However, in this new system, the transmission line losses are considered as zero. The thermal and hydro units have the following characteristics.
Thermal system: Equations (29) and (30):
F T = 700 + 4.8 P T + 0.0005 P T 2 ( $ / h )
200 M W P T 1200 M W
Hydro system: Equations (31)–(33):
D i s ( P h y d ) = 260 + 10 P h y d ( a c r e f t / h )
for:
0 M W P h y d 350 M W
D i s ( P H , P u m p ) = 0 ( a c r e f t / h ) w i t h P h y d = 0 M W
The starting and ending volume of the reservoir were 16,000 (acre-ft) and 12,000 (acre-ft) with a maximum allowed volume of 18,000 (acre-ft) and a minimum allowed volume of 12,000 (acre-ft). A constant water inflow of 2000 (acre-ft/h) was to be maintained in the reservoir for each scheduling interval. It was considered that there were no electrical transmission power losses. The load demand during each interval of four hours each is given in Table 12. Table 13 shows the SPSS results of the independent sample t-test showing the comparison between simple APSO and improved APSO on the non-pumped storage non-cascaded STHTS problem without transmission losses. Table 14 presents the power flow and cost optimization with the improved APSO algorithm implementation on the selected problem. Table 15 shows the comparison of some performance parameters between APSO and improved APSO on the non-cascaded and non-pumped storage STHTS problem while not considering transmission losses. Convergence results for the simple APSO and improved APSO for this case are shown in Figure 11 and Figure 12, respectively.
The results from the t-test show that for this problem type, both the simple APSO and improved APSO algorithms performed equally well, statistically. However, the improved APSO algorithm was able to find a better approximate to the global minimum solution as compared to the simple APSO algorithm. Moreover, the improved APSO algorithm has found the acceptable solution more times as compared to the simple APSO algorithm. If the tuning parameters of both the algorithms are adjusted properly, both algorithms can help in finding a good approximate of the global optimum solution.

8. Optimization of Some General Deterministic Non-Linear Test Functions Using the Proposed Improved APSO Algorithm

To check the performance of the proposed improved APSO algorithm, some hard and general deterministic optimization problems have been solved using the APSO and improved APSO algorithms. The problem functions are given in Table 16 and are plotted in Figure 13. Table 17 presents the results of the implementations of APSO and the proposed improved APSO algorithm on these complex optimization functions, showing the capability of these two algorithms to solve such non-linear and multi-modal objective functions.
It can be seen that both the APSO and improved APSO algorithms have shown promising results to find the good approximates to the determined or known global minima of the highly non-linear and multi-modal test functions. The performances of both APSO and improved APSO were good; however, the improved APSO had the capability to find closer results to the determined or known values of the minima of these functions.
The average value of the set of minima obtained for the case of the egg-crate function was high, as well as the value of standard deviation. The reason is that the egg-crate function had many local minima quite near the global minimum, as can be seen in Figure 13. Therefore, due to the dependency of the particle updating process, only on the g* value, and due to the presence of fewer tuning parameters, both the simple APSO and improved APSO algorithms sometimes became stuck in the local optima. If some chaotic maps are introduced, the stochastic nature of APSO and improved APSO can be increased to allow the particles to escape from local peaks. However, the improved APSO algorithm has still achieved a higher number of successful results for the egg-crate function. Similarly, for the Michaelwicz 2D function, both improved APSO and APSO sometimes become stuck in the local minima. However, for most of the time, good and robust solutions were achieved. If the neighborhood topologies of particles and some stochastic term are dominantly added in the particle update equation, the number of successful achievements of global optimum solution can be increased to more than 90% of the time.
The results obtained from the implementation of the improved APSO and the results taken from [1] of the implementation of the simple APSO algorithm on the deterministic, non-linear, and multi-modal functions show that both of these algorithms are a brilliant choice to obtain robust solutions of the optimization problems. Moreover, the elegance of these APSO variants in terms of a few tuning parameters and a single-step particle update equation make them a preferred choice as optimization algorithms.

9. Conclusions

Establishing the superiority of one type of algorithm over another type of algorithm, while implementing them on one type of optimization problem, requires a proper statistical hypothesis testing like the t-test. This paper has presented another modification of particle swarm optimization as the improved accelerated particle swarm optimization algorithm, which dynamically squeezes the search space for the particles in every iteration. Two out of many types of STHTS problems were tested using simple APSO and improved APSO. The improved APSO algorithm has shown very promising results as compared to the simple APSO algorithm for the non-cascaded pumped storage STHTS case, whereas both algorithms have shown equivalent results in statistical terms for the non-cascaded non-pumped STHTS problem. However, the improved APSO has successfully found a better approximate to the global minimum for this problem. Further research can be done on the tuning of these variants of PSO to find even more promising results for these problems and for other types of STHTS problems.

Author Contributions

Conceptualization, M.S.F. and H.Z.H.; methodology, M.S.F.; software, M.S.F. and H.Z.H.; validation, M.S.F., S.A.R.K. and N.U.A.; formal analysis, M.S.F., S.A.R.K. and N.U.A.; investigation, A.R.; resources, I.A.S.; data curation, M.S.F.; writing–original draft preparation, M.S.F.; writing–review and editing, S.A.R.K.; visualization, A.R. and I.A.S.; supervision, S.A.R.K.; project administration, S.A.R.K., A.R. and I.A.S.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following symbols and abbreviations are used in this manuscript:
APSOAccelerated Particle Swarm Optimization
x i n + 1 Particle position at iteration n + 1
g * Global best particle
Δ l o w , i n Lower increment to be added in the i th particle’s lower limit for search space squeezing at iteration n
Δ l o w , j n Higher increment to be added in the i th particle’s upper limit for search space squeezing at iteration n
x i , m i n n + 1 Lower limit of the i th particle at iteration n + 1
x i , m a x n + 1 upper limit of the i th particle at iteration n + 1
fObjective function/cost
F j Cost of the j th generating unit
V j Volume of the reservoir at the j th interval
Dis j Discharge rate of the reservoir at the j th interval
P T , j Thermal power at the j th interval
P h y d , j Hydro power at the j th interval
R j Constant water inflow into the reservoir at the j th interval
S j Spillage of water from the reservoir at the j th interval
P l o s s , j Power loss at any scheduling j th interval
α , β , ϵ Tuning parameters of APSO equation. Their value ranges between 0 and 1.

References

  1. Yang, X.S. Engineering Optimization: An Introduction with Metaheuristic Applications; John Wiley & Sons: Hoboken, NJ, USA, 2010. [Google Scholar] [CrossRef]
  2. Yang, X.S.; Deb, S.; Fong, S. Accelerated particle swarm optimization and support vector machine for business optimization and applications. Commun. Comput. Inf. Sci. 2011, 136, 53–66. [Google Scholar] [CrossRef]
  3. Hota, P.; Barisal, A.; Chakrabarti, R. An improved PSO technique for short-term optimal hydrothermal scheduling. Electr. Power Syst. Res. 2009, 79, 1047–1053. [Google Scholar] [CrossRef]
  4. Mandal, K.K.; Basu, M.; Chakraborty, N. Particle swarm optimization technique based short-term hydrothermal scheduling. Appl. Soft Comput. 2008, 8, 1392–1399. [Google Scholar] [CrossRef]
  5. Yu, B.; Yuan, X.; Wang, J. Short-term hydro-thermal scheduling using particle swarm optimization method. Energy Convers. Manag. 2007, 48, 1902–1908. [Google Scholar] [CrossRef]
  6. Thang, T.N.; Dieu, N.V.; Anh, V.T. Cuckoo search algorithm for short-term hydrothermal scheduling. Appl. Energy 2014, 132, 276–287. [Google Scholar] [CrossRef]
  7. Ahmadi, A.; Masouleh, M.S.; Janghorbani, M.; Manjili, N.Y.G.; Sharaf, A.M.; Nezhad, A.E. Short term multi-objective hydrothermal scheduling. Electr. Power Syst. Res. 2015, 121, 357–367. [Google Scholar] [CrossRef]
  8. Chen, J.J.; Zheng, J.H. Discussion on short-term environmental/economic hydrothermal scheduling. Electr. Power Syst. Res. 2015, 127, 348–350. [Google Scholar] [CrossRef]
  9. Sun, C.; Lu, S. Short-term combined economic emission hydrothermal scheduling using improved quantum-behaved particle swarm optimization. Expert Syst. Appl. 2010, 37, 4232–4241. [Google Scholar] [CrossRef]
  10. Roy, P.K. Teaching learning based optimization for short-term hydrothermal scheduling problem considering valve point effect and prohibited discharge constraint. Int. J. Electr. Power Energy Syst. 2013, 53, 10–19. [Google Scholar] [CrossRef]
  11. Zhang, H.; Zhou, J.; Zhang, Y.; Fang, N.; Zhang, R. Short-term hydrothermal scheduling using multi-objective differential evolution with three chaotic sequences. Int. J. Electr. Power Energy Syst. 2013, 47, 85–99. [Google Scholar] [CrossRef]
  12. Farhat, I.A.; El-Hawary, M.E. Optimization methods applied for solving the short-term hydrothermal coordination problem. Electr. Power Syst. Res. 2009, 79, 1308–1320. [Google Scholar] [CrossRef]
  13. Tian, H.; Yuan, X.; Ji, B.; Chen, Z. Multi-objective optimization of short-term hydrothermal scheduling using non-dominated sorting gravitational search algorithm with chaotic mutation. Energy Convers. Manag. 2014, 81, 504–519. [Google Scholar] [CrossRef]
  14. Zhou, J.; Liao, X.; Ouyang, S.; Zhang, R.; Zhang, Y. Multi-objective artificial bee colony algorithm for short-term scheduling of hydrothermal system. Int. J. Electr. Power Energy Syst. 2014, 55, 542–553. [Google Scholar] [CrossRef]
  15. Mandal, K.K.; Chakraborty, N. Differential evolution technique-based short-term economic generation scheduling of hydrothermal systems. Electr. Power Syst. Res. 2008, 78, 1972–1979. [Google Scholar] [CrossRef]
  16. Mandal, K.K.; Chakraborty, N. Daily combined economic emission scheduling of hydrothermal systems with cascaded reservoirs using self organizing hierarchical particle swarm optimization technique. Expert Syst. Appl. 2012, 39, 3438–3445. [Google Scholar] [CrossRef]
  17. Immanuel, S.A. Civilized swarm optimization for multi-objective short-term hydrothermal scheduling. Int. J. Electr. Power Energy Syst. 2013, 51, 178–189. [Google Scholar] [CrossRef]
  18. Narang, N. Short-term hydrothermal generation scheduling using improved predator influenced civilized swarm optimization technique. Appl. Soft Comput. 2017, 58, 207–224. [Google Scholar] [CrossRef]
  19. Wong, K.P.; Wong, Y.W. Short-term hydrothermal scheduling part. I. Simulated annealing approach. IEE Proc. Gener. Transm. Distrib. 1994, 141, 497–501. [Google Scholar] [CrossRef]
  20. Padmini, S.; Rajan, C.C.A. Improved PSO for short-term hydrothermal scheduling. SEISCON 2011, 332–334. [Google Scholar]
  21. Samudi, C.; Das, G.P.; Ojha, P.C.; Sreeni, T.S.; Cherian, S. Hydro thermal scheduling using particle swarm optimization. In Proceedings of the Conference on Transmission and Distribution Exposition, Chicago, IL, USA, 21–24 April 2008. [Google Scholar]
  22. Fakhar, M.S.; Kashif, S.A.R.; Saqib, M.A.; Hassan, T.U. Non cascaded short-term hydro-thermal scheduling using fully-informed particle swarm optimization. Int. J. Electr. Power Energy Syst. 2015, 73, 983–990. [Google Scholar] [CrossRef]
  23. Khandualo, S.K.; Barisal, A.K.; Hota, P.K. Scheduling of pumped storage hydro-thermal system with evolutionary programming. J. Clean Energy Technol. 2013, 1. [Google Scholar] [CrossRef]
  24. Fakhar, M.S.; Kashif, S.A.R.; Saqib, M.A.; Mehmood, F.; Hussain, H.Z. Non-cascaded short-term pumped-storage hydro-thermal scheduling using accelerated particle swarm optimization. In Proceedings of the International Conference on Electrical Engineering, San Francisco, CA, USA, 23–25 October 2018. [Google Scholar]
  25. Hussain, H.Z.; Haider, A.; Fakhar, M.S.; Ahmad, J.; Butt, M.A.; Khokhar, K.S. Short-term scheduling of non-cascaded hydro-thermal system with transmission losses using accelerated particle swarm optimization algorithm. Pak. J. Eng. Appl. Sci. 2018, 22, 20–29. [Google Scholar]
  26. Wood, A.J. Power Generation Operation and Control; Wiley: New York, NY, USA, 1996. [Google Scholar]
  27. Wood, A.J.; Wollenberg, B.; Sheblé, G.B. Power Generation Operation and Control, 3rd ed.; Wiley: Hoboken, NJ, USA, 2013. [Google Scholar]
  28. Kalivarapu, V.; Winer, E. A statistical analysis of particle swarm optimization with and without digital pheromones. In Proceedings of the Conference on Structures, Structural Dynamics, and Materials, Honolulu, HI, USA, 23–26 April 2007. [Google Scholar]
  29. Chopard, B.; Tomassini, M. Particle Swarm Optimization; Springer in Natural Computing Series; Springer: Cham, Switzerland, 2018. [Google Scholar]
  30. Kennedy, J. Particle Swarm Optimization. In Encyclopedia of Machine Learning; Springer: New York, NY, USA, 2010. [Google Scholar]
  31. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef] [Green Version]
  32. Ho, Y.C.; Pepyne, D.L. Simple explanation of the no-free-lunch theorem and its implications. J. Opt. Theory Appl. 2002, 115, 549–570. [Google Scholar] [CrossRef]
  33. Yu-Chi, H.; Pepyne, D.L. Simple Explanation of the No Free Lunch Theorem of Optimization. In Proceedings of the 40th IEEE Conference on Decision and Control, Orlando, FL, USA, 4–7 December 2001. [Google Scholar]
  34. Kifer, D.; Machanavajjhala, A. No Free Lunch in Data Privacy. In Proceedings of the ACM SIGMOD International Conference on Management of Data, Athens, Greece, 12–16 June 2011. [Google Scholar]
  35. Field, A. Doing Statistics Using SPSS, 3rd ed.; SAGE Publications Ltd.: London, UK, 2009. [Google Scholar]
Figure 1. Dynamic search space squeezing under the influence of g * at iteration n + 1 . The arrows indicate the squeezing of the search space from iteration n n + 1 .
Figure 1. Dynamic search space squeezing under the influence of g * at iteration n + 1 . The arrows indicate the squeezing of the search space from iteration n n + 1 .
Applsci 09 02440 g001
Figure 2. Generation model of the Short-Term Hydrothermal Scheduling (STHTS) problem [25].
Figure 2. Generation model of the Short-Term Hydrothermal Scheduling (STHTS) problem [25].
Applsci 09 02440 g002
Figure 3. Convergence characteristics of the simple APSO algorithm on the non-cascaded pumped STHTS problem.
Figure 3. Convergence characteristics of the simple APSO algorithm on the non-cascaded pumped STHTS problem.
Applsci 09 02440 g003
Figure 4. Convergence characteristics of the improved APSO algorithm on the non-cascaded pumped STHTS problem.
Figure 4. Convergence characteristics of the improved APSO algorithm on the non-cascaded pumped STHTS problem.
Applsci 09 02440 g004
Figure 5. Convergence characteristics of the improved APSO algorithm on the non-cascaded pumped STHTS problem for five different instances.
Figure 5. Convergence characteristics of the improved APSO algorithm on the non-cascaded pumped STHTS problem for five different instances.
Applsci 09 02440 g005
Figure 6. Convergence characteristics of the simple APSO algorithm on the non-cascaded non-pumped STHTS problem taken from [25].
Figure 6. Convergence characteristics of the simple APSO algorithm on the non-cascaded non-pumped STHTS problem taken from [25].
Applsci 09 02440 g006
Figure 7. Convergence characteristics of the improved APSO algorithm on the non-cascaded non-pumped STHTS problem.
Figure 7. Convergence characteristics of the improved APSO algorithm on the non-cascaded non-pumped STHTS problem.
Applsci 09 02440 g007
Figure 8. Convergence characteristics of the improved APSO algorithm on the non-cascaded non-pumped STHTS problem for five different instances.
Figure 8. Convergence characteristics of the improved APSO algorithm on the non-cascaded non-pumped STHTS problem for five different instances.
Applsci 09 02440 g008
Figure 9. Convergence characteristics of the simple APSO algorithm on the non-cascaded pumped-storage STHTS problem considering valve point loading.
Figure 9. Convergence characteristics of the simple APSO algorithm on the non-cascaded pumped-storage STHTS problem considering valve point loading.
Applsci 09 02440 g009
Figure 10. Convergence characteristics of the improved APSO algorithm on the non-cascaded pumped-storage STHTS problem considering valve point loading.
Figure 10. Convergence characteristics of the improved APSO algorithm on the non-cascaded pumped-storage STHTS problem considering valve point loading.
Applsci 09 02440 g010
Figure 11. Convergence characteristics of the simple APSO algorithm on the non-cascaded non-pumped STHTS problem without transmission loss.
Figure 11. Convergence characteristics of the simple APSO algorithm on the non-cascaded non-pumped STHTS problem without transmission loss.
Applsci 09 02440 g011
Figure 12. Convergence characteristics of the improved APSO algorithm on the non-cascaded non-pumped STHTS problem without transmission loss.
Figure 12. Convergence characteristics of the improved APSO algorithm on the non-cascaded non-pumped STHTS problem without transmission loss.
Applsci 09 02440 g012
Figure 13. Plots of selected functions.
Figure 13. Plots of selected functions.
Applsci 09 02440 g013
Table 1. Load demand for each scheduling interval of four hours each.
Table 1. Load demand for each scheduling interval of four hours each.
Interval123456
Demand MW160018001600500500500
Table 2. Comparison of some performance parameters between APSO and improved APSO on the pumped storage STHTS problem.
Table 2. Comparison of some performance parameters between APSO and improved APSO on the pumped storage STHTS problem.
Performance Parameter or AlgorithmAPSOImproved APSO
Minimum Cost269,642.4001269,642.4000
Average Cost269,642.5318269,642.4752
Maximum Cost269,643.2336269,642.402
No. of acceptable convergences4 out of 100 trials97 out of 100 trials
Standard deviation6.25575390.0088976
Average No. of iterations78
Average computation time *0.462 s0.506 s
* Average computation time is according to MATLAB 2015 on a Core i5 second-generation processor.
Table 3. SPSS results of independent sample t-test showing the comparison between simple APSO and improved APSO on the pumped storage non-cascaded STHTS problem.
Table 3. SPSS results of independent sample t-test showing the comparison between simple APSO and improved APSO on the pumped storage non-cascaded STHTS problem.
Independent Samples t-Test
Levene’s Test for the Equality
of Variances
t-Test for the Equality of Means
Comparison
_APSO
FSig.tdfSig.
(2-tailed)
Mean
Difference
Std. Error
Difference
95% Confidence Interval
of the Difference
LowerUpper
Equal
variances
assumed
(100 instances)
62.4240.0005.3331980.0003.33612100.62557602.10247414.5697679
Equal
variances
assumed
(200 instances)
89.7870.000−5.7683980.000−1.0860.1883−1.4564−0.7159
Equal
variances
not assumed
(100 instances)
5.33399.000.0003.33612100.62557602.09484254.5773995
Equal
variances
not assumed
(200 instances)
−5.768200.6380.000−1.086220.1883−1.4575−0.718
Table 4. Power flow and cost optimization with the improved APSO algorithm implementation on the non-cascaded pumped storage STHTS problem.
Table 4. Power flow and cost optimization with the improved APSO algorithm implementation on the non-cascaded pumped storage STHTS problem.
IntervalDemand (MW)PT (MW)Phyd (MW)Dis (ac-ft/h)V (ac-ft)Total Cost ($)
116001449.991505005999.95269,642.4
2180015003008002799.819
316001450149.97500800
4500800−300−6003200
5500800−300−6005600
6500800−300−6008000
Table 5. Load demand for each scheduling interval (each interval is 12 hours, making a total scheduling period of 3 days).
Table 5. Load demand for each scheduling interval (each interval is 12 hours, making a total scheduling period of 3 days).
Interval123456
Demand MW12001500110018009501300
Table 6. Comparison of some performance parameters between APSO and the improved APSO on non-cascaded non-pumped STHTS problem.
Table 6. Comparison of some performance parameters between APSO and the improved APSO on non-cascaded non-pumped STHTS problem.
Performance Parameter or AlgorithmAPSOImproved APSO
Minimum Cost727,870.00727,855.8
Average Cost728,422.7728,539.4
Maximum Cost730,103.3732,386.8
No. of acceptable convergences15 out of 100 trials15 out of 100 trials
Standard deviation413.35539.34
Average No. of iterations67
Average computation time *1.035 s1.227 s
* Average computation time is according to MATLAB 2015 on a Core i5 second-generation processor.
Table 7. SPSS results of the independent sample t-test showing the comparison between simple APSO and improved APSO on the non-cascaded non-pumped STHTS problem.
Table 7. SPSS results of the independent sample t-test showing the comparison between simple APSO and improved APSO on the non-cascaded non-pumped STHTS problem.
Independent Samples t-Test
Levene’s Test for the Equality
of Variances
t-Test for the Equality of Means
Comparison
_APSO
FSig.tdfSig.
(2-tailed)
Mean
Difference
Std. Error
Difference
95% Confidence Interval
of the Difference
LowerUpper
Equal
variances
assumed
(100 instances)
4.6740.032−1.1331980.259−76.999185067.9525842−211.002868757.0044987
Equal
variances
assumed
(200 instances)
14.5010.0001.6053980.10977.282748.1474−17.3722171.93
Equal
variances
not assumed
(100 instances)
−1.133185.4680.259−76.999185067.9525842−211.058571257.0602012
Equal
variances
not assumed
(200 instances)
1.1605367.6930.10977.282748.1474−17.3960171.9615
Table 8. Power flow and cost optimization with the improved APSO algorithm implementation.
Table 8. Power flow and cost optimization with the improved APSO algorithm implementation.
IntervalPT (MW)Phyd (MW)Ploss (MW)Dis (acre-ft/h)V (acre-ft)Total Cost ($)
1833.91377.411.399220697,527.06727,855.8
2950.66575.8626.52319283,222.43
3813.76293.106.871786.785,781.40
41091.94753.4745.414074.760,884.22
5733.34220.543.891426.167,770.92
6851.08466.31317.3952647.5760,000
Table 9. Comparison of some performance parameters between APSO and improved APSO on the pumped storage STHTS problem considering valve point loading.
Table 9. Comparison of some performance parameters between APSO and improved APSO on the pumped storage STHTS problem considering valve point loading.
Performance Parameter or AlgorithmAPSOImproved APSO
Minimum Cost265,200.2082265,200.2082
Average Cost265,433.4813265,459.0192
Maximum Cost265,781.3508269,539.6085
No. of acceptable convergences53 out of 100 trials59 out of 100 trials
Standard deviation174.6004454.7182
Average No. of iterations77
Average computation time *0.5 s0.6 s
* Average computation time is according to MATLAB 2015 on a Core i5 second-generation processor.
Table 10. SPSS results of the independent sample t-test showing the comparison between simple APSO and improved APSO on the non-cascaded pumped-storage STHTS problem considering valve point loading.
Table 10. SPSS results of the independent sample t-test showing the comparison between simple APSO and improved APSO on the non-cascaded pumped-storage STHTS problem considering valve point loading.
Independent Samples t-Test
Levene’s Test for the Equality
of Variances
t-Test for the Equality of Means
Comparison
_APSO
FSig.tdfSig.
(2-tailed)
Mean
Difference
Std. Error
Difference
95% Confidence Interval
of the Difference
LowerUpper
Equal
variances
assumed
(100 instances)
1.6720.1980.5241980.60125.5378548.70872−70.51659121.59230
Equal
variances
not assumed
(100 instances)
0.524127.5710.60125.5378548.70872−70.84376121.91947
Table 11. Power flow and cost optimization with the improved APSO algorithm implementation on the non-cascaded pumped-storage STHTS problem considering valve point loading.
Table 11. Power flow and cost optimization with the improved APSO algorithm implementation on the non-cascaded pumped-storage STHTS problem considering valve point loading.
IntervalDemand (MW)PT (MW)Phyd (MW)Dis (acre-ft/h)V (acre-ft)Total Cost ($)
116001392.80207.13614.275542.813265,200.2
218001540.4259.51719.032666.47
316001466.6133.34466.68800
4500800−300−6003200
5500800−300−6005600
6500800−300−6008000
Table 12. Load demand for each scheduling interval of 4 hours each.
Table 12. Load demand for each scheduling interval of 4 hours each.
Interval123456
Demand MW6001000900500400500
Table 13. SPSS results of the independent sample t-test showing the comparison between simple APSO and improved APSO on the non-pumped storage non-cascaded STHTS problem without transmission losses.
Table 13. SPSS results of the independent sample t-test showing the comparison between simple APSO and improved APSO on the non-pumped storage non-cascaded STHTS problem without transmission losses.
Independent Samples t-Test
Levene’s Test for the Equality
of Variances
t-Test for the Equality of Means
Comparison
_APSO
FSig.tdfSig.
(2-tailed)
Mean
Difference
Std. Error
Difference
95% Confidence Interval
of the Difference
LowerUpper
Equal
variances
assumed
(100 instances)
0.1580.691−0.8531980.395−1.270931.48974−4.208721.66686
Equal
variances
not assumed
(100 instances)
−0.853199.9460.395−1.270931.48974−4.208721.66686
Table 14. Power flow and cost optimization with the improved APSO algorithm implementation on the non-cascaded non-pumped-storage STHTS problem while not considering transmission losses.
Table 14. Power flow and cost optimization with the improved APSO algorithm implementation on the non-cascaded non-pumped-storage STHTS problem while not considering transmission losses.
IntervalPT (MW)Phyd (MW)Ploss (MW)Dis (acre-ft/h)V (acre-ft)Total Cost ($)
1600475.8124.111501.1017,995.5872,658.09
21000699.4299.503255.4912,973.58
3900702.7197.962239.6312,015.06
4500348.6151.941779.4712,897.18
5400282.7117.461434.6615,158.53
6500246.4252.962789.6312,000.00
Table 15. Comparison of some performance parameters between APSO and improved APSO on the non-cascaded and non-pumped storage STHTS problem while not considering transmission losses.
Table 15. Comparison of some performance parameters between APSO and improved APSO on the non-cascaded and non-pumped storage STHTS problem while not considering transmission losses.
Performance Parameter or AlgorithmAPSOImproved APSO
Minimum Cost72,662.8172,658.09
Average Cost72,684.0972,682.82
Maximum Cost72,708.6022,717.87
No. of acceptable convergences72 out of 100 trials80 out of 100 trials
Standard deviation10.6210.44
Average No. of iterations5050
Average computation time *0.512 s0.506 s
* Average computation time is according to MATLAB 2015 on a Core i5 second-generation processor.
Table 16. Selected functions for the validation of the proposed improved APSO.
Table 16. Selected functions for the validation of the proposed improved APSO.
Optimization FunctionMichaelwics 2D function
Mathematical form f ( x , y ) = s i n ( x ) sin 20 ( x 2 π ) + sin ( y ) sin 20 ( 2 y 2 π )
( x , y ) = [ 0 , 4 ] × [ 0 , 4 ]
Nature of the problem
and its global optimum value
Highly non-linear and multi-modal
Global optimum approximately at
( x , y , z ) = ( 2.20319 , 1.57049 , 1.801 )
Optimization FunctionRosenbrock 2D function (Banana function)
Mathematical form f ( x , y ) = ( 1 x ) 2 + 100 × ( y x 2 ) 2
( x , y ) = [ 5 , 5 ] × [ 5 , 5 ]
Nature of the problem
and its global optimum value
Non-linear and convex
Global optimum at
( x , y , z ) = ( 1 , 1 , 0 )
Optimization FunctionDe Jong 2D function
Mathematical form f ( x , y ) = x 2 + y 2
( x , y ) = [ 5 , 5 ] × [ 5 , 5 ]
Nature of the problem
and its global optimum value
Non-linear and convex
Global optimum at
( x , y , z ) = ( 0 , 0 , 0 )
Optimization FunctionEgg crate 2D function
Mathematical form f ( x , y ) = x 2 + y 2 + 25 [ sin 2 ( x ) + sin 2 ( y ) ]
( x , y ) = [ 5 , 5 ] × [ 5 , 5 ]
Nature of the problem
and its global optimum value
Highly non-linear and multi-modal
Global optimum at
( x , y , z ) = ( 0 , 0 , 0 )
Table 17. Performance of the proposed improved APSO compared with available references.
Table 17. Performance of the proposed improved APSO compared with available references.
Optimization FunctionMichaelwics 2D Function
Minimum Using Improved APSO ( x , y , z ) = ( 2.2010 , 1.5710 , 1.8012 ) Average = −1.6932
Standard deviation = 0.31429 Successful trials = 15/20
(−1.7950 and less taken as successful values)
Iterations Using Improved APSO15 iterations to achieve nearest results to three decimal places.
Best result achieved after 70 trials.
Minimum Using Original AlgorithmAPSO in [1] with ( x , y , z ) = ( 2.20319 , 1.57049 , 1.801 )
Iterations Using Previous Algorithms15 iterations to achieve nearest result. Best result achieved
after 200 trials.
Optimization FunctionRosenbrock/Banana 2D
Minimum Using Improved APSO ( x , y , z ) = ( 1.0047 , 1.0099 , 0.000 ) Average = 0.09987
Standard deviation = 0.3389 Successful trials = 15/20
(0.001 and less taken as successful values)
Iterations Using Improved APSO100 iterations to achieve nearest results to three decimal places.
Minimum Using Original AlgorithmHarmony search in [1] with ( x , y , z ) = ( 1.005 , 1.0605 , 0.000 )
Iterations Using Previous Algorithms2500 iterations to achieve nearest result.
Optimization FunctionDe-Jong 2D
Minimum Using Improved APSO ( x , y , z ) = ( 0.0015 , 0.0042 , 0.000 ) Average = 0.000315
Standard deviation = 0.00055 Successful trials = 20/20
(0.001 and less taken as successful values)
Iterations Using Improved APSO10 iterations to achieve nearest results to three decimal places.
Minimum Using Original AlgorithmSimple APSO with ( x , y , z ) = ( 0.0203 , 0.0197 , 0.0008 )
Iterations Using Previous Algorithms10 iterations to achieve nearest result.
Optimization FunctionEgg-Crate 2D
Minimum Using Improved APSO ( x , y , z ) = ( 0.0030 , 0.0027 , 0.0004 ) Average = 3.989945
Standard deviation = 5.0615 Successful trials = 12/20
(0.01 and less taken as successful values)
Iterations Using Improved APSO10 iterations to achieve nearest results to three decimal places.
Minimum Using Original AlgorithmSimulated annealing in [1]. ( x , y , z ) = ( 0 , 0 , 0 )
Iterations Using Previous Algorithms2500 trials of program and unknown number of iterations,
accurate to three decimal places.
Average computation time of improved and simple APSO is less than 1 s for each objective function.

Share and Cite

MDPI and ACS Style

Fakhar, M.S.; Kashif, S.A.R.; Ain, N.U.; Hussain, H.Z.; Rasool, A.; Sajjad, I.A. Statistical Performances Evaluation of APSO and Improved APSO for Short Term Hydrothermal Scheduling Problem. Appl. Sci. 2019, 9, 2440. https://doi.org/10.3390/app9122440

AMA Style

Fakhar MS, Kashif SAR, Ain NU, Hussain HZ, Rasool A, Sajjad IA. Statistical Performances Evaluation of APSO and Improved APSO for Short Term Hydrothermal Scheduling Problem. Applied Sciences. 2019; 9(12):2440. https://doi.org/10.3390/app9122440

Chicago/Turabian Style

Fakhar, Muhammad Salman, Syed Abdul Rahman Kashif, Noor Ul Ain, Hafiz Zaheer Hussain, Akhtar Rasool, and Intisar Ali Sajjad. 2019. "Statistical Performances Evaluation of APSO and Improved APSO for Short Term Hydrothermal Scheduling Problem" Applied Sciences 9, no. 12: 2440. https://doi.org/10.3390/app9122440

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop