Next Article in Journal
A Novel Single-Valued Neutrosophic Set Similarity Measure and Its Application in Multicriteria Decision-Making
Next Article in Special Issue
Solving Multi-Objective Matrix Games with Fuzzy Payoffs through the Lower Limit of the Possibility Degree
Previous Article in Journal
Merger and Acquisition Target Selection Based on Interval Neutrosophic Multigranulation Rough Sets over Two Universes
Previous Article in Special Issue
The Fuzzy u-Chart for Sustainable Manufacturing in the Vietnam Textile Dyeing Industry
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Forecasting Based on High-Order Fuzzy-Fluctuation Trends and Particle Swarm Optimization Machine Learning

1
School of management, Jiangsu University, Zhenjiang 212013, China
2
Rensselaer Polytechnic Institute, Troy, NY 12180, USA
*
Author to whom correspondence should be addressed.
Symmetry 2017, 9(7), 124; https://doi.org/10.3390/sym9070124
Submission received: 6 June 2017 / Revised: 13 July 2017 / Accepted: 17 July 2017 / Published: 21 July 2017
(This article belongs to the Special Issue Fuzzy Sets Theory and Its Applications)

Abstract

:
Most existing fuzzy forecasting models partition historical training time series into fuzzy time series and build fuzzy-trend logical relationship groups to generate forecasting rules. The determination process of intervals is complex and uncertain. In this paper, we present a novel fuzzy forecasting model based on high-order fuzzy-fluctuation trends and the fuzzy-fluctuation logical relationships of the training time series. Firstly, we compare each piece of data with the data of theprevious day in a historical training time series to generate a new fluctuation trend time series (FTTS). Then, we fuzzify the FTTS into a fuzzy-fluctuation time series (FFTS) according to the up, equal, or down range and orientation of the fluctuations. Since the relationship between historical FFTS and the fluctuation trend of the future is nonlinear, a particle swarm optimization (PSO) algorithm is employed to estimate the proportions for the lagged variables of the fuzzy AR (n) model. Finally, we use the acquired parameters to forecast future fluctuations. In order to compare the performance of the proposed model with that of the other models, we apply the proposed method to forecast the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) time series datasets. The experimental results and the comparison results show that the proposed method can be successfully applied in stock market forecasting or similarkinds of time series. We also apply the proposed method to forecast Shanghai Stock Exchange Composite Index (SHSECI) and DAX30 index to verify its effectiveness and universality.

1. Introduction

It is well known that historic time series imply the behavior rules of a given phenomenon and can be used to forecast the future of the same event [1]. Many researchers have developed time series models to predict the future of a complex system, e.g., regression analysis [2], the autoregressive moving average (ARIMA) model [3], the autoregressive conditional heteroscedasticity (ARCH) model [4], the generalized ARCH (GARCH) model [5], and so on. However, these methods require some premise hypotheses,such as a normality postulate [6], etc. Meanwhile, models that satisfied the constraints precisely can miss the true optimum design within the confines of practical and realistic approximations. Therefore, Song and Chissom proposed the fuzzy time series forecasting model [7,8,9]. Since then, the FTS model has been applied for forecasting in many nonlinear and complicated forecasting problems, e.g., stock market [10,11,12,13], electricity load demand [14,15], project cost [16], and the enrollment at Alabama University [17,18], etc.
A vast majority of FTS models are first-order and high-order fuzzy AR (autoregressive) models. These models can be considered as an equivalent version of AR (n) based on fuzzy lagged variables of time series. Most of these fuzzy time series models follow the basic steps as Chen proposed [19]:
Step 1:
Define the universe U and the number and length of the intervals;
Step 2:
Fuzzify the historical training time series into fuzzy time series;
Step 3:
Establish fuzzy logical relationships (FLR) according to the historical fuzzy time series and generate forecasting rules based on fuzzy logical groups (FLG);
Step 4:
Calculate the forecast values according to the FLG rules and the right-hand side (RHS) of the forecasted point.
In order to improve the accuracy of such kinds of FTS models, researchers have proposed other improved models based on Chen’s model. For example, concerning the determination of suitable intervals, Huarng [20] proposed averages and distribution methods to determine the optimal interval length. Huarng and Yu [21] proposed an unequal interval length method based on ratios of data. Since then, many studies [20,22,23,24,25,26,27] have been carried out for the determination of the optimal interval length using statistical theory. Some authors even employed PSO techniques to determine the length of the intervals [12]. In fact, in addition to the determination of intervals, the definition of the universe of discourse also has an effect on the accuracy of the forecasting results. In these models, minimum data value, maximum data value, and two suitable positive numbers must be determined to make a proper bound of the universe of discourse.
Concerning the establishment of fuzzy logical relationships, many researchers utilize artificial neural networks to determine fuzzy relations [28,29,30]. The study of Aladag et al. [28] is considered as a basic high-order method for forecasting based on artificial neural networks. Meanwhile, fuzzy AR models are also widely used in many fuzzy time series forecasting studies [11,12,31,32,33,34,35]. In order to reflect the recurrence and the weights of different FLR in fuzzy AR models, Yu [36] used a chronologically-determined weight in the defuzzification process. Cheng et al. [37] used the frequencies of different right-hand sides (RHS) of FLG rules to determine the weight of each LHS. Furthermore, many studies employed the adaptive fuzzy inference system (ANFIS) method [38] for time series forecasting. For example, Primoz and Bojan [39] defined soft constraints based on ANFIS to discrete optimization for obtaining optimal solutions. Egrioglu et al. [40] proposed a model named the modified adaptive network based fuzzy inference system (MANFIS). Sarica et al. [41] developed a model based on an autoregressive adaptive network-based fuzzy inference system (AR-ANFIS), etc. Since 2013, considering the impacts of specification errors, fuzzy auto regressive and moving average (ARMA) time series forecasting models were proposed [42,43]. The initial first-order ARMA fuzzy time series forecasting model was proposed by Egrioglu et al. [42] based on the particle swarm optimization method. Kocak [43] developed a high-order ARMA fuzzy time series model based on artificial neural networks. Kocak [44] used both fuzzy AR variables and fuzzy MA variables to increase the performance of the forecasting models.
A forecasting model is used to predict the future fluctuation of a time series based on current values. Therefore, we present a novel method to forecast the fluctuation of a stock market based on a high-order AR (n) fuzzy time series model and particle swarm optimization (PSO) arithmetic. Unlike existing models, the proposed model is based on the fluctuation values instead of the exact values of the time series. Firstly, we calculate the fluctuation for each datum by comparing it with the data of its previous day in a historical training time series to generate a new fluctuation trend time series (FTTS). Then, we fuzzify the FTTS into fuzzy-fluctuation time series (FFTS) according to the up, equal, or down range of each fluctuation data value. Since the relationship between historical FFTS and future fluctuation trends is nonlinear, a PSO algorithm is employed to estimate the proportion of each AR and MA parameter in the model. Finally, we use these acquired parameters to forecast future fluctuations. The advantages provided by the proposed method are as follows.
The remaining content of this paper is organized as follows: Section 2 introduces some preliminaries of fuzzy-fluctuation time series based on Song and Chissom’s fuzzy time series [7,8,9]. Section 4 introduces the process of the PSO machine learning method. Section 4 describes a novel approach for forecasting based on high-order fuzzy-fluctuation trends and the PSO heuristic learning process. In Section 5, the proposed model is usedto forecast the stock market using TAIEX datasets from 1997 to 2005, SHSECI from 2007 to 2015, and the year 2015 of the DAX30 index. Conclusions and potential issues for future research are summarized in Section 6.

2. Preliminaries

Song and Chissom [7,8,9] combined fuzzy set theory with time series and presented the following definitions of fuzzy time series. In this section, we will extend fuzzy time series to fuzzy-fluctuation time series (FFTS) and propose the related concepts.
Definition 1.
Let L = { l 1 , l 2 , ... , l g } be a fuzzy set in the universe of discourse U ; it can be defined by its membership function, μ L : U [ 0 , 1 ] , where μ L ( u i ) denotes the grade of membership of u i , U = { u 1 , u 2 , ... u i , ... , u l } .
The fluctuation trends of a stock market can be expressed by a linguistic set L = { l 1 , l 2 , ... , l g } , e.g., let g=3, L = { l 1 , l 2 , l 3 } ={down, equal, up}. The element l i and its subscript i is strictly monotonically increasing [45], so the function can be defined as follows: f : l i = f ( i ) . To preserve all of the given information, the discrete L = { l 1 , l 2 , ... , l g } also can be extended to a continuous label L ¯ = { l a | a R } , which satisfies the above characteristics.
Definition 2.
Let X ( t ) ( t = 1 , 2 , ... , T ) be a time series of real numbers, where T is the number of the time series. Y ( t ) is defined as a fluctuation time series, where Y ( t ) = X ( t ) X ( t 1 ) , ( t = 2 , 3 , ... , T ) . Each element of Y ( t ) can be represented by a fuzzy set S ( t ) ( t = 2 , 3 , ... , T ) as defined in Definition 1. Then we call time series Y ( t ) to befuzzified into a fuzzy-fluctuation time series (FFTS) S ( t ) .
Definition 3.
Let S ( t ) ( t = 2 , 3 , ... , T ) be a FFTS. If S ( t ) is determined by S ( t 1 ) , S ( t 2 ) , ... , S ( t n ) , then the fuzzy-fluctuation logical relationship is represented by:
S ( t ) S ( t 1 ) , S ( t 2 ) , ... , S ( t n )
and it is called the nth-order fuzzy-fluctuation logical relationship (FFLR) of the fuzzy-fluctuation time series, where S ( t ) is called the left-hand side(LHS) and S ( t n ) , ... , S ( t 2 ) S ( t 1 ) is called the right-hand side(RHS) of the FFLR. This model can be considered as an equivalent of the auto-regressive model of AR (n), defined in Equation (2):
S ¯ ( t ) = ϕ S 1 ( t 1 ) + ϕ S 2 ( t 2 ) + , ... , + ϕ S n ( t n ) + ε t
where ϕ ( k = 1 , 2 , ... , n ) k represents the portion of S ( t k ) for calculating the forecast is ϕ k , ε t is the calculation error, and S ¯ ( t ) is introduced to preserve more information, as described in Definition 1.

3. PSO-Based Machine Learning Method

In this paper, the particle swarm optimization (PSO) is employed to estimate the parameters in Equation (2). The PSO method was introduced as an optimization method for continuous nonlinear functions [46]. It is a stochastic optimization technique, which is similar to social models, such as birds flocking or fish schooling. During the optimization process, particles are distributed randomly in the design space and their location and velocities are modified according to their personal best and global best solutions. Let m+1 represent the current time step, x i , m + 1 , v i , m + 1 , x i , m , v i , m indicate the current position, current velocity, previous position, and previous velocity of particle i, respectively. The position and velocity of particle i are manipulated according to the following equations:
x i , m + 1 = x i , m + v i , m + 1
v i , m + 1 = w × v i , m + c 1 × R a n d ( ) × ( p i , m x i , m ) + c 2 × R a n d ( ) × ( p g , m x i , m )
where w is an inertia weight which determines how much the previous velocity is preserved [47], c1 and c2 are the self-confidence coefficient and social confidence coefficient, respectively, R a n d ( ) [ 0 , 1 ] is a random number, and p i , m and p g , m are the personal best position found by particle i and the global best position found by all particles in the swarm up to time step m, respectively.
Let the design space be defined by [ x m i n , x m a x ] . If the position of particle i exceeds the boundary, then v i , m + 1 is modified as follows:
x i , m + 1 = { x max ( 0.5 × R a n d ( ) × ( x max x min ) ) , i f x i , m + 1 > x max x min + ( 0.5 × R a n d ( ) × ( x max x min ) ) , i f x i , m + 1 < x min

4. A Novel Forecasting Model Based on High-Order Fuzzy-Fluctuation Trends

In this paper, we propose a novel forecasting model based on high-order fuzzy-fluctuationtrends and a PSO machine learning algorithm. In order to compare the forecasting results with other researchers’ work [10,11,27,36,48,49,50,51], the authentic TAIEX (Taiwan Stock Exchange Capitalization Weighted Stock Index) is employed to illustrate the forecasting process. The data from January 1999 to October 1999 are used as training time series and the data from November 1999 to December 1999 are used as testing dataset. The basic steps of the proposed model are shown in Figure 1.
Step 1: Construct FFTS for historical training data
For each element X ( t ) ( t = 1 , 2 , ... , T ) in the historical training time series, its fluctuation trend is determined by Y ( t ) = X ( t ) X ( t 1 ) , ( t = 2 , 3 , ... , T ) . According to the range and orientation of the fluctuations, Y ( t ) ( t = 2 , 3 , ... , T ) can be fuzzified into a linguistic set {down, equal, up}. Let len be the whole mean of all elements in the fluctuation time series Y ( t ) ( t = 2 , 3 , ... , T ) , define u 1 = [ , l e n / 2 ) , u 2 = [ l e n / 2 , l e n / 2 ) , u 3 = [ l e n / 2 , + ] , then Y ( t ) ( t = 2 , 3 , ... , T ) can be fuzzified into a fuzzy-fluctuation time series S ( t ) ( t = 2 , 3 , ... , T ) . It is can also be extended to a continuous labeled time series S ¯ ( t ) ( t = 2 , 3 , ... , T ) , which preserves the accurate original information of Y ( t ) ( t = 2 , 3 , ... , T ) .
Step 2: Establish nth-order FFLRs for the forecasting model
According to Equation (2), each S ¯ ( t ) ( t n + 2 ) can be represented by its previous n days’ fuzzy-fluctuation number. Therefore, the total of FFLRs for historical training data is pn = T − n − 1.
Step 3: Determine the parameters for the forecasting model based on the PSO machine learning algorithm
In this paper, the PSO method is employed to determine the parameters ϕ ( k = 1 , 2 , ... , n ) k and a general error ε in Equation (2). The personal best position and global best position are determined by minimizing the root of the mean squared error (RMSE) in the training process:
R M S E = t = 1 n ( f o r e c a s t ( t ) a c t u a l ( t ) ) 2 n
where n denotes the number of values forecasted, forecast(t) and actual(t)denote the forecasting value and actual valueat time tin the training process, respectively. For determined ϕ ( k = 1 , 2 , ... , n ) k and ε , the forecast value at time t is as follows:
f o r e c a s t ( t ) = a c t u a l ( t 1 ) + l e n × ( ϕ S 1 ( t 1 ) + ϕ S 2 ( t 2 ) + , ... , + ϕ S n ( t n ) + ε 2 )
The pseudo-code for the PSO-based machine learning algorithm is shown in Appendix A.
Step 4: Forecast test time series
For each data in the test time series, its future number can be forecasted according to Equation (7), based on the observed data point X (t − 1), its n-order fuzzy-fluctuation trends, and the parameters generated from the training dataset.

5. EmpiricalAnalysis

5.1. Forecasting TAIEX

Many studiesuse TAIEX1999as an example to illustrate their proposed forecasting methods [10,11,27,36,48,49,50,51]. In order to compare the accuracy with their models, we also use TAIEX1999 to illustrate the proposed method.
Step 1: Calculate the fluctuation trend for each element in the historical training dataset of TAIEX1999. Then, we use the whole mean of the fluctuation numbers of the training dataset to fuzzify the fluctuation trends into FFTS.For example, the whole mean of the historical dataset of TAIEX1999 from January to October is 85. That is to say, len = 85. For X(1) = 6152.43 and X(2) = 6199.91, Y(2 ) = 47.48, S(2) = 3, and S ¯ ( 2 ) 2.5586 . On the other hand, based on the previous data X(1) and the accurate fuzzy number S ¯ ( 2 ) , X(2) can be obtained by: X ( 1 ) + l e n × ( S ¯ ( 2 ) 2 ) , that is 6152.43 + ( 2.5588 2 ) × 85 6199.91 . In this way, the historical training dataset can be represented by a fuzzified fluctuation dataset as shown in Appendix B.
Step 2: Based on the FFTS from 5 January 1999 to 30 October shown in Table 1, establish the nth-order FFLRs for the forecasting model. For example, suppose n=6, the following FFLRs of FFTS can be generated:
S ¯ ( 7 ) = 1.082 = ϕ + 1 ϕ + 2 2 ϕ + 3 2 ϕ + 4 3 ϕ + 5 3 ϕ + 6 ε 7 S ¯ ( 8 ) = 4.5091 = ϕ + 1 ϕ + 2 ϕ + 3 2 ϕ + 4 2 ϕ + 5 3 ϕ + 6 ε 8 S ¯ ( 221 ) = 3.7433 = 2 ϕ + 1 2 ϕ + 2 2 ϕ + 3 2 ϕ + 4 3 ϕ + 5 ϕ + 6 ε 221
Step 3: Replace each error ε t in Equation (8) with one and the same ε . Let the number of iterations itern = 100, the inertia weight w = 0.7298, the self-confidence coefficient and social confidence coefficient c1 = c2 = 1.4962, and use the PSO algorithm listed in Figure 1 to determine the parameters ϕ ( k = 1 , 2 , ... , n ) k and ε . In the PSO process, each element in the generalizedEquation (8) is a particle and their personal best and global best positions are determined by the RMSE of the actual values and forecast values. The obtained global best parameters are shown in Table 1.
Step 4: Use the obtained global best parameters in Table 1 to forecast the test dataset from 1 November 1999 to 30 December. For example, the forecasting value of the TAIEX on 8 November 1999 is calculated asfollows:
Firstly, according to the fuzzy-fluctuation trends (2,1,1,1,2,1) and the parameters in Table 1, the forecasted continuous labeled fuzzy-fluctuation number is:
2 × ( 0.1638 )   +   0.0803   +   0.1372 0.0321   +   2   × 0.0433   +   0.2546   +   1.4408   =   1.6398
Then, the forecasted fluctuation from current value to next value can be obtained by defuzzifying the fluctuation fuzzy number:
( 1.6398 2 ) × 85 = 30.62
Finally, the forecasted value can be obtained by current value and the fluctuation value:
7376.56 30.62   =   7345.94
The other forecasting results are shown in Table 2 and Figure 2.
The forecasting performance can be assessed by comparing the difference between the forecasted valuesand the actual values. The widely used indicators in time series models comparisons are the mean squared error (MSE), root of the mean squared error (RMSE), mean absolute error (MAE), mean percentage error (MPE), etc.To compare the performance of different forecasting results, the Diebold-Mariano test statistic (S) is also widely used. These indicators are defined by Equations (9)–(13):
M S E = t = 1 n ( f o r e c a s t ( t ) a c t u a l ( t ) ) 2 n
R M S E = t = 1 n ( f o r e c a s t ( t ) a c t u a l ( t ) ) 2 n
M A E = t = 1 n | ( f o r e c a s t ( t ) a c t u a l ( t ) ) | n
M P E = t = 1 n | ( f o r e c a s t ( t ) a c t u a l ( t ) ) | / a c t u a l ( t ) n
S = d ¯ ( V a r i a n c e ( d ¯ ) ) 1 / 2 , d ¯ = t = 1 n ( e r r o r o f f o r e c a s t 1 ) t 2 t = 1 n ( e r r o r o f f o r e c a s t 2 ) t 2 n
where n denotes the number of values forecasted, forecast(t) and actual(t) denote the predicted value and actual valueat time t, respectively. S is a test statistic of the Diebold method which is used to compare predictive accuracy of two forecasts obtained by different methods. Forecast1 represents the dataset obtained by method 1, and Forecast2 represents another dataset from method 2. If S>0 and | S | > Z = 1.64 , at the 0.05 significant level, Forecast2 has better predictive accuracy than Forecast1. With respect to the proposed method for the sixth-order, the MSE, RMSE, MAE, and MPE are 9862.33, 99.31, 75.22, and 0.01, respectively.
In order to compare the forecasting results with different parameters such as the number n of the nth-order and the element number g of linguistic set used in the fluctuation fuzzifying process, different experiments under different parameters were carried out. Each typeof experiment was repeated 30 times. The forecasting errors of the averages for the experiments are shown in Table 3 and Table 4.
In Table 4, g = 3 represents that the linguistic set is {down, equal, up}, g = 5 means {greatly down, slightly down, equal, slightly up, greatly up}, g = 7 means {very greatly down, greatly down, slightly down, equal, slightly up, greatly up, very greatly up}, and “none” means that the fluctuation values will not be fuzzified at all.
From Table 3 and Table 4, we can see that the RMSEs are lower when n is equal to six or more. With respect to the parameter g, obviously, the fuzzified fluctuation trends perform better than none fuzzified ones, and it is proper to let g = 3.
Letting n=6 and g=3, we employ the proposed method to forecast the TAIEX from 1997 to 2005. The forecasting results and errors are shown in Figure 3 and Table 5.
Table 6 shows a comparison of RMSEs for different methods for forecasting the TAIEX1999. From this table, we can see that the performance of the proposed method is acceptable. The greatest advantage of the proposed method is that it puts forward a method relying completely on the machine learning mechanism. Though RMSEs of some of the other methods outperform the proposed method, they oftenneed to determine complex discretization partitioning rulesor use adaptive expectation models to justify the final forecasting results. The method proposed in this paper is simpler and easily realized by a computer program.

5.2. Forecasting DAX30

The German DAX30 index is an important stock index in Germany. The RMSEs of different models forecastingyear 2015 of DAX30 are shown in Table 7.
From Table 7, we can see that the proposed method can successfully predict the DAX30 index.

5.3. Forecasting SHSECI

The SHSECI (Shanghai Stock Exchange Composite Index) is the most famous stock market index in China. In the following, we apply the proposed method to forecast the SHSECI from 2007 to 2015. For each year, the authentic datasets of historical daily SHSECI closing prices from January to October are used as the training data, and the datasets from November to December are used as the testing data. The RMSEs of forecast errors are shown in Table 8.
From Table 8, we can see that the proposed method can successfully predict the SHSECI stock market.

6. Conclusions

In this paper, a novel forecasting model is proposed based on high-order fuzzy-fluctuation logical trends and the PSO machine learning method. The proposed method is based on the fluctuations of the time series. The PSO method is employed to look for the best parameters to minimize the RMSE of a historical training dataset. Experiments show that the parameters generated from the training dataset can be successfully used for future datasets as well. In order to compare the performance with that of other methods, we take the TAIEX1999 as an example. We also forecasted TAIEX1997–2005, DAX30 2015 and SHSECI 2007–2015 to verify its effectiveness and universality. In the future, we will consider other factors which might affect the fluctuation of the stock market, such as the trade volume, the beginning value, the end value, etc. We will also consider the influence of other stock markets, such as the Dow Jones, the NASDAQ, the M1b, and so on.

Acknowledgments

The authors are indebted to anonymous reviewers for their very insightful comments and constructive suggestions, which help ameliorate the quality of this paper. This work was supported by the National Natural Science Foundation of China under grant 71471076, the Fund of the Ministry of Education of Humanities and Social Sciences (14YJAZH025), the Fund of the China Nation Tourism Administration (15TACK003), the Natural Science Foundation of Shandong Province (ZR2013GM003), and the Foundation Program of Jiangsu University (16JDG005).

Author Contributions

Aiwu Zhao conceived and designed the experiments; Jingyuan Jia performed the experiments and analyzed the data; and Shuang Guan wrote the paper.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

The Pseudo-code of PSO-based machine learning algorithm for seeking for global best position of each constant parameter in the forecasting model is shown in Table A1.
Table A1. Pseudo-code of the PSO-based machine learning algorithm.
Table A1. Pseudo-code of the PSO-based machine learning algorithm.
PSO-Based Machine Learning Algorithm for the Training Process
INPUT:X: training time series, containing T cases, denoted as X [ 1 ] , X [ 2 ] , ... , X [ i ] ... , X [ T ] .
S: a fuzzy-fluctuation time series of training data, containing T−1 cases, denoted as S [ 2 ] , S [ 3 ] , ... , S [ i ] ... , S [ T ] .
n: the number of nth-order.
itern: the number of iterations.
xmin, xmax: lower and upper bounds of space.
w, c1, c1: parameters described in Equations (3) and (4).
OUPUT:Φ[k] and ε: parameters for the forecasting model, k = 1,2,…,n.
1.Initialize the position and velocity for each particle i:
pn = T−1−n;
/* the number of particles. */
For i = 1 to pn
For j = 1 to n
x[i,j] = rand(xmin, xmax);
v[i,j] = rand(xmin, xmax);
2.Calculate the fitness value for each particle i according to Equation (6):
Set x[pbest] to current x[i] for each particle.
Locate the global best fitness value x[gbest] and set Φ[k] and ε to the corresponding x[gbest].
3.for m=1 to itern loop
For each particle i
Calculate particle velocity according to Equation (3).
Update particle position according to Equations (4) and (5)
If the fitness value is better than the best fitness value x[pbest] of particle i in history:
Set current value as the new x[pbest] for particle i
Locate the current global best fitness value, if it is better than the x[gbest] in history:
Set current global best fitness value as the new x[gbest], and set Φ[k] and ε to x[gbest].
4.Output Φ[k] and ε

Appendix B

The historical training dataset can be represented by a fuzzified fluctuation dataset as shown in Table A2.
Table A2. Historical training data and fuzzified fluctuation data of TAIEX1999.
Table A2. Historical training data and fuzzified fluctuation data of TAIEX1999.
Date (MM/DD/YYYY)TAIEXFluctuationFuzzifiedDate (MM/DD/YYYY)TAIEXFluctuationFuzzifiedDate (MM/DD/YYYY)TAIEXFluctuationFuzzified
1/5/19996152.43--4/17/19997581.5114.6837/26/19997595.71−128.811
1/6/19996199.9147.4834/19/19997623.1841.6827/27/19997367.97−227.741
1/7/19996404.31204.434/20/19997627.744.5627/28/19997484.5116.533
1/8/19996421.7517.4424/21/19997474.16−153.5817/29/19997359.37−125.131
1/11/19996406.99−14.7624/22/19997494.620.4427/30/19997413.1153.743
1/12/19996363.89−43.114/23/19997612.8118.237/31/19997326.75−86.361
1/13/19996319.34−44.5514/26/19997629.0916.2928/2/19997195.94−130.811
1/14/19996241.32−78.0214/27/19997550.13−78.9618/3/19997175.19−20.752
1/15/19996454.6213.2834/28/19997496.61−53.5218/4/19997110.8−64.391
1/16/19996483.328.724/29/19997289.62−206.9918/5/19996959.73−151.071
1/18/19996377.25−106.0514/30/19997371.1781.5538/6/19996823.52−136.211
1/19/19996343.36−33.8925/3/19997383.2612.0928/7/19997049.74226.223
1/20/19996310.71−32.6525/4/19997588.04204.7838/9/19997028.01−21.732
1/21/19996332.221.4925/5/19997572.16−15.8828/10/19997269.6241.593
1/22/19996228.95−103.2515/6/19997560.05−12.1128/11/19997228.68−40.922
1/25/19996033.21−195.7415/7/19997469.33−90.7218/12/19997330.24101.563
1/26/19996115.6482.4335/10/19997484.3715.0428/13/19997626.05295.813
1/27/19996138.8723.2325/11/19997474.45−9.9228/16/19998018.47392.423
1/28/19996063.41−75.4615/12/19997448.41−26.0428/17/19998083.4364.963
1/29/19995984−79.4115/13/19997416.2−32.2128/18/19997993.71−89.721
1/30/19995998.3214.3225/14/19997592.53176.3338/19/19997964.67−29.042
2/1/19995862.79−135.5315/15/19997576.64−15.8928/20/19998117.42152.753
2/2/19995749.64−113.1515/17/19997599.7623.1228/21/19998153.5736.152
2/3/19995743.86−5.7825/18/19997585.51−14.2528/23/19998119.98−33.592
2/4/19995514.89−228.9715/19/19997614.629.0928/24/19997984.39−135.591
2/5/19995474.79−40.125/20/19997608.88−5.7228/25/19998127.09142.73
2/6/19995710.18235.3935/21/19997606.69−2.1928/26/19998097.57−29.522
2/8/19995822.98112.835/24/19997588.23−18.4628/27/19998053.97−43.61
2/9/19995723.73−99.2515/25/19997417.03−171.218/30/19998071.3617.392
2/10/1999579874.2735/26/19997426.639.628/31/19998157.7386.373
2/20/19996072.33274.3335/27/19997469.0142.3829/1/19998273.33115.63
2/22/19996313.63241.335/28/19997387.37−81.6419/2/19998226.15−47.181
2/23/19996180.94−132.6915/29/19997419.732.3329/3/19998073.97−152.181
2/24/19996238.8757.9335/31/19997316.57−103.1319/4/19998065.11−8.862
2/25/19996275.5336.6626/1/19997397.6281.0539/6/19998130.2865.173
2/26/19996318.5242.9936/2/19997488.0390.4139/7/19997945.76−184.521
3/1/19996312.25−6.2726/3/19997572.9184.8839/8/19997973.327.542
3/2/19996263.54−48.7116/4/19997590.4417.5329/9/19998025.0251.723
3/3/19996403.14139.636/5/19997639.348.8639/10/19998161.46136.443
3/4/19996393.74−9.426/7/19997802.69163.3939/13/19998178.6917.232
3/5/19996383.09−10.6526/8/19997892.1389.4439/14/19998092.02−86.671
3/6/19996421.7338.6426/9/19997957.7165.5839/15/19997971.04−120.981
3/8/19996431.9610.2326/10/19997996.7639.0529/16/19997968.9−2.142
3/9/19996493.4361.4736/11/19997979.4−17.3629/17/19997916.92−51.981
3/10/19996486.61−6.8226/14/19997973.58−5.8229/18/19998016.93100.013
3/11/19996436.8−49.8116/15/19997960−13.5829/20/19997972.14−44.791
3/12/19996462.7325.9326/16/19998059.0299.0239/27/19997759.93−212.211
3/15/19996598.32135.5936/17/19998274.36215.3439/28/19997577.85−182.081
3/16/19996672.2373.9136/21/19998413.48139.1239/29/19997615.4537.62
3/17/19996757.0784.8436/22/19998608.91195.4339/30/19997598.79−16.662
3/18/19996895.01137.9436/23/19998492.32−116.59110/1/19997694.9996.23
3/19/19996997.29102.2836/24/19998589.3196.99310/2/19997659.55−35.442
3/20/19996993.38−3.9126/25/19998265.96−323.35110/4/19997685.4825.932
3/22/19997043.2349.8536/28/19998281.4515.49210/5/19997557.01−128.471
3/23/19996945.48−97.7516/29/19998514.27232.82310/6/19997501.63−55.381
3/24/19996889.42−56.0616/30/19998467.37−46.9110/7/19997612110.373
3/25/19996941.3851.9637/2/19998572.09104.72310/8/19997552.98−59.021
3/26/19997033.2591.8737/3/19998563.55−8.54210/11/19997607.1154.133
3/29/19996901.68−131.5717/5/19998593.3529.8210/12/19997835.37228.263
3/30/19996898.66−3.0227/6/19998454.49−138.86110/13/19997836.941.572
3/31/19996881.72−16.9427/7/19998470.0715.58210/14/19997879.9142.973
4/1/19997018.68136.9637/8/19998592.43122.36310/15/19997819.09−60.821
4/2/19997232.51213.8337/9/19998550.27−42.16210/16/19997829.3910.32
4/3/19997182.2−50.3117/12/19998463.9−86.37110/18/19997745.26−84.131
4/6/19997163.99−18.2127/13/19998204.5−259.4110/19/19997692.96−52.31
4/7/19997135.89−28.127/14/19997888.66−315.84110/20/19997666.64−26.322
4/8/19997273.41137.5237/15/19997918.0429.38210/21/19997654.9−11.742
4/9/19997265.7−7.7127/16/19997411.58−506.46110/22/19997559.63−95.271
4/12/19997242.4−23.327/17/19997366.23−45.35110/25/19997680.87121.243
4/13/19997337.8595.4537/19/19997386.8920.66210/26/19997700.2919.422
4/14/19997398.6560.837/20/19997806.85419.96310/27/19997701.220.932
4/15/19997498.1799.5237/21/19997786.65−20.2210/28/19997681.85−19.372
4/16/19997466.82−31.3527/22/19997678.67−107.98110/29/19997706.6724.822
4/17/19997581.5114.6837/23/19997724.5245.85310/30/19997854.85148.183

References

  1. Kendall, S.M.; Ord, K. Time Series, 3rd ed.; Oxford University Press: New York, NY, USA, 1990. [Google Scholar]
  2. Stepnicka, M.; Cortez, P.; Donate, J.P.; Stepnickova, L. Forecasting seasonal time series with computational intelligence: On recent methods and the potential of their combinations. Expert Syst. Appl. 2013, 40, 1981–1992. [Google Scholar] [CrossRef] [Green Version]
  3. Conejo, A.J.; Plazas, M.A.; Espinola, R.; Molina, A.B. Day-ahead electricity price forecasting using the wavelet transform and ARIMA models. IEEE Trans. on Power. Syst. 2005, 20, 1035–1042. [Google Scholar] [CrossRef]
  4. Engle, R.F. Autoregressive conditional heteroscedasticity with estimates of the variance of United Kingdom inflation. Econometrica 1982, 50, 987–1007. [Google Scholar] [CrossRef]
  5. Bollerslev, T. Generalized autoregressive conditional heteroscedasticity. J. Econom. 1986, 31, 307–327. [Google Scholar] [CrossRef]
  6. Jilani, T.A.; Burney, S.M.A. M-factor high order fuzzy time series forecasting for road accident data: Analysis and design of intelligent systems using soft computing techniques. In Analysis and Design of Intelligent Systems Using Soft Computing Techniques; Springer: Berlin/Heidelberg, Germany, 2007; Volume 41, pp. 246–254. [Google Scholar]
  7. Song, Q.; Chissom, B.S. Forecasting enrollments with fuzzy time series—Part I. Fuzzy Sets Syst. 1993, 54, 1–9. [Google Scholar] [CrossRef]
  8. Song, Q.; Chissom, B.S. Fuzzy time series and its models. Fuzzy Sets Syst. 1993, 54, 269–277. [Google Scholar] [CrossRef]
  9. Song, Q.; Chissom, B.S. Forecasting enrollments with fuzzy time series—Part II. Fuzzy Sets Syst. 1994, 62, 1–8. [Google Scholar] [CrossRef]
  10. Chen, M.Y.; Chen, B.T. A hybrid fuzzy time series model based on granular computing for stock price forecasting. Inf. Sci. 2015, 294, 227–241. [Google Scholar] [CrossRef]
  11. Chen, S.M.; Chen, S.W. Fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and the probabilities of trends of fuzzy logical relationships. IEEE Trans. Cybern. 2015, 45, 405–417. [Google Scholar] [PubMed]
  12. Chen, S.M.; Jian, W.S. Fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups, similarity measures and PSO techniques. Inf. Sci. 2017, 391–392, 65–79. [Google Scholar] [CrossRef]
  13. Rubio, A.; Bermudez, J.D.; Vercher, E. Improving stock index forecasts by using a new weighted fuzzy-trend time series method. Expert Syst. Appl. 2017, 76, 12–20. [Google Scholar] [CrossRef]
  14. Efendi, R.; Ismail, Z.; Deris, M.M. A new linguistic out-sample approach of fuzzy time series for daily forecasting of Malaysian electricity load demand. Appl. Soft Comput. 2015, 28, 422–430. [Google Scholar] [CrossRef]
  15. Sadaei, H.J.; Guimaraes, F.G.; Silva, C.J.; Lee, M.H.; Eslami, T. Short-term load forecasting method based on fuzzy time series, seasonality and long memory process. Int. J. Approx. Reason. 2017, 83, 196–217. [Google Scholar] [CrossRef]
  16. Cheng, H.; Chang, R.J.; Yeh, C.A. Entropy-based and trapezoid fuzzification based fuzzy time series approach for forecasting it project cost. Technol. Forecast. Soc. Chang. 2006, 73, 524–542. [Google Scholar] [CrossRef]
  17. Gangwar, S.S.; Kumar, S. Partitions based computational method for high-order fuzzy time series forecasting. Expert Syst. Appl. 2012, 39, 12158–12164. [Google Scholar] [CrossRef]
  18. Singh, S.R. A computational method of forecasting based on high-order fuzzy time series. Expert Syst. Appl. 2009, 36, 10551–10559. [Google Scholar] [CrossRef]
  19. Chen, S.M. Forecasting enrollments based on fuzzy time series. Fuzzy Sets Syst. 1996, 81, 311–319. [Google Scholar] [CrossRef]
  20. Huarng, K.H. Effective lengths of intervals to improve forecasting in fuzzy time series. Fuzzy Sets Syst. 2001, 123, 387–394. [Google Scholar] [CrossRef]
  21. Huarng, K.; Yu, T.H.K. Ratio-based lengths of intervals to improve fuzyy time series forecasting. IEEE Trans. Syst. Man Cybern. Part B Cybern. 2006, 36, 328–340. [Google Scholar] [CrossRef]
  22. Askari, S.; Montazerin, N. A high-order multi-variable fuzzy times series forecasting algorithm based on fuzzy clustering. Expert Syst. Appl. 2015, 42, 2121–2135. [Google Scholar] [CrossRef]
  23. Egrioglu, E.; Aladag, C.H.; Basaran, M.A.; Uslu, V.R.; Yolcu, U. A new approach based on the optimization of the length of intervals in fuzzy time series. J. Intell. Fuzzy Syst. 2011, 22, 15–19. [Google Scholar]
  24. Egrioglu, E.; Aladag, C.H.; Yolcu, U.; Uslu, V.R.; Basaran, M.A. Finding an optimal interval length in high order fuzzy time series. Expert Syst. Appl. 2010, 37, 5052–5055. [Google Scholar] [CrossRef]
  25. Wang, L.; Liu, X.; Pedrycz, W. Effective intervals determined by information granules to improve forecasting in fuzzy time series. Expert Syst. Appl. 2013, 40, 5673–5679. [Google Scholar] [CrossRef]
  26. Yolcu, U.; Egrioglu, E.; Uslu, V.R.; Basaran, M.A.; Aladag, C.H. A new approach for determining the length of intervals for fuzzy time series. Appl. Soft Comput. 2009, 9, 647–651. [Google Scholar] [CrossRef]
  27. Zhao, A.W.; Guan, S.; Guan, H.J. A computational fuzzy time series forecasting model based on GEM-based discretization and hierarchical fuzzy logical rules. J. Intell. Fuzzy Syst. 2016, 31, 2795–2806. [Google Scholar] [CrossRef]
  28. Aladag, C.H.; Basaran, M.A.; Egrioglu, E.; Yolcu, U.; Uslu, V.R. Forecasting in high order fuzzy time series by using neural networks to define fuzzy relations. Expert Syst. Appl. 2009, 36, 4228–4231. [Google Scholar] [CrossRef]
  29. Egrioglu, E.; Aladag, C.H.; Yolcu, U.; Uslu, V.R.; Basaran, M.A. A new approach based on artificial neural networks for high order multivariate fuzzy time series. Expert Syst. Appl. 2009, 36, 10589–10594. [Google Scholar] [CrossRef]
  30. Huarng, K.; Yu, T.H.K. The application of neural networks to forecast fuzzy time series. Phys. A Stat. Mech. Appl. 2006, 363, 481–491. [Google Scholar] [CrossRef]
  31. Cai, Q.; Zhang, D.; Zheng, W.; Leung, S.C.H. A new fuzzy time series forecasting model combined with ant colony optimization and auto-regression. Knowl. Based Syst. 2015, 74, 61–68. [Google Scholar] [CrossRef]
  32. Chen, S.; Chang, Y. Multi-variable fuzzy forecasting based on fuzzy clustering and fuzzy rule interpolation techniques. Inf. Sci. 2010, 180, 4772–4783. [Google Scholar] [CrossRef]
  33. Chen, S.; Chen, C. TAIEX forecasting based on fuzzy time series and fuzzy variation groups. IEEE Trans. Fuzzy Syst. 2011, 19, 1–12. [Google Scholar] [CrossRef]
  34. Chen, S.; Chu, H.; Sheu, T. TAIEX forecasting using fuzzy time series and automatically generated weights of multiple factors. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 2012, 42, 1485–1495. [Google Scholar] [CrossRef]
  35. Ye, F.; Zhang, L.; Zhang, D.; Fujita, H.; Gong, Z. A novel forecasting method based on multi-order fuzzy time series and technical analysis. Inf. Sci. 2016, 367–368, 41–57. [Google Scholar] [CrossRef]
  36. Yu, H.K. Weighted fuzzy time series models for TAIEX forecasting. Phys. A Stat. Mech. Appl. 2005, 349, 609–624. [Google Scholar] [CrossRef]
  37. Cheng, C.H.; Chen, T.L.; Teoh, H.J.; Chiang, C.H. Fuzzy time-series based on adaptive expectation model for TAIEX forecasting. Expert Syst. Appl. 2008, 34, 1126–1132. [Google Scholar] [CrossRef]
  38. Jang, J.S. ANFIS: Adaptive network based fuzzy inference systems. IEEE Trans. Syst. Man Cybern. 1993, 23, 665–685. [Google Scholar] [CrossRef]
  39. Primoz, J.; Bojan, Z. Discrete Optimization with Fuzzy Constraints. Symmetry 2017, 9, 87. [Google Scholar] [CrossRef]
  40. Egrioglu, E.; Aladag, C.H.; Yolcu, U.; Bas, E. A new adaptive network based fuzzy inference system for time series forecasting. Aloy J. Soft Comput. Appl. 2014, 2, 25–32. [Google Scholar]
  41. Sarica, B.; Egrioglu, E.; Asikgil, B. A new hybrid method for time series forecasting: AR-ANFIS. Neural Comput. Appl. 2016, 1–12. [Google Scholar] [CrossRef]
  42. Egrioglu, E.; Yolcu, U.; Aladag, C.H.; Kocak, C. An ARMA type fuzzy time series forecasting method based on particle swarm optimization. Math. Probl. Eng. 2013, 2013, 935815. [Google Scholar] [CrossRef]
  43. Kocak, C. A new high order fuzzy ARMA time series forecasting method by using neural networks to define fuzzy relations. Math. Probl. Eng. 2015, 2015, 128097. [Google Scholar] [CrossRef]
  44. Kocak, C. ARMA (p,q) type high order fuzzy time series forecast method based on fuzzy logic relations. Appl. Soft Comput. 2017, 59, 92–103. [Google Scholar] [CrossRef]
  45. Herrera, F.; Herrera-Viedma, E.; Verdegay, J.L. A model of consensus in group decision making under linguistic assessments. Fuzzy Sets Syst. 1996, 79, 73–87. [Google Scholar] [CrossRef]
  46. Kennedy, J.; Eberhart, R. Particle Swarm Optimization; Springer: New York, NY, USA, 2011. [Google Scholar]
  47. Schutte, J.F.; Reinbolt, J.A.; Fregly, B.J.; Haftka, R.T.; George, A.D. Parallel global optimization with the particle swarm algorithm. Commun. Numer. Methods Eng. 2004, 61, 2296–2315. [Google Scholar] [CrossRef] [PubMed]
  48. Chang, J.R.; Wei, L.Y.; Cheng, C.H. A hybrid ANFIS model based on AR and volatility for TAIEX Forecasting. Appl. Soft Comput. 2011, 11, 1388–1395. [Google Scholar] [CrossRef]
  49. Chen, S.M.; Manalu, G.M.T.; Pan, J.S.; Liu, H.C. Fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and particle swarm optimization techniques. IEEE Trans. Cybern. 2013, 43, 1102–1117. [Google Scholar] [CrossRef] [PubMed]
  50. Cheng, C.H.; Wei, L.Y.; Liu, J.W.; Chen, T.L. OWA-based ANFIS model for TAIEX forecasting. Econ. Model. 2013, 30, 442–448. [Google Scholar] [CrossRef]
  51. Hsieh, T.J.; Hsiao, H.F.; Yeh, W.C. Forecasting stock markets using wavelet trans-forms and recurrent neural networks: An integrated system based on artificial bee colony algorithm. Appl. Soft Comput. 2011, 11, 2510–2525. [Google Scholar] [CrossRef]
Figure 1. Flowchart of our proposed forecasting model.
Figure 1. Flowchart of our proposed forecasting model.
Symmetry 09 00124 g001
Figure 2. Forecasting results from 1 November1999 to 30 December 1999.
Figure 2. Forecasting results from 1 November1999 to 30 December 1999.
Symmetry 09 00124 g002
Figure 3. The stock marketfluctuation for TAIEX test dataset (1997–2005).
Figure 3. The stock marketfluctuation for TAIEX test dataset (1997–2005).
Symmetry 09 00124 g003aSymmetry 09 00124 g003b
Table 1. Global best parameters obtained using PSO for training dataset.
Table 1. Global best parameters obtained using PSO for training dataset.
ϕ1ϕ2ϕ3ϕ4ϕ5ϕ6ERMSE
−0.16380.08030.1372−0.03210.04330.25461.4408115.73
Table 2. Forecasting results from 1 November1999 to 30 December 1999.
Table 2. Forecasting results from 1 November1999 to 30 December 1999.
Date (MM/DD/YYYY)ActualForecast(Forecast–Actual)2Date (MM/DD/YYYY)ActualForecast(Forecast–Actual)2
11/1/19997814.897869.352965.8912/1/19997766.207705.593673.57
11/2/19997721.597825.3510,766.1412/2/19997806.267790.48249.01
11/3/19997580.097704.0015,353.6912/3/19997933.177824.2911,854.85
11/4/19997469.237573.2110,811.8412/4/19997964.497967.9612.04
11/5/19997488.267460.24785.1212/6/19997894.467965.875099.39
11/6/19997376.567468.508452.9612/7/19997827.057897.624980.12
11/8/19997401.497345.943085.8012/8/19997811.027806.2522.75
11/9/19997362.697400.031394.2812/9/19997738.847823.687197.83
11/10/19997401.817379.30506.7012/10/19997733.777701.121066.02
11/11/19997532.227410.8614,728.2512/13/19997883.617718.3827,300.95
11/15/19997545.037553.8277.2612/14/19997850.147921.865143.76
11/16/19997606.207569.421352.7712/15/19997859.897862.878.88
11/17/19997645.787631.90192.6512/16/19997739.767857.1213,773.37
11/18/19997718.067667.912515.0212/17/19997723.227750.49743.65
11/19/19997770.817750.58409.2512/18/19997797.877733.154188.68
11/20/19997900.347800.669936.1012/20/19997782.947815.101034.27
11/22/19998052.317936.5513,400.3812/21/19997934.267781.7423,262.35
11/23/19998046.198079.431104.9012/22/19998002.767953.132463.14
11/24/19997921.858072.4222,671.3212/23/19998083.498060.46530.38
11/25/19997904.537908.8318.4912/24/19998219.458119.709950.06
11/26/19997595.447912.20100,336.9012/27/19998415.078246.5728,392.25
11/29/19997823.907576.2161,350.3412/28/19998448.848462.94198.81
11/30/19997720.877823.0610,442.80Root Mean Square Error(RMSE)99.31
Table 3. Comparison of forecasting errors for different nth-orders (g=3).
Table 3. Comparison of forecasting errors for different nth-orders (g=3).
n12345678910
RMSE109.04105.47103.04102.96101.9299.1299.5999.698.7599
Table 4. Comparison of forecasting errors for different linguistic sets (n=6).
Table 4. Comparison of forecasting errors for different linguistic sets (n=6).
g357None
RMSE99.12101.67105.82128.97
Table 5. RMSEs of forecast errors for TAIEX 1997 to 2005.
Table 5. RMSEs of forecast errors for TAIEX 1997 to 2005.
Year199719981999200020012002200320042005
RMSE143.60115.3499.12125.70115.9170.4354.2657.2454.68
Table 6. A comparison of RMSEs for different methods for forecasting the TAIEX1999.
Table 6. A comparison of RMSEs for different methods for forecasting the TAIEX1999.
MethodsRMSES
Yu’s Method(2005) [36]1451.62 **
Hsieh et al.’s Method(2011) [51]94−0.32
Chang et al.’s Method(2011) [48]1000.11
Cheng et al.’s Method(2013) [50]1030.34
Chen et al.’s Method(2013) [49]102.110.21
Chen and Chen’s Method(2015) [11]103.90.36
Chen and Chen’s Method(2015) [10]92−0.42
Zhao et al.’s Method(2016) [27]110.851.08
The Proposed Method99.12-
** The proposed method has better predictive accuracy than the method at the 5% significance level.
Table 7. RMSEs of the forecast errors for year 2015 of DAX30.
Table 7. RMSEs of the forecast errors for year 2015 of DAX30.
YearYu (2005) [36]Cheng et al. (2008) [37]Wang et al. (2013) [25]Rubio et al. (2017) [13]Proposed Model
RMSE172.69170.56376.80153.15159.22
S1.311.233.68 **−0.26-
** The proposed method has better predictive accuracy than the method at the 5% significancelevel.
Table 8. RMSEs of forecast errors for SHSECI from 2007 to 2015.
Table 8. RMSEs of forecast errors for SHSECI from 2007 to 2015.
Year200720082009201020112012201320142015
RMSE113.1155.2849.5945.7328.4525.0519.8641.4459.5

Share and Cite

MDPI and ACS Style

Jia, J.; Zhao, A.; Guan, S. Forecasting Based on High-Order Fuzzy-Fluctuation Trends and Particle Swarm Optimization Machine Learning. Symmetry 2017, 9, 124. https://doi.org/10.3390/sym9070124

AMA Style

Jia J, Zhao A, Guan S. Forecasting Based on High-Order Fuzzy-Fluctuation Trends and Particle Swarm Optimization Machine Learning. Symmetry. 2017; 9(7):124. https://doi.org/10.3390/sym9070124

Chicago/Turabian Style

Jia, Jingyuan, Aiwu Zhao, and Shuang Guan. 2017. "Forecasting Based on High-Order Fuzzy-Fluctuation Trends and Particle Swarm Optimization Machine Learning" Symmetry 9, no. 7: 124. https://doi.org/10.3390/sym9070124

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop