Next Article in Journal
Detection of Interaction Effects in a Nonparametric Concurrent Regression Model
Next Article in Special Issue
Expecting the Unexpected: Entropy and Multifractal Systems in Finance
Previous Article in Journal
Design of Multi-User Noncoherent Massive SIMO Systems for Scalable URLLC
Previous Article in Special Issue
An Approach for the Estimation of Concentrations of Soluble Compounds in E. coli Bioprocesses
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Stock Market Forecasting Based on Spatiotemporal Deep Learning

1
Department of Statistics and Information Science, Fu Jen Catholic University, New Taipei City 242062, Taiwan
2
Department of Mathematics, Fu Jen Catholic University, New Taipei City 242062, Taiwan
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Entropy 2023, 25(9), 1326; https://doi.org/10.3390/e25091326
Submission received: 14 August 2023 / Revised: 5 September 2023 / Accepted: 11 September 2023 / Published: 12 September 2023

Abstract

:
This study introduces the Spacetimeformer model, a novel approach for predicting stock prices, leveraging the Transformer architecture with a time–space mechanism to capture both spatial and temporal interactions among stocks. Traditional Long–Short Term Memory (LSTM) and recent Transformer models lack the ability to directly incorporate spatial information, making the Spacetimeformer model a valuable addition to stock price prediction. This article uses the ten minute stock prices of the constituent stocks of the Taiwan 50 Index and the intraday data of individual stock on the Taiwan Stock Exchange. By training the Timespaceformer model with multi-time-step stock price data, we can predict the stock prices at every ten minute interval within the next hour. Finally, we also compare the prediction results with LSTM and Transformer models that only consider temporal relationships. The research demonstrates that the Spacetimeformer model consistently captures essential trend changes and provides stable predictions in stock price forecasting. This article proposes a Spacetimeformer model combined with daily moving windows. This method has superior performance in stock price prediction and also demonstrates the significance and value of the space–time mechanism for prediction. We recommend that people who want to predict stock prices or other financial instruments try our proposed method to obtain a better return on investment.

1. Introduction

With the progress of society and changes in the economic environment, investment and finance have become mainstream trends in modern society. Among numerous investment options such as bonds, stocks, funds, and futures, stock trading is a popular choice for many people. In stock trading, quantitative trading is an important area that utilizes extensive historical data and market information to predict stock price trends [1,2,3]. Through these predictions, investors can obtain signals about stock price movements earlier and execute buying or selling actions with appropriate timing [4,5]. This behavior is sometimes likened to the concept of insider trading, as investors can act earlier than others based on their predictions to gain better trading opportunities. Therefore, in quantitative trading, how to forecast stock prices is a crucial issue. In recent years, with the development of artificial intelligence, there is an increasing amount of research confirming that neural network models outperform traditional time series methods in stock price prediction [6,7,8,9].
In addition to neural network models, deep learning models are currently the most important models in the field of artificial intelligence, especially in time series forecasting [10,11,12,13]. Among them, the LSTM model [14] is one of the most commonly used deep learning models and has shown outstanding performance in time series forecasting [15,16,17,18,19,20,21,22,23,24]. However, after the Transformer model [25] achieved breakthroughs in the field of natural language processing (NLP), people began to explore whether the Transformer model could also be applied to time series forecasting. The LSTM method was originally developed for NLP, and it was later applied to time series forecasting, which also achieved good results. When Transformer technology was successful in NLP tasks, many data scientists began to apply its technology to time series forecasting, showing the potential of the Transformer model to replace the LSTM model and showing a better performance in time series forecasting [26,27,28,29,30,31,32,33]. Recently, it has become one of the most noteworthy new developments.
Traditional LSTM and recent Transformer models only focus on the temporal relationship between data points, and predict future information by capturing the dependencies between before and after data collection. They overlook the spatial relationships that exist among variables, which can be complex and dynamic in the context of stocks [34,35]. For example, some companies may hold important positions in the Apple’s supply chain at certain times, while other companies may replace these positions at different times. Furthermore, the relationships among these companies are not fixed and may change over time. Therefore, considering the interaction among multiple stocks is crucial for stock price forecasting. However, most advanced models primarily concentrate on the time series characteristics of individual stocks and pay relatively little attention to the interactions among multiple stocks. However, some scholars have proposed methods that integrate spatial and temporal concepts, such as the spatiotemporal attention-based convolutional network (STACN) model introduced by Lin et al. [36]. STACN uses the convolutional neural network to extract spatial feature maps from news headlines to capture the market structure of stocks. Simultaneously, LSTM extracts temporal features from historical stock prices and relevant fundamental information to capture price variations and trends. Finally, STACN employs attention mechanisms to learn and select the most important features, utilizing spatiotemporal information for stock price prediction. Hou et al. [37] also proposed an approach incorporating the graph structure relationships between different companies into the time series forecasting task. Firstly, the method utilizes the Variational Autoencoder (VAE) to learn the low-dimensional latent features from the fundamental data of companies. It further calculates the Euclidean distances between companies to establish a graph network and explore inter-company correlations. Then, a hybrid deep neural network consisting of a graph convolutional network and a long-short term memory network (GCN-LSTM) is used to model the graph structure interaction among stocks and their price fluctuations over time. The method that combines spatial and temporal information for stock price prediction is called VAE-GCN-LSTM by Hou et al. [37]. We can observe that both Lin et al. [36] and Hou et al. [37] conducted research based on the LSTM model and did not utilize the Transformer model.
Based on the research background and motivation mentioned above, this study aims to explore the impact of spatiotemporal relationships on the stock price predictions. To capture the relationships between different stocks at different time points, we employ the Spacetimeformer model [38], which incorporates spatiotemporal mechanisms, as the predictive model for stock prices. Additionally, to avoid the influence of the cross-day prediction, we adopt the daily scan window approach in our experimental design. The main focus of this research is multi-time-step forecasting for the prices of individual stock. In the multi-time-step forecasting task, the Spacetimeformer model predicts the stock prices for every ten minutes within the next hour (six steps in total). Finally, we compare the predictions of the Spacetimeformer model, which integrates spatiotemporal concepts, with the latest Transformer model that purely considers time and the more mature LSTM model in time series forecasting. The goal is to investigate the performance differences between models with spatiotemporal considerations and models that only consider time, as well as to assess the impact of spatiotemporal mechanisms on prediction accuracy.

2. Materials and Methods

2.1. Datasets

The data for this research come from the stock information of the Taiwan Stock Exchange provided by the stock trading system of Yuanta Securities Co., Ltd. (https://www.yuanta.com.tw/eYuanta/Securities/Stock, accessed on 1 May 2022), which is updated every minute. In order to effectively demonstrate the performance of the Spacetimeformer model, we selected six important constituent stocks from the constituent stocks of the Taiwan 50 Index (as shown in Table 1) announced in June 2022 as our research objects. They are Taiwan Semiconductor Manufacturing Co., Ltd. (TSMC, 2330.TW), United Microelectronics Corporation (UMC, 2303.TW), Delta Electronics, Inc., (DELTA, 2308.TW), Evergreen Marine Corporation (EMC, 2603.TW), Formosa Chemicals & Fibre Corporation (FCFC, 1326.TW), and Yuanta Financial Holding Co., Ltd. (YFH, 2885.TW).
Among these companies, TSMC and UMC are the two leaders in Taiwan’s foundry industry, and they have been able to stay within the top five in the global semiconductor foundry field for a long time. Especially in the 2022 global wafer foundry industry revenue ranking, TSMC ranks first in the world and UMC ranks third in the world. The operating conditions of the two companies affect Taiwan’s economic development and also affect the trend of Taiwan stocks. DELTA is a leading manufacturer of power management and thermal management solutions, and it holds a world-class position in many product fields. It has been included in the Dow Jones Sustainability Indexes for twelve consecutive years (2011–2022). EMC has written many brilliant records in the history of container shipping. So far, it has taken the leading position in the world in terms of fleet size and container carrying capacity. FCFC is one of the main members of the Formosa Plastics Group. Whether it is textile, fiber products or petrochemical products, the company has a leading position in Taiwan and Asia. YFH is a financial holding company that develops on the dual axes of securities investment and commercial banking. Its market share in each business is one of the main market leaders, and it has long been recognized by investors.
We use the ten minute interval trading prices of these six stocks as input data for the model, and the data range is from 1 June 2022 to 18 November 2022. In order to avoid the sharp fluctuations in stock prices caused by “opening” and “closing”, the trading time interval of this study is locked from the original [9:00,13:00] to [9:01,13:21] for discussion. That is to say, we only consider twenty-seven time cut-off points, 9:01, 9:11, …, and 13:21, every day. The descriptive statistics of the stock prices of these six companies are shown in Table 2, where S.D. is the abbreviation of the standard deviation. We can observe that the average stock prices of TSMC, DELTA, and EMC are relatively high, so the stock price variation is also large. The stock price trend chart is shown in Figure 1. We can observe that except for EMC re-listing due to capital reduction on 19 September 2022, the stock price was faulted. The trend of other stocks seems to be very similar to the double bottoms pattern. Therefore, we reasonably suspect that there is some complex relationship among these stocks [39], which we refer to as a spatial correlation.

2.2. A Brief Review of LSTM Model

Predicting stock prices using LSTM models has gained significant attention in financial markets due to their ability to capture complex temporal dependencies in historical price data. LSTM models are a type of recurrent neural network designed to overcome the vanishing gradient problem, making them particularly well suited for time series forecasting. The architecture of an LSTM model is composed of distinct layers, each serving a crucial role in processing sequential data. This design is particularly effective for capturing intricate temporal relationships within the data. A fundamental characteristic of LSTM models is their ability to stack multiple LSTM cells (shown in Figure 2) on top of each other in a layered fashion. This stacking facilitates the hierarchical learning of patterns and relationships within sequential data, making it a robust choice for various time series tasks.
Within each LSTM cell, a series of gates regulates the flow of information throughout the sequence. Their primary function is to control how information is introduced into the network, stored within the cell state, and ultimately released for prediction. The forget gate, in particular, is responsible for filtering out information that the model deems irrelevant or outdated. Only information deemed pertinent and aligned with the model’s learning objectives is retained, while less relevant data are actively discarded. This selective retention and discarding of information through the forget gate enables LSTM models to focus on the most meaningful aspects of the input sequence, enhancing their ability to make accurate predictions and capture underlying patterns effectively.
Let x t be the input at time t. We also use h t 1 and c t 1 to denote the previous hidden state and cell state, respectively, at time ( t 1 ) . The initial stage in the LSTM architecture is the forget gate. This gate plays a crucial role in determining the relevance of elements within the cell state, essentially acting as the neural network’s filter for long-term memory. It makes this determination based on information from both the prior hidden state and the new input data. The network operating within the forget gate is trained to assign values close to 0 to information it deems irrelevant and values close to 1 to information it considers relevant. This process allows the LSTM to selectively retain or discard information from the cell state, enabling it to focus on what is most important for its learning objectives. The calculation method of the forgetting probability is given by
f ( t ) = sigmoid ( W f · [ x t , h t 1 ] + b f ) ,
where W f and b f are weight matrix and bias vector parameters, respectively, corresponding to the forget gate which needs to be learned during training.
In the subsequent phase of the LSTM process, we encounter the input gate and the new memory network. Their primary purpose at this stage is to determine what fresh information should be integrated into the network’s long-term memory, known as the cell state. This decision hinges on a careful evaluation of both the previous hidden state and the current input data. Similar to the forget gate, the output value from the input gate holds significant meaning. A low output value signals that the corresponding element of the cell state should remain unaltered, indicating a decision to not update that specific aspect of the memory. Crucially, the new memory update vector serves as a blueprint for adjusting each component of the long-term memory, the cell state. It essentially guides the LSTM in determining how much each memory element should be modified based on the most recent data, ensuring the model’s adaptability and responsiveness to evolving information. Let W i , W c , b i , and b c be the corresponding weight matrices and bias vectors. The functions of the input gate and the new memory network are
i ( t ) = sigmoid ( W i · [ x t , h t 1 ] + b i ) ,
and
c ( t ) = tanh ( W c · [ x t , h t 1 ] + b c ) ,
respectively. The internal state can be updated by
c t = i ( t ) c ( t ) + f ( t ) c t 1 ,
where ⊙ denotes the Hadamard product.
In the concluding phase of an LSTM’s operation, the pivotal task is to derive the new hidden state by leveraging the recently updated cell state, the preceding hidden state, and the incoming input data. The output gate essentially acts as a decision-maker, regulating which parts of the updated cell state and the previous hidden state should contribute to the final hidden state. This filtration mechanism ensures that only the most pertinent and contextually relevant information is incorporated, enabling the LSTM model to maintain a precise and informative hidden state while avoiding unnecessary complexity. The output gate is calculated as
o ( t ) = sigmoid ( W o · [ x t , h t 1 ] + b o ) ,
where W o and b o are weight matrices and bias vector parameters with respect to the output gate, respectively. The updated cell state is constrained to [−1,1] through the tanh activation function, and then the final new hidden state is given by
h t = o t tanh ( c t ) .

2.3. Spacetimeformer Model

Time series forecasting plays an important role in many fields, including weather forecasting, traffic conditions, and financial forecasting. In the past, LSTM had excellent performance in NLP tasks. However, the input of LSTM is a vector, and the input must be processed step by step. Moreover, due to the recursive structure, LSTM cannot capture the long-term correlation in the sequence. Compared with LSTM, the input of the Transformer is a matrix which can eliminate the order of the input so that each Token in the sequence can be processed in parallel. Therefore, the Transformer has been more widely used in time series forecasting tasks recently. A Transformer basically consists of a series of encoder and decoder layers whose input is a matrix. The encoder uses the attention mechanism to understand the correlation between Tokens, while the decoder uses the information obtained from the encoder to produce task-specific predictions.
Time series forecasting is typically based on the sequence-to-sequence approach, where past variable values within a time range of k steps are used to predict future target values for h steps. We assume that x t represents the timestamp value at time t (e.g., year, month, date), and that y t = ( y t ( 1 ) , y t ( 2 ) , , y t ( N ) ) R N , where N is the number of variables, represents the response vector at time t. Given the timestamps ( x T k + 1 , , x T ) and response sequences ( y T k + 1 , , y T ) , the model would output the response sequences ( y T + 1 , , y T + h ) for the future steps ( x T + 1 , , x T + h ) . The Informer, proposed by Zhou et al. [33], is an encoder–decoder Transformer architecture for time series forecasting models. This model embeds the time series into a high-dimensional space and uses zeros as placeholders for the unknown target sequence ( y T k + 1 , , y T , 0 T + 1 , , 0 T + h ) to embed them into the same dimension. The model adds the timestamps (x) and response vectors ( y ) of the sequence to create an input sequence Z R ( k + h ) × N consisting of k + h Tokens. This architecture has been demonstrated by Zhou et al. [33] for long-term forecasting. However, this setting will cause the model to learn only temporal features, while ignoring the spatial correlation between response variables.
In the past, many advanced models for multivariate time series forecasting tasks relied on attention mechanisms between time steps. Such models may be able to capture temporal correlations, but unfortunately have not been able to capture the different spatial relationships between variables. In order to solve this issue, Grigsby et al. [38] proposed the Spacetimeformer model based on the Informer encoder–decoder architecture. The model flattens each multivariate vector ( y t ) along the dimension of time into a timestamped N vector to represent the transformation of the input data into a spatiotemporal sequence (shown in Figure 3). Therefore, the new embedded sequence Z R N ( k + h ) × 1 is ( ( x T k + 1 , y T k + 1 ( 1 ) ) , , ( x T k + 1 , y T k + 1 ( N ) ) , , ( x T , y T ( 1 ) ) , , ( x T , y T ( N ) ) , ( x T + 1 , 0 T + 1 ( 1 ) ) , , ( x T + 1 , 0 T + 1 ( N ) ) , , ( x T + h , 0 T + h ( 1 ) ) , , ( x T + h , 0 T + h ( N ) ) ) . Consequently, when Z passes through the attention layer, there is a direct path between each Token, allowing the model to capture both temporal and spatial information. This enables the Spacetimeformer model to capture correlations between different variables at different time steps.
The order of the inputs cannot be interpreted because the attention mechanism puts the input sequence into the model at the same time. Therefore, we need to add relative position information through position embedding. Time marks are an important feature in time series forecasting; thus, time vector embedding, called Time2Vec embedding [40], is added to capture the periodic and aperiodic relationships of time to produce accurate forecasts. We combine the straightened Time2Vec output with the stock price and project it onto the model through the forward propagation layer, which is called value and time embedding. This is the standard input sequence of the time series forecasting model, which enables each Token to contain time and stock price information. Furthermore, the model also needs to distinguish various stocks at different times; thus, variable embedding is added. The variable embedding straightens the variable indices along the dimension of time and projects each straightened variable index to the same dimension. Finally, the variable values, time embedding, and variable embedding are combined to create the input sequence (shown in Figure 3b) for the encoder, so that each Token carries information about times, stocks, and prices. The attention mechanism is then used to accurately interpret the temporal and spatial information embedded in the sequence.

2.4. Experimental Design

In order to avoid affecting the performance of the model due to cross-day forecasting, we use daily moving windows, as shown in Figure 4, to define the training set and testing set. Given any trading day from 3 June 2022 to 4 November 2022, we take time points 9:01, 9:11, …, and 12:21 as starting points. The fifty-five time points forward from this starting point are regarded as the input sequence, and we could predict the stock price (target sequence) every ten minutes in the next hour (a total of six steps). Let us take 9:01 as an example. The model can predict the stock price in the next 6 steps, 9:11, 9:21, …, and 10:01. Until we sample the last time point at 12:21, the model will predict the stock prices at 12:31, 12:41, …, and 13:21. This process, as shown in Figure 5, is repeated until all the training data have been entered into the model, a total of 2205 sequence data. At the same time, we use a total of ten trading days from 7 November 2022 to 18 November 2022 as the testing data.
Finally, we compare the predictions of the Spacetimeformer model, which considers the interaction of time and space, with the Transformer model that only considers temporal correlations. We also explore whether the performance of the above two models is different from that of the earlier LSTM model. We use the mean absolute error rate (MAPE) and root mean square error (RMSE) as the indicators of the performance evaluation among the three models. The formulae of MAPE and RMSE are
MAPE = 1 n i = 1 n | y i y ^ i y i | ,
and
RMSE = 1 n i = 1 n ( y i y ^ i ) 2 ,
respectively.

3. Results

First, we draw the trend chart of stock price predictions, as shown in Figure 6, Figure 7, Figure 8, Figure 9, Figure 10 and Figure 11, to show the performance results of different models. In all figures, the red line represents the real value, the blue line represents the Spacetimeformer model, the orange line represents the Transformer model, and the green line represents the LSTM model. Among them, “Step i”, i = 1, 2, …6, represents the ith step predictions through each model. At the same time, we use MAPE (shown in Table 3, Table 4, Table 5, Table 6, Table 7 and Table 8) and RMSE (shown in Table 9, Table 10, Table 11, Table 12, Table 13 and Table 14, respectively) to evaluate the error between the stock price predicted by each model and the real stock price. The trend chart, MAPE, and RMSE help us to more comprehensively evaluate the performance of stock price prediction between models.
Firstly, let us discuss the performance of the stock price forecast for TSMC. From Figure 6, it is evident that the Spacetimeformer model provides the closest predictions to the true stock prices. While the Transformer model exhibits a decent performance in stock price prediction before 16 November, it becomes noticeably distorted thereafter. On the other hand, the LSTM model shows the opposite trend, performing poorly before 16 November and improving afterwards. Furthermore, from the evaluation indexes of MAPE and RMSE in Table 3 and Table 9, respectively, we can observe that the Spacetimeformer model is indeed significantly better than the other two.
For the performance of UMC’s stock price prediction, we can observe from Figure 7 that the Spacetimeformer model maintains robustness. Its forecasted stock price is very close to the actual value. However, the performance of the Transformer model is inferior to the traditional LSTM model. Moreover, from the evaluation indexes of MAPE and RMSE in Table 4 and Table 10, respectively, it is clear that the Spacetimeformer model is better than the other two methods in UMC stock price prediction. However, some predictions of the Transformer model are indeed not as good as the LSTM model.
Next, we discuss the respective predictions of the three models for DELTA’s stock prices. From Figure 8, we can observe that the Spacetimeformer model continues to exhibit the best predictive performance. The Transformer model performed next, but with a significantly larger deviation than previous predictions for TSMC and UMC. On the other hand, the LSTM model performed poorly, with significant bias in stock price predictions. As shown in Table 5 and Table 11, the stock price forecast generated by the Spacetimeformer model is indeed better than the other two models. The predictions of the LSTM model were even more disappointing.
Regarding the stock price forecast of EMC, from Figure 9, we can observe that the stock price forecast of the Spacetimeformer model is still stable. The prediction performance of the LSTM model is the best prediction result so far. The Transformer model performed poorly, with the worst prediction results so far. From Table 6 and Table 12, we can clearly observe that the performance of the Spacetimeformer model and the LSTM model are almost comparable. It is worth mentioning that under the RMSE evaluation standard, LSTM is slightly better than Spacetimeformer in the fourth step of the prediction. However, the stock price prediction of the Transformer model is significantly inferior to the other two.
From Figure 10, we can observe that the Spacetimeformer model’s predictions of TSMC’s stock prices are still better than the other two. The performance of the Transformer model or the LSTM model is still not as good as expected because there is a gap between the prediction and the true stock prices. From Table 7 and Table 13, we can observe that the stock price prediction of the Spacetimeformer model is significantly better than the other two, and the performance of the Transformer model or the LSTM model is poor.
Finally, we discuss the stock price forecast of YFH. From Figure 11, we can observe that the stock price forecast of the Spacetimeformer model is still excellent. However, the Transformer model performs slightly better than the LSTM model, but both often underestimate the stock price. From Table 8 and Table 14, we can clearly observe that the stock price prediction of the Spacetimeformer model is also better than the other two, and the performance of the Transformer model or the LSTM model really needs to be strengthened.
According to the above results, the Spacetimeformer model can predict the stock price in the next ten minutes, twenty minutes or even sixty minutes with the smallest error. It significantly outperforms the other two models for stock price prediction at all steps. The Transformer model is next, and the LSTM model has the highest error. Most of the predictions from the Spacetimeformer model fall close to the true value, despite errors from the true data. However, these errors can be explained as natural fluctuations. Although the Transformer model outperforms the LSTM model, the predictions at some time points are significantly biased. It shows that its smoothness processing is poor, and it also confirms again that the Spacetimeformer model with the concept of time and space performs better in stock price prediction. The Spacetimeformer model can continuously capture important trend changes in stock price forecasts and provide relatively stable forecast results. In the predictions of multiple stock prices, both the Transformer and LSTM models show very unstable performances, which further proves that the Transformer model combined with the daily moving windows method may perform better in long-term forecasting.

4. Discussion

Investment and financial management has become an important topic in modern society, and stock trading is an investment project that people have taken much interest in. In stock trading, quantitative trading is a field worthy of research. We can use a large amount of historical data and market information to predict the stock price trend in order to obtain better trading opportunities. In recent years, with the development of artificial intelligence and deep learning, the LSTM and Transformer models have shown good performance in stock price prediction. The traditional LSTM and the latest Transformer models mainly focus on the time-to-time correlation but ignore the spatial relationship between stocks. There are often complex and dynamic relationships between stocks, such as AI concept stocks, electric vehicle concept stocks, etc. It is important to consider the mutual influence between stocks, but LSTM and Transformer models mainly focus on the time series characteristics of individual stocks. The interaction between multiple stocks is relatively less explored. Therefore, this article uses the Spacetimeformer model with a space–time mechanism as a model for predicting stock prices. The model is trained through the interaction mechanism of space and time to capture the relationship between different stocks at different times.
This study uses TSMC, UMC, DELTA, EMC, FCFC, and YFH from the constituent stocks of the Taiwan 50 Index as our research targets. The stock price every ten minutes from 1 June 2022 to 18 November 2022 is used as our research data. At the same time, we avoid the influence of cross-day forecasts, and use the method of daily moving windows to define the training set and testing set. In addition, through multi-time-step forecasting, we can predict the stock prices every ten minutes in the next hour. We compare it with the Transformer and LSTM models that only consider the temporal relationship, and finally use the MAPE index to evaluate the performance of the three models.
The research results show that the Spacetimeformer model with the space–time concept performs better in stock price prediction than the Transformer and LSTM models. The Spacetimeformer model can continuously capture important trend changes in stock price forecasts and provide relatively stable forecast results. Furthermore, the Spacetimeformer model’s predictions for the next ten minutes, twenty minutes, and even one hour are very close to the true values. This highlights the excellent performance of the Spacetimeformer model in both short- and long-term predictions. In contrast, the predictions of the Transformer and LSTM models at different time points might vary, indicating a less stable performance. This also further demonstrates the superior performance of models based on the Spacetimeformer architecture in both short- and long-term predictions. The experimental design of this study adopts a daily moving window approach to avoid the problems of cross-day forecasting and using long-term models for prediction. This allows us to more accurately assess model performance and leverage the adaptive nature of deep learning. In summary, the Spacetimeformer model combined with daily moving windows demonstrates a superior performance in stock price prediction compared to the Transformer and LSTM models, indicating the significance and value of spatiotemporal concepts for predictive modeling.
Therefore, we suggest that people who want to predict stock prices or other financial instruments use the Spacetimeformer model with a time–space interaction mechanism to obtain better results. Based on the above research results, we hope to further verify whether the forecasting performance is also excellent for the stock markets of different countries in the future. We hope to use it to discuss different financial instruments, such as foreign exchange, futures contract, cryptocurrency, etc. This will help us provide more diverse investment forecasts and insights.

Author Contributions

Conceptualization, H.-Y.H. and Y.-H.K.; methodology, H.-Y.H. and Y.-H.K.; software, Y.-C.L. and H.-Y.H.; validation, H.-Y.H., N.-P.Y. and Y.-H.K.; formal analysis, Y.-C.L. and H.-Y.H.; investigation, Y.-H.K.; resources, H.-Y.H.; data curation, H.-Y.H. and N.-P.Y.; writing—original draft preparation, Y.-C.L. and Y.-H.K.; writing—review and editing, Y.-C.L. and Y.-H.K.; visualization, Y.-C.L. and Y.-H.K.; supervision, H.-Y.H., N.-P.Y. and Y.-H.K.; project administration, H.-Y.H., N.-P.Y. and Y.-H.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

All data for this article can be found at https://www.yuanta.com.tw/eYuanta/Securities/Stock, accessed on 1 May 2022.

Acknowledgments

The authors thank their families for their continued encouragement and support.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
LSTMLong-Short Term Memory
NLPNatural Language Processing
STACNSpatial-Temporal Attention-Based Convolutional Network
VAEVariational Autoencoder
GCN-LSTMGraph Convolutional Network and Long-Short Term Memory Network
TSMCTaiwan Semiconductor Manufacturing Co., Ltd.
UMCUnited Microelectronics Corporation
DELTADelta Electronics, Inc.
EMCEvergreen Marine Corporation
FCFCFormosa Chemicals & Fibre Corporation
YFHYuanta Financial Holding Co., Ltd.
S.D.Standard Deviation

References

  1. Liu, P.; Zheng, Y. Precision measurement of the return distribution property of the Chinese stock market index. Entropy 2023, 25, 36. [Google Scholar] [CrossRef] [PubMed]
  2. Kumar, G.; Jain, S.; Singh, U.P. Stock market forecasting using computational intelligence: A survey. Arch. Comput. Methods Eng. 2021, 28, 1069–1101. [Google Scholar] [CrossRef]
  3. Chen, Y.; Zhu, Z. An IPSO-FW-WSVM Method for Stock Trading Signal Forecasting. Entropy 2023, 25, 279. [Google Scholar] [CrossRef] [PubMed]
  4. Li, Z.C.; Xie, C.; Zeng, Z.J.; Wang, G.J.; Zhang, T. Forecasting global stock market volatilities in an uncertain world. Int. Rev. Financ. Anal. 2023, 85, 102463. [Google Scholar] [CrossRef]
  5. Yen, P.T.W.; Xia, K.; Cheong, S.A. Laplacian spectra of persistent structures in Taiwan, Singapore, and US stock markets. Entropy 2023, 25, 846. [Google Scholar] [CrossRef]
  6. Wijesinghe, G.W.R.I.; Rathnayaka, R.M.K.T. ARIMA and ANN Approach for forecasting daily stock price fluctuations of industries in Colombo Stock Exchange, Sri Lanka. In Proceedings of the 2020 5th International Conference on Information Technology Research (ICITR), Moratuwa, Sri Lanka, 2–4 December 2020; pp. 1–7. [Google Scholar]
  7. Zhang, G.; Patuwo, B.E.; Hu, M.Y. Forecasting with artificial neural networks:: The state of the art. Int. J. Forecast. 1998, 14, 35–62. [Google Scholar] [CrossRef]
  8. Khattak, A.; Khan, A.; Ullah, H.; Asghar, M.U.; Arif, A.; Kundi, F.M.; Asghar, M.Z. An efficient supervised machine learning technique for forecasting stock market trends. Inf. Knowl. Internet Things 2022, 80, 143–162. [Google Scholar]
  9. Dezhkam, A.; Manzuri, M.T. Forecasting stock market for an efficient portfolio by combining XGBoost and Hilbert-Huang transform. Eng. Appl. Artif. Intell. 2023, 118, 105626. [Google Scholar] [CrossRef]
  10. Park, H.J.; Kim, Y.; Kim, H.Y. Stock market forecasting using a multi-task approach integrating long short-term memory and the random forest framework. Appl. Soft Comput. 2022, 114, 108106. [Google Scholar] [CrossRef]
  11. Kumbure, M.M.; Lohrmann, C.; Luukka, P.; Porras, J. Machine learning techniques and data for stock market forecasting: A literature review. Expert Syst. Appl. 2022, 197, 116659. [Google Scholar] [CrossRef]
  12. Kumar, D.; Sarangi, P.K.; Verma, R. A systematic review of stock market prediction using machine learning and statistical techniques. Mater. Today Proc. 2022, 49, 3187–3191. [Google Scholar] [CrossRef]
  13. Liapis, C.M.; Karanikola, A.; Kotsiantis, S. Investigating deep stock market forecasting with sentiment analysis. Entropy 2023, 25, 219. [Google Scholar] [CrossRef]
  14. Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
  15. Sharma, D.K.; Hota, H.S.; Brown, K.; Handa, R. Integration of genetic algorithm with artificial neural network for stock market forecasting. Int. J. Syst. Assur. Eng. Manag. 2022, 13, 828–841. [Google Scholar] [CrossRef]
  16. Bukhari, A.H.; Raja, M.A.Z.; Sulaiman, M.; Islam, S.; Shoaib, M.; Kumam, P. Fractional neuro-Sequential ARFIMA-LSTM for financial market forecasting. IEEE Access 2020, 8, 71326–71338. [Google Scholar] [CrossRef]
  17. Yunpeng, L.; Di, H.; Junpeng, B.; Yong, Q. Multi-step ahead time series forecasting for different data patterns based on LSTM recurrent neural network. In Proceedings of the 2017 14th Web Information Systems and Applications Conference (WISA), Liuzhou, China, 11–12 November 2017; pp. 305–310. [Google Scholar]
  18. Luo, J.; Zhu, G.; Xiang, H. Artificial intelligent based day-ahead stock market profit forecasting. Comput. Electr. Eng. 2022, 99, 107837. [Google Scholar] [CrossRef]
  19. Elsworth, S.; Güttel, S. Time series forecasting using LSTM networks: A symbolic approach. arXiv 2020, arXiv:2003.05672. [Google Scholar]
  20. Li, Y.; Zhu, Z.; Kong, D.; Han, H.; Zhao, Y. EA-LSTM: Evolutionary attention-based LSTM for time series prediction. Knowl.-Based Syst. 2019, 181, 104785. [Google Scholar] [CrossRef]
  21. Liu, F.; Cai, M.; Wang, L.; Lu, Y. An ensemble model based on adaptive noise reducer and over-fitting prevention LSTM for multivariate time series forecasting. IEEE Access 2019, 7, 26102–26115. [Google Scholar] [CrossRef]
  22. Livieris, I.E.; Pintelas, E.; Pintelas, P. A CNN–LSTM model for gold price time-series forecasting. Neural Comput. Appl. 2020, 32, 17351–17360. [Google Scholar] [CrossRef]
  23. Peng, L.; Zhu, Q.; Lv, S.X.; Wang, L. Effective long short-term memory with fruit fly optimization algorithm for time series forecasting. Soft Comput. 2020, 24, 15059–15079. [Google Scholar] [CrossRef]
  24. Siami-Namini, S.; Tavakoli, N.; Namin, A.S. The performance of LSTM and BiLSTM in forecasting time series. In Proceedings of the 2019 IEEE International Conference on Big Data (Big Data), Los Angeles, CA, USA, 9–12 December 2019; pp. 3285–3292. [Google Scholar]
  25. Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention is all you need. Adv. Neural Inf. Process. Syst. 2017, 30, 6000–6010. [Google Scholar]
  26. Araabi, A.; Monz, C. Optimizing transformer for low-Resource neural machine translation. In Proceedings of the 28th International Conference on Computational Linguistics, Barcelona, Spain (Online), 8–13 December 2020; pp. 3429–3435. [Google Scholar]
  27. Chen, K.; Chen, G.; Xu, D.; Zhang, L.; Huang, Y.; Knoll, A. NAST: Non-autoregressive spatial-temporal transformer for time series forecasting. arXiv 2021, arXiv:2102.05624. [Google Scholar]
  28. Li, S.; Jin, X.; Xuan, Y.; Zhou, X.; Chen, W.; Wang, Y.X.; Yan, X. Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. Adv. Neural Inf. Process. Syst. 2019, 32, 5243–5253. [Google Scholar]
  29. Li, Y.; Tian, C.; Lan, Y.; Yu, C.; Xie, K. Transformer with sparse attention mechanism for industrial time series forecasting. J. Physics Conf. Ser. 2021, 2026, 012036. [Google Scholar] [CrossRef]
  30. Wu, H.; Xu, J.; Wang, J.; Long, M. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. Adv. Neural Inf. Process. Syst. 2021, 34, 22419–22430. [Google Scholar]
  31. Wu, Z.; Wu, L.; Meng, Q.; Xia, Y.; Xie, S.; Qin, T.; Dai, X.; Liu, T.Y. Unidrop: A simple yet effective technique to improve transformer without extra cost. arXiv 2021, arXiv:2104.04946. [Google Scholar]
  32. Zerveas, G.; Jayaraman, S.; Patel, D.; Bhamidipaty, A.; Eickhoff, C. Multi-step ahead time series forecasting for different data patterns based on LSTM recurrent neural network. In Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, Virtual Event, Singapore, 14–18 August 2021; pp. 2114–2124. [Google Scholar]
  33. Zhou, H.; Zhang, S.; Peng, J.; Zhang, S.; Li, J.; Xiong, H.; Zhang, W. Multi-step ahead time series forecasting for different data patterns based on LSTM recurrent neural network. In Proceedings of the AAAI Conference on Artificial Intelligence, Vancouver, BC, Canada, 2–9 February 2021; pp. 11106–11115. [Google Scholar]
  34. Scagliarini, T.; Faes, L.; Marinazzo, D.; Stramaglia, S.; Mantegna, R.N. Synergistic information transfer in the global system of financial markets. Entropy 2023, 22, 1000. [Google Scholar] [CrossRef]
  35. Scagliarini, T.; Pappalardo, G.; Biondo, A.E.; Pluchino, A.; Rapisarda, A.; Stramaglia, S. Pairwise and high-order dependencies in the cryptocurrency trading network. Sci. Rep. 2022, 12, 18483. [Google Scholar] [CrossRef]
  36. Lin, C.T.; Wang, Y.K.; Huang, P.L.; Shi, Y.; Chang, Y.C. Spatial-temporal attention-based convolutional network with text and numerical information for stock price prediction. Neural Comput. Appl. 2022, 34, 14387–14395. [Google Scholar] [CrossRef]
  37. Hou, X.; Wang, K.; Zhong, C.; Wei, Z. St-trader: A spatial-temporal deep neural network for modeling stock market movement. IEEE/CAA J. Autom. Sin. 2021, 8, 1015–1024. [Google Scholar] [CrossRef]
  38. Grigsby, J.; Wang, Z.; Nguyen, N.; Qi, Y. Long-range transformers for dynamic spatiotemporal forecasting. arXiv 2023, arXiv:2109.12218. [Google Scholar]
  39. Huang, P.Y.; Ni, Y.S.; Yu, C.M. The microstructure of the price-volume relationship of the constituent stocks of the Taiwan 50 Index. Emerg. Mark. Financ. Trade 2012, 48, 153–168. [Google Scholar] [CrossRef]
  40. Kazemi, S.M.; Goel, R.; Eghbali, S.; Ramanan, J.; Sahota, J.; Thakur, S.; Wu, S.; Smyth, C.; Poupart, P.; Brubaker, M. Time2vec: Learning a vector representation of time. arXiv 2019, arXiv:1907.05321. [Google Scholar]
Figure 1. Stock price trend of (a) TSMC, (b) UMC, (c) DELTA, (d) EMC, (e) FCFC, and (f) YFH from 1 June 2022 to 18 November 2022.
Figure 1. Stock price trend of (a) TSMC, (b) UMC, (c) DELTA, (d) EMC, (e) FCFC, and (f) YFH from 1 June 2022 to 18 November 2022.
Entropy 25 01326 g001
Figure 2. The structure of an LSTM cell.
Figure 2. The structure of an LSTM cell.
Entropy 25 01326 g002
Figure 3. Multivariate time series data. (a) Standard temporal input sequence. (b) Flattened spatiotemporal input sequence.
Figure 3. Multivariate time series data. (a) Standard temporal input sequence. (b) Flattened spatiotemporal input sequence.
Entropy 25 01326 g003
Figure 4. Illustration of the daily moving windows method.
Figure 4. Illustration of the daily moving windows method.
Entropy 25 01326 g004
Figure 5. Architecture of the Spacetimeformer model.
Figure 5. Architecture of the Spacetimeformer model.
Entropy 25 01326 g005
Figure 6. Comparison of the trend chart for TSMC’s stock price using the six-step model forecasting.
Figure 6. Comparison of the trend chart for TSMC’s stock price using the six-step model forecasting.
Entropy 25 01326 g006
Figure 7. Comparison of the trend chart for UMC’s stock price using the six-step model forecasting.
Figure 7. Comparison of the trend chart for UMC’s stock price using the six-step model forecasting.
Entropy 25 01326 g007
Figure 8. Comparison of the trend chart for DELTA’s stock price using the six-step model forecasting.
Figure 8. Comparison of the trend chart for DELTA’s stock price using the six-step model forecasting.
Entropy 25 01326 g008
Figure 9. Comparison of the trend chart for EMC’s stock price using the six-step model forecasting.
Figure 9. Comparison of the trend chart for EMC’s stock price using the six-step model forecasting.
Entropy 25 01326 g009
Figure 10. Comparison of the trend chart for FCFC’s stock price using the six-step model forecasting.
Figure 10. Comparison of the trend chart for FCFC’s stock price using the six-step model forecasting.
Entropy 25 01326 g010
Figure 11. Comparison of the trend chart for YFH’s stock price using the six-step model forecasting.
Figure 11. Comparison of the trend chart for YFH’s stock price using the six-step model forecasting.
Entropy 25 01326 g011
Table 1. The constituent stocks of the Taiwan 50 Index announced in June 2022.
Table 1. The constituent stocks of the Taiwan 50 Index announced in June 2022.
Stock Symbol
1101.TW1216.TW1301.TW1303.TW1326.TW1590.TW2002.TW2207.TW2303.TW2308.TW
2317.TW2327.TW2330.TW2357.TW2379.TW2382.TW2395.TW2408.TW2409.TW2412.TW
2454.TW2603.TW2609.TW2615.TW2801.TW2880.TW2881.TW2882.TW2883.TW2884.TW
2885.TW2886.TW2887.TW2891.TW2892.TW2912.TW3008.TW3034.TW3037.TW3045.TW
3711.TW4904.TW5871.TW5876.TW5880.TW6415.TW6505.TW6770.TW8046.TW9910.TW
Table 2. Descriptive statistics of stock prices.
Table 2. Descriptive statistics of stock prices.
NameSymbolMeanS.D.MinimumMedianMaximum
TSMC2330.TW470.1845.72371.00486.00555.00
UMC2303.TW41.243.9335.0540.0552.60
DELTA2308.TW255.8117.97210.50260.50295.50
EMC2603.TW117.4328.1779.30106.00183.50
FCFC1326.TW70.924.8964.3068.9082.60
YFH2885.TW20.531.2918.7520.1524.10
Table 3. Comparison of the MAPE for TSMC’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Table 3. Comparison of the MAPE for TSMC’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Step 1Step 2Step 3
Date STF TF LSTM STF TF LSTM STF TF LSTM
11/070.0110.0080.0080.0100.0060.0060.0100.0060.016
11/080.0020.0030.0110.0020.0040.0090.0020.0050.011
11/090.0030.0080.0230.0020.0080.0240.0030.0070.015
11/100.0070.0080.0110.0070.0090.0100.0070.0150.007
11/110.0040.0140.0330.0030.0130.0180.0030.0130.036
11/140.0060.0100.0040.0060.0130.0060.0060.0140.008
11/150.0070.0220.0210.0090.0200.0260.0100.0210.033
11/160.0020.0090.0110.0030.0140.0080.0040.0100.007
11/170.0030.0110.0110.0040.0120.0090.0050.0150.006
11/180.0020.0050.0040.0020.0040.0060.0020.0050.005
Mean0.0050.0100.0140.0050.0100.0120.0050.0110.014
S.D.0.0030.0050.0090.0030.0050.0070.0030.0050.011
Step 4Step 5Step 6
DateSTFTFLSTMSTFTFLSTMSTFTFLSTM
11/070.0090.0050.0120.0090.0050.0040.0090.0060.014
11/080.0020.0030.0130.0020.0030.0120.0030.0030.009
11/090.0030.0070.0240.0030.0070.0270.0040.0090.037
11/100.0080.0100.0060.0080.0100.0100.0080.0120.011
11/110.0030.0110.0290.0030.0120.0210.0030.0120.017
11/140.0060.0070.0060.0060.0090.0080.0060.0110.007
11/150.0110.0190.0190.0130.0230.0190.0130.0220.017
11/160.0040.0130.0060.0040.0150.0090.0040.0120.010
11/170.0050.0180.0050.0060.0180.0080.0060.0200.009
11/180.0020.0050.0040.0020.0050.0040.0020.0050.006
Mean0.0050.0100.0120.0060.0110.0120.0060.0110.014
S.D.0.0030.0050.0090.0030.0060.0070.0030.0060.009
Table 4. Comparison of the MAPE for UMC’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Table 4. Comparison of the MAPE for UMC’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Step 1Step 2Step 3
Date STF TF LSTM STF TF LSTM STF TF LSTM
11/070.0020.0070.0170.0020.0080.0150.0020.0070.015
11/080.0020.0110.0070.0030.0140.0090.0030.0170.007
11/090.0020.0100.0080.0030.0080.0190.0030.0100.013
11/100.0020.0060.0060.0030.0060.0050.0030.0060.003
11/110.0040.0280.0180.0050.0310.0230.0050.0310.024
11/140.0030.0100.0060.0030.0090.0080.0040.0110.004
11/150.0060.0250.0270.0080.0290.0290.0080.0290.029
11/160.0050.0090.0060.0050.0110.0030.0050.0070.008
11/170.0030.0160.0080.0040.0130.0090.0040.0100.010
11/180.0040.0080.0320.0040.0100.0250.0040.0100.020
Mean0.0030.0130.0140.0040.0140.0150.0040.0140.013
S.D.0.0020.0070.0090.0020.0080.0090.0020.0090.008
Step 4Step 5Step 6
DateSTFTFLSTMSTFTFLSTMSTFTFLSTM
11/070.0020.0070.0180.0020.0080.0150.0020.0080.021
11/080.0030.0160.0060.0030.0190.0040.0030.0200.006
11/090.0030.0080.0170.0030.0090.0170.0030.0090.011
11/100.0030.0050.0040.0040.0050.0050.0040.0060.003
11/110.0050.0410.0270.0040.0390.0190.0040.0340.027
11/140.0040.0090.0040.0040.0150.0040.0040.0110.010
11/150.0090.0300.0290.0090.0310.0160.0090.0300.021
11/160.0050.0080.0040.0050.0080.0040.0040.0080.009
11/170.0050.0100.0140.0060.0090.0110.0070.0110.009
11/180.0030.0110.0290.0030.0140.0250.0030.0160.028
Mean0.0040.0150.0150.0040.0160.0120.0040.0150.014
S.D.0.0020.0110.0100.0020.0100.0070.0020.0090.009
Table 5. Comparison of the MAPE for DELTA’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Table 5. Comparison of the MAPE for DELTA’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Step 1Step 2Step 3
Date STF TF LSTM STF TF LSTM STF TF LSTM
11/070.0070.0070.0060.0070.0070.0090.0080.0070.006
11/080.0030.0100.0100.0030.0100.0200.0030.0110.018
11/090.0030.0050.0130.0020.0060.0130.0020.0030.012
11/100.0020.0070.0040.0020.0100.0030.0020.0060.003
11/110.0080.0290.0330.0090.0290.0540.0090.0280.046
11/140.0060.0040.0390.0070.0040.0400.0070.0040.041
11/150.0070.0160.0460.0080.0200.0500.0080.0170.046
11/160.0070.0210.0580.0060.0200.0330.0060.0260.050
11/170.0020.0060.0510.0030.0080.0460.0030.0100.040
11/180.0040.0100.0780.0040.0070.0720.0040.0070.039
Mean0.0050.0120.0340.0050.0120.0340.0050.0120.030
S.D.0.0020.0080.0240.0020.0080.0210.0030.0090.017
Step 4Step 5Step 6
DateSTFTFLSTMSTFTFLSTMSTFTFLSTM
11/070.0090.0080.0110.0090.0090.0110.0090.0080.010
11/080.0030.0100.0140.0030.0090.0210.0030.0090.018
11/090.0020.0040.0160.0020.0040.0180.0020.0040.011
11/100.0030.0060.0180.0030.0050.0020.0030.0030.002
11/110.0090.0250.0380.0090.0230.0490.0100.0220.043
11/140.0070.0050.0380.0070.0050.0530.0070.0050.044
11/150.0100.0220.0470.0100.0240.0440.0120.0240.048
11/160.0060.0270.0370.0050.0300.0540.0050.0280.042
11/170.0030.0070.0500.0040.0080.0560.0040.0070.041
11/180.0030.0070.0570.0030.0100.0400.0020.0100.050
Mean0.0050.0120.0310.0060.0130.0350.0060.0120.031
S.D.0.0030.0080.0180.0030.0090.0190.0030.0090.017
Table 6. Comparison of the MAPE for EMC’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Table 6. Comparison of the MAPE for EMC’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Step 1Step 2Step 3
Date STF TF LSTM STF TF LSTM STF TF LSTM
11/070.0100.0210.0170.0100.0180.0180.0110.0200.019
11/080.0090.0470.0110.0110.0500.0100.0110.0530.013
11/090.0040.0260.0050.0050.0270.0080.0050.0220.008
11/100.0060.0470.0070.0060.0500.0080.0050.0470.006
11/110.0060.0170.0090.0050.0170.0100.0050.0170.013
11/140.0120.0510.0240.0120.0570.0150.0130.0550.020
11/150.0070.0570.0060.0070.0660.0110.0070.0570.010
11/160.0070.0240.0070.0080.0130.0050.0080.0140.003
11/170.0050.0240.0110.0060.0130.0040.0060.0110.020
11/180.0100.0210.0060.0090.0200.0040.0080.0170.007
Mean0.0080.0330.0100.0080.0330.0090.0080.0310.012
S.D.0.0020.0140.0060.0030.0190.0040.0030.0180.006
Step 4Step 5Step 6
DateSTFTFLSTMSTFTFLSTMSTFTFLSTM
11/070.0120.0230.0110.0120.0300.0200.0130.0260.023
11/080.0120.0550.0150.0130.0460.0150.0140.0510.014
11/090.0050.0250.0050.0050.0240.0040.0040.0270.005
11/100.0060.0500.0070.0060.0510.0060.0060.0510.007
11/110.0040.0150.0100.0040.0160.0070.0040.0140.007
11/140.0130.0460.0100.0130.0490.0210.0130.0460.011
11/150.0080.0560.0050.0090.0570.0040.0090.0560.005
11/160.0070.0170.0060.0070.0180.0050.0070.0180.009
11/170.0060.0140.0060.0060.0170.0120.0060.0130.003
11/180.0080.0170.0060.0080.0180.0040.0070.0140.007
Mean0.0080.0320.0080.0080.0320.0100.0080.0310.009
S.D.0.0030.0170.0030.0030.0160.0060.0040.0170.006
Table 7. Comparison of the MAPE for FCFC’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Table 7. Comparison of the MAPE for FCFC’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Step 1Step 2Step 3
Date STF TF LSTM STF TF LSTM STF TF LSTM
11/070.0010.0040.0090.0010.0030.0070.0010.0030.006
11/080.0010.0010.0070.0020.0020.0110.0020.0020.014
11/090.0020.0040.0090.0020.0030.0080.0020.0060.009
11/100.0010.0040.0060.0010.0040.0050.0020.0060.006
11/110.0020.0120.0050.0030.0110.0070.0030.0130.007
11/140.0020.0100.0020.0020.0110.0040.0020.0140.004
11/150.0040.0050.0070.0050.0070.0050.0050.0040.005
11/160.0030.0080.0100.0040.0060.0190.0050.0060.004
11/170.0020.0040.0050.0020.0040.0070.0020.0040.003
11/180.0020.0050.0100.0020.0050.0060.0020.0050.005
Mean0.0020.0060.0070.0020.0060.0080.0030.0060.006
S.D.0.0010.0030.0020.0010.0030.0040.0010.0040.003
Step 4Step 5Step 6
DateSTFTFLSTMSTFTFLSTMSTFTFLSTM
11/070.0010.0020.0060.0010.0030.0080.0010.0030.011
11/080.0020.0020.0110.0020.0020.0120.0020.0020.014
11/090.0020.0030.0090.0020.0040.0060.0020.0040.011
11/100.0020.0060.0040.0020.0040.0050.0020.0050.003
11/110.0030.0130.0060.0030.0140.0110.0030.0140.011
11/140.0020.0160.0030.0020.0160.0050.0020.0190.005
11/150.0050.0050.0070.0050.0050.0070.0050.0060.010
11/160.0050.0060.0200.0060.0060.0150.0060.0060.016
11/170.0030.0040.0020.0030.0040.0050.0030.0040.003
11/180.0020.0050.0070.0030.0050.0050.0030.0050.013
Mean0.0030.0060.0080.0030.0060.0080.0030.0070.010
S.D.0.0010.0050.0050.0020.0050.0030.0020.0050.004
Table 8. Comparison of the MAPE for YFH’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Table 8. Comparison of the MAPE for YFH’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Step 1Step 2Step 3
Date STF TF LSTM STF TF LSTM STF TF LSTM
11/070.0020.0060.0090.0020.0070.0100.0010.0070.008
11/080.0020.0020.0100.0020.0020.0100.0020.0020.013
11/090.0020.0040.0110.0020.0050.0110.0020.0050.007
11/100.0020.0060.0020.0020.0050.0020.0020.0050.002
11/110.0020.0080.0180.0030.0070.0110.0030.0080.014
11/140.0040.0120.0320.0030.0100.0250.0040.0100.020
11/150.0020.0120.0050.0020.0110.0050.0030.0130.003
11/160.0030.0100.0040.0030.0110.0110.0040.0120.006
11/170.0030.0040.0050.0030.0040.0060.0030.0040.005
11/180.0020.0060.0060.0030.0070.0060.0030.0070.007
Mean0.0020.0070.0100.0030.0070.0100.0030.0070.009
S.D.0.0010.0030.0080.0010.0030.0060.0010.0030.005
Step 4Step 5Step 6
DateSTFTFLSTMSTFTFLSTMSTFTFLSTM
11/070.0020.0080.0060.0020.0070.0050.0020.0080.011
11/080.0020.0020.0090.0020.0020.0080.0020.0020.010
11/090.0020.0050.0080.0020.0050.0100.0020.0060.012
11/100.0020.0050.0020.0030.0050.0030.0030.0050.003
11/110.0040.0080.0170.0040.0080.0170.0050.0080.019
11/140.0040.0110.0290.0040.0120.0250.0040.0100.028
11/150.0030.0090.0040.0030.0140.0030.0030.0140.012
11/160.0050.0120.0060.0050.0100.0150.0060.0150.008
11/170.0030.0040.0100.0030.0040.0050.0030.0040.004
11/180.0030.0070.0050.0030.0090.0130.0040.0080.009
Mean0.0030.0070.0100.0030.0080.0100.0030.0080.012
S.D.0.0010.0030.0080.0010.0040.0070.0010.0040.007
Table 9. Comparison of the RMSE for TSMC’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Table 9. Comparison of the RMSE for TSMC’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Step 1Step 2Step 3
Date STF TF LSTM STF TF LSTM STF TF LSTM
11/074.323.693.424.113.092.613.923.426.28
11/080.941.624.530.961.813.911.072.094.58
11/091.493.769.401.263.869.761.183.376.50
11/102.844.075.372.965.114.823.146.753.74
11/113.077.3218.342.626.898.022.246.5516.35
11/143.205.662.243.246.623.393.187.084.07
11/154.1412.7212.115.3611.7914.556.5012.0018.11
11/161.685.576.842.107.884.372.056.173.61
11/172.016.366.022.357.155.352.708.983.22
11/181.353.082.821.322.804.371.312.802.89
Mean2.505.387.112.635.706.112.735.926.93
S.D.1.122.924.721.302.833.481.552.935.28
Step 4Step 5Step 6
DateSTFTFLSTMSTFTFLSTMSTFTFLSTM
11/073.772.784.963.742.931.633.823.005.76
11/081.211.505.031.331.264.921.501.324.10
11/091.363.5010.051.603.4511.201.974.4015.42
11/103.275.102.663.425.224.663.506.005.42
11/112.096.0313.202.256.299.592.386.407.54
11/143.214.213.093.265.074.253.025.903.15
11/157.4311.0111.468.3814.3811.599.1212.5310.26
11/162.187.053.812.238.155.062.227.145.76
11/173.0110.472.703.3810.544.693.6711.415.05
11/181.303.362.571.332.832.801.323.543.60
Mean2.885.505.953.096.016.043.256.166.61
S.D.1.753.033.841.963.813.302.133.353.53
Table 10. Comparison of the RMSE for UMC’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Table 10. Comparison of the RMSE for UMC’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Step 1Step 2Step 3
Date STF TF LSTM STF TF LSTM STF TF LSTM
11/070.080.350.750.100.390.700.090.330.70
11/080.110.510.330.130.600.520.140.700.38
11/090.140.500.420.170.450.830.180.520.57
11/100.110.320.260.130.290.310.160.320.16
11/110.231.331.060.251.471.310.271.521.43
11/140.160.540.300.180.500.430.200.590.26
11/150.391.271.290.471.411.370.511.421.36
11/160.260.590.330.320.630.170.300.450.36
11/170.150.790.430.190.680.520.240.520.53
11/180.240.401.470.220.491.210.200.480.96
Mean0.190.660.660.210.690.730.230.690.67
S.D.0.090.340.430.100.390.410.110.410.42
Step 4Step 5Step 6
DateSTFTFLSTMSTFTFLSTMSTFTFLSTM
11/070.110.350.790.100.370.680.100.370.89
11/080.150.710.310.150.810.200.140.860.24
11/090.180.450.770.180.500.780.160.510.48
11/100.180.270.210.190.260.240.210.300.16
11/110.261.851.610.231.761.170.201.621.42
11/140.220.490.230.230.790.220.240.530.54
11/150.561.461.410.601.480.880.651.451.10
11/160.270.560.180.260.540.230.230.530.43
11/170.270.550.730.310.500.560.360.600.46
11/180.170.541.350.160.661.200.140.751.31
Mean0.240.720.760.240.770.620.240.750.70
S.D.0.120.490.510.130.460.370.150.420.42
Table 11. Comparison of the RMSE for DELTA’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Table 11. Comparison of the RMSE for DELTA’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Step 1Step 2Step 3
Date STF TF LSTM STF TF LSTM STF TF LSTM
11/071.912.101.832.092.152.522.252.181.78
11/081.042.802.711.182.905.551.212.984.99
11/090.801.523.720.711.693.590.661.033.34
11/100.522.201.100.622.981.120.761.731.00
11/112.748.599.573.138.5415.332.968.2513.00
11/141.951.5611.242.021.4011.522.101.2911.67
11/152.115.3813.522.426.3814.542.675.6613.42
11/162.096.1516.831.966.079.651.937.7814.41
11/170.872.0414.801.042.7813.361.113.3711.75
11/181.403.1822.681.422.5820.941.392.4711.59
Mean1.543.559.801.663.759.811.703.678.70
S.D.0.682.256.950.762.266.150.752.505.00
Step 4Step 5Step 6
DateSTFTFLSTMSTFTFLSTMSTFTFLSTM
11/072.382.513.032.472.753.132.582.672.89
11/081.152.843.911.182.605.781.212.535.00
11/090.701.424.480.701.174.930.641.192.97
11/100.881.750.630.931.400.710.961.070.74
11/113.077.4210.773.226.8513.903.336.5912.31
11/142.201.5110.872.181.7915.252.261.8312.50
11/152.976.8813.663.237.5113.043.547.3514.20
11/161.877.9310.851.808.7215.521.758.1812.11
11/171.152.3914.511.172.6816.331.202.2811.94
11/181.072.3916.800.943.2211.820.773.1814.56
Mean1.743.718.951.783.8710.041.823.698.92
S.D.0.832.485.240.902.615.501.002.515.07
Table 12. Comparison of the RMSE for EMC’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Table 12. Comparison of the RMSE for EMC’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Step 1Step 2Step 3
Date STF TF LSTM STF TF LSTM STF TF LSTM
11/071.563.292.991.622.913.221.813.213.39
11/081.767.332.132.057.732.022.197.732.28
11/090.754.050.760.814.101.410.823.521.28
11/101.517.171.221.507.611.481.367.241.24
11/111.053.471.611.003.432.080.983.622.43
11/142.258.033.792.338.912.582.438.703.10
11/151.288.901.201.2510.201.901.238.911.74
11/161.403.881.461.442.161.001.502.360.67
11/170.963.771.911.042.060.721.071.913.14
11/181.733.421.001.623.210.721.522.881.26
Mean1.425.331.811.475.231.711.495.012.05
S.D.0.422.120.900.452.890.770.502.640.90
Step 4Step 5Step 6
DateSTFTFLSTMSTFTFLSTMSTFTFLSTM
11/071.983.571.892.094.553.652.233.934.39
11/082.398.442.712.537.222.832.747.882.61
11/090.833.840.820.803.700.770.754.130.93
11/101.477.691.231.617.811.241.587.721.17
11/110.943.041.930.833.331.320.692.911.28
11/142.377.221.742.407.623.372.337.221.76
11/151.378.711.031.428.790.891.448.680.86
11/161.422.781.111.512.931.061.543.001.41
11/171.082.231.031.042.701.831.002.090.69
11/181.452.901.031.412.980.761.342.431.27
Mean1.535.041.451.565.161.771.565.001.64
S.D.0.522.490.560.582.281.050.652.441.05
Table 13. Comparison of the RMSE for FCFC’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Table 13. Comparison of the RMSE for FCFC’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Step 1Step 2Step 3
Date STF TF LSTM STF TF LSTM STF TF LSTM
11/070.110.320.620.120.250.530.110.270.46
11/070.110.110.540.130.120.810.140.120.98
11/070.170.310.670.170.230.560.160.410.67
11/070.090.310.420.120.370.340.130.510.53
11/070.240.970.600.300.870.660.281.010.65
11/070.200.870.200.190.970.310.201.120.31
11/070.340.460.650.390.650.400.420.400.42
11/070.260.700.750.320.581.410.370.560.33
11/070.160.330.360.180.320.500.190.310.26
11/070.160.460.760.180.410.540.190.430.40
Mean0.180.480.560.210.480.610.220.480.50
S.D.0.070.260.170.090.270.300.100.270.21
Step 4Step 5Step 6
DateSTFTFLSTMSTFTFLSTMSTFTFLSTM
11/070.110.210.440.100.230.580.110.240.76
11/080.140.130.780.150.160.850.150.130.98
11/090.160.220.640.160.310.450.170.280.75
11/100.150.480.270.170.350.470.170.420.25
11/110.261.010.480.241.090.910.231.100.93
11/140.211.300.260.191.250.450.221.460.42
11/150.450.450.590.470.450.630.490.480.83
11/160.410.511.520.450.541.140.490.521.17
11/170.210.390.180.220.380.410.220.350.22
11/180.210.390.680.230.430.490.240.391.08
Mean0.230.520.590.240.510.640.250.520.74
S.D.0.110.300.370.120.350.230.130.340.32
Table 14. Comparison of the RMSE for YFH’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Table 14. Comparison of the RMSE for YFH’s stock price from 7 November 2022 to 18 November 2022 using the six-step model forecasting.
Step 1Step 2Step 3
Date STF TF LSTM STF TF LSTM STF TF LSTM
11/070.040.140.190.040.150.200.040.150.16
11/080.050.050.200.050.660.200.050.060.27
11/090.040.100.230.050.110.240.050.120.17
11/100.040.150.050.050.120.060.050.120.05
11/110.060.200.400.080.190.270.080.210.32
11/140.090.261.470.090.231.210.100.230.96
11/150.050.280.140.060.270.120.070.300.08
11/160.070.240.090.090.270.250.100.280.15
11/170.060.090.140.070.090.160.080.110.13
11/180.060.140.130.070.160.130.070.160.16
Mean0.060.160.300.060.220.280.070.170.25
S.D.0.010.070.400.020.160.320.020.080.25
Step 4Step 5Step 6
DateSTFTFLSTMSTFTFLSTMSTFTFLSTM
11/070.040.160.140.040.140.110.040.160.22
11/080.050.050.180.050.050.170.050.050.21
11/090.050.110.170.050.120.210.060.130.26
11/100.060.130.050.070.130.060.070.120.08
11/110.090.220.380.100.200.380.110.200.42
11/140.100.251.350.110.261.200.120.231.31
11/150.070.240.100.080.320.060.080.330.27
11/160.120.280.150.120.250.340.130.350.19
11/170.080.100.240.080.110.130.080.100.11
11/180.070.180.120.080.200.280.080.180.20
Mean0.070.170.290.080.180.290.080.180.33
S.D.0.020.070.360.030.080.320.030.090.34
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Li, Y.-C.; Huang, H.-Y.; Yang, N.-P.; Kung, Y.-H. Stock Market Forecasting Based on Spatiotemporal Deep Learning. Entropy 2023, 25, 1326. https://doi.org/10.3390/e25091326

AMA Style

Li Y-C, Huang H-Y, Yang N-P, Kung Y-H. Stock Market Forecasting Based on Spatiotemporal Deep Learning. Entropy. 2023; 25(9):1326. https://doi.org/10.3390/e25091326

Chicago/Turabian Style

Li, Yung-Chen, Hsiao-Yun Huang, Nan-Ping Yang, and Yi-Hung Kung. 2023. "Stock Market Forecasting Based on Spatiotemporal Deep Learning" Entropy 25, no. 9: 1326. https://doi.org/10.3390/e25091326

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop