Machine Learning and Deep Learning Methods for Time Series Analysis and Forecasting

A special issue of Electronics (ISSN 2079-9292). This special issue belongs to the section "Computer Science & Engineering".

Deadline for manuscript submissions: closed (15 October 2022) | Viewed by 16948

Special Issue Editor


E-Mail Website
Guest Editor
Center For Research And Technology-Hellas / Information Technologies Institute, 54636 Thessaloniki, Greece
Interests: machine learning and artificial intelligence; data analytics; time series analysis and forecasting; crowd sourcing; intelligent transport systems; cloud/edge/fog computing; software security; energy efficient software systems

Special Issue Information

Dear Colleagues, 

Time series data represent the change and evolution of phenomena encountered in several domains ranging from finance, transportation, energy and manufacturing up to healthcare, medicine and retail. Nowadays, the abundance of big time series data has developed a great interest in the development of effective and efficient methods for various tasks’ time series analyses, including anomaly detection in time series, time series decomposition, time series segmentation, trend analysis, time series classification and clustering, time series storage, time series visualization and time series forecasting. Of particular interest is the use of the current state-of-the-art machine learning and deep learning approaches for effectively addressing these tasks. The aim of this Special Issue is to bring together academics and practitioners to exchange and discuss the latest innovations and applications of machine learning and deep learning methods on time series analysis and forecasting tasks. Papers addressing, but not limited to, the following topics will be considered for publication:

  • Anomaly/outlier detection on time series
  • Correlation and causation among time series
  • Time series decomposition methods
  • Trend analysis in time series
  • Time series segmentation
  • Time series classification
  • Time series clustering
  • Time series storage
  • Time series visualization
  • Automatic feature extraction from time series data
  • Univariate time series forecasting
  • Multivariate time series forecasting
  • Deployment of time series forecasting models in production
  • Efficiency issues on time series forecasting models

Technical Program Committee Member
Mr. Athanasios Salamanis  Center For Research And Technology-Hellas / Information Technologies Institute

Dr. Dionysios Kehagias
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Electronics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • machine learning
  • deep learning
  • time series analysis
  • time series classification
  • time series clustering
  • time series decomposition
  • time series segmentation
  • time series storage
  • time series forecasting

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

15 pages, 4141 KiB  
Article
n-Dimensional Chaotic Time Series Prediction Method
by Fang Liu, Baohui Yin, Mowen Cheng and Yongxin Feng
Electronics 2023, 12(1), 160; https://doi.org/10.3390/electronics12010160 - 29 Dec 2022
Cited by 3 | Viewed by 1491
Abstract
Chaotic time series have been involved in many fields of production and life, so their prediction has a very important practical value. However, due to the characteristics of chaotic time series, such as internal randomness, nonlinearity, and long-term unpredictability, most prediction methods cannot [...] Read more.
Chaotic time series have been involved in many fields of production and life, so their prediction has a very important practical value. However, due to the characteristics of chaotic time series, such as internal randomness, nonlinearity, and long-term unpredictability, most prediction methods cannot achieve high-precision intermediate or long-term predictions. Thus, an intermediate and long-term prediction (ILTP) method for n-dimensional chaotic time series is proposed to solve this problem. Initially, the order of the model is determined by optimizing the preprocessing and constructing the joint calculation strategy, so that the observation sequence can be decomposed and reorganized accurately. Furthermore, the RBF neural network is introduced to construct a multi-step prediction model of future sequences, with a feedback recursion mechanism. Compared with the existing prediction methods, the error of the ILTP method can be reduced by 1–6 orders of magnitude, and the prediction step can be increased by 10–20 steps. The ILTP method can provide reference technology for the application of time series prediction with chaotic characteristics. Full article
Show Figures

Figure 1

15 pages, 5667 KiB  
Article
A Prediction Method with Data Leakage Suppression for Time Series
by Fang Liu, Lizhi Chen, Yuanfang Zheng and Yongxin Feng
Electronics 2022, 11(22), 3701; https://doi.org/10.3390/electronics11223701 - 11 Nov 2022
Cited by 3 | Viewed by 1638
Abstract
In view of the characteristics of the collected time series, such as being high noise, non-stationary and nonlinear, most of the current methods are designed to smooth or denoise the whole time series at one time and then divide the training set and [...] Read more.
In view of the characteristics of the collected time series, such as being high noise, non-stationary and nonlinear, most of the current methods are designed to smooth or denoise the whole time series at one time and then divide the training set and testing set, which will lead to using the information of the testing set in the training process, resulting in data leakage and other problems. In order to reduce the impact of noise on time series prediction and prevent data leakage, a prediction method with data leakage suppression for time series (DLS) is proposed. This prediction method carries out multiple variational mode decomposition on the time series by overlapping slicing and improves the noise reduction threshold function to perform noise reduction processing on the decomposed time series. Furthermore, the idea of deep learning is introduced to establish a neural network multi-step prediction model, so as to improve the performance of time series prediction. The different datasets are selected as experimental data, and the results show that the proposed method has a better prediction effect and lower prediction error, compared with the common multi-step prediction methods, which verifies the superiority of the prediction method. Full article
Show Figures

Figure 1

20 pages, 486 KiB  
Article
LSTM-Based Deep Learning Models for Long-Term Tourism Demand Forecasting
by Athanasios Salamanis, Georgia Xanthopoulou, Dionysios Kehagias and Dimitrios Tzovaras
Electronics 2022, 11(22), 3681; https://doi.org/10.3390/electronics11223681 - 10 Nov 2022
Cited by 9 | Viewed by 2349
Abstract
Tourism demand forecasting comprises an important task within the overall tourism demand management process since it enables informed decision making that may increase revenue for hotels. In recent years, the extensive availability of big data in tourism allowed for the development of novel [...] Read more.
Tourism demand forecasting comprises an important task within the overall tourism demand management process since it enables informed decision making that may increase revenue for hotels. In recent years, the extensive availability of big data in tourism allowed for the development of novel approaches based on the use of deep learning techniques. However, most of the proposed approaches focus on short-term tourism demand forecasting, which is just one part of the tourism demand forecasting problem. Another important part is that most of the proposed models do not integrate exogenous data that could potentially achieve better results in terms of forecasting accuracy. Driven from the aforementioned problems, this paper introduces a deep learning-based approach for long-term tourism demand forecasting. In particular, the proposed forecasting models are based on the long short-term memory network (LSTM), which is capable of incorporating data from exogenous variables. Two different models were implemented, one using only historical hotel booking data and another one, which combines the previous data in conjunction with weather data. The aim of the proposed models is to facilitate the management of a hotel unit, by leveraging their ability to both integrate exogenous data and generate long-term predictions. The proposed models were evaluated on real data from three hotels in Greece. The evaluation results demonstrate the superior forecasting performance of the proposed models after comparison with well-known state-of-the-art approaches for all three hotels. By performing additional benchmarks of forecasting models with and without weather-related parameters, we conclude that the exogenous variables have a noticeable influence on the forecasting accuracy of deep learning models. Full article
Show Figures

Figure 1

19 pages, 6994 KiB  
Article
Comparative Performance Analysis of Vibration Prediction Using RNN Techniques
by Ju-Hyung Lee and Jun-Ki Hong
Electronics 2022, 11(21), 3619; https://doi.org/10.3390/electronics11213619 - 6 Nov 2022
Cited by 7 | Viewed by 1977
Abstract
Drones are increasingly used in several industries, including rescue, firefighting, and agriculture. If the motor connected to a drone’s propeller is damaged, there is a risk of a drone crash. Therefore, to prevent such incidents, an accurate and quick prediction tool of the [...] Read more.
Drones are increasingly used in several industries, including rescue, firefighting, and agriculture. If the motor connected to a drone’s propeller is damaged, there is a risk of a drone crash. Therefore, to prevent such incidents, an accurate and quick prediction tool of the motor vibrations in drones is required. In this study, normal and abnormal vibration data were collected from the motor connected to the propeller of a drone. The period and amplitude of the vibrations are consistent in normal vibrations, whereas they are irregular in abnormal vibrations. The collected vibration data were used to train six recurrent neural network (RNN) techniques: long short-term memory (LSTM), attention-LSTM (Attn.-LSTM), bidirectional-LSTM (Bi-LSTM), gated recurrent unit (GRU), attention-GRU (Attn.-GRU), and bidirectional GRU (Bi-GRU). Then, the simulation runtime it took for each RNN technique to predict the vibrations and the accuracy of the predicted vibrations were analyzed to compare the performances of the RNN model. Based on the simulation results, the Attn.-LSTM and Attn.-GRU techniques, incorporating the attention mechanism, had the best efficiency compared to the conventional LSTM and GRU techniques, respectively. The attention mechanism calculates the similarity between the input value and the to-be-predicted value in advance and reflects the similarity in the prediction. Full article
Show Figures

Figure 1

25 pages, 1264 KiB  
Article
Time Series Forecasting of Software Vulnerabilities Using Statistical and Deep Learning Models
by Ilias Kalouptsoglou, Dimitrios Tsoukalas, Miltiadis Siavvas, Dionysios Kehagias, Alexander Chatzigeorgiou and Apostolos Ampatzoglou
Electronics 2022, 11(18), 2820; https://doi.org/10.3390/electronics11182820 - 7 Sep 2022
Cited by 3 | Viewed by 2501
Abstract
Software security is a critical aspect of modern software products. The vulnerabilities that reside in their source code could become a major weakness for enterprises that build or utilize these products, as their exploitation could lead to devastating financial consequences. Therefore, the development [...] Read more.
Software security is a critical aspect of modern software products. The vulnerabilities that reside in their source code could become a major weakness for enterprises that build or utilize these products, as their exploitation could lead to devastating financial consequences. Therefore, the development of mechanisms capable of identifying and discovering software vulnerabilities has recently attracted the interest of the research community. Besides the studies that examine software attributes in order to predict the existence of vulnerabilities in software components, there are also studies that attempt to predict the future number of vulnerabilities based on the already reported vulnerabilities of a project. In this paper, the evolution of vulnerabilities in a horizon of up to 24 months ahead is predicted using a univariate time series forecasting approach. Both statistical and deep learning models are developed and compared based on security data coming from five popular software projects. In contrast to related literature, the results indicate that the capacity of Deep Learning and statistical models in forecasting the evolution of software vulnerabilities, as well as the selection of the best-performing model, depends on the respective software project. In some cases, statistical models provided better accuracy, whereas in other cases, Deep Learning models demonstrated better predictive power. However, the difference in their performance was not found to be statistically significant. In general, the two model categories produced similar forecasts for the number of vulnerabilities expected in the future, without significant diversities. Full article
Show Figures

Figure 1

13 pages, 613 KiB  
Article
Traffic Forecasting Based on Integration of Adaptive Subgraph Reformulation and Spatio-Temporal Deep Learning Model
by Shi-Yuan Han, Qi-Wei Sun, Qiang Zhao, Rui-Zhi Han and Yue-Hui Chen
Electronics 2022, 11(6), 861; https://doi.org/10.3390/electronics11060861 - 9 Mar 2022
Cited by 3 | Viewed by 2247
Abstract
Traffic forecasting provides the foundational guidance for many typical applications in the smart city management, such as urban traffic control, congestion avoidance, and navigation guidance. Many researchers have focused on the spatio-temporal correlations under fixed topology structure in traffic network to improve the [...] Read more.
Traffic forecasting provides the foundational guidance for many typical applications in the smart city management, such as urban traffic control, congestion avoidance, and navigation guidance. Many researchers have focused on the spatio-temporal correlations under fixed topology structure in traffic network to improve the traffic forecasting accuracy. Despite their advantages, the existing approaches are not completely discussed that the association relationship among traffic network nodes are not invariable under different traffic conditions. In this paper, a novel traffic forecasting framework is proposed by integrating the dynamic association of traffic nodes with the spatio-temporal deep learning model. To be specific, an adaptive subgraph reformulation algorithm is designed first based on the specific forecasting interval to reduce the interference of irrelevant spatio-temporal information. After that, by enhancing the attention mechanism with the generative decoder, a spatio-temporal deep learning model with only one forward operation is proposed to avoid the degradation of accuracy in the long-term prediction, in which the spatio-temporal information and the external factors (such as weather and holiday) are fused together to be as an input vector. Based on the reformulated subgraph constructed of traffic nodes with closer spatio-temporal correlation, experiments show that the proposed framework consistently outperforms other GNN (Graph Neural Network)-based state-of-the-art baselines for various forecasting intervals on a real-world dataset. Full article
Show Figures

Figure 1

24 pages, 6450 KiB  
Article
Time Series Network Data Enabling Distributed Intelligence—A Holistic IoT Security Platform Solution
by Aikaterini Protogerou, Evangelos V. Kopsacheilis, Asterios Mpatziakas, Kostas Papachristou, Traianos Ioannis Theodorou, Stavros Papadopoulos, Anastasios Drosou and Dimitrios Tzovaras
Electronics 2022, 11(4), 529; https://doi.org/10.3390/electronics11040529 - 10 Feb 2022
Cited by 5 | Viewed by 2647
Abstract
The Internet of Things (IoT) encompasses multiple fast-emerging technologies controlling and connecting millions of new devices every day in several application domains. The increased number of interconnected IoT devices, their limited computational power, and the evolving sophistication of cyber security threats, results in [...] Read more.
The Internet of Things (IoT) encompasses multiple fast-emerging technologies controlling and connecting millions of new devices every day in several application domains. The increased number of interconnected IoT devices, their limited computational power, and the evolving sophistication of cyber security threats, results in increased security challenges for the IoT ecosystem. The diversity of IoT devices, and the variety of QoS requirements among several domains of IoT application, impose considerable challenges in designing and implementing a robust IoT security solution. The aim of this paper is to present an efficient, robust, and easy-to-use system, for IoT cyber security operators. Following a by-design security approach, the proposed system is a platform comprising four distinct yet cooperating components; a distributed AI-enhanced detection of potential threats and anomalies mechanisms, an AI-based generation of effective mitigation strategies according to the severity of detected threats, a system for the verification of SDN routing decisions along with network- and resource-related policies, and a comprehensive and intuitive security status visualization and analysis. The distributed anomaly detection scheme implementing multiple AI-powered agents is deployed across the IoT network nodes aiming to efficiently monitor the entire network infrastructure. Network traffic data are fed to the AI agents, which process consecutive traffic samples from the network in a time series analysis manner, where consecutive time windows framing the traffic of the surrounding nodes are processed by a graph neural network algorithm. Any detected anomalies are handled by a mitigation engine employing a distributed neural network algorithm, which exploits the recorded anomalous events and deploys appropriate responses for optimal threat mitigation. The implemented platform also includes the hypothesis testing module, and a multi-objective optimization tool for the quick verification of routing decisions. The system incorporates visualization and analytics functionality and a customizable user interface. Full article
Show Figures

Figure 1

Back to TopTop