Next Article in Journal
A Novel Optimal Sensor Placement Framework for Concrete Arch Dams Based on IAHA Considering the Effects of Cracks and Elastic Modulus Degradation
Previous Article in Journal
Methodology to Detect Rail Corrugation from Vehicle On-Board Measurements by Isolating Effects from Other Sources of Excitation
Previous Article in Special Issue
Unveiling the Significance of Individual Level Predictions: A Comparative Analysis of GRU and LSTM Models for Enhanced Digital Behavior Prediction
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Revolutionizing Time Series Data Preprocessing with a Novel Cycling Layer in Self-Attention Mechanisms †

School of Information Technology, York University, North York, ON M3J 1P3, Canada
*
Author to whom correspondence should be addressed.
This work is an extend version of a paper published in the proceeding of the 2024 16th International Conference on Intelligent Human Machine Systems and Cybernetics (IHMSC 2024), held in Hangzhou, China, 24–25 August 2024.
Appl. Sci. 2024, 14(19), 8922; https://doi.org/10.3390/app14198922
Submission received: 31 August 2024 / Revised: 28 September 2024 / Accepted: 30 September 2024 / Published: 3 October 2024

Abstract

This paper introduces an innovative method for enhancing time series data preprocessing by integrating a cycling layer into a self-attention mechanism. Traditional approaches often fail to capture the cyclical patterns inherent to time series data, which affects the predictive model accuracy. The proposed method aims to improve models’ ability to identify and leverage these cyclical patterns, as demonstrated using the Jena Climate dataset from the Max Planck Institute for Biogeochemistry. Empirical results show that the proposed method enhances forecast accuracy and speeds up model fitting compared to the conventional techniques. This paper contributes to the field of time series analysis by providing a more effective preprocessing approach.
Keywords: information technology; machine learning; autoencoder; data science; data preprocessing; attention model; unsupervised learning; weather prediction; regression model information technology; machine learning; autoencoder; data science; data preprocessing; attention model; unsupervised learning; weather prediction; regression model

Share and Cite

MDPI and ACS Style

Chen, J.; Yang, Z. Revolutionizing Time Series Data Preprocessing with a Novel Cycling Layer in Self-Attention Mechanisms. Appl. Sci. 2024, 14, 8922. https://doi.org/10.3390/app14198922

AMA Style

Chen J, Yang Z. Revolutionizing Time Series Data Preprocessing with a Novel Cycling Layer in Self-Attention Mechanisms. Applied Sciences. 2024; 14(19):8922. https://doi.org/10.3390/app14198922

Chicago/Turabian Style

Chen, Jiyan, and Zijiang Yang. 2024. "Revolutionizing Time Series Data Preprocessing with a Novel Cycling Layer in Self-Attention Mechanisms" Applied Sciences 14, no. 19: 8922. https://doi.org/10.3390/app14198922

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop