Previous Article in Journal
Chloride Resistance of Assembled Bridge Piers Reinforced with Epoxy-Coated Steel Bars
Previous Article in Special Issue
Integrating Interpolation and Extrapolation: A Hybrid Predictive Framework for Supervised Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

TransTLA: A Transfer Learning Approach with TCN-LSTM-Attention for Household Appliance Sales Forecasting in Small Towns

Department of Information and Communication Engineering, Tongji University, Shanghai 201804, China
*
Author to whom correspondence should be addressed.
Appl. Sci. 2024, 14(15), 6611; https://doi.org/10.3390/app14156611 (registering DOI)
Submission received: 26 June 2024 / Revised: 19 July 2024 / Accepted: 27 July 2024 / Published: 28 July 2024
(This article belongs to the Special Issue Big Data: Analysis, Mining and Applications)

Abstract

Deep learning (DL) has been widely applied to forecast the sales volume of household appliances with high accuracy. Unfortunately, in small towns, due to the limited amount of historical sales data, it is difficult to forecast household appliance sales accurately. To overcome the above-mentioned challenge, we propose a novel household appliance sales forecasting algorithm based on transfer learning, temporal convolutional network (TCN), long short-term memory (LSTM), and attention mechanism (called “TransTLA”). Firstly, we combine TCN and LSTM to exploit the spatiotemporal correlation of sales data. Secondly, we utilize the attention mechanism to make full use of the features of sales data. Finally, in order to mitigate the impact of data scarcity and regional differences, a transfer learning technique is used to improve the predictive performance in small towns, with the help of the learning experience from the megacity. The experimental outcomes reveal that the proposed TransTLA model significantly outperforms traditional forecasting methods in predicting small town household appliance sales volumes. Specifically, TransTLA achieves an average mean absolute error (MAE) improvement of 27.60% over LSTM, 9.23% over convolutional neural networks (CNN), and 11.00% over the CNN-LSTM-Attention model across one to four step-ahead predictions. This study addresses the data scarcity problem in small town sales forecasting, helping businesses improve inventory management, enhance customer satisfaction, and contribute to a more efficient supply chain, benefiting the overall economy.
Keywords: household appliance sales forecasting; transfer learning; temporal convolutional network; long short-term memory; attention mechanism household appliance sales forecasting; transfer learning; temporal convolutional network; long short-term memory; attention mechanism

Share and Cite

MDPI and ACS Style

Huang, Z.; Liu, J. TransTLA: A Transfer Learning Approach with TCN-LSTM-Attention for Household Appliance Sales Forecasting in Small Towns. Appl. Sci. 2024, 14, 6611. https://doi.org/10.3390/app14156611

AMA Style

Huang Z, Liu J. TransTLA: A Transfer Learning Approach with TCN-LSTM-Attention for Household Appliance Sales Forecasting in Small Towns. Applied Sciences. 2024; 14(15):6611. https://doi.org/10.3390/app14156611

Chicago/Turabian Style

Huang, Zhijie, and Jianfeng Liu. 2024. "TransTLA: A Transfer Learning Approach with TCN-LSTM-Attention for Household Appliance Sales Forecasting in Small Towns" Applied Sciences 14, no. 15: 6611. https://doi.org/10.3390/app14156611

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop