Dynamic Adaptive Artificial Hummingbird Algorithm-Enhanced Deep Learning Framework for Accurate Transmission Line Temperature Prediction
Abstract
:1. Introduction
- (1)
- In this paper, we propose an innovative enhancement to the traditional artificial hummingbird algorithm (AHA), introducing the Dynamic Adaptive Artificial Hummingbird Algorithm (DA-AHA). By incorporating dynamic step size and inertia weight adjustments, a greedy local search mechanism, an elite retention strategy, and a grouped parallel search mechanism, we significantly enhance both the global exploration and local exploitation capabilities of the algorithm. These improvements effectively address the issue of traditional optimization algorithms being prone to local optima in high-dimensional complex problems, thereby accelerating convergence and improving optimization accuracy.
- (2)
- For the first time, the DA-AHA is integrated with a deep learning model to construct the DA-AHA-CNN-LSTM-TPA (DA-AHA-CLT) model, which combines convolutional neural networks (CNNs), long short-term memory (LSTM) networks, and the temporal pattern attention (TPA) mechanism. With global hyperparameter optimization enabled by the DA-AHA, the model demonstrates exceptional performance in the task of full-time-step transmission line temperature prediction, significantly enhancing both prediction accuracy and stability.
- (3)
- The DA-AHA-CLT model proposed in this paper effectively captures the short-term fluctuations, long-term trends, and key time-period characteristics of transmission line temperatures. It excels in full-time-step prediction, significantly improving fitting ability and prediction accuracy compared to models that combine traditional methods with other optimization algorithms.
2. Modules and Algorithms
2.1. Implementation of the LSTM Method in the Proposed Method
2.2. Temporal Pattern Attention Mechanisms
2.3. CNN
2.4. Dynamic Adaptive Artificial Hummingbird Algorithm (DA-AHA)
- Dynamic step size and inertia weight adjustment: By dynamically updating the step size and inertia weight, the hummingbird has a larger search range in the early stage of global search, and gradually converges to local search in the later stage, which balances the exploration and exploitation capabilities in the search process.
- Greedy local search mechanism: After executing the foraging strategy, a local search mechanism is introduced for the current solution, which selects a better solution by fine-tuning the comparison between the current position and the neighboring positions, thus effectively improving the convergence accuracy and local optimization ability of the algorithm.
- Elite retention mechanism: In each iteration, the current global optimal solution is saved and passed to the next generation to prevent the degradation of the solution, ensure the stability of the optimal solution, and further improve the global search effect of the algorithm.
- Grouped parallel search mechanism: the hummingbird population is divided into multiple groups. Each group searches independently in the local area, and the individuals in the group converge to the optimal solution in the group. This mechanism expands the search space and improves the diversity of solutions and the overall efficiency of the algorithm.
2.4.1. Initialization Phase
2.4.2. Dynamic Step Size and Inertia Weight Adjustment
2.4.3. Foraging Strategy Selection
2.4.4. Greedy Local Search Mechanism
2.4.5. Elite Retention Mechanisms
2.4.6. Grouped Parallel Search Mechanisms
2.4.7. Termination Conditions
2.5. Whale Optimization Algorithm
2.6. Northern Goshawk Optimization
2.7. Particle Swarm Optimization
2.8. Transmission Line Temperature Prediction Model
3. Experimental Section
3.1. Data Processing
3.2. Target Functions and Evaluation Indicators
3.3. Ablation Experiment
4. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Alhamrouni, I.; Kahar, N.H.A.; Salem, M.; Swadi, M.; Zahroui, Y.; Kadhim, D.J.; Mohamed, F.A.; Nazari, M.A. A comprehensive review on the role of artificial intelligence in power system stability, control, and protection: Insights and future directions. Appl. Sci. 2024, 14, 6214. [Google Scholar] [CrossRef]
- Zainuddin, N.M.; Rahman, M.A.; Kadir, M.A.; Ali, N.N.; Ali, Z.; Osman, M.; Nasir, N.M. Review of thermal stress and condition monitoring technologies for overhead transmission lines: Issues and challenges. IEEE Access 2020, 8, 120053–120081. [Google Scholar] [CrossRef]
- Paldino, G.M.; De Caro, F.; De Stefani, J.; Vaccaro, A.; Bontempi, G. Transfer learning-based methodologies for Dynamic Thermal Rating of transmission lines. Electr. Power Syst. Res. 2024, 229, 110206. [Google Scholar] [CrossRef]
- Yin, Y.; Le Guen, V.; Dona, J.; de Bézenac, E.; Ayed, I.; Thome, N.; Gallinari, P. Augmenting physical models with deep networks for complex dynamics forecasting. J. Stat. Mech. Theory Exp. 2021, 2021, 124012. [Google Scholar] [CrossRef]
- Qiu, H.; Gu, W.; Ning, C.; Lu, X.; Liu, P.; Wu, Z. Multistage mixed-integer robust optimization for power grid scheduling: An efficient reformulation algorithm. IEEE Trans. Sustain. Energy 2022, 14, 254–271. [Google Scholar] [CrossRef]
- Hao, Y.-Q.; Cao, Y.-L.; Ye, Q.; Cai, H.-W.; Qu, R.-H. On-line temperature monitoring in power transmission lines based on Brillouin optical time domain reflectometry. Opt.-Int. J. Light Electron Opt. 2015, 126, 2180–2183. [Google Scholar] [CrossRef]
- Chen, K.; Yue, Y.; Tang, Y. Research on temperature monitoring method of cable on 10 kV railway power transmission lines based on distributed temperature sensor. Energies 2021, 14, 3705. [Google Scholar] [CrossRef]
- Zhou, R.; Zhang, Z.; Zhang, H.; Cai, S.; Zhang, W.; Fan, A.; Xiao, Z.; Li, L. Reliable monitoring and prediction method for transmission lines based on FBG and LSTM. Adv. Eng. Inform. 2024, 62, 102603. [Google Scholar] [CrossRef]
- de Nazare, F.V.B.; Werneck, M.M. Hybrid optoelectronic sensor for current and temperature monitoring in overhead transmission lines. IEEE Sens. J. 2011, 12, 1193–1194. [Google Scholar] [CrossRef]
- Valentina, C.; St, L.A.; Karen, M. Incorporating temperature variations into transmission-line models. IEEE Trans. Power Deliv. 2011, 26, 2189–2196. [Google Scholar]
- Luo, S.; Wang, B.; Gao, Q.; Wang, Y.; Pang, X. Stacking integration algorithm based on CNN-BiLSTM-Attention with XGBoost for short-term electricity load forecasting. Energy Rep. 2024, 12, 2676–2689. [Google Scholar] [CrossRef]
- Arsalan, M.; Mubeen, M.; Bilal, M.; Abbasi, S.F. 1D-CNN-IDS: 1D CNN-based intrusion detection system for IIoT. In Proceedings of the 2024 29th International Conference on Automation and Computing (ICAC), Sunderland, UK, 28–30 August 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 1–4. [Google Scholar]
- Lumazine, A.; Drakos, G.; Salvatore, M.; Armand, V.; Andros, B.; Castiglione, R.; Grigorescu, E. Ransomware Detection in Network Traffic Using a Hybrid Cnn and Isolation Forest Approach; Sage Publishing: Thousand Oaks, CA, USA, 2024. [Google Scholar]
- Al Mudawi, N.; Ansar, H.; Alazeb, A.; Aljuaid, H.; AlQahtani, Y.; Algarni, A.; Jalal, A.; Liu, H. Innovative healthcare solutions: Robust hand gesture recognition of daily life routines using 1D CNN. Front. Bioeng. Biotechnol. 2024, 12, 1401803. [Google Scholar] [CrossRef]
- Ullah, K.; Ahsan, M.; Hasanat, S.M.; Haris, M.; Yousaf, H.; Raza, S.F.; Tandon, R.; Abid, S.; Ullah, Z. Short-Term Load Forecasting: A Comprehensive Review and Simulation Study With CNN-LSTM Hybrids Approach. IEEE Access 2024, 12, 111858–111881. [Google Scholar] [CrossRef]
- Wang, X.; Li, X.; Wang, L.; Ruan, T.; Li, P. Adaptive Cache Management for Complex Storage Systems Using CNN-LSTM-Based Spatiotemporal Prediction. arXiv 2024, arXiv:2411.12161. [Google Scholar]
- Shi, J.; Zhong, J.; Zhang, Y.; Xiao, B.; Xiao, L.; Zheng, Y. A dual attention LSTM lightweight model based on exponential smoothing for remaining useful life prediction. Reliab. Eng. Syst. Saf. 2024, 243, 109821. [Google Scholar] [CrossRef]
- Limouni, T.; Yaagoubi, R.; Bouziane, K.; Guissi, K.; Baali, E.H. Accurate one step and multistep forecasting of very short-term PV power using LSTM-TCN model. Renew. Energy 2023, 205, 1010–1024. [Google Scholar] [CrossRef]
- Hochreiter, S. Long Short-Term Memory; Neural Computation MIT-Press: Cambridge, MA, USA, 1997. [Google Scholar]
- Lin, T.; Horne, B.G.; Giles, C. How embedded memory in recurrent neural network architectures helps learning long-term temporal dependencies. Neural Netw. 1998, 11, 861–868. [Google Scholar] [CrossRef]
- Malashin, I.; Tynchenko, V.; Gantimurov, A.; Nelyub, V.; Borodulin, A. Applications of Long Short-Term Memory (LSTM) Networks in Polymeric Sciences: A Review. Polymers 2024, 16, 2607. [Google Scholar] [CrossRef]
- Cavus, M.; Ugurluoglu, Y.F.; Ayan, H.; Allahham, A.; Adhikari, K.; Giaouris, D. Switched Auto-Regressive Neural Control (S-ANC) for Energy Management of Hybrid Microgrids. Appl. Sci. 2023, 13, 11744. [Google Scholar] [CrossRef]
- Shih, S.Y.; Sun, F.K.; Lee, H. Temporal pattern attention for multivariate time series forecasting. Mach. Learn. 2019, 108, 1421–1441. [Google Scholar] [CrossRef]
- Hatami, N.; Gavet, Y.; Debayle, J. Classification of time-series images using deep convolutional neural networks. In Tenth International Conference on Machine Vision (ICMV 2017); SPIE: Bellingham, DC, USA, 2018; Volume 10696, pp. 242–249. [Google Scholar]
- Pelletier, C.; Webb, G.I.; Petitjean, F. Temporal convolutional neural network for the classification of satellite image time series. Remote Sens. 2019, 11, 523. [Google Scholar] [CrossRef]
- Liu, L.; Si, Y.W. 1D convolutional neural networks for chart pattern classification in financial time series. J. Supercomput. 2022, 78, 14191–14214. [Google Scholar] [CrossRef]
- Zhao, W.; Wang, L.; Mirjalili, S. Artificial hummingbird algorithm: A new bio-inspired optimizer with its engineering applications. Comput. Methods Appl. Mech. Eng. 2022, 388, 114194. [Google Scholar] [CrossRef]
- Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
- Dehghani, M.; Hubálovský, Š.; Trojovský, P. Northern goshawk optimization: A new swarm-based algorithm for solving optimization problems. IEEE Access 2021, 9, 162059–162080. [Google Scholar] [CrossRef]
- Wang, D.; Tan, D.; Liu, L. Particle swarm optimization algorithm: An overview. Soft Comput. 2018, 22, 387–408. [Google Scholar] [CrossRef]
Method Type | Examples | Strengths | Limitations | Proposed Model’s Innovations |
---|---|---|---|---|
Traditional Models | ARIMA, SARIMA | Good for linear trends and stationary data. | Poor at capturing nonlinear and multivariate dependencies; struggles with long-term dependencies. | Combines deep learning and optimization to handle complexity. |
Deep Learning Models | CNN, LSTM, CNN-LSTM | Effective in extracting features from time series and modeling temporal relationships. | Sensitive to hyperparameter tuning; a single model struggles to capture both local and global time-series characteristics. | Integrates CNN, LSTM, and TPA with optimized hyperparameters via DA-AHA. |
Proposed Model | DA-AHA-CNN-LSTM-TPA | Captures local and global features using CNN, LSTM, and TPA; dynamic hyperparameter optimization improves accuracy and robustness. | Relies on high-quality data and computational resources. | Combines multiple advantages and optimizations for superior prediction performance. |
Voltage (kV) | Ambient Temperature (°C) | Wire Type | Tower Height (m) | Wind Speed (m/s) | Wind Direction (°) | Line Temperature (°C) |
---|---|---|---|---|---|---|
500 | −15.1 | steel-cored aluminum stranded wire | 30 | 4.84 | 338.18 | 81.55 |
500 | −12.9 | steel-cored aluminum stranded wire | 30 | 6.45 | 340.38 | 84.13 |
500 | −12.7 | steel-cored aluminum stranded wire | 30 | 6.33 | 338.13 | 84.29 |
500 | −11.5 | steel-cored aluminum stranded wire | 30 | 4.89 | 320.68 | 85.77 |
500 | −10.5 | steel-cored aluminum stranded wire | 30 | 6.05 | 312.18 | 87.26 |
Feature | Value |
---|---|
Training data (80%) | 00:00 1 January 2024 to 23:50 24 September 2024 |
Testing data (20%) | 00:38 25 September 2024 to 00:00 1 December 2024 |
Vector length | 10 |
Sampling rate | 1h |
Numerical environment | Python 3.9.3 |
Libraries | Numpy, TensorFlow, Pandas, Matplotlib, Keras, cuda |
Machine Configuration | AMD RyzenTM 9 5900HX @ 3.30 GHz, 16 Threads, NVIDIA GeForce RTX 4070 Ti, 12 GB GDDR6X, Operating System: 64-bit Windows |
Parameters | Details | |
---|---|---|
Filter | 32 | |
Kernel size | 2 | |
Conv1D | Activation | ReLu |
Kernel regularizer | L2 (strength 0.1) | |
MaxPooling1D | Pool size | 2 |
Dropout | Dropout rate | 0.3 |
LSTM | Units1 | 10 |
Units2 | 10 | |
Attention | Units | 20 |
Unites | 10 | |
Dense1 | Activation | ReLu |
Dense2 | Unites | 1 |
RMSE | MAE | |||
---|---|---|---|---|
CLT | 0.878 | 0.92 | 0.87 | 0.71 |
LSTM-TPA | 0.798 | 1.07 | 0.94 | 0.81 |
CNN-LSTM | 0.773 | 1.13 | 0.97 | 0.78 |
LSTM | 0.704 | 1.59 | 1.32 | 0.91 |
ARIMA | 0.853 | 0.10 | 0.90 | 0.84 |
SARIMA | 0.859 | 0.98 | 0.91 | 0.83 |
Parameters | Details | |
---|---|---|
Pop | 3 | |
MaxIter | 40 | |
Dim | 4 | |
LSTM units1 | [32, 128] | |
Best parameters | LSTM regularizer | [0.001, 0.01] |
LSTM units2 | [32, 64] | |
Learning rate | [0.001, 0.01] |
RMSE | MAE | |||
---|---|---|---|---|
DA-AHA-CLT | 0.987 | 0.023 | 0.018 | 0.011 |
WOA-CLT | 0.964 | 0.047 | 0.041 | 0.023 |
NGO-CLT | 0.971 | 0.053 | 0.048 | 0.035 |
PSO-CLT | 0.962 | 0.078 | 0.064 | 0.041 |
CLT | 0.878 | 0.921 | 0.871 | 0.713 |
LSTM-TPA | 0.798 | 1.075 | 0.943 | 0.812 |
CNN-LSTM | 0.773 | 1.134 | 0.972 | 0.784 |
LSTM | 0.704 | 1.597 | 1.324 | 0.915 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ji, X.; Lu, C.; Xie, B.; Han, H.; Li, M. Dynamic Adaptive Artificial Hummingbird Algorithm-Enhanced Deep Learning Framework for Accurate Transmission Line Temperature Prediction. Electronics 2025, 14, 403. https://doi.org/10.3390/electronics14030403
Ji X, Lu C, Xie B, Han H, Li M. Dynamic Adaptive Artificial Hummingbird Algorithm-Enhanced Deep Learning Framework for Accurate Transmission Line Temperature Prediction. Electronics. 2025; 14(3):403. https://doi.org/10.3390/electronics14030403
Chicago/Turabian StyleJi, Xiu, Chengxiang Lu, Beimin Xie, Huanhuan Han, and Mingge Li. 2025. "Dynamic Adaptive Artificial Hummingbird Algorithm-Enhanced Deep Learning Framework for Accurate Transmission Line Temperature Prediction" Electronics 14, no. 3: 403. https://doi.org/10.3390/electronics14030403
APA StyleJi, X., Lu, C., Xie, B., Han, H., & Li, M. (2025). Dynamic Adaptive Artificial Hummingbird Algorithm-Enhanced Deep Learning Framework for Accurate Transmission Line Temperature Prediction. Electronics, 14(3), 403. https://doi.org/10.3390/electronics14030403