An Evolutionary Deep Learning Framework for Accurate Remaining Capacity Prediction in Lithium-Ion Batteries
Abstract
:1. Introduction
- Temporal Convolutional Networks (TCNs): TCNs effectively capture short-term dependencies and local temporal features using causal convolutions, while dilated convolutions expand the receptive field to model long-term dependencies and multi-scale temporal patterns efficiently. This combination enables TCNs to handle both near-term and long-term degradation patterns in time-series data.
- Bidirectional Gated Recurrent Units (BiGRUs): Skilled at capturing long-term dependencies by processing information bidirectionally, providing a comprehensive view of the battery’s degradation trajectory.
- Attention Mechanism: Dynamically assigns weights to different time steps, emphasizing critical degradation stages that significantly impact prediction accuracy.
- (1)
- Advanced Time-Series Feature Extraction Model
- (2)
- Intelligent Optimization Framework
- (3)
- Comprehensive Experimental Validation
2. Related Works
- (1)
- Model-Based Methods
- (i)
- Electrochemical Models
- (ii)
- Equivalent Circuit Models (ECMs)
- (iii)
- Empirical Models
- (2)
- Data-Driven Methods
- (i)
- Statistical Techniques
- (ii)
- Stochastic Models:
- (iii)
- Machine Learning (ML) and Deep Learning (DL)
- (3)
- Fusion Methods
3. Methodology
3.1. Overview
- ①
- Data Collection
- ②
- Data Preprocessing
- ③
- The Design of the Time-Series Feature Extraction Model
- ④
- Automatic Hyperparameter Optimization
- ⑤
- Model Training
- ⑥
- Model Testing
3.2. TCN-BiGRU-Attention Model
- (1)
- Temporal Convolutional Networks (TCNs)
- (2)
- Bidirectional Gated Recurrent Units (BiGRUs)
- (3)
- Self-Attention Mechanism
- (4)
- Overall TCN-BiGRU-Attention Model
- (i)
- TCN captures local and multi-scale temporal features ().
- (ii)
- BiGRU extracts bidirectional sequential dependencies ().
- (iii)
- The self-attention mechanism enhances feature selection and noise resistance ().
3.3. Optimizing the Network Structure
3.3.1. Hyperparameter Selection and Justification
- (1)
- Learning Rate
- (2)
- Number of BiGRU Neurons
- (3)
- Attention Key-Value Dimension
- (4)
- Convolution Kernel Size
3.3.2. Optimization Algorithms
- (1)
- Sparrow Search Algorithm (SSA)
- (2)
- Bayesian Optimization (BO)
- (i)
- Surrogate Model Construction
- (ii)
- Acquisition Function Selection
- (iii)
- Acquisition Function Optimization
- (iv)
- Surrogate Model Update
- (3)
- Hybrid Optimization Algorithm
- (i)
- Enhanced Global and Local Search Capabilities
- (ii)
- Improved Robustness
- (iii)
- Maintaining Population Diversity
Algorithm 1: Hybrid Optimization Algorithm for TCN-BiGRU-Attention Model |
Input: (i) (e.g., validation loss) (ii) (iii) (iv) (v) (TCN-BiGRU-Attention). Output: (i) (learning rate, number of BiGRU neurons, attention key value, convolution kernel size) (ii) . 1: Define the search space: Set the ranges for hyperparameters, including learning rate, BiGRU neurons, attention key value, and convolution kernel size. 2: Initialize the population : . 3: Evaluate the fitness of each individual in using : : Compute . 4: Apply SSA to generate a set of high-quality initial values : ; . 5: Initialize the GP surrogate model with the initial values : Use the refined population to build a Gaussian Process (GP) model as the surrogate for the objective function. 6: for = 1 to T do: (6.1) ; ; . (6.2) . end for. 7: Select the optimal hyperparameters from the evaluated points: Choose the hyperparameters with the lowest validation loss. 8: Train the final neural network model with the optimal hyperparameters : on the entire training dataset. 9: Return θ* and the trained model : . |
3.3.3. Limitations of the Hybrid Approach
- Computational Cost: The combined use of SSA and BO introduces additional computational overhead, particularly in constructing and optimizing the BO surrogate model.
- Parameter Initialization: The performance of the hybrid algorithm depends on the initial parameter ranges and population size, which may require domain-specific knowledge or empirical tuning.
- Scalability: For real-time applications, the optimization process may need simplification or hardware acceleration to meet latency requirements.
3.4. Summary
4. Experiment
4.1. Experimental Overview
4.2. Dataset Description
- (1)
- NASA Dataset
- (2)
- Silicon-Based Anodes Half-cell Dataset
4.3. Main Experiments
- (1)
- No Optimization: Parameters were manually set within their initial ranges, providing a baseline for comparison.
- (2)
- Particle Swarm Optimization (PSO): Parameters were optimized using PSO, leveraging its global search capabilities.
- (3)
- Hybrid Optimization (SSA + BO): The hybrid algorithm combined SSA for broad exploration and BO for local refinement, dynamically tuning parameters to achieve optimal performance.
4.4. Model Ablation Study
4.4.1. Experiment Design
- (1)
- Remove TCN: To assess the contribution of the Temporal Convolutional Network (TCN) component, this experiment removes the TCN layer from the TCN-BiGRU-Attention network. The resulting model, consisting of only CNN, BiGRU, and the attention mechanism, is evaluated on the test set. The performance of this modified model is compared with the full model to analyze the impact of the TCN on prediction accuracy. RMSE results for all twelve experimental groups are presented in Table 6.
- (2)
- Remove Attention Mechanism: To evaluate the impact of the attention mechanism, this experiment removes the attention component from the TCN-BiGRU-Attention network, leaving a model comprising only TCN and BiGRU. The modified model is evaluated on the test set, and its performance is compared against the full model to determine the significance of the attention mechanism in enhancing prediction accuracy. The Root Mean Squared Error (RMSE) results for all twelve experimental groups are presented in Table 7.
- (3)
- Replace BiGRU with GRU: To examine the contribution of the bidirectional structure in BiGRU, this experiment replaces BiGRU with a standard unidirectional Gated Recurrent Unit (GRU) in the TCN-BiGRU-Attention network. The modified model is evaluated on the test set, and its performance is compared with the full model to assess the impact of the bidirectional mechanism. The Root Mean Squared Error (RMSE) results for all twelve experimental groups are presented in Table 8.
4.4.2. Result Analysis
- (1)
- Removing the TCN layer resulted in a significant increase in average RMSE by 43.3%, 16.6%, and 71.3% on the NASA, Half-Cell (Si/CNTs), and Half-Cell (Si/Graphene) datasets, respectively, compared to the full model. This underscores the critical role of TCN in capturing long-term dependencies in time-series data, which is vital for predicting complex patterns accurately.
- (2)
- Eliminating the attention mechanism led to a substantial decline in performance, with average RMSE increases of 31.1%, 49.7%, and 75.2% on the same datasets. These findings highlight the importance of the attention mechanism in focusing on relevant input features, thereby enhancing prediction accuracy and robustness, particularly in highly variable datasets.
- (3)
- Replacing the bidirectional BiGRU with a unidirectional GRU caused notable performance degradation, marked by average RMSE increases of 64.3%, 74.6%, and 92.6% across the datasets. This demonstrates that the bidirectional architecture of BiGRU is crucial for capturing both forward and backward dependencies in sequential data, significantly impacting prediction accuracy.
4.5. Ablation Study of Network Optimization Algorithms
4.5.1. Experiment Design
4.5.2. Result Analysis
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Fan, X.; Yang, Y.; Fei, Z.; Huang, Z.; Tsui, K.-L. Life Prediction of Lithium-Ion Batteries Based on Stacked Denoising Autoencoders. Reliab. Eng. Syst. Saf. 2021, 208, 107396. [Google Scholar] [CrossRef]
- Xu, X.; Tang, S.; Yu, C.; Xie, J.; Han, X.; Ouyang, M. Remaining Useful Life Prediction of Lithium-Ion Batteries Based on Wiener Process under Time-Varying Temperature Condition. Reliab. Eng. Syst. Saf. 2021, 214, 107675. [Google Scholar] [CrossRef]
- Liu, Y.; Zhao, T.; Ju, W.; Shi, S. Materials Discovery and Design Using Machine Learning. J. Mater. 2017, 3, 159–177. [Google Scholar] [CrossRef]
- Rezvanizaniani, S.M.; Liu, Z.; Chen, Y.; Lee, J. Review and Recent Advances in Battery Health Monitoring and Prognostics Technologies for Electric Vehicle (EV) Safety and Mobility. J. Power Sources 2014, 256, 110–124. [Google Scholar] [CrossRef]
- Zhou, B.; Cheng, C.; Ma, G.; Zhang, Y. Remaining Useful Life Prediction of Lithium-Ion Battery Based on Attention Mechanism with Positional Encoding. IOP Conf. Ser. Mater. Sci. Eng. 2020, 895, 012006. [Google Scholar] [CrossRef]
- Li, D.; Yang, L.; Li, C. Control-Oriented Thermal-Electrochemical Modeling and Validation of Large Size Prismatic Lithium Battery for Commercial Applications. Energy 2021, 214, 119057. [Google Scholar] [CrossRef]
- Chen, N.; Zhang, P.; Dai, J.; Gui, W. Estimating the State-of-Charge of Lithium-Ion Battery Using an H-Infinity Observer Based on Electrochemical Impedance Model. IEEE Access 2020, 8, 26872–26884. [Google Scholar] [CrossRef]
- Zhang, C.; Allafi, W.; Dinh, Q.; Ascencio, P.; Marco, J. Online Estimation of Battery Equivalent Circuit Model Parameters and State of Charge Using Decoupled Least Squares Technique. Energy 2018, 142, 678–688. [Google Scholar] [CrossRef]
- Naseri, F.; Schaltz, E.; Stroe, D.-I.; Gismero, A.; Farjah, E. An Enhanced Equivalent Circuit Model with Real-Time Parameter Identification for Battery State-of-Charge Estimation. IEEE Trans. Ind. Electron. 2022, 69, 3743–3751. [Google Scholar] [CrossRef]
- Cai, C.; Gong, Y.; Fotouhi, A.; Auger, D.J. A Novel Hybrid Electrochemical Equivalent Circuit Model for Online Battery Management Systems. J. Energy Storage 2024, 99 Pt A, 113142. [Google Scholar] [CrossRef]
- Wang, S.; Wang, Y.; Su, X.; Zhang, L.; Liu, M.; Chen, J.; Huang, Q.; Lee, T.; Zhao, F.; Gao, H. Impact of Energy Efficiency and Operating Temperature on the Remaining Life of Lithium-Ion Batteries. Intell. Comput. Appl. 2018, 8, 162–171. [Google Scholar]
- Zhou, Y.; Huang, M. Lithium-Ion Batteries Remaining Useful Life Prediction Based on a Mixture of Empirical Mode Decomposition and ARIMA Model. Microelectron. Reliab. 2016, 65, 265–273. [Google Scholar] [CrossRef]
- Kim, S.; Lee, P.-Y.; Lee, M.; Kim, J.; Na, W. Improved State-of-Health Prediction Based on Auto-Regressive Integrated Moving Average with Exogenous Variables Model in Overcoming Battery Degradation-Dependent Internal Parameter Variation. J. Energy Storage 2022, 46, 103888. [Google Scholar] [CrossRef]
- Liu, K.; Shang, Y.; Ouyang, Q.; Widanage, W.D. A Data-Driven Approach with Uncertainty Quantification for Predicting Future Capacities and Remaining Useful Life of Lithium-Ion Battery. IEEE Trans. Ind. Electron. 2020, 68, 3170–3180. [Google Scholar] [CrossRef]
- Xie, Y.X.; Wang, S.L.; Shi, W.H.; Xiong, X.; Chen, X. A New Method of Unscented Particle Filter for High-Fidelity Lithium-Ion Battery SOC Estimation. Energy Storage Sci. Technol. 2021, 10, 722–731. [Google Scholar]
- Jiao, Z.Q.; Fan, X.M.; Zhang, X.; Luo, Y.; Liu, Y. State Tracking and Remaining Useful Life Predictive Method of Li-Ion Battery Based on Improved Particle Filter Algorithm. Trans. China Electrotech. Soc. 2020, 35, 3979–3993. [Google Scholar]
- Li, X.; Yuan, C.; Wang, Z. Multitime-Scale Framework for Prognostic Health Condition of Lithium Battery Using Modified Gaussian Process Regression and Nonlinear Regression. J. Power Sources 2020, 467, 228358. [Google Scholar] [CrossRef]
- Deng, Z.W.; Hu, X.S.; Lin, X.K.; Che, Y.; Xu, L.; Guo, W. Data-Driven State of Charge Estimation for Lithium-Ion Battery Packs Based on Gaussian Process Regression. Energy 2020, 205, 118000. [Google Scholar] [CrossRef]
- Zhang, R.; Ji, C.H.; Zhou, X.; Liu, T.; Jin, G.; Pan, Z.; Liu, Y. Capacity Estimation of Lithium-Ion Batteries with Uncertainty Quantification Based on Temporal Convolutional Network and Gaussian Process Regression. Energies 2024, 297, 131154. [Google Scholar] [CrossRef]
- Li, X.; Yuan, C.; Wang, Z. State of Health Estimation for Li-Ion Battery via Partial Incremental Capacity Analysis Based on Support Vector Regression. Energy 2020, 203, 117852. [Google Scholar] [CrossRef]
- Liu, Z.Y.; He, H.J.; Xie, J.; Wang, K.; Huang, W. Self-Discharge Prediction Method for Lithium-Ion Batteries Based on Improved Support Vector Machine. J. Energy Storage 2022, 55, 105571. [Google Scholar] [CrossRef]
- Qin, W.; Lv, H.; Liu, C.; Dey, N.; Jahanshahi, P. Remaining Useful Life Prediction for Lithium-Ion Batteries Using Particle Filter and Artificial Neural Network. Ind. Manag. Data Syst. 2019; ahead-of-print. [Google Scholar] [CrossRef]
- Ren, L.; Dong, J.; Wang, X.; Meng, Z.; Zhao, L.; Deen, M.J. A Data-Driven Auto-CNN-LSTM Prediction Model for Lithium-Ion Battery Remaining Useful Life. IEEE Trans. Ind. Inform. 2020, 17, 3478–3487. [Google Scholar] [CrossRef]
- Hong, J.; Lee, D.; Jeong, E.-R.; Yi, Y. Towards the Swift Prediction of the Remaining Useful Life of Lithium-Ion Batteries with End-to-End Deep Learning. Appl. Energy 2020, 278, 115646. [Google Scholar] [CrossRef]
- Zhang, Y.; Xiong, R.; He, H.; Pecht, M.G. Long Short-Term Memory Recurrent Neural Network for Remaining Useful Life Prediction of Lithium-Ion Batteries. IEEE Trans. Veh. Technol. 2018, 67, 5695–5705. [Google Scholar] [CrossRef]
- Park, K.; Choi, Y.; Choi, W.J.; Ryu, H.-Y.; Kim, H. LSTM-Based Battery Remaining Useful Life Prediction with Multi-Channel Charging Profiles. IEEE Access 2020, 8, 20786–20798. [Google Scholar] [CrossRef]
- Shi, Z.; Chehade, A. A Dual-LSTM Framework Combining Change Point Detection and Remaining Useful Life Prediction. Reliab. Eng. Syst. Saf. 2021, 205, 107257. [Google Scholar] [CrossRef]
- Chen, D.; Hong, W.; Zhou, X. Transformer Network for Remaining Useful Life Prediction of Lithium-Ion Batteries. IEEE Access 2022, 10, 19621–19628. [Google Scholar] [CrossRef]
- Mao, J.; Yin, X.; Chen, R.; Ding, K.; Jiang, L. An Improved Approach Based on Transformer Network for Remaining Useful Life Prediction of Lithium-Ion Batteries. Energies 2022, 15, 9317. [Google Scholar] [CrossRef]
- Shen, L.; Li, J.; Zuo, L.; Zhu, L.; Shen, H.T. Source-Free Cross-Domain State of Charge Estimation of Lithium-Ion Batteries at Different Ambient Temperatures. IEEE Trans. Power Electron. 2023, 38, 6851–6862. [Google Scholar] [CrossRef]
- Borst, N.; Verhagen, W.J.C. Introducing CNN-LSTM Network Adaptations to Improve Remaining Useful Life Prediction of Complex Systems. Aeronaut. J. 2023, 127, 2143–2153. [Google Scholar] [CrossRef]
- Hafizhahullah, H.; Yuliani, A.R.; Pardede, H.; Ramdan, A.; Zilvan, V.; Krisnandi, D.; Kadar, J. A Hybrid CNN-LSTM for Battery Remaining Useful Life Prediction with Charging Profiles Data. In Proceedings of the 2022 International Conference on Computer, Control, Informatics and Its Applications (IC3INA’22), Virtual Event, 22–23 November 2022; pp. 106–110. [Google Scholar] [CrossRef]
- Hofmann, T.; Dubarry, M.; Hamar, J.; Erhard, S.; Schmidt, J.P. Transfer Learning from Synthetic Data for SOH Estimation. In ECS Meeting Abstracts, Vol. MA2024-02, A03: Accelerating Next-Generation Battery R&D Through Data-Driven Approaches; ECS—The Electrochemical Society: Pennington, NJ, USA, 2024; p. 364. [Google Scholar] [CrossRef]
- Wei, Y.; Wu, D. Prediction of State of Health and Remaining Useful Life of Lithium-Ion Battery Using Graph Convolutional Network with Dual Attention Mechanisms. Reliab. Eng. Syst. Saf. 2023, 230, 108289. [Google Scholar] [CrossRef]
- Srinivas, S.S.; Sarkar, R.K.; Runkana, V. Battery GraphNets: Relational Learning for Lithium-Ion Batteries (LIBs) Life Estimation. arXiv 2024, arXiv:2408.07624. [Google Scholar] [CrossRef]
- Colominas, M.A.; Schlotthauer, G.; Torres, M.E. Improved Complete Ensemble EMD: A Suitable Tool for Biomedical Signal Processing. Biomed. Signal Process. Control 2014, 14, 19–29. [Google Scholar] [CrossRef]
- Chung, J.; Gulcehre, C.; Cho, K.; Bengio, Y. Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling. arXiv 2014, arXiv:1412.3555. Available online: https://arxiv.org/abs/1412.3555 (accessed on 15 March 2024).
- Kennedy, J.; Eberhart, R. Particle Swarm Optimization. In Proceedings of the IEEE International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar] [CrossRef]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention Is All You Need. In Advances in Neural Information Processing Systems 30 (NIPS 2017); Curran Associates, Inc.: Red Hook, NY, USA, 2017; pp. 5998–6008. Available online: https://arxiv.org/pdf/1706.03762 (accessed on 15 March 2024).
- Tang, J.; Liu, G.; Pan, Q. A Review of Representative Swarm Intelligence Algorithms for Solving Optimization Problems: Applications and Trends. IEEE/CAA J. Autom. Sin. 2021, 8, 1627–1643. [Google Scholar] [CrossRef]
- Xue, J.; Shen, B. A Novel Swarm Intelligence Optimization Approach: Sparrow Search Algorithm. Syst. Sci. Control Eng. 2020, 8, 22–34. [Google Scholar] [CrossRef]
- Snoek, J.; Larochelle, H.; Adams, R.P. Practical Bayesian Optimization of Machine Learning Algorithms. In Proceedings of the Advances in Neural Information Processing Systems, Lake Tahoe, NV, USA, 3–8 December 2012; Curran Associates, Inc.: Red Hook, NY, USA, 2012; pp. 2951–2959. [Google Scholar]
- Saha, B.; Goebel, K. Battery Data Set. NASA Ames Prognostics Data Repository 2007. Available online: https://ti.arc.nasa.gov/tech/dash/groups/pcoe/prognostic-data-repository/ (accessed on 15 March 2024).
Component | Parameter | Initial Range | Manual Initialization | Automatically Optimized |
---|---|---|---|---|
Training | Learning Rate | [0.001, 0.01] | RG | Adjusted dynamically by PSO/SSA + BO |
Optimizer | - | Adam | Adam | |
Batch Size | - | 32 | 32 | |
TCN | Number of Layers | - | 3 | 3 |
Convolution Kernel Size | [2, 10] | RG | Determined by PSO/SSA + BO | |
Dilation Factors | - | [1, 2, 4] | [1, 2, 4] | |
Dropout Rate | - | 0.2 | 0.2 | |
BiGRU | Number of Neurons | [10, 50] | RG | Determined by PSO/SSA + BO |
Attention | Key-Value Dimension | [2, 50] | RG | Determined by PSO/SSA + BO |
Algorithm | Parameter | Value/Description |
---|---|---|
PSO | Swarm Size (N) | 6 |
Maximum Iterations (Tmax) | 4 | |
Cognitive Coefficient (c1) | 1.5 | |
Social Coefficient (c2) | 1.5 | |
Inertia Weight (w) | 0.8 | |
SSA | Population Size (N) | 6 |
Maximum Iterations (Tmax) | 4 | |
Alarm Threshold (R2) | 0.8 | |
Safety Threshold (k) | 0.5–1 | |
BO | Surrogate Model | Gaussian Process |
Acquisition Function | Expected Improvement (EI) |
Optimization Method | GROUP | MSE | RMSE | MAE | MAPE | R2 |
---|---|---|---|---|---|---|
Unoptimized | B5 | 0.0003849 | 0.01962 | 0.01464 | 0.0097 | 0.9882 |
B6 | 0.0019812 | 0.04451 | 0.02199 | 0.0139 | 0.9120 | |
B7 | 0.0039188 | 0.06260 | 0.07914 | 0.0374 | 0.8750 | |
B18 | 0.0003752 | 0.01937 | 0.01287 | 0.0080 | 0.9885 | |
Mean | 0.0016650 | 0.03653 | 0.03216 | 0.0172 | 0.9434 | |
PSO | B5 | 0.0002373 | 0.01541 | 0.01045 | 0.0066 | 0.9927 |
B6 | 0.0018106 | 0.04255 | 0.01909 | 0.0120 | 0.9195 | |
B7 | 0.0014178 | 0.03765 | 0.02902 | 0.0198 | 0.9717 | |
B18 | 0.0002857 | 0.01690 | 0.01084 | 0.0068 | 0.9912 | |
Mean | 0.0009379 | 0.02813 | 0.01735 | 0.0113 | 0.9688 | |
Hybrid (SSA + BO) | B5 | 0.0002122 | 0.01457 | 0.00884 | 0.0057 | 0.9936 |
B6 | 0.0016997 | 0.04123 | 0.01645 | 0.0104 | 0.9239 | |
B7 | 0.0009132 | 0.03022 | 0.02225 | 0.0143 | 0.9823 | |
B18 | 0.0002127 | 0.01458 | 0.00898 | 0.0057 | 0.9936 | |
Mean | 0.0007595 | 0.02515 | 0.01413 | 0.0090 | 0.9733 |
Optimization Method | GROUP | MSE | RMSE | MAE | MAPE | R2 |
---|---|---|---|---|---|---|
Unoptimized | I | 0.0032252 | 0.05679 | 0.05074 | 0.0275 | 0.9709 |
II | 0.0028417 | 0.05331 | 0.04344 | 0.0245 | 0.9678 | |
III | 0.0031221 | 0.05588 | 0.04776 | 0.0239 | 0.9665 | |
IV | 0.0045294 | 0.06730 | 0.05579 | 0.0321 | 0.9630 | |
Mean | 0.0034296 | 0.05832 | 0.04943 | 0.0270 | 0.9670 | |
PSO | I | 0.0027646 | 0.05258 | 0.02924 | 0.0142 | 0.9751 |
II | 0.0025061 | 0.05062 | 0.02747 | 0.0136 | 0.9709 | |
III | 0.0020725 | 0.04552 | 0.02866 | 0.0135 | 0.9778 | |
IV | 0.0018385 | 0.04288 | 0.03091 | 0.0152 | 0.9850 | |
Mean | 0.0022954 | 0.04790 | 0.02907 | 0.0141 | 0.9772 | |
Hybrid (SSA + BO) | I | 0.001337 | 0.03657 | 0.02597 | 0.0133 | 0.9879 |
II | 0.001262 | 0.03552 | 0.02400 | 0.0117 | 0.9857 | |
III | 0.001339 | 0.03659 | 0.02464 | 0.0123 | 0.9857 | |
IV | 0.001673 | 0.04092 | 0.02917 | 0.0151 | 0.9863 | |
Mean | 0.001403 | 0.03740 | 0.02594 | 0.0131 | 0.9864 |
Optimization Method | GROUP | MSE | RMSE | MAE | MAPE | R2 |
---|---|---|---|---|---|---|
Unoptimized | I | 0.011862 | 0.10891 | 0.07959 | 0.05499 | 0.98491 |
II | 0.080990 | 0.28459 | 0.08683 | 0.04744 | 0.90090 | |
III | 0.005486 | 0.07987 | 0.06019 | 0.03307 | 0.98592 | |
IV | 0.009826 | 0.09913 | 0.06546 | 0.05845 | 0.98704 | |
Mean | 0.027041 | 0.14313 | 0.07302 | 0.04849 | 0.96469 | |
PSO | I | 0.009574 | 0.09785 | 0.07081 | 0.03377 | 0.98782 |
II | 0.065282 | 0.25550 | 0.08229 | 0.03954 | 0.92012 | |
III | 0.004885 | 0.06989 | 0.05319 | 0.02412 | 0.99199 | |
IV | 0.006832 | 0.08266 | 0.06348 | 0.03094 | 0.99099 | |
Mean | 0.021643 | 0.12648 | 0.06744 | 0.03209 | 0.97273 | |
Hybrid (SSA + BO) | I | 0.009378 | 0.09684 | 0.06615 | 0.03055 | 0.98807 |
II | 0.065217 | 0.25538 | 0.08220 | 0.03948 | 0.92000 | |
III | 0.002743 | 0.05237 | 0.04300 | 0.02366 | 0.99550 | |
IV | 0.006689 | 0.08179 | 0.06176 | 0.03097 | 0.99118 | |
Mean | 0.021006 | 0.12160 | 0.06328 | 0.03117 | 0.97369 |
Dataset | GROUP | CNN-BiGRU-Attention | TCN-BiGRU-Attention |
---|---|---|---|
NASA | B5 | 0.01483 | 0.01457 |
B6 | 0.02720 | 0.02709 | |
B7 | 0.05682 | 0.01284 | |
B18 | 0.03744 | 0.02271 | |
Half-Cell (Si/CNTs) | I | 0.04776 | 0.03657 |
II | 0.15991 | 0.12844 | |
III | 0.03580 | 0.03553 | |
IV | 0.04081 | 0.03661 | |
Half-Cell (Si/Graphene) | I | 0.01036 | 0.00938 |
II | 0.07810 | 0.01197 | |
III | 0.00468 | 0.00356 | |
IV | 0.00334 | 0.00269 |
Dataset | GROUP | CNN-BiGRU | TCN-BiGRU-Attention |
---|---|---|---|
NASA | B5 | 0.01392 | 0.01457 |
B6 | 0.04171 | 0.02709 | |
B7 | 0.01392 | 0.01284 | |
B18 | 0.04334 | 0.02271 | |
Half-Cell (Si/CNTs) | I | 0.13416 | 0.03657 |
II | 0.19977 | 0.12844 | |
III | 0.06738 | 0.03553 | |
IV | 0.06970 | 0.03661 | |
Half-Cell (Si/Graphene) | I | 0.00546 | 0.00938 |
II | 0.08857 | 0.01197 | |
III | 0.00385 | 0.00356 | |
IV | 0.01347 | 0.00269 |
Dataset | GROUP | TCN-GRU-Attention | TCN-BiGRU-Attention |
---|---|---|---|
NASA | B5 | 0.04527 | 0.01457 |
B6 | 0.06269 | 0.02709 | |
B7 | 0.06787 | 0.01284 | |
B18 | 0.04020 | 0.02271 | |
Half-Cell (Si/CNTs) | I | 0.24148 | 0.03657 |
II | 0.28569 | 0.12844 | |
III | 0.19593 | 0.03553 | |
IV | 0.20875 | 0.03661 | |
Half-Cell (Si/Graphene) | I | 0.12131 | 0.00938 |
II | 0.08503 | 0.01197 | |
III | 0.07737 | 0.00356 | |
IV | 0.08740 | 0.00269 |
Dataset | GROUP | SSA | SSA + BO |
---|---|---|---|
NASA | B5 | 0.06871 | 0.06445 |
B6 | 0.02427 | 0.02033 | |
B7 | 0.10656 | 0.08013 | |
B18 | 0.10801 | 0.08836 | |
Half-Cell (Si/CNTs) | I | 0.08911 | 0.12844 |
II | 0.16727 | 0.12844 | |
III | 0.06497 | 0.05501 | |
IV | 0.07898 | 0.05185 | |
Half-Cell (Si/Graphene) | I | 0.01004 | 0.00938 |
II | 0.09201 | 0.07263 | |
III | 0.00372 | 0.00356 | |
IV | 0.03161 | 0.01829 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Liu, Y.; Han, L.; Wang, Y.; Zhu, J.; Zhang, B.; Guo, J. An Evolutionary Deep Learning Framework for Accurate Remaining Capacity Prediction in Lithium-Ion Batteries. Electronics 2025, 14, 400. https://doi.org/10.3390/electronics14020400
Liu Y, Han L, Wang Y, Zhu J, Zhang B, Guo J. An Evolutionary Deep Learning Framework for Accurate Remaining Capacity Prediction in Lithium-Ion Batteries. Electronics. 2025; 14(2):400. https://doi.org/10.3390/electronics14020400
Chicago/Turabian StyleLiu, Yang, Liangyu Han, Yuzhu Wang, Jinqi Zhu, Bo Zhang, and Jia Guo. 2025. "An Evolutionary Deep Learning Framework for Accurate Remaining Capacity Prediction in Lithium-Ion Batteries" Electronics 14, no. 2: 400. https://doi.org/10.3390/electronics14020400
APA StyleLiu, Y., Han, L., Wang, Y., Zhu, J., Zhang, B., & Guo, J. (2025). An Evolutionary Deep Learning Framework for Accurate Remaining Capacity Prediction in Lithium-Ion Batteries. Electronics, 14(2), 400. https://doi.org/10.3390/electronics14020400