Post Constraint and Correction: A Plug-and-Play Module for Boosting the Performance of Deep Learning Based Weather Multivariate Time Series Forecasting
Abstract
:1. Introduction
- 1.
- We develop a plug-and-play deep learning module named PCC that incorporates variable relationships and maintains state reasonability for weather multivariate time series forecasting, enabling seamless integration with various forecasting models without requiring architectural modifications or additional preprocessing.
- 2.
- We design a computationally efficient architecture that significantly improves backbone model performance on weather time series forecasting tasks with minimal additional computational overhead, allowing the enhanced model to achieve superior performance with a significantly reduced parameter count.
- 3.
- We conduct comprehensive experiments, ablation studies, and visualizations to demonstrate and analyze the superiority of our method.
2. Related Work
2.1. Weather Multivariate Time Series Forecasting
2.2. Deep Learning-Based Time Series Forecasting
3. Method
3.1. Initial Prediction
3.2. Multi-Variants Correlation Constraint
3.3. State Correction
4. Experiments
4.1. Experiment Materials and Setup
4.1.1. Dataset
4.1.2. Backbone Models
4.1.3. Training and Evaluation Setup
- 1.
- The hidden dimension for the two submodule networks was set as 128, balancing performance and computational complexity.
- 2.
- Tanh was chosen as the activation function due to its symmetry around zero and smooth gradient properties. These characteristics ensure training stability, which is particularly important because weather variables often contain extreme values and rapid fluctuations.
- 3.
- The dropout rate was set to 0.3, representing an optimal tradeoff between regularization to prevent overfitting and maintaining sufficient network capacity.
- 4.
- The learnable matrices were both initialized to 1, ensuring that the differential terms start with equivalent contribution weights. This helps to prevent initial bias that may cause unstable parameter updates during the early training stages.
4.2. Main Results and Discussion
4.2.1. Main Forecasting Results
4.2.2. Different Length of Observed Time Series
4.2.3. Key Variables Analysis
4.2.4. Impact of Strongly Correlated Variables
4.3. Method Analysis Results and Discussion
4.3.1. Ablation Study
4.3.2. Scalability Analysis
4.3.3. Training Cost and Stability
4.3.4. Comparison with Complex Mechanisms
- SAMP + SC: Replaces the MCC module with SAMP [46], a self-attention based method for capturing variable correlation in time series.
- LIFT + SC: Replaces the MCC module with LIFT [47], a channel dependence correction method for time series forecasting using linear structures with complex calculations such as Fourier transforms.
- saPCC: Replaces the MLP structure in both PCC submodules with a multihead self-attention mechanism.
4.3.5. Robustness Analysis
4.3.6. Visualization of the Two Submodules in PCC
4.3.7. Visualization of Differential Operation
5. Conclusions and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A. Dataset Variables
Symbol | Unit | Description |
---|---|---|
P | mbar | air pressure |
T | °C | air temperature |
Tpot | K | potential temperature |
Tdew | °C | dew point temperature |
rh | % | relative humidity |
VPmax | mbar | saturation water vapor pressure |
VPact | mbar | actual water vapor pressure |
VPdef | mbar | water vapor pressure deficit |
sh | g/kg | specific humidity |
H2OC | mmol/mol | water vapor concentration |
rho | g/m3 | air density |
wv | m/s | wind velocity |
max. wv | m/s | maximum wind velocity |
wd | ° | wind direction |
rain | mm | precipitation |
raining | s | precipitation duration |
SWDR | W/m2 | shortwave downward radiation |
PAR | W/m2 | photosynthetically active radiation |
max. PAR | W/m2 | maximum photosynthetically active radiation |
Tlog | °C | temperature in log |
CO2 | ppm | carbon dioxide concentration of ambient air |
Appendix B. Error Bars
stack LSTM | +PCC | Original | Confidence Interval | ||
---|---|---|---|---|---|
MSE | MAE | MSE | MAE | ||
96 | 99% | ||||
192 | 99% | ||||
336 | 99% | ||||
720 | 99% | ||||
DLinear | +PCC | Original | Confidence Interval | ||
MSE | MAE | MSE | MAE | ||
96 | 99% | ||||
192 | 99% | ||||
336 | 99% | ||||
720 | 99% | ||||
iTransformer | +PCC | Original | Confidence Interval | ||
MSE | MAE | MSE | MAE | ||
96 | 99% | ||||
192 | 99% | ||||
336 | 99% | ||||
720 | 99% | ||||
PatchTST | +PCC | Original | Confidence Interval | ||
MSE | MAE | MSE | MAE | ||
96 | 99% | ||||
192 | 99% | ||||
336 | 99% | ||||
720 | 99% | ||||
SegRNN | +PCC | Original | Confidence Interval | ||
MSE | MAE | MSE | MAE | ||
96 | 99% | ||||
192 | 99% | ||||
336 | 99% | ||||
720 | 99% |
Appendix C. Hyperparameter Sensitivity
Setting | MSE | |||
---|---|---|---|---|
96 | 192 | 336 | 720 | |
One | ||||
Zero | ||||
Zero-One | ||||
One-Zero |
Appendix D. Detailed Settings of Backbone Models
Parameter | Value | Parameter | Value |
---|---|---|---|
Segment Length | 48 | Learning rate | 0.0001 |
RNN type | GRU | Batch size | 64 |
RNN layers | 1 | Training epochs | 30 |
Hidden size | 512 | Training patience | 10 |
Dropout | 0.5 | Training loss | MAE |
Parameter | Value | Parameter | Value |
---|---|---|---|
Encoder layers | 3 | Learning rate | 0.0001 |
Heads | 8 | Batch size | 32 |
Hidden dimensions | 512 | Training epochs | 10 |
Dropout rate | 0.1 | Training patience | 3 |
Training loss | MSE |
Parameter | Value | Parameter | Value |
---|---|---|---|
Patch size | 16 | Learning rate | 0.0001 |
Stride | 8 | Batch size (96, 192) | 32 |
Head (96, 336, 720) | 4 | Batch size (336, 720) | 128 |
Head (192) | 16 | Training epochs | 3 |
Hidden dimensions | 512 | Training patience | 3 |
Encoder layers | 2 | Training loss | MSE |
Parameter | Value | Parameter | Value |
---|---|---|---|
Kernel size | 25 | Learning rate | 0.0001 |
Training epochs | 30 | Batch size | 64 |
Training patience | 10 | Training loss | MAE |
Parameter | Value | Parameter | Value |
---|---|---|---|
Hidden size | 512 | Learning rate | 0.0001 |
LSTM layers | 4 | Batch size | 64 |
Training epochs | 30 | Training loss | MAE |
Training patience | 10 |
Appendix E. Implementation Details of Complex Mechanisms
References
- Graham, A.; Mishra, E.P. Time series analysis model to forecast rainfall for Allahabad region. J. Pharmacogn. Phytochem. 2017, 6, 1418–1421. [Google Scholar]
- Shivhare, N.; Rahul, A.K.; Dwivedi, S.B.; Dikshit, P.K.S. ARIMA based daily weather forecasting tool: A case study for Varanasi. Mausam 2019, 70, 133–140. [Google Scholar]
- Poterjoy, J. Implications of multivariate non-Gaussian data assimilation for multiscale weather prediction. Mon. Weather. Rev. 2022, 150, 1475–1493. [Google Scholar]
- Moreno, S.R.; dos Santos Coelho, L. Wind speed forecasting approach based on singular spectrum analysis and adaptive neuro fuzzy inference system. Renew. Energy 2018, 126, 736–754. [Google Scholar] [CrossRef]
- Yano, J.I.; Ziemiański, M.Z.; Cullen, M.; Termonia, P.; Onvlee, J.; Bengtsson, L.; Carrassi, A.; Davy, R.; Deluca, A.; Gray, S.L.; et al. Scientific challenges of convective-scale numerical weather prediction. Bull. Am. Meteorol. Soc. 2018, 99, 699–710. [Google Scholar]
- Schultz, M.G.; Betancourt, C.; Gong, B.; Kleinert, F.; Langguth, M.; Leufen, L.H.; Mozaffari, A.; Stadtler, S. Can deep learning beat numerical weather prediction? Philos. Trans. R. Soc. 2021, 379, 20200097. [Google Scholar]
- Wang, Y.; Wu, H.; Dong, J.; Liu, Y.; Long, M.; Wang, J. Deep time series models: A comprehensive survey and benchmark. arXiv 2024, arXiv:2407.13278. [Google Scholar]
- Lim, B.; Zohren, S. Time-series forecasting with deep learning: A survey. Philos. Trans. R. Soc. A 2021, 379, 20200209. [Google Scholar]
- Wilby, R.L.; Troni, J.; Biot, Y.; Tedd, L.; Hewitson, B.C.; Smith, D.M.; Sutton, R.T. A review of climate risk information for adaptation and development planning. Int. J. Climatol. J. R. Meteorol. Soc. 2009, 29, 1193–1215. [Google Scholar]
- Bauer, P.; Thorpe, A.; Brunet, G. The quiet revolution of numerical weather prediction. Nature 2015, 525, 47–55. [Google Scholar]
- Shen, C. A transdisciplinary review of deep learning research and its relevance for water resources scientists. Water Resour. Res. 2018, 54, 8558–8593. [Google Scholar]
- Kurth, T.; Subramanian, S.; Harrington, P.; Pathak, J.; Mardani, M.; Hall, D.; Miele, A.; Kashinath, K.; Anandkumar, A. Fourcastnet: Accelerating global high-resolution weather forecasting using adaptive fourier neural operators. In Proceedings of the Platform for Advanced Scientific Computing Conference, Davos, Switzerland, 26–28 June 2023; pp. 1–11. [Google Scholar]
- Zhu, X.; Xiong, Y.; Wu, M.; Nie, G.; Zhang, B.; Yang, Z. Weather2k: A multivariate spatio-temporal benchmark dataset for meteorological forecasting based on real-time observation data from ground weather stations. arXiv 2023, arXiv:2302.10493. [Google Scholar]
- Dubey, A.K.; Kumar, A.; García-Díaz, V.; Sharma, A.K.; Kanhaiya, K. Study and analysis of SARIMA and LSTM in forecasting time series data. Sustain. Energy Technol. Assessments 2021, 47, 101474. [Google Scholar] [CrossRef]
- Ray, S.; Das, S.S.; Mishra, P.; Al Khatib, A.M.G. Time series SARIMA modelling and forecasting of monthly rainfall and temperature in the South Asian countries. Earth Syst. Environ. 2021, 5, 531–546. [Google Scholar] [CrossRef]
- Hewage, P.; Behera, A.; Trovati, M.; Pereira, E.; Ghahremani, M.; Palmieri, F.; Liu, Y. Temporal convolutional neural (TCN) network for an effective weather forecasting using time-series data from the local weather station. Soft Comput. 2020, 24, 16453–16482. [Google Scholar] [CrossRef]
- Verdonck, T.; Baesens, B.; Óskarsdóttir, M.; vanden Broucke, S. Special issue on feature engineering editorial. Mach. Learn. 2024, 113, 3917–3928. [Google Scholar] [CrossRef]
- Bi, K.; Xie, L.; Zhang, H.; Chen, X.; Gu, X.; Tian, Q. Accurate medium-range global weather forecasting with 3D neural networks. Nature 2023, 619, 533–538. [Google Scholar] [CrossRef]
- Chen, K.; Han, T.; Gong, J.; Bai, L.; Ling, F.; Luo, J.J.; Chen, X.; Ma, L.; Zhang, T.; Su, R.; et al. Fengwu: Pushing the skillful global medium-range weather forecast beyond 10 days lead. arXiv 2023, arXiv:2304.02948. [Google Scholar]
- Karevan, Z.; Suykens, J.A. Transductive LSTM for time-series prediction: An application to weather forecasting. Neural Netw. 2020, 125, 1–9. [Google Scholar] [CrossRef]
- Al Sadeque, Z.; Bui, F.M. A deep learning approach to predict weather data using cascaded LSTM network. In Proceedings of the 2020 IEEE Canadian Conference on Electrical and Computer Engineering (CCECE), London, ON, Canada, 30 August–2 September 2020; IEEE: New York, NY, USA, 2020; pp. 1–5. [Google Scholar]
- Dikshit, A.; Pradhan, B.; Alamri, A.M. Long lead time drought forecasting using lagged climate variables and a stacked long short-term memory model. Sci. Total Environ. 2021, 755, 142638. [Google Scholar]
- Yan, Z.; Lu, X.; Wu, L. Exploring the Effect of Meteorological Factors on Predicting Hourly Water Levels Based on CEEMDAN and LSTM. Water 2023, 15, 3190. [Google Scholar] [CrossRef]
- Wang, H. Weather temperature prediction based on LSTM and transformer. In Proceedings of the International Conference on Electronics, Electrical and Information Engineering (ICEEIE 2024), Bangkok, Thailand, 16–18 August 2024; SPIE: Bellingham, WA, USA, 2024; Volume 13445, pp. 206–214. [Google Scholar]
- Sezer, O.B.; Gudelek, M.U.; Ozbayoglu, A.M. Financial time series forecasting with deep learning: A systematic literature review: 2005–2019. Appl. Soft Comput. 2020, 90, 106181. [Google Scholar]
- Zheng, J.; Huang, M. Traffic flow forecast through time series analysis based on deep learning. IEEE Access 2020, 8, 82562–82570. [Google Scholar]
- Jaseena, K.; Kovoor, B.C. Deterministic weather forecasting models based on intelligent predictors: A survey. J. King Saud Univ.-Comput. Inf. Sci. 2022, 34, 3393–3412. [Google Scholar]
- Xiao, J.; Zhou, Z. Research progress of RNN language model. In Proceedings of the 2020 IEEE International Conference on Artificial Intelligence and Computer Applications (ICAICA), Dalian, China, 27–29 June 2020; IEEE: New York, NY, USA, 2020; pp. 1285–1288. [Google Scholar]
- Hewamalage, H.; Bergmeir, C.; Bandara, K. Recurrent neural networks for time series forecasting: Current status and future directions. Int. J. Forecast. 2021, 37, 388–427. [Google Scholar]
- Saini, U.; Kumar, R.; Jain, V.; Krishnajith, M. Univariant Time Series forecasting of Agriculture load by using LSTM and GRU RNNs. In Proceedings of the 2020 IEEE Students Conference on Engineering & Systems (SCES), Prayagraj, India, 10–12 July 2020; IEEE: New York, NY, USA, 2020; pp. 1–6. [Google Scholar]
- Casado-Vara, R.; Martin del Rey, A.; Pérez-Palau, D.; de-la Fuente-Valentín, L.; Corchado, J.M. Web traffic time series forecasting using LSTM neural networks with distributed asynchronous training. Mathematics 2021, 9, 421. [Google Scholar] [CrossRef]
- Amalou, I.; Mouhni, N.; Abdali, A. Multivariate time series prediction by RNN architectures for energy consumption forecasting. Energy Rep. 2022, 8, 1084–1091. [Google Scholar]
- Lin, S.; Lin, W.; Wu, W.; Zhao, F.; Mo, R.; Zhang, H. Segrnn: Segment recurrent neural network for long-term time series forecasting. arXiv 2023, arXiv:2308.11200. [Google Scholar]
- Wen, Q.; Zhou, T.; Zhang, C.; Chen, W.; Ma, Z.; Yan, J.; Sun, L. Transformers in time series: A survey. arXiv 2022, arXiv:2202.07125. [Google Scholar]
- Ren, H.; Dai, H.; Dai, Z.; Yang, M.; Leskovec, J.; Schuurmans, D.; Dai, B. Combiner: Full attention transformer with sparse computation cost. Adv. Neural Inf. Process. Syst. 2021, 34, 22470–22482. [Google Scholar]
- Nie, Y.; Nguyen, N.H.; Sinthong, P.; Kalagnanam, J. A time series is worth 64 words: Long-term forecasting with transformers. arXiv 2022, arXiv:2211.14730. [Google Scholar]
- Liu, Y.; Hu, T.; Zhang, H.; Wu, H.; Wang, S.; Ma, L.; Long, M. itransformer: Inverted transformers are effective for time series forecasting. arXiv 2023, arXiv:2310.06625. [Google Scholar]
- Wu, H.; Xu, J.; Wang, J.; Long, M. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. Adv. Neural Inf. Process. Syst. 2021, 34, 22419–22430. [Google Scholar]
- Liu, Y.; Wu, H.; Wang, J.; Long, M. Non-stationary transformers: Exploring the stationarity in time series forecasting. Adv. Neural Inf. Process. Syst. 2022, 35, 9881–9893. [Google Scholar]
- Zeng, A.; Chen, M.; Zhang, L.; Xu, Q. Are transformers effective for time series forecasting? In Proceedings of the AAAI Conference on Artificial Intelligence, Washington, DC, USA, 7–14 February 2023; Volume 37, pp. 11121–11128. [Google Scholar]
- Yun, K.S.; Lee, J.Y.; Timmermann, A.; Stein, K.; Stuecker, M.F.; Fyfe, J.C.; Chung, E.S. Increasing ENSO–rainfall variability due to changes in future tropical temperature–rainfall relationship. Commun. Earth Environ. 2021, 2, 43. [Google Scholar]
- Wilby, R.L.; Wigley, T. Precipitation predictors for downscaling: Observed and general circulation model relationships. Int. J. Climatol. J. R. Meteorol. Soc. 2000, 20, 641–661. [Google Scholar]
- Park, S.; Kwak, N. Analysis on the dropout effect in convolutional neural networks. In Proceedings of the Computer Vision–ACCV 2016: 13th Asian Conference on Computer Vision, Taipei, Taiwan, 20–24 November 2016; Revised Selected Papers, Part II 13. Springer: Cham, Switzerland, 2017; pp. 189–204. [Google Scholar]
- Ye, T.; Dong, L.; Xia, Y.; Sun, Y.; Zhu, Y.; Huang, G.; Wei, F. Differential transformer. arXiv 2024, arXiv:2410.05258. [Google Scholar]
- Laplante, P.A.; Cravey, R.; Dunleavy, L.P.; Antonakos, J.L.; LeRoy, R.; East, J.; Buris, N.E.; Conant, C.J.; Fryda, L.; Boyd, R.W.; et al. Compr. Dict. Electr. Eng.; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
- Wang, H.; Wang, Z.; Niu, Y.; Liu, Z.; Li, H.; Liao, Y.; Huang, Y.; Liu, X. An Accurate and interpretable framework for trustworthy process monitoring. IEEE Trans. Artif. Intell. 2023, 5, 2241–2252. [Google Scholar]
- Zhao, L.; Shen, Y. Rethinking Channel Dependence for Multivariate Time Series Forecasting: Learning from Leading Indicators. arXiv 2024, arXiv:2401.17548. [Google Scholar]
- Whang, S.E.; Lee, J.G. Data collection and quality challenges for deep learning. Proc. VLDB Endow. 2020, 13, 3429–3432. [Google Scholar]
- Zhong, R.; Jun, S.; Xu, P. Analysis and de-noise of time series data from automatic weather station using chaos-based adaptive B-spine method. In Proceedings of the 2011 International Conference on Remote Sensing, Environment and Transportation Engineering, Nanjing, China, 24–26 June 2011; IEEE: New York, NY, USA, 2011; pp. 4765–4769. [Google Scholar]
- Yang, R.; Hu, J.; Li, Z.; Mu, J.; Yu, T.; Xia, J.; Li, X.; Dasgupta, A.; Xiong, H. Interpretable machine learning for weather and climate prediction: A review. Atmos. Environ. 2024, 338, 120797. [Google Scholar]
Model | Design | 96 | 192 | 336 | 720 | Average | |||||
---|---|---|---|---|---|---|---|---|---|---|---|
MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | ||
stackLSTM | Original | 0.298 | 0.318 | 0.387 | 0.373 | 0.501 | 0.459 | 0.548 | 0.495 | 0.434 | 0.411 |
+PCC | 0.221 | 0.266 | 0.242 | 0.289 | 0.285 | 0.314 | 0.369 | 0.376 | 0.279 | 0.311 | |
DLinear | Original | 0.207 | 0.233 | 0.244 | 0.269 | 0.286 | 0.306 | 0.345 | 0.354 | 0.271 | 0.290 |
+PCC | 0.152 | 0.197 | 0.196 | 0.239 | 0.243 | 0.276 | 0.304 | 0.318 | 0.224 | 0.258 | |
iTransformer | Original | 0.175 | 0.215 | 0.225 | 0.258 | 0.281 | 0.299 | 0.358 | 0.350 | 0.260 | 0.281 |
+PCC | 0.159 | 0.204 | 0.208 | 0.248 | 0.267 | 0.291 | 0.347 | 0.344 | 0.245 | 0.272 | |
PatchTST | Original | 0.175 | 0.216 | 0.221 | 0.257 | 0.280 | 0.298 | 0.352 | 0.347 | 0.257 | 0.280 |
+PCC | 0.157 | 0.205 | 0.207 | 0.249 | 0.270 | 0.293 | 0.345 | 0.344 | 0.245 | 0.273 | |
SegRNN | Original | 0.162 | 0.200 | 0.208 | 0.243 | 0.264 | 0.285 | 0.347 | 0.340 | 0.245 | 0.267 |
+PCC | 0.144 | 0.189 | 0.189 | 0.233 | 0.238 | 0.273 | 0.299 | 0.317 | 0.218 | 0.253 |
Backbone | Design | Observation Length | ||||
---|---|---|---|---|---|---|
48 | 96 | 192 | 336 | 720 | ||
stackLSTM | Original | 0.326 | 0.298 | 0.326 | 0.322 | 0.349 |
+PCC | 0.236 | 0.221 | 0.213 | 0.195 | 0.228 | |
DLinear | Original | 0.230 | 0.207 | 0.190 | 0.177 | 0.168 |
+PCC | 0.188 | 0.152 | 0.145 | 0.143 | 0.142 | |
iTransformer | Original | 0.202 | 0.175 | 0.170 | 0.162 | 0.175 |
+PCC | 0.181 | 0.159 | 0.154 | 0.151 | 0.159 | |
PatchTST | Original | 0.211 | 0.175 | 0.159 | 0.150 | 0.147 |
+PCC | 0.188 | 0.157 | 0.150 | 0.146 | 0.145 | |
SegRNN | Original | 0.203 | 0.162 | 0.150 | 0.146 | 0.142 |
+PCC | 0.169 | 0.144 | 0.139 | 0.138 | 0.138 |
Variable | Air Temperature | Specific Humidity | Wind Velocity | Precipitation | ||||
---|---|---|---|---|---|---|---|---|
MSE | MAE | MSE | MAE | MSE | MAE | MSE | MAE | |
Original | 0.099 | 0.223 | 0.097 | 0.212 | 0.001 | 0.021 | 0.067 | 0.055 |
+PCC | 0.082 | 0.218 | 0.062 | 0.173 | 0.001 | 0.018 | 0.052 | 0.035 |
Backbone | Design | Reduced Variable Set | Full Variable Set | ||
---|---|---|---|---|---|
MSE | MAE | MSE | MAE | ||
SegRNN | Original | 0.219 | 0.208 | 0.162 | 0.200 |
+PCC | 0.196 | 0.195 | 0.144 | 0.189 | |
DLinear | Original | 0.284 | 0.253 | 0.207 | 0.233 |
+PCC | 0.207 | 0.207 | 0.152 | 0.197 |
Design | Operation | Forecasting Length | |||
---|---|---|---|---|---|
96 | 192 | 336 | 720 | ||
MCC | w/o | 0.149 | 0.193 | 0.241 | 0.302 |
FC-only | 0.145 | 0.190 | 0.239 | 0.301 | |
SC | w/o | 0.146 | 0.196 | 0.255 | 0.332 |
FC-only | 0.148 | 0.192 | 0.245 | 0.312 | |
Diff-Rep | w/o | 0.146 | 0.192 | 0.243 | 0.307 |
PCC | w/o | 0.162 | 0.208 | 0.264 | 0.340 |
Invert | 0.147 | 0.192 | 0.244 | 0.309 | |
✓ | 0.144 | 0.189 | 0.238 | 0.299 |
Design | MSE | Time (s) | Memory (MB) | |||
---|---|---|---|---|---|---|
96 | 720 | 96 | 720 | 96 | 720 | |
Original | 0.162 | 0.347 | 176 | 227 | 204.55 | 865.39 |
+PCC | 0.144 | 0.299 | 214 | 219 | 204.47 | 868.00 |
Design | MSE | Time (ms) | Parameters (k) | |||
---|---|---|---|---|---|---|
96 | 720 | 96 | 720 | 96 | 720 | |
SAMP + SC | 0.152 | 0.304 | 0.325 | 0.401 | 86.5 | 246.3 |
LIFT + SC | 0.149 | 0.303 | 0.348 | 0.455 | 1694.7 | 2738.2 |
saPCC | 0.147 | 0.301 | 0.311 | 0.411 | 112.5 | 3173.9 |
PCC | 0.144 | 0.299 | 0.297 | 0.375 | 35.6 | 195.4 |
Noisy Input | 0.1 | 0.3 | ||
---|---|---|---|---|
MSE | MAE | MSE | MAE | |
SegRNN | 0.167 | 0.211 | 0.176 | 0.228 |
SegRNN + PCC | 0.149 | 0.198 | 0.159 | 0.210 |
Noisy Sequence | 0.1 | 0.3 | ||
MSE | MAE | MSE | MAE | |
SegRNN | 0.178 | 0.240 | 0.267 | 0.357 |
SegRNN + PCC | 0.161 | 0.228 | 0.250 | 0.346 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, Z.; Luo, Z.; Yang, Z.; Liu, Y. Post Constraint and Correction: A Plug-and-Play Module for Boosting the Performance of Deep Learning Based Weather Multivariate Time Series Forecasting. Appl. Sci. 2025, 15, 3935. https://doi.org/10.3390/app15073935
Wang Z, Luo Z, Yang Z, Liu Y. Post Constraint and Correction: A Plug-and-Play Module for Boosting the Performance of Deep Learning Based Weather Multivariate Time Series Forecasting. Applied Sciences. 2025; 15(7):3935. https://doi.org/10.3390/app15073935
Chicago/Turabian StyleWang, Zhengrui, Zhongwen Luo, Zirui Yang, and Yuanyuan Liu. 2025. "Post Constraint and Correction: A Plug-and-Play Module for Boosting the Performance of Deep Learning Based Weather Multivariate Time Series Forecasting" Applied Sciences 15, no. 7: 3935. https://doi.org/10.3390/app15073935
APA StyleWang, Z., Luo, Z., Yang, Z., & Liu, Y. (2025). Post Constraint and Correction: A Plug-and-Play Module for Boosting the Performance of Deep Learning Based Weather Multivariate Time Series Forecasting. Applied Sciences, 15(7), 3935. https://doi.org/10.3390/app15073935