A Novel Hybrid Intelligent SOPDEL Model with Comprehensive Data Preprocessing for Long-Time-Series Climate Prediction
Abstract
:1. Introduction
- (1)
- Empirical statistical method;
- (2)
- Mathematical statistical method;
- (3)
- Physical statistical method;
- (4)
- Dynamic mode;
- (5)
- Machine learning.
2. Materials and Methods
2.1. General Idea of This Paper
2.2. Data Acquisition by Remote Sensing
- (1)
- Temperature obtained by remote sensing: The meteorological satellite is equipped with a remote sensor that captures sensing images, while a sensing instrument performs inversion by measuring the range of thermal radiation. Various sensors are employed to observe far-infrared bands and obtain pixel values, as different components of the earth’s surface exhibit different radiation characteristics along bands. The values are then converted into thermal infrared radiation values, and an appropriate mapping is established between radiation values and the earth’s surface temperature by using suitable models.
- (2)
- Rainfall obtained by remote sensing: The method can be divided into infrared remote sensing, passive-microwave remote sensing and active-microwave detection. Infrared remote sensing retrieves surface precipitation intensity by utilizing the empirical relationship between the cloud-top temperature and surface precipitation. Generally, strong-precipitation clouds tend to have a lower cloud-top temperature. The widely-used satellite-infrared-inversing precipitation data were developed by the prediction center of the Atmospheric and Oceanic Administration of the United States according to this principle. A GPI algorithm is more suitable for deep convective precipitation and has poor expressiveness for stratus precipitation. Passive microwave remote sensing employs two schemes for retrieving precipitation: the microwave-emission scheme and the scattering scheme. The microwave-emission scheme inverts the surface precipitation by observing low-frequency (e.g., 19 GHz) microwave radiation emitted by precipitation particles. The principle behind the scheme is that, under the lower radiation background, stronger precipitation and more liquid water particles in the cloud will increase the brightness temperature of upward radiation. This scheme has demonstrated good results on the ocean surface but not on land. In contrast, the microwave scattering scheme retrieves precipitation by utilizing a high-frequency (e.g., 85 ghz) signal of ice particles on the upper part of the cloud. The more ice particles there are, the lower the upward-scattering-brightness temperature and the stronger the surface precipitation. Although the microwave scattering scheme is more indirect compared with the emission scheme, it can be used to invert land-surface precipitation by establishing an empirical or semi-empirical relationship between the precipitation rate and scattering signal according to the observation.
- (3)
- Snowfall obtained by remote sensing: The method used is the same as that of measuring rainfall. Raining or snowing is related to local temperature.
2.3. PD-RS-PE Technology for Data Decomposition
Algorithm 1. PD-RS-PE. |
Input: original climate data |
Output: decomposed climate data |
1. construct three figures with x-axis of month and y-axis of respectively |
2. for each period of input do in figures |
3. split variables with 12 sub-coordinates to sub region |
4. end for |
5. for each sub region do in figures |
6. get minimum value in peaks |
7. get maximum value in valleys |
8. end for |
9. get |
10. get |
11. draw a line crossing and perpendicular to y-axis |
12. draw a line crossing and perpendicular to y-axis |
13. extract data with , |
2.4. Unsupervised Learning for Feature Extraction
Algorithm 2. DBN feature extraction. |
Input: decomposed climate data |
Output: decomposed-feature climate data |
1. do unsupervised pre-training: |
2. construct a three-layer DBN → three RBMs with |
3. train RBM using cd-k |
4. |
5. |
6. |
7. end do |
8. do supervised regression-level training: |
9. set input vector and bias number |
10. add. (activation function, regularization coefficient) |
11. set weight of test set with bias layer |
12. final output of DBN network |
13. end do |
- Unsupervised pre-training
- 2.
- Supervised regression-level training.
2.5. Three-Dimensional Input Conversion Technology for Data Dimensionality Upgrading in a Novel Spatiotemporal-Factor Matrix
2.6. Evolutionary Algorithm for Model Optimization
2.7. Proposed Climate Prediction System Named SS-OS-PSO-DBN-ELM-LSTME (SOPDEL)
2.7.1. PSO-DBN-ELME Algorithm
2.7.2. PSO-DBN-LSTME Algorithm
- (1)
- Forget gate: Determine which information in cell needs to be discarded. Output a vector by viewing the information of and . The element’s value of 0~1 in the vector indicates how much information in cell state needs to be retained or discarded; 0 means no reservation, 1 means all reservation.
- (2)
- Input gate: Determine which information in the cell needs to be updated. Firstly, and are used to determine the information to be updated, and then new candidate cell information is obtained through a layer, which may be updated into the cell information. For the update of the old cell information to the new , the rule is to forget part of the old cell information by the selection of the forget gate, and is obtained by inputting part of the candidate cell information by gate selection.
- (3)
- Output gate: Determine which information in the cell needs to be output. The output is activated by the function, and it needs to enter the Sigmoid layer to get the judging condition of the cell’s output state characteristics. The final output of the LSTM cell is obtained by multiplying the judging condition of the input and output gate.
2.7.3. SOPDEL Algorithm
2.8. Performance Evaluation Indices
3. Nine Climate Forecasting Models for Comparison and Verification
4. Results and Discussion
4.1. Comparative Analysis for Fitting Performance in Training Datasets between M1–M9 Models
4.2. Comparative Analysis for Predicting Performance in Testing Datasets between M1–M6 Models
4.3. Comparative Analysis for Fitting Performance in Training Datasets between M6–M9 Models
4.4. Comparative Analysis for Predicting Performance in Testing Datasets between M6–M9 Models
4.5. Predicting Performance for the Proposed Model
- (1)
- When the error is within ±0.2 °C, it is judged that the prediction quality of the month is very prominent, which is expressed as “Excellent”;
- (2)
- When the error is between ±0.2 °C and 0.5 °C, it is judged that the prediction quality of the month is good, which is expressed as “Good”;
- (3)
- When the error is within ±0.5~1 °C, it is judged that the prediction quality of the month is medium, which is expressed as “Moderate”;
- (4)
- When the error is beyond ±1 °C, it is judged that the prediction quality of the month is poor, which is expressed as “Bad”.
- (5)
- Evaluation system of rainfall:
- (6)
- When the error is within ±5mm, it is judged that the prediction quality of the month is very prominent, which is expressed as “Excellent”;
- (7)
- When the error is within ±5~10 mm, it is judged that the prediction quality of the month is good, which is expressed by “Good”;
- (8)
- When the error is within ±10~20 mm, it is judged that the prediction quality of the month is medium, which is expressed by “Moderate”;
- (9)
- When the error is beyond ±20 mm, it is judged that the prediction quality of the month is poor, which is expressed as “Bad”.
5. Conclusions
- (1)
- Different machine learning methods suitable for temporal data prediction will exhibit better prediction performance in specific types of datasets. Specifically, LSTM is better suited for predicting stationary data sequences, while an ELM is more appropriate for predicting oscillating data sequences. In training datasets, the improvement of in M6 is 0.0034, 0.0109 and 0.0067 compared to M5, respectively. In testing datasets, the improvement of in M6 is 0.0018, 0.0040 and 0.0026 compared to M5, respectively. The case study illustrates the feasibility of PD-RS-PE technology.
- (2)
- The construction of a 3D spatiotemporal-factor matrix enables the realization of data dimensionality upgrading. Its function is to reduce the disturbance caused by temporary climate mutations in the predicted data, thus enhancing the overall system’s robustness. As this method reduces the step size of data entry, it offers unique advantages in time-series data that require adequate training. In training datasets, the improvement of in M2 is 0.0079, 0.0047 and 0.0352 compared to M1, respectively. In testing datasets, the improvement of in M2 is 0.0095, 0.0167 and 0.0681 compared to M1, respectively. The case study embodies the feasibility of three-dimensional input conversion technology.
- (3)
- A DBN has compatible interfaces with both an ELM and LSTM. As the amount of information input increases significantly after upgrading the data dimensionality, eliminating irrelevant information, becomes increasingly critical. The feature extraction technique of the DBN can effectively assist ELM and LSTM neural networks in learning more valuable information. In the training datasets, the improvement of in M3 is 0.0025, 0.0140 and 0.1628 compared to M2, respectively. In the testing datasets, the improvement of in M3 is 0.0030, 0.0319 and 0.1545 compared to M2, respectively. The case study announces the feasibility of the DBN feature extraction.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Tol, S. Estimates of the Damage Costs of Climate Change, Part II. Dynamic Estimates. Environ. Resour. Econ. 2022, 21, 135–160. [Google Scholar] [CrossRef]
- Cifuentes, J.; Marulanda, G. Air Temperature Forecasting Using Machine Learning Techniques: A Review. Energies 2020, 13, 4215. [Google Scholar] [CrossRef]
- Figura, S.; Livingstone, M. Forecasting Groundwater Temperature with Linear Regression Models Using Historical Data. Groundwater 2015, 53, 943–954. [Google Scholar] [CrossRef] [PubMed]
- Cresswell, M. Empirical Methods in Short-term Climate Prediction. Geogr. J. 2009, 175, 85. [Google Scholar] [CrossRef]
- Chryst, B.; Marlon, J. Global Warming’s “Six Americas Short Survey”: Audience Segmentation of Climate Change Views Using a Four Question Instrument. Environ. Commun. 2018, 12, 1109–1122. [Google Scholar] [CrossRef] [Green Version]
- Newman, M. An Empirical Benchmark for Decadal Forecasts of Global Surface Temperature Anomalies. J. Clim. 2013, 26, 5260–5269. [Google Scholar] [CrossRef]
- Penland, C.; Matrosova, L. Prediction of Tropical Atlantic Sea Surface Temperatures Using Linear Inverse Modeling. J. Clim. 1998, 11, 483–496. [Google Scholar] [CrossRef]
- Thorson, T. Empirical orthogonal function regression: Linking population biology to spatial varying environmental conditions using climate projections. Glob. Chang. Biol. 2020, 26, 4638–4649. [Google Scholar] [CrossRef]
- Troumbis, A.; Tsekouras, E. A Chebyshev polynomial feedforward neural network trained by differential evolution and its application in environmental case studies. Environ. Model. Softw. 2020, 126, 663. [Google Scholar] [CrossRef]
- Price, T.; McKenney, D. A comparison of two statistical methods for spatial interpolation of Canadian monthly mean climate data. Agric. For. Meteorol. 2000, 101, 81–94. [Google Scholar] [CrossRef]
- Cheng, X. Climate modulation of Niño3.4 SST-anomalies on air quality change in southern China: Application to seasonal forecast of haze pollution. Atmos. Res. 2019, 225, 157–164. [Google Scholar] [CrossRef]
- Yu, Y.; Zhang, X. Global coupled ocean-atmosphere general circulation models in LASG/IAP. Adv. Atmos. Sci. 2004, 21, 444–455. [Google Scholar]
- Rasch, J.; Kristjánsson, J. A Comparison of the CCM3 Model Climate Using Diagnosed and Predicted Condensate Parameterizations. J. Clim. 1998, 11, 1587–1614. [Google Scholar] [CrossRef]
- Javanroodi, K.; Nik, V. Combining computational fluid dynamics and neural networks to characterize microclimate extremes: Learning the complex interactions between meso-climate and urban morphology. Sci. Total Environ. 2022, 829, 154–223. [Google Scholar] [CrossRef]
- Wang, J.; Song, Y. Analysis and application of forecasting models in wind power integration: A review of multi-step-ahead wind speed forecasting models. Renew. Sustain. Energy Rev. 2016, 60, 960–981. [Google Scholar] [CrossRef]
- Shi, Z.; Han, M. Support Vector Echo-State Machine for Chaotic Time-Series Prediction. IEEE Trans. Neural Netw. 2007, 18, 359–372. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Li, X. A Neural Prediction Model to Predict Power Load Involving Weather Element. In Proceedings of the International Conference on Machinery, New York, NY, USA, 15–20 July 2017. [Google Scholar] [CrossRef] [Green Version]
- Viehweg, J.; Worthmann, K. Parameterizing echo state networks for multi-step time series prediction. Neurocomputing 2023, 522, 214–228. [Google Scholar] [CrossRef]
- Lai, Y.; Dzombak, A. Use of the Autoregressive Integrated Moving Average (ARIMA) Model to Forecast Near-Term Regional Temperature and Precipitation. Weather. Forecast. 2020, 35, 959–976. [Google Scholar] [CrossRef] [Green Version]
- Venkatesan, C.; Raskar, D. Prediction of all India summer monsoon rainfall using error-back-propagation neural networks. Meteorol. Atmos. Phys. 1997, 62, 225–240. [Google Scholar] [CrossRef]
- Deshpande, R. On the rainfall time series prediction using multilayer perceptron artificial neural network. Int. J. Emerg. Technol. Adv. Eng. 2012, 2, 2250–2459. [Google Scholar]
- Wu, L.; Chau, W. Prediction of rainfall time series using modular soft computing methods. Eng. Appl. Artif. Intell. 2013, 26, 997–1007. [Google Scholar] [CrossRef] [Green Version]
- Gupta, A.; Gautam, A. Time series analysis of forecasting Indian rainfall. Int. J. Inventive Eng. Sci. 2013, 1, 42–45. [Google Scholar]
- Elham, F.; Rahim, B. Design and implementation of a hybrid model based on two-layer decomposition method coupled with extreme learning machines to support real-time environmental monitoring of water quality parameters. Sci. Total Environ. 2019, 648, 839–853. [Google Scholar] [CrossRef]
- Feng, Z.; Huang, G. Classification of the Complex Agricultural Planting Structure with a Semi-Supervised Extreme Learning Machine Framework. Remote Sens. 2020, 12, 3708. [Google Scholar] [CrossRef]
- Poonam, P.; Vinal, P. KELM-CPPpred: Kernel Extreme Learning Machine Based Prediction Model for Cell-Penetrating Peptides. J. Proteome Res. 2018, 17, 3214–3222. [Google Scholar] [CrossRef]
- Syed, Y.; Abdulrahman, A. Predicting catastrophic temperature changes based on past events via a CNN-LSTM regression mechanism. Neural Comput. Appl. 2021, 33, 9775–9790. [Google Scholar] [CrossRef]
- Mtibaa, F.; Nguyen, K. LSTM-based indoor air temperature prediction framework for HVAC systems in smart buildings. Neural Comput. Appl. 2020, 32, 17569–17585. [Google Scholar] [CrossRef]
- Chung, J.; Lee, Y. Correlation Analysis between Air Temperature and MODIS Land Surface Temperature and Prediction of Air Temperature Using TensorFlow Long Short-Term Memory for the Period of Occurrence of Cold and Heat Waves. Remote Sens. 2020, 12, 3231. [Google Scholar] [CrossRef]
- Yanlin, Q.; Qi, L. A hybrid model for spatiotemporal forecasting of PM2.5 based on graph convolutional neural network and long short-term memory. Sci. Total Environ. 2019, 664, 1–10. [Google Scholar] [CrossRef]
- Jun, M.; Jack, P. Improving air quality prediction accuracy at larger temporal resolutions using deep learning and transfer learning techniques. Atmos. Environ. 2019, 214, 116885. [Google Scholar] [CrossRef]
- Wei, L.; Guan, L. Prediction of Sea Surface Temperature in the China Seas Based on Long Short-Term Memory Neural Networks. Remote Sens. 2020, 12, 2697. [Google Scholar] [CrossRef]
- Wenquan, X.; Hui, P. DBN based SD-ARX model for nonlinear time series prediction and analysis. Appl. Intell. 2020, 50, 4586–4601. [Google Scholar] [CrossRef]
- Gochoo, M.; Akhter, I. Stochastic Remote Sensing Event Classification over Adaptive Posture Estimation via Multifused Data and Deep Belief Network. Remote Sens. 2021, 13, 912. [Google Scholar] [CrossRef]
- Raul, F.; Juan, G. Real Evaluations Tractability using Continuous Goal-Directed Actions in Smart City Applications. Sensors 2018, 18, 3818. [Google Scholar] [CrossRef] [Green Version]
- Yaqi, W.; Fei, H. An Improved Ensemble Extreme Learning Machine Based on ARPSO and Tournament-Selection. In Proceedings of the Advances in Swarm Intelligence, Bali, Indonesia, 25–30 June 2016; pp. 89–96. [Google Scholar] [CrossRef]
- Yu, Z.; Ye, Y. A novel multimodal retrieval model based on ELM. Neurocomputing 2018, 277, 65–77. [Google Scholar] [CrossRef]
- Liu, Z.; Yang, S. Fast SAR Autofocus Based on Ensemble Convolutional Extreme Learning Machine. Remote Sens. 2021, 13, 2683. [Google Scholar] [CrossRef]
- Javaria, A.; Muhammad, S. Brain tumor detection: A long short-term memory (LSTM)-based learning model. Neural Comput. Appl. 2020, 32, 15965–15973. [Google Scholar] [CrossRef]
- Dawen, X.; Maoting, Z. A distributed WND-LSTM model on MapReduce for short-term traffic flow prediction. Neural Comput. Appl. 2021, 33, 2393–2410. [Google Scholar] [CrossRef]
- Okan, S.; Olcay, P. Real-time prediction of online shoppers’ purchasing intention using multilayer perceptron and LSTM recurrent neural networks. Neural Comput. Appl. 2019, 31, 6893–6908. [Google Scholar] [CrossRef]
- Zhongrun, X.; Jun, Y. A Rainfall-Runoff Model With LSTM-Based Sequence-to-Sequence Learning. Water Resour. Res. 2019, 56, e2019WR025326. [Google Scholar] [CrossRef]
- Ata, A.; Tiantian, Y. Short-Term Precipitation Forecast Based on the PERSIANN System and LSTM Recurrent Neural Networks. J. Geophys. Res. Atmos. 2018, 123, 12543–12563. [Google Scholar] [CrossRef]
- Mohammad, A.; Bashar, T. Using long short-term memory deep neural networks for aspect-based sentiment analysis of Arabic reviews. Int. J. Mach. Learn. Cybern. 2019, 10, 2163–2175. [Google Scholar] [CrossRef]
- Gyeongmin, K.; Chanhee, L. Automatic extraction of named entities of cyber threats using a deep Bi-LSTM-CRF network. Int. J. Mach. Learn. Cybern. 2021, 11, 2341–2355. [Google Scholar] [CrossRef]
- Mario, M.; Paula, Q. Learning Carbohydrate Digestion and Insulin Absorption Curves Using Blood Glucose Level Prediction and Deep Learning Models. Sensors 2021, 21, 4926. [Google Scholar] [CrossRef]
- Zhang, L.; Zhang, Z. Combining Optical, Fluorescence, Thermal Satellite, and Environmental Data to Predict County-Level Maize Yield in China Using Machine Learning Approaches. Remote Sens. 2020, 12, 21. [Google Scholar] [CrossRef] [Green Version]
Climate Data | Model | ||||
---|---|---|---|---|---|
The highest temperature | M6 | 0.9944 | 0.4463 | 0.3243 | 0.0007 |
M5 | 0.9910 | 0.5599 | 0.3438 | 0.0006 | |
M4 | 0.9742 | 0.9374 | 0.7041 | 0.0005 | |
M3 | 0.9718 | 0.9799 | 0.7647 | 0.0005 | |
M2 | 0.9693 | 1.0472 | 0.6684 | 0.0008 | |
M1 | 0.9614 | 1.1529 | 0.7971 | 0.0009 | |
The lowest temperature | M6 | 0.9900 | 0.4760 | 0.3599 | 0.0002 |
M5 | 0.9791 | 0.6888 | 0.4303 | 0.0006 | |
M4 | 0.9673 | 0.8625 | 0.6158 | 0.0019 | |
M3 | 0.9584 | 0.9710 | 0.7353 | 0.0017 | |
M2 | 0.9444 | 1.1374 | 0.7276 | 0.0026 | |
M1 | 0.9397 | 1.1818 | 0.7875 | 0.0027 | |
Rainfall | M6 | 0.9669 | 11.3331 | 8.2658 | 0.0001 |
M5 | 0.9602 | 12.4948 | 7.7168 | 0.0005 | |
M4 | 0.7880 | 28.4910 | 19.9793 | 0.0081 | |
M3 | 0.7239 | 32.5228 | 24.3261 | 0.0100 | |
M2 | 0.5611 | 41.2722 | 28.5648 | 0.0208 | |
M1 | 0.5259 | 42.8850 | 30.6940 | 0.0221 |
Climate Data | Proposed Model | ||||
The highest temperature | M6 | 0.9990 | 0.1981 | 0.1632 | 0.0052 |
M5 | 0.9972 | 0.3312 | 0.2206 | 0.0032 | |
M4 | 0.9964 | 0.3836 | 0.3085 | 0.0126 | |
M3 | 0.9835 | 0.7985 | 0.5936 | 0.0092 | |
M2 | 0.9805 | 1.1110 | 0.8832 | 0.0932 | |
M1 | 0.9710 | 1.2862 | 1.0848 | 0.1051 | |
The lowest temperature | M6 | 0.9964 | 0.2986 | 0.2141 | 0.0004 |
M5 | 0.9924 | 0.4476 | 0.2515 | 0.0012 | |
M4 | 0.9890 | 0.5283 | 0.3814 | 0.0082 | |
M3 | 0.9633 | 0.9583 | 0.7018 | 0.0062 | |
M2 | 0.9314 | 1.8562 | 1.6509 | 0.3455 | |
M1 | 0.9147 | 1.9314 | 1.7136 | 0.3332 | |
Rainfall | M6 | 0.9934 | 5.9639 | 4.8788 | 0.0206 |
M5 | 0.9908 | 7.0205 | 3.6939 | 0.0327 | |
M4 | 0.9793 | 10.7810 | 8.5691 | 0.0484 | |
M3 | 0.8828 | 25.3663 | 19.0880 | 0.1493 | |
M2 | 0.7283 | 45.3143 | 35.5998 | 0.4868 | |
M1 | 0.6602 | 43.9278 | 33.6869 | 0.3867 |
Climate Data | Model | ||||
---|---|---|---|---|---|
The highest temperature | M6 | 0.9944 | 0.4463 | 0.3243 | 0.0007 |
M7 | 0.9817 | 0.9079 | 0.7281 | 0.0032 | |
M8 | 0.9895 | 1.5036 | 1.1899 | 0.0005 | |
M9 | 0.9889 | 1.4249 | 1.1040 | 0.0003 | |
The lowest temperature | M6 | 0.9900 | 0.4760 | 0.3599 | 0.0002 |
M7 | 0.9733 | 0.9271 | 0.7168 | 0.0063 | |
M8 | 0.9090 | 1.4749 | 1.1457 | 0.0021 | |
M9 | 0.9251 | 1.3080 | 0.9647 | 0.0038 | |
Rainfall | M6 | 0.9669 | 11.3331 | 8.2658 | 0.0001 |
M7 | 0.9998 | 1.0390 | 0.7868 | 0.0004 | |
M8 | 0.3783 | 51.2115 | 38.7237 | 0.0253 | |
M9 | 0.4600 | 46.8355 | 34.4975 | 0.0402 |
Climate Data | Model | ||||
---|---|---|---|---|---|
The highest temperature | M6 | 0.9990 | 0.1981 | 0.1632 | 0.0052 |
M7 | 0.8455 | 5.9297 | 5.4070 | 0.6155 | |
M8 | 0.9893 | 1.5866 | 1.2181 | 0.0972 | |
M9 | 0.9900 | 2.1328 | 1.8084 | 0.0601 | |
The lowest temperature | M6 | 0.9964 | 0.2986 | 0.2141 | 0.0004 |
M7 | 0.8383 | 2.2091 | 1.7702 | 0.2982 | |
M8 | 0.8762 | 1.7638 | 1.3624 | 0.1071 | |
M9 | 0.8549 | 2.2919 | 1.8396 | 0.0267 | |
Rainfall | M6 | 0.9934 | 5.9639 | 4.8788 | 0.0206 |
M7 | 0.0541 | 93.2919 | 79.0570 | 0.4900 | |
M8 | 0.4590 | 54.3568 | 43.6699 | 0.4365 | |
M9 | 0.6106 | 46.1647 | 35.2501 | 0.3776 |
Date | The Highest Temperature (Unit: °C) | The Lowest Temperature (Unit: °C) | Rainfall (Unit: mm) | ||||||
---|---|---|---|---|---|---|---|---|---|
Actual | Forecast | Quality | Actual | Forecast | Quality | Actual | Forecast | Quality | |
December 2015 | 7.5 | 7.3398 | Excellent | 2.6 | 2.5070 | Excellent | 218.6 | 208.2459 | Good |
January 2016 | 7.7 | 7.5826 | Excellent | 1.6 | 1.8390 | Good | 167.2 | 159.2228 | Good |
February 2016 | 10.1 | 10.1042 | Excellent | 4.1 | 4.1217 | Excellent | 130.4 | 137.5251 | Good |
March 2016 | 11.6 | 11.5474 | Excellent | 5.1 | 5.0720 | Excellent | 161.6 | 160.6963 | Excellent |
April 2016 | 15.7 | 15.9185 | Good | 7.8 | 7.9100 | Excellent | 24.2 | 16.4077 | Good |
May 2016 | 18.5 | 18.6536 | Excellent | 10 | 10.1467 | Excellent | 51.6 | 51.5732 | Excellent |
June 2016 | 20.2 | 20.3909 | Excellent | 12.1 | 12.3117 | Good | 58.2 | 47.6714 | Moderate |
July 2016 | 22 | 22.2694 | Good | 14.6 | 14.5930 | Excellent | 32.8 | 36.4994 | Excellent |
August 2016 | 22 | 22.6675 | Moderate | 14.5 | 14.3521 | Excellent | 13.8 | 13.3474 | Excellent |
September 2016 | 18 | 18.2240 | Good | 10.4 | 10.6100 | Good | 78.4 | 66.4883 | Good |
October 2016 | 13.7 | 13.8064 | Excellent | 8.3 | 8.4452 | Excellent | 203.4 | 198.4994 | Excellent |
November 2016 | 11.8 | 11.5107 | Good | 7.6 | 7.0967 | Moderate | 240.2 | 241.4955 | Excellent |
December 2016 | 3.7 | 3.7118 | Excellent | −1.9 | −1.7218 | Excellent | 117.8 | 112.8448 | Excellent |
January 2017 | 5.1 | 5.1253 | Excellent | −1.3 | 1.2987 | Excellent | 98.8 | 99.7717 | Excellent |
February 2017 | 6.2 | 6.5295 | Good | −0.1 | 0.1312 | Excellent | 78.8 | 75.8139 | Excellent |
March 2017 | 9.5 | 9.4452 | Excellent | 4.2 | 4.1198 | Excellent | 209.8 | 207.2923 | Excellent |
April 2017 | 12.6 | 12.4872 | Excellent | 6.5 | 6.2520 | Good | 124.2 | 122.0196 | Excellent |
May 2017 | 17.1 | 16.9661 | Excellent | 8.9 | 8.8326 | Excellent | 102 | 96.3386 | Good |
June 2017 | 19.7 | 19.7449 | Excellent | 11.6 | 11.6155 | Excellent | 45.2 | 41.6002 | Excellent |
July 2017 | 22.9 | 23.0222 | Excellent | 13.8 | 13.9241 | Excellent | 1.8 | 5.1026 | Excellent |
August 2017 | 23.3 | 23.4255 | Excellent | 14.2 | 14.2678 | Excellent | 5 | 7.3658 | Excellent |
September 2017 | 19.9 | 20.0255 | Excellent | 11.6 | 11.7353 | Excellent | 29.4 | 29.8482 | Excellent |
October 2017 | 13.4 | 13.2980 | Excellent | 5.9 | 6.1782 | Good | 114.3 | 128.4054 | Moderate |
November 2017 | 9.2 | 9.2724 | Excellent | 7.6 | 7.1276 | Good | 212 | 214.0281 | Excellent |
December 2017 | 5.1 | 4.9846 | Excellent | −0.3 | −0.2994 | Excellent | 151.6 | 149.0540 | Excellent |
January 2018 | 7.5 | 7.7038 | Good | 3.1 | 3.4227 | Good | 249.4 | 247.1281 | Excellent |
February 2018 | 6.2 | 6.1890 | Excellent | 0.6 | 0.4597 | Excellent | 89.8 | 98.0948 | Good |
March 2018 | 9.5 | 9.7942 | Good | 2.6 | 2.8414 | Good | 110.2 | 106.2041 | Excellent |
April 2018 | 12.8 | 12.7656 | Excellent | 5.9 | 5.8745 | Excellent | 134.4 | 139.7440 | Good |
May 2018 | 19.3 | 19.4378 | Excellent | 10.6 | 10.7660 | Excellent | 2 | 9.8711 | Good |
June 2018 | 19.9 | 19.7199 | Excellent | 12.2 | 11.3656 | Good | 39.4 | 52.8009 | Moderate |
July 2018 | 24.1 | 24.3585 | Good | 14.2 | 14.5827 | Good | 4.4 | 6.9913 | Excellent |
August 2018 | 22.8 | 22.8814 | Excellent | 13.8 | 13.7947 | Excellent | 16.2 | 12.2030 | Excellent |
September 2018 | 18.2 | 18.3292 | Excellent | 10.8 | 10.9668 | Excellent | 117.4 | 110.7052 | Good |
October 2018 | 13.7 | 13.8661 | Excellent | 5.7 | 6.4413 | Moderate | 131.2 | 119.9685 | Moderate |
November 2018 | 10.3 | 10.6193 | Good | 7.5 | 6.9648 | Moderate | 179.2 | 180.5537 | Excellent |
December 2018 | 7.7 | 7.6093 | Excellent | 2 | 2.3930 | Good | 251.8 | 247.2106 | Excellent |
January 2019 | 7.9 | 8.1561 | Good | 2 | 2.2026 | Good | 140.8 | 135.3906 | Excellent |
February 2019 | 3.9 | 3.7840 | Excellent | −3.1 | −3.3859 | Good | 43.4 | 41.1272 | Excellent |
March 2019 | 10.9 | 10.3358 | Moderate | 1.5 | 1.5063 | Excellent | 31.2 | 35.4759 | Excellent |
April 2019 | 13.2 | 13.1458 | Excellent | 6 | 5.9137 | Excellent | 110.8 | 103.8416 | Good |
May 2019 | 18.5 | 18.3757 | Excellent | 10 | 10.0341 | Excellent | 30.4 | 38.5369 | Good |
June 2019 | 21.2 | 21.3732 | Excellent | 11.9 | 12.3973 | Good | 26.2 | 18.7868 | Good |
July 2019 | 22.8 | 22.4856 | Good | 14.2 | 14.0893 | Excellent | 30.8 | 34.0751 | Excellent |
August 2019 | 23 | 22.7455 | Good | 14.2 | 14.0337 | Excellent | 25.8 | 30.1011 | Excellent |
September 2019 | 18.6 | 18.4169 | Excellent | 11.8 | 11.6109 | Excellent | 122.2 | 124.5298 | Excellent |
October 2019 | 12 | 11.8299 | Excellent | 4.9 | 5.2675 | Good | 122.6 | 118.3732 | Excellent |
November 2019 | 9.4 | 9.7790 | Good | 7.6 | 6.7031 | Moderate | 92.1 | 96.2326 | Excellent |
December 2019 | 7.5 | 7.6247 | Excellent | 3.6 | 3.6619 | Excellent | 157.8 | 156.9835 | Excellent |
January 2020 | 7 | 7.0094 | Excellent | 2.2 | 2.2843 | Excellent | 226 | 230.1365 | Excellent |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhou, Z.; Tang, W.; Li, M.; Cao, W.; Yuan, Z. A Novel Hybrid Intelligent SOPDEL Model with Comprehensive Data Preprocessing for Long-Time-Series Climate Prediction. Remote Sens. 2023, 15, 1951. https://doi.org/10.3390/rs15071951
Zhou Z, Tang W, Li M, Cao W, Yuan Z. A Novel Hybrid Intelligent SOPDEL Model with Comprehensive Data Preprocessing for Long-Time-Series Climate Prediction. Remote Sensing. 2023; 15(7):1951. https://doi.org/10.3390/rs15071951
Chicago/Turabian StyleZhou, Zeyu, Wei Tang, Mingyang Li, Wen Cao, and Zhijie Yuan. 2023. "A Novel Hybrid Intelligent SOPDEL Model with Comprehensive Data Preprocessing for Long-Time-Series Climate Prediction" Remote Sensing 15, no. 7: 1951. https://doi.org/10.3390/rs15071951
APA StyleZhou, Z., Tang, W., Li, M., Cao, W., & Yuan, Z. (2023). A Novel Hybrid Intelligent SOPDEL Model with Comprehensive Data Preprocessing for Long-Time-Series Climate Prediction. Remote Sensing, 15(7), 1951. https://doi.org/10.3390/rs15071951