A Robust Hybrid Deep Learning Model for Spatiotemporal Image Fusion
Abstract
:1. Introduction
2. Materials and Methods
2.1. Study Area
2.2. Satellite Data
2.3. Simulation Data
2.4. Additional Test Sites
2.5. Hybrid Deep Learning Model
2.5.1. Hybrid Deep Learning Model: SRCNN
2.5.2. Hybrid Deep Learning Model: LSTM
2.5.3. Implementation of the Hybrid Deep Learning Model
2.6. Design of Phenological Change Scenarios
2.7. Benchmark Fusion Models
2.8. Accuracy Assessment
3. Results
3.1. Scenarios of Phenological Changes
3.2. Fusion Results of Hybrid Deep Learning Model
3.2.1. Simulation Data
3.2.2. Satellite Data
3.3. Comparison with Benchmark Models
3.3.1. Comparison Results of Simulation Data
3.3.2. Comparison Results of Satellite Data
3.4. Fusion Results in Additional Test Sites
4. Discussion
4.1. Strengths and Limitations of Hybrid Deep Learning Model
4.2. An Innovative Approach to Evaluating Model Performance under Temporal Changes
4.3. Future Satellite Missions and Data Fusion
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Zhu, X.L.; Cai, F.Y.; Tian, J.Q.; Williams, T.K.A. Spatiotemporal Fusion of Multisource Remote Sensing Data: Literature Survey, Taxonomy, Principles, Applications, and Future Directions. Remote Sens. 2018, 10, 527. [Google Scholar] [CrossRef] [Green Version]
- Gao, F.; Anderson, M.C.; Zhang, X.; Yang, Z.; Alfieri, J.G.; Kustas, W.P.; Mueller, R.; Johnson, D.M.; Prueger, J.H. Toward mapping crop progress at field scales through fusion of Landsat and MODIS imagery. Remote Sens. Environ. 2017, 188, 9–25. [Google Scholar] [CrossRef] [Green Version]
- Dong, T.; Liu, J.; Qian, B.; Zhao, T.; Jing, Q.; Geng, X.; Wang, J.; Huffman, T.; Shang, J. Estimating winter wheat biomass by assimilating leaf area index derived from fusion of Landsat-8 and MODIS data. Int. J. Appl. Earth Obs. Geoinf. 2016, 49, 63–74. [Google Scholar] [CrossRef]
- Gao, F.; Anderson, M.C.; Xie, D. Spatial and temporal information fusion for crop condition monitoring. In Proceedings of the 2016 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Beijing, China, 10–15 July 2016; pp. 3579–3582. [Google Scholar]
- Amorós-López, J.; Gómez-Chova, L.; Alonso, L.; Guanter, L.; Zurita-Milla, R.; Moreno, J.; Camps-Valls, G. Multitemporal fusion of Landsat/TM and ENVISAT/MERIS for crop monitoring. Int. J. Appl. Earth Obs. Geoinf. 2013, 23, 132–141. [Google Scholar] [CrossRef]
- Diao, C. Innovative pheno-network model in estimating crop phenological stages with satellite time series. ISPRS J. Photogramm. Remote Sens. 2019, 153, 96–109. [Google Scholar] [CrossRef]
- Diao, C. Remote sensing phenological monitoring framework to characterize corn and soybean physiological growing stages. Remote Sens. Environ. 2020, 248, 111960. [Google Scholar] [CrossRef]
- Bégué, A.; Arvor, D.; Bellon, B.; Betbeder, J.; De Abelleyra, D.; Ferraz, R.P.D.; Lebourgeois, V.; Lelong, C.; Simões, M.; Verón, S.R. Remote sensing and cropping practices: A review. Remote Sens. 2018, 10, 99. [Google Scholar] [CrossRef] [Green Version]
- Hilker, T.; Wulder, M.A.; Coops, N.C.; Linke, J.; McDermid, G.; Masek, J.G.; Gao, F.; White, J.C. A new data fusion model for high spatial-and temporal-resolution mapping of forest disturbance based on Landsat and MODIS. Remote Sens. Environ. 2009, 113, 1613–1627. [Google Scholar] [CrossRef]
- Chen, B.; Huang, B.; Xu, B. Comparison of Spatiotemporal Fusion Models: A Review. Remote Sens. 2015, 7, 1798–1835. [Google Scholar] [CrossRef] [Green Version]
- Gao, F.; Hilker, T.; Zhu, X.; Anderson, M.; Masek, J.; Wang, P.; Yang, Y. Fusing Landsat and MODIS Data for Vegetation Monitoring. IEEE Geosci. Remote Sens. Mag. 2015, 3, 47–60. [Google Scholar] [CrossRef]
- Gao, F.; Masek, J.; Schwaller, M.; Hall, F. On the blending of the Landsat and MODIS surface reflectance: Predicting daily Landsat surface reflectance. IEEE Trans. Geosci. Remote Sens. 2006, 44, 2207–2218. [Google Scholar]
- Zhukov, B.; Oertel, D.; Lanzl, F.; Reinhackel, G. Unmixing-based multisensor multiresolution image fusion. IEEE Trans. Geosci. Remote Sens. 1999, 37, 1212–1226. [Google Scholar] [CrossRef]
- Lu, M.; Chen, J.; Tang, H.; Rao, Y.; Yang, P.; Wu, W. Land cover change detection by integrating object-based data blending model of Landsat and MODIS. Remote Sens. Environ. 2016, 184, 374–386. [Google Scholar] [CrossRef]
- Wu, M.; Huang, W.; Niu, Z.; Wang, C. Generating daily synthetic Landsat imagery by combining Landsat and MODIS data. Sensors 2015, 15, 24002–24025. [Google Scholar] [CrossRef] [Green Version]
- Huang, B.; Zhang, H.; Song, H.; Wang, J.; Song, C. Unified fusion of remote-sensing imagery: Generating simultaneously high-resolution synthetic spatial–temporal–spectral earth observations. Remote Sens. Lett. 2013, 4, 561–569. [Google Scholar] [CrossRef]
- You, X.; Meng, J.; Zhang, M.; Dong, T. Remote sensing based detection of crop phenology for agricultural zones in China using a new threshold method. Remote Sens. 2013, 5, 3190–3211. [Google Scholar] [CrossRef] [Green Version]
- Shen, H.; Meng, X.; Zhang, L. An integrated framework for the spatio–temporal–spectral fusion of remote sensing images. IEEE Trans. Geosci. Remote Sens. 2016, 54, 7135–7148. [Google Scholar] [CrossRef]
- Xue, J.; Leung, Y.; Fung, T. A bayesian data fusion approach to spatio-temporal fusion of remotely sensed images. Remote Sens. 2017, 9, 1310. [Google Scholar] [CrossRef] [Green Version]
- Ke, Y.; Im, J.; Park, S.; Gong, H. Downscaling of MODIS One kilometer evapotranspiration using Landsat-8 data and machine learning approaches. Remote Sens. 2016, 8, 215. [Google Scholar] [CrossRef] [Green Version]
- Huang, B.; Song, H. Spatiotemporal reflectance fusion via sparse representation. IEEE Trans. Geosci. Remote Sens. 2012, 50, 3707–3716. [Google Scholar] [CrossRef]
- Song, H.; Liu, Q.; Wang, G.; Hang, R.; Huang, B. Spatiotemporal Satellite Image Fusion Using Deep Convolutional Neural Networks. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 821–829. [Google Scholar] [CrossRef]
- Zhu, X.; Helmer, E.H.; Gao, F.; Liu, D.; Chen, J.; Lefsky, M.A. A flexible spatiotemporal method for fusing satellite images with different resolutions. Remote Sens. Environ. 2016, 172, 165–177. [Google Scholar] [CrossRef]
- Zhu, X.; Chen, J.; Gao, F.; Chen, X.; Masek, J.G. An enhanced spatial and temporal adaptive reflectance fusion model for complex heterogeneous regions. Remote Sens. Environ. 2010, 114, 2610–2623. [Google Scholar] [CrossRef]
- Wang, Q.; Atkinson, P.M. Spatio-temporal fusion for daily Sentinel-2 images. Remote Sens. Environ. 2018, 204, 31–42. [Google Scholar] [CrossRef] [Green Version]
- Zhu, X.X.; Tuia, D.; Mou, L.; Xia, G.-S.; Zhang, L.; Xu, F.; Fraundorfer, F. Deep Learning in Remote Sensing: A Comprehensive Review and List of Resources. IEEE Geosci. Remote Sens. Mag. 2017, 5, 8–36. [Google Scholar] [CrossRef] [Green Version]
- Ma, L.; Liu, Y.; Zhang, X.; Ye, Y.; Yin, G.; Johnson, B.A. Deep learning in remote sensing applications: A meta-analysis and review. ISPRS J. Photogramm. Remote Sens. 2019, 152, 166–177. [Google Scholar] [CrossRef]
- Yuan, Q.; Shen, H.; Li, T.; Li, Z.; Li, S.; Jiang, Y.; Xu, H.; Tan, W.; Yang, Q.; Wang, J. Deep learning in environmental remote sensing: Achievements and challenges. Remote Sens. Environ. 2020, 241, 111716. [Google Scholar] [CrossRef]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
- Wu, H.; Prasad, S. Convolutional Recurrent Neural Networks for Hyperspectral Data Classification. Remote Sens. 2017, 9, 298. [Google Scholar] [CrossRef] [Green Version]
- Huang, B.; Zhao, B.; Song, Y. Urban land-use mapping using a deep convolutional neural network with high spatial resolution multispectral remote sensing imagery. Remote Sens. Environ. 2018, 214, 73–86. [Google Scholar] [CrossRef]
- Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef] [PubMed]
- Teimouri, N.; Dyrmann, M.; Jørgensen, R.N. A Novel Spatio-Temporal FCN-LSTM Network for Recognizing Various Crop Types Using Multi-Temporal Radar Images. Remote Sens. 2019, 11, 990. [Google Scholar] [CrossRef] [Green Version]
- Kong, Y.-L.; Huang, Q.; Wang, C.; Chen, J.; Chen, J.; He, D. Long Short-Term Memory Neural Networks for Online Disturbance Detection in Satellite Image Time Series. Remote Sens. 2018, 10, 452. [Google Scholar] [CrossRef] [Green Version]
- Liu, X.; Deng, C.; Chanussot, J.; Hong, D.; Zhao, B. StfNet: A Two-Stream Convolutional Neural Network for Spatiotemporal Image Fusion. IEEE Trans. Geosci. Remote Sens. 2019, 57, 6552–6564. [Google Scholar] [CrossRef]
- Zhang, H.; Song, Y.; Han, C.; Zhang, L. Remote Sensing Image Spatiotemporal Fusion Using a Generative Adversarial Network. IEEE Trans. Geosci. Remote Sens. 2021, 59, 4273–4286. [Google Scholar] [CrossRef]
- USDA-NASS. Census of Agriculture; US Department of Agriculture, National Agricultural Statistics Service: Washington, DC, USA, 2017; Volume 1.
- Boryan, C.; Yang, Z.; Mueller, R.; Craig, M. Monitoring US agriculture: The US department of agriculture, national agricultural statistics service, cropland data layer program. Geocarto Int. 2011, 26, 341–358. [Google Scholar] [CrossRef]
- Dong, C.; Loy, C.C.; He, K.; Tang, X. Image Super-Resolution Using Deep Convolutional Networks. IEEE Trans. Pattern Anal. Mach. Intell. 2016, 38, 295–307. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zhang, X.; Friedl, M.A.; Schaaf, C.B.; Strahler, A.H.; Hodges, J.C.; Gao, F.; Reed, B.C.; Huete, A. Monitoring vegetation phenology using MODIS. Remote Sens. Environ. 2003, 84, 471–475. [Google Scholar] [CrossRef]
- Chaithra, C.; Taranath, N.; Darshan, L.; Subbaraya, C. A Survey on Image Fusion Techniques and Performance Metrics. In Proceedings of the 2018 Second International Conference on Electronics, Communication and Aerospace Technology (ICECA), Coimbatore, India, 29–31 March 2018; pp. 995–999. [Google Scholar]
- Arik, S.O.; Kliegl, M.; Child, R.; Hestness, J.; Gibiansky, A.; Fougner, C.; Prenger, R.; Coates, A. Convolutional recurrent neural networks for small-footprint keyword spotting. arXiv 2017, arXiv:1703.05390. [Google Scholar]
SAM | Simulation Data | ERGAS | ||||||
---|---|---|---|---|---|---|---|---|
STARFM | FSDAF | STFDCNN | Proposed | Scenario | STARFM | FSDAF | STFDCNN | Proposed |
0.0446 | 0.0584 | 0.0591 | 0.0243 | Rapid | 0.9518 | 0.9273 | 1.6172 | 0.3027 |
0.0302 | 0.0256 | 0.0216 | 0.0221 | Moderate | 0.4623 | 0.2940 | 0.3494 | 0.2721 |
0.0105 | 0.0114 | 0.0203 | 0.0230 | Minimal | 0.1404 | 0.1405 | 0.1200 | 0.2694 |
0.0323 | 0.0380 | 0.0389 | 0.0234 | Overall | 0.6169 | 0.5559 | 0.8920 | 0.2859 |
SAM | Real Data | ERGAS | ||||||
---|---|---|---|---|---|---|---|---|
STARFM | FSDAF | STFDCNN | Proposed | Scenario | STARFM | FSDAF | STFDCNN | Proposed |
0.1321 | 0.1309 | 0.1451 | 0.1044 | Rapid | 2.2124 | 2.2268 | 2.4626 | 2.1306 |
0.1179 | 0.1084 | 0.1111 | 0.1043 | Moderate | 2.1063 | 1.8819 | 1.8824 | 1.4834 |
0.1292 | 0.1264 | 0.1383 | 0.1044 | Overall | 2.1912 | 2.1578 | 2.3466 | 2.0012 |
Hybrid Model vs. | STARFM | FSDAF | STFDCNN |
---|---|---|---|
SAM | p < 0.001 | p < 0.001 | p < 0.001 |
ERGAS | p = 0.022 | p = 0.011 | p < 0.001 |
SAM | ERGAS | |||||||
---|---|---|---|---|---|---|---|---|
STARFM | FSDAF | STFDCNN | Proposed | Site: Oklahoma | STARFM | FSDAF | STFDCNN | Proposed |
0.1256 | 0.1123 | 0.1143 | 0.0990 | Rapid | 1.9885 | 1.8825 | 1.9133 | 1.8506 |
0.0956 | 0.0786 | 0.0784 | 0.0693 | Moderate | 1.4872 | 1.3209 | 1.3065 | 1.2558 |
0.1136 | 0.0988 | 0.0999 | 0.0871 | Overall | 1.7880 | 1.6578 | 1.6706 | 1.6127 |
STARFM | FSDAF | STFDCNN | Proposed | Site: Chicago | STARFM | FSDAF | STFDCNN | Proposed |
0.1400 | 0.1276 | 0.1360 | 0.1281 | Rapid | 2.8813 | 2.8281 | 2.8452 | 2.7232 |
0.1077 | 0.1140 | 0.1131 | 0.1109 | Moderate | 2.3977 | 2.3196 | 3.1150 | 2.2776 |
0.1206 | 0.1194 | 0.1222 | 0.1178 | Overall | 2.5911 | 2.5230 | 3.0071 | 2.4559 |
STARFM | FSDAF | STFDCNN | Proposed | Site: Harvard Forest | STARFM | FSDAF | STFDCNN | Proposed |
0.1228 | 0.1224 | 0.1408 | 0.1305 | Rapid/Overall | 2.2142 | 2.2768 | 2.4100 | 2.2300 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yang, Z.; Diao, C.; Li, B. A Robust Hybrid Deep Learning Model for Spatiotemporal Image Fusion. Remote Sens. 2021, 13, 5005. https://doi.org/10.3390/rs13245005
Yang Z, Diao C, Li B. A Robust Hybrid Deep Learning Model for Spatiotemporal Image Fusion. Remote Sensing. 2021; 13(24):5005. https://doi.org/10.3390/rs13245005
Chicago/Turabian StyleYang, Zijun, Chunyuan Diao, and Bo Li. 2021. "A Robust Hybrid Deep Learning Model for Spatiotemporal Image Fusion" Remote Sensing 13, no. 24: 5005. https://doi.org/10.3390/rs13245005