Spatiotemporal Image Fusion in Remote Sensing
Abstract
:1. Introduction
2. Fusion Methods to Increase Spatiotemporal Resolution of Satellite Images
2.1. Reconstruction-Based Spatiotemporal Image Fusion Methods
2.2. Learning-Based Spatiotemporal Image Fusion Methods
2.3. Unmixing-Based Spatiotemporal Image Fusion Methods
3. Synthesis: Challenges and Opportunities
3.1. Other Advanced Methods for Spatiotemporal Image Fusion
3.2. Increasing the Resolution of Various Satellite-Derived Data Products
3.3. Methods to Increase Spatiotemporal-Spectral Resolution of Images
3.4. Quality Assessment of Spatiotemporal Blended Images
3.5. Spatiotemporal Image Fusion Methods for Sentinel Images
3.6. Important Data Pre-processing Issues to be Considered when Fusing Spatiotemporal Images
- Spectral responses of input images have to be unified: Reconstruction and unmixing spatiotemporal image fusion methods assume that input images have similar spectral information. Therefore, their application is limited, given that the sensors might have different wavelength. When blending information from different remote sensing data sources, we have to spectrally normalize the input sensors to common wavebands [70]. According to Pinty et al. [127], the absence of similar wavelength has a low impact on the fusion results when physically-based reflectance methods are used for blending surface reflectance of the input images. Machine learning based spatiotemporal image fusion methods, on the other hand, are less sensitive to similarity between spectral responses of the input images.
- Co-registration of multi-source input images: Multi-source images alignment is a very important issue to be considered when fusing them. For example, reported misalignments between Landsat and Sentinel-2 by several pixels need to be carefully addressed when fusing the two input images [125]. Further investigation in the development of automatic solutions for images alignments is highly required [70].
- Atmospheric corrections: Radiometric consistency of the multi-source images to be fused might vary because of the presence of clouds and haze, or because of the differences in the illumination and acquisition angles [88]. Therefore, input images have to be radiometrically corrected before fusing them [70] using one of the available existing radiometric corrections techniques such as MODerate spectral resolution TRANsmittance code (MODTRAN) [128]. These techniques can be grouped into two categories, namely absolute and relative techniques. Absolute techniques require information on the sensor spectral profile for sensor calibration and corrections of images for atmospheric effects [129]. Relative radiometric techniques involve either the selection of landscape elements whose reflectance remain constant over time [130,131] or normalization using regression [132,133].
3.7. Future Directions
4. Conclusions
Author Contributions
Conflicts of Interest
References
- Pohl, C.; Van Genderen, J. Remote sensing image fusion: An update in the context of digital earth. Int. J. Digit. Earth 2014, 7, 158–172. [Google Scholar] [CrossRef]
- Ehlers, M.; Klonus, S.; Johan Åstrand, P.; Rosso, P. Multi-sensor image fusion for pansharpening in remote sensing. Int. J. Image Data Fusion 2010, 1, 25–45. [Google Scholar] [CrossRef]
- Zhang, J. Multi-source remote sensing data fusion: Status and trends. Int. J. Image Data Fusion 2010, 1, 5–24. [Google Scholar] [CrossRef]
- Wu, W.; Yang, J.; Kang, T. Study of remote sensing image fusion and its application in image classification. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2008, 37, 1141–1146. [Google Scholar]
- PohlC, V.G.J.L. Multisensor image fusion in remote sensing: Concepts, methods and applications. Int. J. Remote Sens. 1998, 19, 823–854. [Google Scholar] [CrossRef]
- Van Genderen, J.; Pohl, C. Image fusion: Issues, techniques and applications. In Proceedings of the EARSeL Workshop on Intelligent Image Fusion, Strasbourg, France, 11 September 1994; pp. 18–26. [Google Scholar]
- Huang, B.; Song, H. Spatiotemporal reflectance fusion via sparse representation. IEEE Trans. Geosci. Remote Sens. 2012, 50, 3707–3716. [Google Scholar] [CrossRef]
- Roy, D.P.; Wulder, M.; Loveland, T.R.; Woodcock, C.; Allen, R.; Anderson, M.; Helder, D.; Irons, J.; Johnson, D.; Kennedy, R. Landsat-8: Science and product vision for terrestrial global change research. Remote Sens. Environ. 2014, 145, 154–172. [Google Scholar] [CrossRef]
- Mizuochi, H.; Hiyama, T.; Ohta, T.; Nasahara, K. Evaluation of the surface water distribution in north-central namibia based on modis and amsr series. Remote Sens. 2014, 6, 7660–7682. [Google Scholar] [CrossRef]
- Gao, F.; Hilker, T.; Zhu, X.; Anderson, M.; Masek, J.; Wang, P.; Yang, Y. Fusing landsat and modis data for vegetation monitoring. IEEE Geosci. Remote Sens. Mag. 2015, 3, 47–60. [Google Scholar] [CrossRef]
- Ghassemian, H. A review of remote sensing image fusion methods. Inf. Fusion 2016, 32, 75–89. [Google Scholar] [CrossRef]
- Zhu, X.; Helmer, E.H.; Gao, F.; Liu, D.; Chen, J.; Lefsky, M.A. A flexible spatiotemporal method for fusing satellite images with different resolutions. Remote Sens. Environ. 2016, 172, 165–177. [Google Scholar] [CrossRef]
- Atkinson, P.M. Downscaling in remote sensing. Int. J. Appl. Earth Obs. Geoinf. 2013, 22, 106–114. [Google Scholar] [CrossRef]
- Thomas, C.; Ranchin, T.; Wald, L.; Chanussot, J. Synthesis of multispectral images to high spatial resolution: A critical review of fusion methods based on remote sensing physics. IEEE Trans. Geosci. Remote Sens. 2008, 46, 1301–1312. [Google Scholar] [CrossRef]
- Zhan, W.; Chen, Y.; Zhou, J.; Wang, J.; Liu, W.; Voogt, J.; Zhu, X.; Quan, J.; Li, J. Disaggregation of remotely sensed land surface temperature: Literature survey, taxonomy, issues, and caveats. Remote Sens. Environ. 2013, 131, 119–139. [Google Scholar] [CrossRef]
- Zhu, X.; Cai, F.; Tian, J.; Williams, T. Spatiotemporal fusion of multisource remote sensing data: Literature survey, taxonomy, principles, applications, and future directions. Remote Sens. 2018, 10, 527. [Google Scholar]
- Reiche, J.; Verbesselt, J.; Hoekman, D.; Herold, M. Fusing landsat and sar time series to detect deforestation in the tropics. Remote Sens. Environ. 2015, 156, 276–293. [Google Scholar] [CrossRef]
- Racault, M.-F.; Sathyendranath, S.; Platt, T. Impact of missing data on the estimation of ecological indicators from satellite ocean-colour time-series. Remote Sens. Environ. 2014, 152, 15–28. [Google Scholar] [CrossRef]
- Honaker, J.; King, G. What to do about missing values in time-series cross-section data. Am. J. Political Sci. 2010, 54, 561–581. [Google Scholar] [CrossRef]
- Dunsmuir, W.; Robinson, P. Estimation of time series models in the presence of missing data. J. Am. Stat. Assoc. 1981, 76, 560–568. [Google Scholar] [CrossRef]
- Schmitt, M.; Zhu, X.X. Data fusion and remote sensing: An ever-growing relationship. IEEE Geosci. Remote Sens. Mag. 2016, 4, 6–23. [Google Scholar] [CrossRef]
- Amarsaikhan, D.; Blotevogel, H.H.; van Genderen, J.L.; Ganzorig, M.; Gantuya, R.; Nergui, B. Fusing high-resolution sar and optical imagery for improved urban land cover study and classification. Int. J. Image Data Fusion 2010, 1, 83–97. [Google Scholar] [CrossRef]
- Erasmi, S.; Twele, A. Regional land cover mapping in the humid tropics using combined optical and sar satellite data—A case study from central Sulawesi, Indonesia. Int. J. Remote Sens. 2009, 30, 2465–2478. [Google Scholar] [CrossRef]
- Reiche, J.; Souza, C.M.; Hoekman, D.H.; Verbesselt, J.; Persaud, H.; Herold, M. Feature level fusion of multi-temporal alos palsar and landsat data for mapping and monitoring of tropical deforestation and forest degradation. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2013, 6, 2159–2173. [Google Scholar] [CrossRef]
- Lehmann, E.A.; Caccetta, P.A.; Zhou, Z.-S.; McNeill, S.J.; Wu, X.; Mitchell, A.L. Joint processing of landsat and alos-palsar data for forest mapping and monitoring. IEEE Trans. Geosci. Remote Sens. 2012, 50, 55–67. [Google Scholar] [CrossRef]
- Kim, J.; Hogue, T.S. Improving spatial soil moisture representation through integration of amsr-e and modis products. IEEE Trans. Geosci. Remote Sens. 2012, 50, 446–460. [Google Scholar] [CrossRef]
- Mizuochi, H.; Hiyama, T.; Ohta, T.; Fujioka, Y.; Kambatuku, J.R.; Iijima, M.; Nasahara, K.N. Development and evaluation of a lookup-table-based approach to data fusion for seasonal wetlands monitoring: An integrated use of amsr series, modis, and landsat. Remote Sens. Environ. 2017, 199, 370–388. [Google Scholar] [CrossRef]
- Kou, X.; Jiang, L.; Bo, Y.; Yan, S.; Chai, L. Estimation of land surface temperature through blending modis and amsr-e data with the bayesian maximum entropy method. Remote Sens. 2016, 8, 105. [Google Scholar] [CrossRef]
- Schmitt, M.; Tupin, F.; Zhu, X.X. Fusion of sar and optical remote sensing data—Challenges and recent trends. In Proceedings of the 2017 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Fort Worth, TX, USA, 23–28 July 2017; pp. 5458–5461. [Google Scholar]
- Audebert, N.; Le Saux, B.; Lefèvre, S. Beyond rgb: Very high resolution urban remote sensing with multimodal deep networks. ISPRS J. Photogramm. Remote Sens. 2018, 140, 20–32. [Google Scholar] [CrossRef]
- Eismann, M.T.; Hardie, R.C. Application of the stochastic mixing model to hyperspectral resolution enhancement. IEEE Trans. Geosci. Remote Sens. 2004, 42, 1924–1933. [Google Scholar] [CrossRef]
- Gao, F.; Masek, J.; Schwaller, M.; Hall, F. On the blending of the landsat and modis surface reflectance: Predicting daily landsat surface reflectance. IEEE Trans. Geosci. Remote Sens. 2006, 44, 2207–2218. [Google Scholar]
- Mizuochi, H.; Nishiyama, C.; Ridwansyah, I.; Nishida Nasahara, K. Monitoring of an indonesian tropical wetland by machine learning-based data fusion of passive and active microwave sensors. Remote Sens. 2018, 10, 1235. [Google Scholar] [CrossRef]
- Butler, D. Many eyes on earth. Nature 2014, 505, 143–144. [Google Scholar] [CrossRef]
- Boyd, D.S.; Jackson, B.; Wardlaw, J.; Foody, G.M.; Marsh, S.; Bales, K. Slavery from space: Demonstrating the role for satellite remote sensing to inform evidence-based action related to un sdg number 8. ISPRS J. Photogramm. Remote Sens. 2018, 142, 380–388. [Google Scholar] [CrossRef]
- Gevaert, C.M.; García-Haro, F.J. A comparison of starfm and an unmixing-based algorithm for landsat and modis data fusion. Remote Sens. Environ. 2015, 156, 34–44. [Google Scholar] [CrossRef]
- Gao, F.; Anderson, M.C.; Zhang, X.; Yang, Z.; Alfieri, J.G.; Kustas, W.P.; Mueller, R.; Johnson, D.M.; Prueger, J.H. Toward mapping crop progress at field scales through fusion of landsat and modis imagery. Remote Sens. Environ. 2017, 188, 9–25. [Google Scholar] [CrossRef]
- Liang, L.; Schwartz, M.D.; Wang, Z.; Gao, F.; Schaaf, C.B.; Tan, B.; Morisette, J.T.; Zhang, X. A cross comparison of spatiotemporally enhanced springtime phenological measurements from satellites and ground in a northern us mixed forest. IEEE Trans. Geosci. Remote Sens. 2014, 52, 7513–7526. [Google Scholar] [CrossRef]
- Woodcock, C.E.; Allen, R.; Anderson, M.; Belward, A.; Bindschadler, R.; Cohen, W.; Gao, F.; Goward, S.N.; Helder, D.; Helmer, E.; et al. Free access to landsat imagery. Science 2008, 320, 1011. [Google Scholar] [CrossRef] [PubMed]
- Jarihani, A.; McVicar, T.; Van Niel, T.; Emelyanova, I.; Callow, J.; Johansen, K. Blending landsat and modis data to generate multispectral indices: A comparison of “index-then-blend” and “blend-then-index” approaches. Remote Sens. 2014, 6, 9213–9238. [Google Scholar] [CrossRef]
- Chen, B.; Huang, B.; Xu, B. A hierarchical spatiotemporal adaptive fusion model using one image pair. Int. J. Digit. Earth 2017, 10, 639–655. [Google Scholar] [CrossRef]
- Chen, B.; Huang, B.; Xu, B. Comparison of spatiotemporal fusion models: A review. Remote Sens. 2015, 7, 1798–1835. [Google Scholar] [CrossRef]
- Hilker, T.; Wulder, M.A.; Coops, N.C.; Linke, J.; McDermid, G.; Masek, J.G.; Gao, F.; White, J.C. A new data fusion model for high spatial-and temporal-resolution mapping of forest disturbance based on landsat and modis. Remote Sens. Environ. 2009, 113, 1613–1627. [Google Scholar] [CrossRef]
- Zhu, X.; Chen, J.; Gao, F.; Chen, X.; Masek, J.G. An enhanced spatial and temporal adaptive reflectance fusion model for complex heterogeneous regions. Remote Sens. Environ. 2010, 114, 2610–2623. [Google Scholar] [CrossRef]
- Hazaymeh, K.; Hassan, Q.K. Spatiotemporal image-fusion model for enhancing the temporal resolution of landsat-8 surface reflectance images using modis images. J. Appl. Remote Sens. 2015, 9, 096095. [Google Scholar] [CrossRef]
- Luo, Y.; Guan, K.; Peng, J. Stair: A generic and fully-automated method to fuse multiple sources of optical satellite data to generate a high-resolution, daily and cloud-/gap-free surface reflectance product. Remote Sens. Environ. 2018, 214, 87–99. [Google Scholar] [CrossRef]
- Zhao, Y.; Huang, B.; Song, H. A robust adaptive spatial and temporal image fusion model for complex land surface changes. Remote Sens. Environ. 2018, 208, 42–62. [Google Scholar] [CrossRef]
- Wang, Q.; Atkinson, P.M. Spatio-temporal fusion for daily sentinel-2 images. Remote Sens. Environ. 2018, 204, 31–42. [Google Scholar] [CrossRef]
- Song, H.; Huang, B. Spatiotemporal satellite image fusion through one-pair image learning. IEEE Trans. Geosci. Remote Sens. 2013, 51, 1883–1896. [Google Scholar] [CrossRef]
- Wu, M.; Niu, Z.; Wang, C.; Wu, C.; Wang, L. Use of modis and landsat time series data to generate high-resolution temporal synthetic landsat data using a spatial and temporal reflectance fusion model. J. Appl. Remote Sens. 2012, 6, 063507. [Google Scholar]
- Huang, B.; Zhang, H. Spatio-temporal reflectance fusion via unmixing: Accounting for both phenological and land-cover changes. Int. J. Remote Sens. 2014, 35, 6213–6233. [Google Scholar] [CrossRef]
- Wu, M.; Huang, W.; Niu, Z.; Wang, C. Generating daily synthetic landsat imagery by combining landsat and modis data. Sensors 2015, 15, 24002–24025. [Google Scholar] [CrossRef]
- Zurita-Milla, R.; Kaiser, G.; Clevers, J.; Schneider, W.; Schaepman, M. Downscaling time series of meris full resolution data to monitor vegetation seasonal dynamics. Remote Sens. Environ. 2009, 113, 1874–1885. [Google Scholar] [CrossRef]
- Zhang, Y.; Foody, G.M.; Ling, F.; Li, X.; Ge, Y.; Du, Y.; Atkinson, P.M. Spatial-temporal fraction map fusion with multi-scale remotely sensed images. Remote Sens. Environ. 2018, 213, 162–181. [Google Scholar] [CrossRef]
- Shen, H.; Meng, X.; Zhang, L. An integrated framework for the spatio–temporal–spectral fusion of remote sensing images. IEEE Trans. Geosci. Remote Sens. 2016, 54, 7135–7148. [Google Scholar] [CrossRef]
- Wang, J.; Huang, B. A spatiotemporal satellite image fusion model with autoregressive error correction (arec). Int. J. Remote Sens. 2018, 39, 6731–6756. [Google Scholar] [CrossRef]
- Zhang, X.; Wang, J.; Gao, F.; Liu, Y.; Schaaf, C.; Friedl, M.; Yu, Y.; Jayavelu, S.; Gray, J.; Liu, L. Exploration of scaling effects on coarse resolution land surface phenology. Remote Sens. Environ. 2017, 190, 318–330. [Google Scholar] [CrossRef]
- Emelyanova, I.V.; McVicar, T.R.; Van Niel, T.G.; Li, L.T.; van Dijk, A.I.J.M. Assessing the accuracy of blending landsat–modis surface reflectances in two landscapes with contrasting spatial and temporal dynamics: A framework for algorithm selection. Remote Sens. Environ. 2013, 133, 193–209. [Google Scholar] [CrossRef]
- Yang, J.; Wright, J.; Huang, T.S.; Ma, Y. Image super-resolution via sparse representation. IEEE Trans. Image Process. 2010, 19, 2861–2873. [Google Scholar] [CrossRef]
- Kwan, C.; Budavari, B.; Gao, F.; Zhu, X. A hybrid color mapping approach to fusing modis and landsat images for forward prediction. Remote Sens. 2018, 10, 520. [Google Scholar] [CrossRef]
- Xu, Y.; Huang, B.; Xu, Y.; Cao, K.; Guo, C.; Meng, D. Spatial and temporal image fusion via regularized spatial unmixing. IEEE Geosci. Remote Sens. Lett. 2015, 12, 1362–1366. [Google Scholar]
- Gómez, C.; White, J.C.; Wulder, M.A. Optical remotely sensed time series data for land cover classification: A review. ISPRS J. Photogramm. Remote Sens. 2016, 116, 55–72. [Google Scholar] [CrossRef]
- Wu, M.; Wu, C.; Huang, W.; Niu, Z.; Wang, C. High-resolution leaf area index estimation from synthetic landsat data generated by a spatial and temporal data fusion model. Comput. Electron. Agric. 2015, 115, 1–11. [Google Scholar] [CrossRef]
- Wu, M.; Li, H.; Huang, W.; Niu, Z.; Wang, C. Generating daily high spatial land surface temperatures by combining aster and modis land surface temperature products for environmental process monitoring. Environ. Sci. Process. Impacts 2015, 17, 1396–1404. [Google Scholar] [CrossRef] [PubMed]
- Zurita-Milla, R.; Gómez-Chova, L.; Guanter, L.; Clevers, J.G.; Camps-Valls, G. Multitemporal unmixing of medium-spatial-resolution satellite images: A case study using meris images for land-cover mapping. IEEE Trans. Geosci. Remote Sens. 2011, 49, 4308–4317. [Google Scholar] [CrossRef]
- Leckie, D.G. Advances in remote sensing technologies for forest surveys and management. Can. J. For. Res. 1990, 20, 464–483. [Google Scholar] [CrossRef]
- Wu, M.; Zhang, X.; Huang, W.; Niu, Z.; Wang, C.; Li, W.; Hao, P. Reconstruction of daily 30 m data from hj ccd, gf-1 wfv, landsat, and modis data for crop monitoring. Remote Sens. 2015, 7, 16293–16314. [Google Scholar] [CrossRef]
- Quan, J.; Zhan, W.; Ma, T.; Du, Y.; Guo, Z.; Qin, B. An integrated model for generating hourly landsat-like land surface temperatures over heterogeneous landscapes. Remote Sens. Environ. 2018, 206, 403–423. [Google Scholar] [CrossRef]
- Wu, P.; Shen, H.; Zhang, L.; Göttsche, F.-M. Integrated fusion of multi-scale polar-orbiting and geostationary satellite observations for the mapping of high spatial and temporal resolution land surface temperature. Remote Sens. Environ. 2015, 156, 169–181. [Google Scholar] [CrossRef]
- Kwan, C.; Zhu, X.; Gao, F.; Chou, B.; Perez, D.; Li, J.; Shen, Y.; Koperski, K.; Marchisio, G. Assessment of spatiotemporal fusion algorithms for planet and worldview images. Sensors 2018, 18, 1051. [Google Scholar] [CrossRef]
- Wang, Q.; Blackburn, G.A.; Onojeghuo, A.O.; Dash, J.; Zhou, L.; Zhang, Y.; Atkinson, P.M. Fusion of landsat 8 oli and sentinel-2 msi data. IEEE Trans. Geosci. Remote Sens. 2017, 55, 3885–3899. [Google Scholar] [CrossRef]
- Zhang, L.; Zhang, L.; Du, B. Deep learning for remote sensing data: A technical tutorial on the state of the art. IEEE Geosci. Remote Sens. Mag. 2016, 4, 22–40. [Google Scholar] [CrossRef]
- Zhong, J.; Yang, B.; Huang, G.; Zhong, F.; Chen, Z. Remote sensing image fusion with convolutional neural network. Sens. Imaging 2016, 17, 10. [Google Scholar] [CrossRef]
- Masi, G.; Cozzolino, D.; Verdoliva, L.; Scarpa, G. Pansharpening by convolutional neural networks. Remote Sens. 2016, 8, 594. [Google Scholar] [CrossRef]
- Yuan, Y.; Zheng, X.; Lu, X. Hyperspectral image superresolution by transfer learning. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017, 10, 1963–1974. [Google Scholar] [CrossRef]
- Huang, W.; Xiao, L.; Wei, Z.; Liu, H.; Tang, S. A new pan-sharpening method with deep neural networks. IEEE Geosci. Remote Sens. Lett. 2015, 12, 1037–1041. [Google Scholar] [CrossRef]
- Mou, L.; Schmitt, M.; Wang, Y.; Zhu, X.X. A cnn for the identification of corresponding patches in sar and optical imagery of urban scenes. In Proceedings of the 2017 Joint Urban Remote Sensing Event (JURSE), Dubai, UAE, 6–8 March 2017; pp. 1–4. [Google Scholar]
- Hu, J.; Mou, L.; Schmitt, A.; Zhu, X.X. Fusionet: A two-stream convolutional neural network for urban scene classification using polsar and hyperspectral data. In Proceedings of the 2017 Joint Urban Remote Sensing Event (JURSE), Dubai, UAE, 6–8 March 2017; pp. 1–4. [Google Scholar]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster r-cnn: Towards real-time object detection with region proposal networks. In Advances in Neural Information Processing Systems; 2015; pp. 91–99. [Google Scholar]
- Lefèvre, S.; Tuia, D.; Wegner, J.D.; Produit, T.; Nassaar, A.S. Toward seamless multiview scene analysis from satellite to street level. Proc. IEEE 2017, 105, 1884–1899. [Google Scholar] [CrossRef]
- Wei, Y.; Yuan, Q.; Shen, H.; Zhang, L. Boosting the accuracy of multispectral image pansharpening by learning a deep residual network. IEEE Geosci. Remote Sens. Lett. 2017, 14, 1795–1799. [Google Scholar] [CrossRef]
- Liu, Y.; Chen, X.; Wang, Z.; Wang, Z.J.; Ward, R.K.; Wang, X. Deep learning for pixel-level image fusion: Recent advances and future prospects. Inf. Fusion 2018, 42, 158–173. [Google Scholar] [CrossRef]
- Song, H.; Liu, Q.; Wang, G.; Hang, R.; Huang, B. Spatiotemporal satellite image fusion using deep convolutional neural networks. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 821–829. [Google Scholar] [CrossRef]
- Tran, D.; Bourdev, L.; Fergus, R.; Torresani, L.; Paluri, M. Learning spatiotemporal features with 3d convolutional networks. In Proceedings of the IEEE International Conference on Computer Vision, Santiago, Chile, 7–13 December 2015; pp. 4489–4497. [Google Scholar]
- Zhang, H.; Huang, B. A new look at image fusion methods from a bayesian perspective. Remote Sens. 2015, 7, 6828–6861. [Google Scholar] [CrossRef]
- Xue, J.; Leung, Y.; Fung, T. A bayesian data fusion approach to spatio-temporal fusion of remotely sensed images. Remote Sens. 2017, 9, 1310. [Google Scholar] [CrossRef]
- Fasbender, D.; Obsomer, V.; Bogaert, P.; Defourny, P. Updating Scarce High Resolution Images with Time Series of Coarser Images: A Bayesian Data Fusion Solution. In Sensor and Data Fusion; IntechOpen: London, UK, 2009. [Google Scholar]
- Roy, D.P.; Ju, J.; Lewis, P.; Schaaf, C.; Gao, F.; Hansen, M.; Lindquist, E. Multi-temporal modis–landsat data fusion for relative radiometric normalization, gap filling, and prediction of landsat data. Remote Sens. Environ. 2008, 112, 3112–3130. [Google Scholar] [CrossRef]
- Hutengs, C.; Vohland, M. Downscaling land surface temperatures at regional scales with random forest regression. Remote Sens. Environ. 2016, 178, 127–141. [Google Scholar] [CrossRef]
- Pan, X.; Zhu, X.; Yang, Y.; Cao, C.; Zhang, X.; Shan, L. Applicability of downscaling land surface temperature by using normalized difference sand index. Sci. Rep. 2018, 8, 9530. [Google Scholar] [CrossRef]
- Kustas, W.P.; Norman, J.M.; Anderson, M.C.; French, A.N. Estimating subpixel surface temperatures and energy fluxes from the vegetation index–radiometric temperature relationship. Remote Sens. Environ. 2003, 85, 429–440. [Google Scholar] [CrossRef]
- Li, Z.-L.; Tang, B.-H.; Wu, H.; Ren, H.; Yan, G.; Wan, Z.; Trigo, I.F.; Sobrino, J.A. Satellite-derived land surface temperature: Current status and perspectives. Remote Sens. Environ. 2013, 131, 14–37. [Google Scholar] [CrossRef]
- Pardo-Igúzquiza, E.; Chica-Olmo, M.; Atkinson, P.M. Downscaling cokriging for image sharpening. Remote Sens. Environ. 2006, 102, 86–98. [Google Scholar] [CrossRef]
- Yang, Y.; Cao, C.; Pan, X.; Li, X.; Zhu, X. Downscaling land surface temperature in an arid area by using multiple remote sensing indices with random forest regression. Remote Sens. 2017, 9, 789. [Google Scholar] [CrossRef]
- Yang, G.; Pu, R.; Zhao, C.; Huang, W.; Wang, J. Estimation of subpixel land surface temperature using an endmember index based technique: A case examination on aster and modis temperature products over a heterogeneous area. Remote Sens. Environ. 2011, 115, 1202–1219. [Google Scholar] [CrossRef]
- Merlin, O.; Duchemin, B.; Hagolle, O.; Jacob, F.; Coudert, B.; Chehbouni, G.; Dedieu, G.; Garatuza, J.; Kerr, Y. Disaggregation of modis surface temperature over an agricultural area using a time series of formosat-2 images. Remote Sens. Environ. 2010, 114, 2500–2512. [Google Scholar] [CrossRef]
- Alidoost, F.; Sharifi, M.A.; Stein, A. Region- and pixel-based image fusion for disaggregation of actual evapotranspiration. Int. J. Image Data Fusion 2015, 6, 216–231. [Google Scholar] [CrossRef]
- Liu, H.; Yang, B.; Kang, E. Cokriging method for spatio-temporal assimilation of multi-scale satellite data. In Proceedings of the 2015 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Milan, Italy, 26–31 July 2015; pp. 3314–3316. [Google Scholar]
- Hwang, T.; Song, C.; Bolstad, P.V.; Band, L.E. Downscaling real-time vegetation dynamics by fusing multi-temporal modis and landsat ndvi in topographically complex terrain. Remote Sens. Environ. 2011, 115, 2499–2512. [Google Scholar] [CrossRef]
- Merlin, O.; Al Bitar, A.; Walker, J.P.; Kerr, Y. An improved algorithm for disaggregating microwave-derived soil moisture based on red, near-infrared and thermal-infrared data. Remote Sens. Environ. 2010, 114, 2305–2316. [Google Scholar] [CrossRef]
- Choi, M.; Hur, Y. A microwave-optical/infrared disaggregation for improving spatial representation of soil moisture using amsr-e and modis products. Remote Sens. Environ. 2012, 124, 259–269. [Google Scholar] [CrossRef]
- Jia, S.; Zhu, W.; Lű, A.; Yan, T. A statistical spatial downscaling algorithm of trmm precipitation based on ndvi and dem in the qaidam basin of china. Remote Sens. Environ. 2011, 115, 3069–3079. [Google Scholar] [CrossRef]
- Duan, Z.; Bastiaanssen, W.G.M. First results from version 7 trmm 3b43 precipitation product in combination with a new downscaling–calibration procedure. Remote Sens. Environ. 2013, 131, 1–13. [Google Scholar] [CrossRef]
- Huang, B.; Zhang, H.; Song, H.; Wang, J.; Song, C. Unified fusion of remote-sensing imagery: Generating simultaneously high-resolution synthetic spatial–temporal–spectral earth observations. Remote Sens. Lett. 2013, 4, 561–569. [Google Scholar] [CrossRef]
- Meng, X.; Shen, H.; Zhang, L.; Yuan, Q.; Li, H. A unified framework for spatio-temporal-spectral fusion of remote sensing images. In Proceedings of the 2015 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Milan, Italy, 26–31 July 2015; pp. 2584–2587. [Google Scholar]
- Stein, A. Use of single- and multi-source image fusion for statistical decision-making. Int. J. Appl. Earth Obs. Geoinf. 2005, 6, 229–239. [Google Scholar] [CrossRef]
- Li, S.; Li, Z.; Gong, J. Multivariate statistical analysis of measures for assessing the quality of image fusion. Int. J. Image Data Fusion 2010, 1, 47–66. [Google Scholar] [CrossRef]
- Yuhas, R.H.; Goetz, A.F.; Boardman, J.W. Discrimination among semi-arid landscape endmembers using the Spectral Angle Mapper (SAM) Algorithm. In Summaries of the 3rd annual JPL Airborne Geoscience Workshop; JPL Publication: Pasadena, CA, USA, 1992; Volume 1, pp. 147–149. [Google Scholar]
- Kwan, C.; Dao, M.; Chou, B.; Kwan, L.; Ayhan, B. Mastcam image enhancement using estimated point spread functions. In Proceedings of the IEEE 8th Annual Ubiquitous Computing, Electronics and Mobile Communication Conference (UEMCON), New York, NY, USA, 19–21 October 2017; pp. 186–191. [Google Scholar]
- Wang, Z.; Bovik, A.C. A universal image quality index. IEEE Signal Process. Lett. 2002, 9, 81–84. [Google Scholar] [CrossRef]
- Alparone, L.; Baronti, S.; Garzelli, A.; Nencini, F. A global quality measurement of pan-sharpened multispectral imagery. IEEE Geosci. Remote Sens. Lett. 2004, 1, 313–317. [Google Scholar] [CrossRef]
- Zhou, J.; Kwan, C.; Budavari, B. Hyperspectral image super-resolution: A hybrid color mapping approach. J. Appl. Remote Sens. 2016, 10, 035024. [Google Scholar] [CrossRef]
- Wald, L. Data Fusion: Definitions and Architectures: Fusion of Images of Different Spatial Resolutions; Presses des MINES: Paris, France, 2002. [Google Scholar]
- Alparone, L.; Aiazzi, B.; Baronti, S.; Garzelli, A.; Nencini, F.; Selva, M. Multispectral and panchromatic data fusion assessment without reference. Photogramm. Eng. Remote Sens. 2008, 74, 193–200. [Google Scholar] [CrossRef]
- Agaian, S.S.; Panetta, K.; Grigoryan, A.M. Transform-based image enhancement algorithms with performance measure. IEEE Trans. Image Process. 2001, 10, 367–382. [Google Scholar] [CrossRef] [PubMed]
- Sheikh, H.R.; Sabir, M.F.; Bovik, A.C. A statistical evaluation of recent full reference image quality assessment algorithms. IEEE Trans. Image Process. 2006, 15, 3440–3451. [Google Scholar] [CrossRef] [PubMed]
- Wang, Z.; Bovik, A.C.; Sheikh, H.R.; Simoncelli, E.P. Image quality assessment: From error visibility to structural similarity. IEEE Trans. Image Process. 2004, 13, 600–612. [Google Scholar] [CrossRef] [PubMed]
- Tsai, D.-Y.; Lee, Y.; Matsuyama, E. Information entropy measure for evaluation of image quality. J. Digit. Imaging 2008, 21, 338–347. [Google Scholar] [CrossRef]
- Wang, J.; Huang, B. A rigorously-weighted spatiotemporal fusion model with uncertainty analysis. Remote Sens. 2017, 9, 990. [Google Scholar] [CrossRef]
- Zhong, D.; Zhou, F. A prediction smooth method for blending landsat and moderate resolution imagine spectroradiometer images. Remote Sens. 2018, 10, 1371. [Google Scholar] [CrossRef]
- Belgiu, M.; Csillik, O. Sentinel-2 cropland mapping using pixel-based and object-based time-weighted dynamic time warping analysis. Remote Sens. Environ. 2018, 204, 509–523. [Google Scholar] [CrossRef]
- Andreo, V.; Belgiu, M.; Hoyos, D.B.; Osei, F.; Provensal, C.; Stein, A. Rodents and satellites: Predicting mice abundance and distribution with sentinel-2 data. Ecol. Inform. 2019, 51, 157–167. [Google Scholar] [CrossRef]
- Flood, N. Comparing sentinel-2a and landsat 7 and 8 using surface reflectance over Australia. Remote Sens. 2017, 9, 659. [Google Scholar] [CrossRef]
- Foroosh, H.; Zerubia, J.B.; Berthod, M. Extension of phase correlation to subpixel registration. IEEE Trans. Image Process. 2002, 11, 188–200. [Google Scholar] [CrossRef] [PubMed]
- Yan, L.; Roy, D.P.; Zhang, H.; Li, J.; Huang, H. An automated approach for sub-pixel registration of landsat-8 operational land imager (oli) and sentinel-2 multi spectral instrument (msi) imagery. Remote Sens. 2016, 8, 520. [Google Scholar] [CrossRef]
- Zhang, H.K.; Roy, D.P.; Yan, L.; Li, Z.; Huang, H.; Vermote, E.; Skakun, S.; Roger, J.-C. Characterization of sentinel-2a and landsat-8 top of atmosphere, surface, and nadir brdf adjusted reflectance and ndvi differences. Remote Sens. Environ. 2018, 215, 482–494. [Google Scholar] [CrossRef]
- Pinty, B.; Widlowski, J.L.; Taberner, M.; Gobron, N.; Verstraete, M.; Disney, M.; Gascon, F.; Gastellu, J.P.; Jiang, L.; Kuusk, A.; et al. Radiation transfer model intercomparison (rami) exercise: Results from the second phase. J. Geophys. Res. Atmos. 2004, 109. [Google Scholar] [CrossRef]
- Berk, A.; Cooley, T.W.; Anderson, G.P.; Acharya, P.K.; Bernstein, L.S.; Muratov, L.; Lee, J.; Fox, M.J.; Adler-Golden, S.M.; Chetwynd, J.H.; et al. Modtran5: A reformulated atmospheric band model with auxiliary species and practical multiple scattering options. In Remote Sensing of Clouds and the Atmosphere IX; International Society for Optics and Photonics: Bellingham, WA, USA, 2004; pp. 78–86. [Google Scholar]
- Justice, C.O.; Vermote, E.; Townshend, J.R.; Defries, R.; Roy, D.P.; Hall, D.K.; Salomonson, V.V.; Privette, J.L.; Riggs, G.; Strahler, A.; et al. The moderate resolution imaging spectroradiometer (modis): Land remote sensing for global change research. IEEE Trans. Geosci. Remote Sens. 1998, 36, 1228–1249. [Google Scholar] [CrossRef]
- Hall, F.G.; Strebel, D.E.; Nickeson, J.E.; Goetz, S.J. Radiometric rectification: Toward a common radiometric response among multidate, multisensor images. Remote Sens. Environ. 1991, 35, 11–27. [Google Scholar] [CrossRef]
- Coppin, P.R.; Bauer, M.E. Processing of multitemporal landsat tm imagery to optimize extraction of forest cover change features. IEEE Trans. Geosci. Remote Sens. 1994, 32, 918–927. [Google Scholar] [CrossRef]
- Heo, J.; FitzHugh, T.W. A standardized radiometric normalization method for change detection using remotely sensed imagery. Photogramm. Eng. Remote Sens. 2000, 66, 173–181. [Google Scholar]
- Du, Y.; Teillet, P.M.; Cihlar, J. Radiometric normalization of multitemporal high-resolution satellite images with quality control for land cover change detection. Remote Sens. Environ. 2002, 82, 123–134. [Google Scholar] [CrossRef]
- Gorelick, N.; Hancher, M.; Dixon, M.; Ilyushchenko, S.; Thau, D.; Moore, R. Google earth engine: Planetary-scale geospatial analysis for everyone. Remote Sens. Environ. 2017, 202, 18–27. [Google Scholar] [CrossRef]
Spatiotemporal Fusion Model | Categories | |
---|---|---|
Gao et al. [32] | Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM) | Reconstruction-based |
Hilker et al. [43] | Spatial-Temporal Adaptive Algorithm for mapping Reflectance Change (STAARCH) | Reconstruction-based |
Zhu et al. [44] | Enhanced spatial and temporal adaptive reflectance fusion model (ESTARFM) | Reconstruction-based |
Hazaymeh and Hassan [45] | Spatiotemporal image-fusion model (STI-FM) | Reconstruction-based |
Luo et al. [46] | Satellite Data Integration (STAIR) | Reconstruction-based |
Zhao et al. [47] | Robust Adaptive Spatial and Temporal Fusion Model (RASTFM) | Reconstruction-based |
Wang and Atkinson [48] | FIT-FC | Reconstruction-based |
Chen et al. [41] | Hierarchical Spatiotemporal Adaptive Fusion model (HSTAFM) | Learning-based |
Huang et al. [7] | Sparse representation based Spatio temporal reflectance Fusion Model (SPSTFM) | Learning-based model |
Song and Huang [49] | One-pair learning image fusion model | Learning-based model |
Wu et al. [50] | Spatial and Temporal Data Fusion Approach (STDFA) | Unmixing-based |
Huang and Zhang [51] | Spatio-Temporal Reflectance Fusion Model (U-STFM) | Unmixing-based |
Gevaert et al. [36] | Spatial and Temporal Reflectance Unmixing Model (STRUM) | Unmixing-based |
Wu et al. [52] | Modified Spatial and Temporal Data Fusion Approach (MSTDFA) | Unmixing-based |
Zurita-Milla et al. [53] | Constrained unmixing image fusion model | Unmixing-based |
Zhang et al. [54] | Spatial-Temporal Fraction Map Fusion (STFMF) | Unmixing-based |
Zhu et al. [12] | Flexible Spatiotemporal Data Fusion (FSDAF) | Hybrid |
Data Fusion Performance Metrics | Authors |
---|---|
Spectral angle mapper (SAM) | Yuhas et al. [108] |
Peak Signal-to-noise-ratio (SNR) | Sheikh et al. [116] |
Structural Similarity Index (SSIM) | Wang et al. [117] |
Image quality index | Wang et al. [110] |
Extended image quality index | Alparone et al. [111] |
Quality w/no reference index | Alparone et al. [114] |
Enhancement measure evaluation (EME) | Agaian et al. [115] |
Entropy | Tsai et al. [118] |
Erreur Relative Globale Adimensionnelle de Synthese (ERGAS) | Wald [113] |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Belgiu, M.; Stein, A. Spatiotemporal Image Fusion in Remote Sensing. Remote Sens. 2019, 11, 818. https://doi.org/10.3390/rs11070818
Belgiu M, Stein A. Spatiotemporal Image Fusion in Remote Sensing. Remote Sensing. 2019; 11(7):818. https://doi.org/10.3390/rs11070818
Chicago/Turabian StyleBelgiu, Mariana, and Alfred Stein. 2019. "Spatiotemporal Image Fusion in Remote Sensing" Remote Sensing 11, no. 7: 818. https://doi.org/10.3390/rs11070818
APA StyleBelgiu, M., & Stein, A. (2019). Spatiotemporal Image Fusion in Remote Sensing. Remote Sensing, 11(7), 818. https://doi.org/10.3390/rs11070818