Next Article in Journal
Vibration Data Processing for Bedload Monitoring in Underwater Environments
Previous Article in Journal
A Pilot Study on Remote Sensing and Citizen Science for Archaeological Prospection
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

IHS-GTF: A Fusion Method for Optical and Synthetic Aperture Radar Data

1
State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan 430079, China
2
School of Remote Sensing and Information Engineering, Wuhan University, Wuhan 430079, China
3
School of Geography and Information Engineering, China University of Geosciences (Wuhan), Wuhan 430074, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2020, 12(17), 2796; https://doi.org/10.3390/rs12172796
Submission received: 28 July 2020 / Revised: 25 August 2020 / Accepted: 26 August 2020 / Published: 28 August 2020

Abstract

Optical and Synthetic Aperture Radar (SAR) fusion is addressed in this paper. Intensity–Hue–Saturation (IHS) is an easily implemented fusion method and can separate Red–Green–Blue (RGB) images into three independent components; however, using this method directly for optical and SAR images fusion will cause spectral distortion. The Gradient Transfer Fusion (GTF) algorithm is proposed firstly for infrared and gray visible images fusion, which formulates image fusion as an optimization problem and keeps the radiation information and spatial details simultaneously. However, the algorithm assumes that the spatial details only come from one of the source images, which is inconsistent with the actual situation of optical and SAR images fusion. In this paper, a fusion algorithm named IHS-GTF for optical and SAR images is proposed, which combines the advantages of IHS and GTF and considers the spatial details from the both images based on pixel saliency. The proposed method was assessed by visual analysis and ten indices and was further tested by extracting impervious surface (IS) from the fused image with random forest classifier. The results show the good preservation of spatial details and spectral information by our proposed method, and the overall accuracy of IS extraction is 2% higher than that of using optical image alone. The results demonstrate the ability of the proposed method for fusing optical and SAR data effectively to generate useful data.
Keywords: optical and SAR; image fusion; pixel saliency; impervious surface optical and SAR; image fusion; pixel saliency; impervious surface

Share and Cite

MDPI and ACS Style

Shao, Z.; Wu, W.; Guo, S. IHS-GTF: A Fusion Method for Optical and Synthetic Aperture Radar Data. Remote Sens. 2020, 12, 2796. https://doi.org/10.3390/rs12172796

AMA Style

Shao Z, Wu W, Guo S. IHS-GTF: A Fusion Method for Optical and Synthetic Aperture Radar Data. Remote Sensing. 2020; 12(17):2796. https://doi.org/10.3390/rs12172796

Chicago/Turabian Style

Shao, Zhenfeng, Wenfu Wu, and Songjing Guo. 2020. "IHS-GTF: A Fusion Method for Optical and Synthetic Aperture Radar Data" Remote Sensing 12, no. 17: 2796. https://doi.org/10.3390/rs12172796

APA Style

Shao, Z., Wu, W., & Guo, S. (2020). IHS-GTF: A Fusion Method for Optical and Synthetic Aperture Radar Data. Remote Sensing, 12(17), 2796. https://doi.org/10.3390/rs12172796

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop