Next Article in Journal
Monitoring and Law Analysis of Secondary Deformation on the Surface of Multi-Coal Seam Mining in Closed Mines
Previous Article in Journal
A Non-Uniform Grid Graph Convolutional Network for Sea Surface Temperature Prediction
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

SAR Image Despeckling Based on Denoising Diffusion Probabilistic Model and Swin Transformer

1
School of Remote Sensing and Information Engineering, Wuhan University, Wuhan 430079, China
2
Ant Group (Beijing), Beijing 100020, China
3
Ant Group, Hangzhou 310020, China
4
Hubei Luojia Laboratory, Wuhan 430075, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2024, 16(17), 3222; https://doi.org/10.3390/rs16173222 (registering DOI)
Submission received: 19 July 2024 / Revised: 24 August 2024 / Accepted: 29 August 2024 / Published: 30 August 2024

Abstract

The speckle noise inherent in synthetic aperture radar (SAR) imaging has long posed a challenge for SAR data processing, significantly affecting image interpretation and recognition. Recently, deep learning-based SAR speckle removal algorithms have shown promising results. However, most existing algorithms rely on convolutional neural networks (CNN), which may struggle to effectively capture global image information and lead to texture loss. Besides, due to the different characteristics of optical images and synthetic aperture radar (SAR) images, the results of training with simulated SAR data may bring instability to the real-world SAR data denoising. To address these limitations, we propose an innovative approach that integrates swin transformer blocks into the prediction noise network of the denoising diffusion probabilistic model (DDPM). By harnessing DDPM’s robust generative capabilities and the Swin Transformer’s proficiency in extracting global features, our approach aims to suppress speckle while preserving image details and enhancing authenticity. Additionally, we employ a post-processing strategy known as pixel-shuffle down-sampling (PD) refinement to mitigate the adverse effects of training data and the training process, which rely on spatially uncorrelated noise, thereby improving its adaptability to real-world SAR image despeckling scenarios. We conducted experiments using both simulated SAR image datasets and real SAR image datasets, evaluating our algorithm from subjective and objective perspectives. The visual results demonstrate significant improvements in noise suppression and image detail restoration. The objective results demonstrate that our method obtains state-of-the-art performance, which outperforms the second-best method by an average peak signal-to-noise ratio (PSNR) of 0.93 dB and Structural Similarity Index (SSIM) of 0.03, affirming the effectiveness of our approach.
Keywords: synthetic aperture radar (SAR); speckle noise suppression; denoising diffusion probalistic model; swin-transformer synthetic aperture radar (SAR); speckle noise suppression; denoising diffusion probalistic model; swin-transformer

Share and Cite

MDPI and ACS Style

Pan, Y.; Zhong, L.; Chen, J.; Li, H.; Zhang, X.; Pan, B. SAR Image Despeckling Based on Denoising Diffusion Probabilistic Model and Swin Transformer. Remote Sens. 2024, 16, 3222. https://doi.org/10.3390/rs16173222

AMA Style

Pan Y, Zhong L, Chen J, Li H, Zhang X, Pan B. SAR Image Despeckling Based on Denoising Diffusion Probabilistic Model and Swin Transformer. Remote Sensing. 2024; 16(17):3222. https://doi.org/10.3390/rs16173222

Chicago/Turabian Style

Pan, Yucheng, Liheng Zhong, Jingdong Chen, Heping Li, Xianlong Zhang, and Bin Pan. 2024. "SAR Image Despeckling Based on Denoising Diffusion Probabilistic Model and Swin Transformer" Remote Sensing 16, no. 17: 3222. https://doi.org/10.3390/rs16173222

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop