*Article* **Deep Multi-Scale Recurrent Network for Synthetic Aperture Radar Images Despeckling**

**Yuanyuan Zhou, Jun Shi \*, Xiaqing Yang, Chen Wang, Durga Kumar and Shunjun Wei and Xiaoling Zhang**

School of Information and Communication Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China

**\*** Correspondence: shijun@uestc.edu.cn

Received: 23 September 2019; Accepted: 19 October 2019; Published: 23 October 2019

**Abstract:** For the existence of speckles, many standard optical image processing methods, such as classification, segmentation, and registration, are restricted to synthetic aperture radar (SAR) images. In this work, an end-to-end deep multi-scale recurrent network (MSR-net) for SAR image despeckling is proposed. The multi-scale recurrent and weights sharing strategies are introduced to increase network capacity without multiplying the number of weights parameters. A convolutional long short-term memory (convLSTM) unit is embedded to capture useful information and helps with despeckling across scales. Meanwhile, the sub-pixel unit is utilized to improve the network efficiency. Besides, two criteria, edge feature keep ratio (EFKR) and feature point keep ratio (FPKR), are proposed to evaluate the performance of despeckling capacity for SAR, which can assess the retention ability of the despeckling algorithm to edge and feature information more effectively. Experimental results show that our proposed network can remove speckle noise while preserving the edge and texture information of images with low computational costs, especially in the low signal noise ratio scenarios. The peak signal to noise ratio (PSNR) of MSR-net can outperform traditional despeckling methods SAR-BM3D (Block-Matching and 3D filtering) by more than 2 dB for the simulated image. Furthermore, the adaptability of optical image processing methods to real SAR images can be enhanced after despeckling.

**Keywords:** synthetic aperture radar; despeckling; multi-scale; LSTM; sub-pixel
