Next Article in Journal
Single-Frequency GPS/BDS RTK and INS Ambiguity Resolution and Positioning Performance Enhanced with Positional Polynomial Fitting Constraint
Previous Article in Journal
Ionospheric S4 Scintillations from GNSS Radio Occultation (RO) at Slant Path
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A SAR Image Despeckling Method Based on an Extended Adaptive Wiener Filter and Extended Guided Filter

1
Department of Computer Engineering, Sari Branch, Islamic Azad University, Sari 48161–19318, Iran
2
Department of Mathematics, Iran University of Science and Technology, Tehran 1684613114, Iran
3
Department of Mathematical Sciences, University of South Africa, Pretoria 0002, South Africa
4
Department of Mathematics and General Sciences, Prince Sultan University, Riyadh 11586, Saudi Arabia
5
Department of Medical Research, China Medical University, Taichung 40402, Taiwan
6
Department of Computer Science and Information Engineering, Asia University, Taichung 41354, Taiwan
7
Department of Computer Engineering, Babol Branch, Islamic Azad University, Babol 47471–37381, Iran
*
Author to whom correspondence should be addressed.
Remote Sens. 2020, 12(15), 2371; https://doi.org/10.3390/rs12152371
Submission received: 7 January 2020 / Revised: 19 February 2020 / Accepted: 21 February 2020 / Published: 23 July 2020
(This article belongs to the Section Remote Sensing Image Processing)

Abstract

:
The elimination of multiplicative speckle noise is the main issue in synthetic aperture radar (SAR) images. In this study, a SAR image despeckling filter based on a proposed extended adaptive Wiener filter (EAWF), extended guided filter (EGF), and weighted least squares (WLS) filter is proposed. The proposed EAWF and EGF have been developed from the adaptive Wiener filter (AWF) and guided Filter (GF), respectively. The proposed EAWF can be applied to the SAR image, without the need for logarithmic transformation, considering the fact that the denoising performance of EAWF is better than AWF. The proposed EGF can remove the additive noise and preserve the edges’ information more efficiently than GF. First, the EAWF is applied to the input image. Then, a logarithmic transformation is applied to the resulting EAWF image in order to convert multiplicative noise into additive noise. Next, EGF is employed to remove the additive noise and preserve edge information. In order to remove unwanted spots on the image that is filtered by EGF, it is applied twice with different parameters. Finally, the WLS filter is applied in the homogeneous region. Results show that the proposed algorithm has a better performance in comparison with the other existing filters.

Graphical Abstract

1. Introduction

Despeckling of synthetic aperture radar (SAR) images is one of the main topics of recent studies [1]. SAR images have been widely used in many fields, such as disaster monitoring, environmental, protection, and topographic mapping [1]. It is not even affected by cloud cover or variation in solar illumination. A SAR image is formed by the continuous interaction of emitted microwave radiance with targeted regions, which causes random constructive and destructive noisiness resulting in multiplicative noise called speckle noise. Several methods have been proposed to remove these unwanted patterns, and some noise removal methods are based on spatial filtering, for instance, Frost filtering [2] and Lee filtering [3]. However, the spatial methods tend to darken the despeckled SAR images. In recent years, model-based filters are being used for SAR image despeckling [4] such as the block sorting for the SAR image denoising algorithm [5]. These filters are useful for solving different inverse problems but they are often time-consuming. Lately, scientists have presented many image fusion methods, the two known image fusion filters among which are Multi-scale image fusion filters and data-driven image fusion filters [6]. However, these Filters do not fully consider spatial consistency and tend to produce color distortion and brightness.
Other popular filters include the partial differential equation (PDE), nonlinear, linear, multiresolution, hybrid, and denoising filters based on fuzzy logic.
The linear types of filtering are performed through a procedure known as convolution. The filters are obtained by the weighted sum of the neighbor pixels of the mask window. For example, the Gaussian and mean methods belong to linear filters. Due to its inability to detect image areas (edge, homogeneous, and details), the mean filter shows a low-edge preservation performance [7]. However, the Gaussian method shows good performance in small variance and the blurring phenomenon occurs within the edge areas.
In the nonlinear method, the output is not a linear function of its input (e.g., median filter, bilateral filter (BF) [8], non-local mean (NLM) filters [9], Weibull multiplicative model, etc). The median filter is similar to the mean filter representation of good performance and weak performance in homogenous regions and edge preservation, respectively. The BF and NLM filters have good performance in edge preservation, but in Rayleigh distribution of noise (speckle noise), the denoising performance is poor. In addition, BF represents gradient distortion and high complexity [10]. Moreover, Anantrasirichai [11] suggested an adaptive bilateral filter (ABF) for the imaging of optical coherence tomography. The range parameter is determined by the variance of the most homogeneous block, and it is automatically adapted based on speckle noise variations. But, given the sensitivity of variance to noise, the variance cannot correctly identify the homogeneous image in high noise. The NLM filter functions based on BF that is appropriate to additive white gaussian noise (AWGN) denoising, not despeckling [12]. Another popular model is the Weibull distribution but it is appropriate for urban scenes and sea clutter [13].
The PDE methods are useful to eliminate noise. In order to obtain a noise-free image, a noisy image is transformed into PDE forms [14]. Some filtering methods such as the adaptive window diffusion (AWAD) [15] are based on PDE, which is able to control the mask size and direction and shows good performance in edge preservation. However, the AWAD method that is based on a new diffusion function does not perform speckle denoising in homogeneous regions, so it has a low de-speckle performance.
In the field of fuzzy noise removal, Cheng et al. [16] developed a denoising method of synthetic aperture radar images, which is based on fuzzy logic and estimates the fuzzy edges of each pixel in the filter window and uses these to weigh the contributions of neighboring pixels to perform fuzzy filtering. Nonetheless, it is appropriate only for images with large homogenous regions, which is its main drawback. In addition, Babu et al. [17] proposed adaptive despeckling based on fuzzy logic for the classification of image regions. Then, the most appropriate filter was selected for the particular image region used only averaging and median filters, though.
The transform domain filters are mostly based on multi-resolution transforms such as Shearlet-domain SAR image denoising [18] and wavelet-domain [19]. Because multiresolution filters are able to eliminate noise at different frequencies, they are employed for despeckling. However, because of some inherent flaws, the transform domain methods cause pixel distortion, so hybrid filters have become more popular. Zhang and Gunturk [10] stated that noise may exist in the detailed and approximate sub-bands of images in the wavelet transform, so for speckle noise removal, different filters can be applied to individual sub-bands. Loganayagi et al. [20] proposed a robust denoising method based on BF. In addition, Kumar et al. proposed speckle denoising using tetrolet wavelets [21]. Since the BF is a single resolution, not all frequency components of the image are available. Accordingly, the wavelet filter is used to form a scale-space for the noisy image. The wavelet filter decomposition is able to distinguish signal from noise at various scales [22]. Mehta proposed a method that exploits the wavelet filter features with the estimation ability of the Wiener filter. To achieve better performance, they used the adaptive Wiener filter (AWF) to denoise the components of the wavelet sub-bands. Zhang proposed ultrasound image denoising using a combination of bilateral filtering and stationary wavelet [23]. Speckle noise in the low-pass approximation and high-pass detail are filtered by the fast bilateral filter and wavelet thresholding, respectively. Sari et al. proposed a denoising method through the subsequent application of bilateral filters and wavelet thresholding [24]. In these mentioned hybrid articles, wavelet decomposition is employed to divide the information of an image into approximation sub-band and detail sub-bands. A bilateral or Wiener filter is applied in different positions of the sub-bands. These algorithms show low speckle noise suppression ability due to the low ability to choose the optimal position and optimal parameters of filters. However, it is impossible to completely remove speckle noise in the degraded image because both the noise and signal may have a continuous power spectrum. Therefore, denoising is performed through an MMSE filter. We used the AWF for despeckling and improved the AWF structure in order to increase noise reduction efficiency via EAWF. In addition, the main tool in image despeckling is an edge-aware method [25], such as a guided filter (GF). It can be applied as an edge-preserving operator, such as the well-known BF, but functions better near to the edges. In this study, we improved the performance of GF with the proposed edge detection method. The extended guided filter (EGF) outcome shows better speckle noise removal than the GF outcome. We focused on despeckling SAR images using a hybrid combination of EAWF, EGF, and weighted least squares (WLS) filter to make it more efficient. First, the EAWF is applied to the input image. Then, a logarithmic transformation is applied to the resulting EAWF image in order to convert multiplicative noise into additive noise. Next, to remove the additive noise and preserve edge information, EGF is used. Finally, in order to eliminate speckle noise in homogeneous regions, the WLS filter is used (Figure 1). The organization of this study is as follows: The materials and methods are explained in Section 2. In Section 3, the proposed algorithm is shown. Experimental outcomes and experiments on real SAR images are described in Section 4 and Section 5. Finally, Computational Complexity, Discussion, and the concluding remarks are given in Section 6, Section 7 and Section 8 respectively.

2. Materials and Methods

2.1. Measurement of Performance

After enhancement, the image quality was measured by comparing it with the noise-free images using some metrics. In Section 3, we use six parameters of PSNR, PFOM, SNR, SSIM, IQI, and MAE. In Section 5, we use one parameter of the equivalent number of look (ENL) and standard deviation (STD) and in the remaining sections we use two parameters of PSNR and SSIM. PSNR is an engineering term for the ratio between the maximum possible power of a signal and the power of corrupting noise that affects the fidelity of its representation. Next, SNR compares the level of the desired signal to the level of background noise and SSIM is a procedure to predict the perceived quality of digital television, cinematic pictures, and other kinds of digital images. In addition, IQI is considered the fourth parameter for assessing the quality of denoised images, and MAE is the average of all absolute errors for measuring the proximity of forecasts or predictions to the eventual outcomes. PFOM is Pratt’s figure of merit used here as an assessment criterion for the standard edge detectors.

2.2. Adaptive Wiener Filter

One of the first methods developed for denoising in digital images is based on Wiener filtering. If we assume (n1, n2) is a particular pixel location, the AWF is given by [26].
A W F [ I ( n 1 , n 2 ) ] = μ + σ 2 σ n 2 σ 2 ( I ( n 1 , n 2 ) μ )
where I is the input image, variance ( σ 2 ) and mean µ are locally estimated from the set ℵ of (N × M) local neighborhood of each pixel. So that
μ = 1 M N n 1 , n 2 ϵ I ( n 1 , n 2 )
σ 2 = 1 M N n 1 , n 2 ϵ I 2 ( n 1 , n 2 ) μ 2
In addition, ( σ n ) is the variance of the noise.

2.3. Guided Filter

GF transforms the output qi for the guidance Ik.of the window ω k , and k is the center pixel in the window, as follows:
qi = akIk + bk’ ∀i ∈ ωk
where ωk is a square window with the size of (2r +1) × (2r +1) and linear coefficients of ak and bk are constants estimated from the window ω k .
Generally,
qi = pi − ni
where ni and pi define the noise and input image, respectively. The linear coefficients can be estimated by minimizing the squared difference between the input pi and the output qi, as follows:
E ( a k , b k ) = i ω k ( ( a k b k + b k p i ) 2 + ε a k 2 )
where ε is a normalization parameter. It can serve to prevent ak from becoming immeasurably large. The coefficients bk and ak can be solved by linear regression.
a k = 1 | ω | i ω k I i P i μ k P k _ σ k 2 + ϵ
b k = P k _ a k μ k
where σ k 2 and μ k are the variance and the mean of the guidance image in the window ω k and | ω | indicates the number of pixels in ω k , and P k _ = 1 | ω | P i . As ε and the window size ω k adjustment, the noise is deleted and the edge regions are preserved.

3. Proposed Algorithm

3.1. Improvement of Adaptive Wiener Filter

In Equation (1), we used the dispersion index instead of the variance. The dispersion index is for determining if a set of observed occurrences are clustered or dispersed. It is defined as the ratio of the variance to the mean.
D I = σ 2 μ
E A W F [ I ( n 1 , n 2 ) ] = μ + D I σ n 2 D I ( I ( n 1 , n 2 ) μ )
To enhance performance, a dispersion index is also used to obtain the noise estimate instead of the variance. We can simplify Equation (10) in the form of Equation (11).
E A W F [ I ( n 1 , n 2 ) ] = μ + ( I ( n 1 , n 2 ) μ ) σ n 2 D I ( I ( n 1 , n 2 ) μ ) = I ( n 1 , n 2 ) σ n 2 D I ( I ( n 1 , n 2 ) μ )   = I ( n 1 , n 2 ) μ ( σ n 2 σ   2 ( I ( n 1 , n 2 ) μ ) )
In fact, σ n 2 σ   2 ( I ( n 1 , n 2 ) μ ) is almost the same as the noise but with a different sign. Meanwhile, speckle noise is a multiplicative noise. Therefore, noise has less of an effect in lower light intensity. By multiplying the mean in σ n 2 σ 2 ( I ( n 1 , n 2 ) μ ) , a better approximation of noise is obtained ( μ ( σ n 2 σ   2 ( I ( n 1 , n 2 ) μ ) ) ). Figure 2 shows the comparison between AWF, homogeneous adaptive Wiener filter (HAWF), and EAWF.
To evaluate this method, we compared the EAWF and existing methods with PSNR, SSIM, IQI and Pratt’s FOM quantitative measurements (Figure 3). In this comparison, a 250x250 Lena image is used.

3.2. SAR Speckle Noise Model and Logarithmic Transformation

The following model is appropriate for images with multiplicative noise:
f(x,y) = g(x,y).ƞm(x,y) + ƞa(x,y)
where f(x, y),g(x, y),ηm(x, y) and ηa(x, y) are the real noisy image, unknown noise-free image, and additive and multiplicative noise functions, respectively. Since additive noise is considered to be less than multiplicative noise, we considered Equation (13) for speckle noise.
f(x,y) = g(x,y).ƞm(x,y)
where f(x, y) defines the SAR image degraded by speckle noise. g(x, y) indicates a radar scattering characteristic of the ground target (i.e., the noise-free image). ƞm(x,y) also defines the speckle due to fading. g(x,y) and ƞm(x,y) are independent. ƞm(x,y) conforms to a Gamma distribution where the variance is 1 L and the mean is one.
ρ N   ( N ) = L L η m L 1 E X P ( η m L ) Γ ( L )
where N 0, L 1, and L is the equivalent number of looks (ENL), where a bigger L defines weaker speckling. The noise of the speckles present in images is the noise of multiplicative. Therefore, we preferred to carry out a logarithmic transform of the noisy image and multiplicative noise becomes addictive as it is shown in the following equation [27]:
Logf(x,y) = logg(x,y) + ƞm(x,y)

3.3. Extended Guided Filter

3.3.1. A New Edge-Aware Weighting

1.
Coefficient of Variation
The coefficient of variation (CV) is determined as a ratio of standard deviation (σ) to the average (μ). Because the CV is unitless, the CV value is similar in extreme variances in regions with high intensity and low variances in regions with lower intensity. This leads to a similar function for regions with different brightness (Figure 4). The formula for CV is defined as follows:
C V = S α m α × 100 %
Where Sa and ma are the standard deviation and mean, respectively. Therefore, high, low, and intermediate CV values correspond to ‘edges’, ‘homogenous’, and ‘detail’.
In order to better identify the edges, the mean of the image is added to the local mean.
C V = S α m α + M × 100 %
where M is the mean of the image.
2.
Difference of variances
The difference of variances (DoV) is the difference between the noise variance estimation and standard deviation of the local window, so homogeneous regions close to zero and edge regions have greater values.
DoVw = (std(x) - estimate_noise(X))
where std (x) and estimate_noise (X) are the standard deviation of the local window (x) and the noise variance estimation of image (X), respectively. Its window size is w = (5 × 5). Figure 5 shows the outcome of DoVw=5 in different speckle noise. As can be seen, DoV5 is stable during changes in noise intensity.

3.3.2. The Proposed Extended Guided Filter

We define the proposed edge-aware weighting (PEAW) s as follows:
S = K × DOVw=5
where K is a constant obtained experimentally and S is combined into the cost function E(ak, bk). The criterion of a ”homogeneous area” or an ”edge area” is determined by the parameter ε , and the areas with variance (σ2) < ε are smoothed, whilst the areas with variance (σ2) > ε are preserved. Therefore, by replacing ε with ε s in Equation (6) it is possible to maintain the edge more efficiently. In addition, the window size of DoV is 5 and if w becomes greater than 5, the edges of the DoV image become blurry and the guide filter cannot smooth the homogeneous areas around the edges properly. As mentioned, an EGF minimizes the output qi and the input pi. Equation (20) shows a cost function with applied PEAW.
E ( a k , b k ) = i ω k ( ( a k b k + b k p i ) 2 + ε s a k 2 )
The optimal value of a k and the optimal value b k are calculated as:
a k = 1 | ω | i ω k I i P i μ k P k _ σ k 2 + ϵ s
b k = P k _ a k μ k
Finally, q i ^ is defined as follows:
q i ^ = a k ^ I k + b k ^
where b k ^ is the mean values of b k and a k ^ is t h e   m e a n   v a l u e s   o f   a k within the window, respectively. Figure 6 shows the effect of a guide filter with DoV and without DoV on a Lena image.

4. Experimental Results

All the experimental outcomes are assessed in MATLAB = R2018b on Intel(R) Core(TM) i5-8500 CPU M 430 @ 3.0 GHz, 16 GB RAM and 64-bit operating system. Table 1 presents the simulation conditions for the proposed method.
During the experimental work, simulation SAR images, real SAR images, and standard optical images impressed with speckles are used. In the simulation SAR outcomes, the presence of the speckle is already there in the reference SAR image, so the real effectiveness and strength of despeckling scheme is checked by experimenting with the proposed method in 14 standard optical images (i.e., ‘Lena’, ’Boat’,…) in speckle adding experiments instead of using SAR images that have already been impressed with speckle noise.
In the beginning, we used six standard images and the original images are shown in Figure 7. Speckle noise is added for speckle noise removal at noise (σ = 0.04 and number of looks L = 25).
First, the EAWF was applied to the input image. Figure 8a–f shows three noisy images and EAWF outcomes. Then, a logarithmic transformation was applied to the resulting EAWF image in order to convert multiplicative noise into additive noise. Next, to remove the additive noise and preserve edge information, EGF was used. Figure 8j–l shows the application of EGF for the first time with s = K × DoV, K = 150, where the filtering process is guided by image G which is the outcome of EAWF. As can be seen in Figure 8j–l, after applying the EGF on the image, some spots appear on homogeneous areas, which can be reduced by re-applying the EGF with different windows. The use of CV will be useful for softening the areas with high homogeneity (Figure 8m–o). In order to avoid over-softening, the parameters of this filter are also considered to be in small amounts. In addition, if the WLS filter is applied, after the first use of EGF, adjusting the WLS filter parameters leads to the removal of image details to achieve the desired ENL. Therefore, the EGF is applied to the image for the second time with s = K × CV’, K = 4 (Figure 8p–r), where the filtering process is guided by image G which is the outcome of the EAWF. Finally, in order to achieve the desired ENL, WLS Filter is applied to a homogenous region of the image (Figure 8s–u). Equation (24–26) shows the steps of applying the WLS filter to homogeneous regions.
Homogenous region = (1-CV’)
Edge region = CV’
Denoised image = WLS (I× (1-CV’)) + I × (CV’)
where I is the result of the EGF despeckling. Figure 8s–u shows the final outcome of image despeckling.
To mention the advantages of EAWF and EGF in the proposed method, we studied the performance of the sub-filters of the proposed method. The denoising result of EAWF+EGF+WLS, EGF+WLS, and EAWF+WLS are shown step by step in Table 2 to verify the correctness and necessity of the designed components in the proposed method. Table 2 shows ENL, PSNR (dB), STD, and SSIM values of the despeckled standard images using the sub-filters of the proposed method. The ENL measures the degree of speckle reduction in a homogenous area. Generally, a higher ENL value corresponds to better speckle elimination but a large STD corresponds to higher speckle noise
According to Table 2, the denoising performance of EGF+WLS filter is reduced for the monarch image (PSNR = 25.86506 (-12.67%), SSIM = 25.86 (-21.57%), ENL = 230.84 (-90.34%) and STD = 0.030 (-64.44%)). In other images, the conditions are the same. The mean of the decline rate in EGF+WLS filter is reduced (PSNR = -12.53%, SSIM = -24.15%, ENL = -88.83% and STD = -66.25%). Therefore, the use of EAWF can increase PSNR by 12.53%, SSIM by 24.15%, ENL by 88.83% and STD by -66.25%. On the other hand, the use of EGF can increase PSNR by 3.32%, SSIM by 10.06%, ENL by 82.73% and STD by -58.80%. As can be seen from the mean of the decline rate in Table 2, the EAWF+ WLS filter has a better performance compared to the EGF + WLS filter. Thus, the EAWF is more efficient than EGF. Since EAWF+ EGF+ WLS has the best performance, the combination of these filters is synergistic and compensates for individual weaknesses. When considering the EAWF + EGF + WLS, the PSNR value may be slightly reduced by adding filters (Monarch, Man, Boat, Peppers, and Cameraman). This decrease is less than 2%, but the ENL value is increased by adding filters for all images. This increase is between 41% and 329% for the monarch image, between 22% and 476% for the man image, between 40% and 318% for the Boat image, between 29% and 647% for the Lena image, between 14% and 340% for the Peppers image, and between 13% and 506% for the Cameraman image. In addition, the STD value is improved similarly to ENL but the SSIM changes are minor.
In the next comparison, we used 14 standard images. The original images are shown in Figure 9. Speckle noise is added for speckle noise removal at noise (σ = 0.04 and number of looks L = 25). The standard methods used for comparison between different filters are SAR-BM3D [28], NLM [29], WLS [30], bitonic [31], guided [32], Lee [33], Frost [34], anisotropic diffusion filter with memory based on speckle statistics (ADMSS) [35], non-local low-rank (NLLR) [36], SRAD-guided [37], SRAD [38], and Choi et al. [1]. Table 3 illustrates the optimal parameters of existing filters.
Table 4 and Table 5 show the SSIM and PSNR values of the despeckled standard images using the proposed and the standard methods. The best value of the SSIM and PSNR are shown in bold and the second-best value of the SSIM and PSNR are shown in red color
The SAR-BM3D filter shows the best despeckling in the Barbara image (PSNR = 28.32 dB - the best value), Airplane image (PSNR = 28.10 dB - the best value), Hill image (PSNR = 28.30 dB - the best value), and House image (PSNR = 29.83 dB - the best value). Nevertheless, the proposed method in the Barbara image, Airplane image, and House image demonstrate second-best values. The proposed method of Choi et al. [1] shows the best despeckling in the Lena image (PSNR = 30.13 dB), Man image (PSNR = 28.55 dB), Monarch image (PSNR = 29.64 dB), and Zelda image (PSNR = 32.77 dB). However, SAR-BM3D in the Lena image shows slightly smaller values of PSNR than the method proposed by Choi et al. (PSNR = 29.91 dB - second-best value) and the proposed method in the Man image (PSNR = 28.00 dB), Monarch image (PSNR = 29.59 dB), and Zelda image (PSNR = 32.58 dB) show second-best values. The PSNR values of the proposed method exhibit the best performance of despeckling in the images (Boat = 28.24 dB; Cameraman = 28.43 dB; Fruits = 27.79 dB; Napoli = 26.83 dB and Peppers = 28.53 dB)
The SSIM values of the proposed and the standard methods are shown in Table 5. As can be seen, the SRAD filter provides the best edge preservation performance in Baboon = 0.65 and Hill= 0.73 and the proposed method of Choi et al. provides the best edge preservation performance in Fruits = 0.78 and Hill = 0.73. SAR-BM3D provide the highest edge preservation performance, with Zelda = 0.87, Monarch = 0.90, House = 0.84, Hill = 0.73 Fruits = 0.78, Barbara = 0.84, and Airplane = 0.84. The proposed method exhibits the best edge preservation performance in Airplane = 0.84, Boat = 0.79; Cameraman = 0.82; Fruits = 0.78, Lena = 0.85; Man = 0.78; Monarch = 0.90; Napoli = 0.80; and Peppers = 0.85. The proposed method and SAR-BM3D exhibit the same edge preservation performance in Monarch = 0.90, Fruits = 0.78, and Airplane = 0.84. Table 4 and Table 5 confirm that the proposed method outcomes represent an excellent performance compared to the existing standard methods. Table 6 shows the number of the best, the second best, and the sum of both.
As shown in Table 6, the performance of the proposed method is better than other methods. Figure 10 exhibits the comparison among the filtering result images of GF, Frost, Lee, bitonic, WLS, NLLR, ADMSS, SRAD, SRAD-Guided, SAR-BM3D filters, the proposed method of Choi et al., and our proposed method, respectively.
In Figure 10b–e and 10g–i, the noise of speckle residue is represented in homogeneous areas. Speckle noise reduction performance of the SRAD-guided and the WLS methods are better than the GF, Frost, Lee, Bitonic, NLLR, ADMSS, and SRAD methods, as these methods show a blurring in the image. The visual quality and edge reservation performance of the method proposed by Choi et al. and SAR-BM3D filters are also excellent, but these filters show artifacts in the homogeneous area (Figure 10k–l). As can be seen, the proposed method exhibits robust denoising and edge preservation abilities.

5. Experiments on Real SAR Images

To consider the actual performance of the proposed filter, we examined the proposed filter in the real SAR image. In this section, three SAR images are described (Figure 11, Figure 12, and Figure 13). The actual SAR image depicts a rural scene (512 × 512) [39], capitol building scene (1232 × 803), and an image of a C-130s on a flight line (600 × 418) [40]. Table 7 shows that the proposed method has the best performance in terms of ENL. The WLS filter also has the second rank in speckle noise suppression performance. For SAR images, the level of noise is related to L. When the look number is not known, finding the mean of several ENL is a common way to obtain the L [41]. According to Table 7, the estimated look number is L = 15.
Figure 11 shows the outcomes of despeckled filters, demonstrating that some methods, like guided, Frost, Lee, bitonic, NLLR, and SRAD, do not have a strong robust denoising ability (Figure 11b–e,g,i). Table 7 shows that the proposed method demonstrates the best ENL value and WLS exhibits second-best ENL value, but it shows a blurring phenomenon (Figure 11f). Compared with SAR-BM3D and Choi et al. proposed method, the SAR-guided method exhibits lower edge preservation and denoising performances. The SAR-BM3D and Choi et al. methods have good edge preservation and despeckling performance; meanwhile, artifacts in the homogeneous areas are noticeable (Figure 11k). A comparison of the proposed method with Choi et al. and SAR-BM3D filters shows that the proposed method has a better edge preservation performance (Figure 11m).
In the following, another real SAR image is evaluated by ENL and STD (Figure 12, Table 8). Generally, a large STD corresponds to higher speckle noise. The standard methods used for comparison are GF, SAR-BM3D, Bilateral filter, Fast Bilateral filter, WLS, DPAD, DPAD, and the proposed method of Fang et al. [42]. Table 9 presents the parameters of different filters.
Figure 12 exhibits that bilateral filter does not show strong despeckling ability (Figure 12e). The WLS and fast bilateral filter show a blurring phenomenon in the despeckled image (Figure 12d–f). Based on Table 8, the proposed method represents the second-best ENL and STD values. However, it exhibits excellent performance in edge preservation abilities (Figure 12g). According to Table 8, the estimated look number is 23. Table 8 shows that the filter SAR-BM3D has a good performance but, as can be seen in Figure 12—region1 and Figure 12—region 2, sharp changes in the homogeneous area lead to artifacts in the homogeneous areas. Table 8 shows that WLS represents the best ENL and STD values. In order to compare the proposed method and WLS filter, we examined the pixel changes of these two filters and the real SAR image. WLS filter shows a smoothing at the edge area (Figure 12—region 3 and Figure 12—region 4) but the proposed method can preserve the edges of the image.
In the simulated SAR image experiment, the generation of speckle noise is performed through modeling of the multiplicative speckle noise using Equation (14). The noise distribution in the actual SAR images is unknown. Therefore, the actual SAR images are not possible to test the algorithm at different noise variances. For this purpose, the concept of simulated SAR images is introduced [43]. Furthermore, a simulated SAR image is evaluated by PSNR, SNR, SSIM, and MAE (Table 10). Figure 13 shows the original SAR image and the noisy image, respectively, whose number of looks L=25, 16, 12, and 10, and the variance of the noise, were 0.04, 0.06, 0.08, and 0.1. The purpose of showing the outcome in the simulated SAR image is to test the validity, robustness, effectiveness, and adaptive features of the despeckling method at diverse noise variances. The despeckling is applied to a speckled SAR image (600 × 418) to validate the efficiency of the proposed method.
According to Table 10, the proposed method achieved the best results in the three evaluation indexes of SNR, PSNR, and MAE when noise variance is 0.06, 0.08, and 0.1. Table 10 shows that the proposed method represents the second-best SSIM values when noise variance is 0.04 and 0.08. This means that this algorithm showed satisfactory results. Generally, the proposed method can preserve the edges of the image, suppress the noise effectively, and retain the edge details to some extent.

6. Computational Complexity

The time consumption of the proposed and the standard methods for 17 images are shown in Table 11, Table 12, Table 13, Table 14, Table 15, Table 16, Table 17 and Table 18. According to Table 11, Table 12, and Table 13, the proposed method is faster than Lee, NLLR, ADMSS, SRAD, SRAD-guided, SAR-BM3D, and Choi et al. In addition, according to Table 14 and Table 16, the proposed method is faster than the bilateral filter, DPDA, SRAD, SRAD-guided, SAR-BM3D, and Fang et al. The average time cost of the proposed method with fourteen images and three SAR images is about 1.92 s, 2.76 s, 11.20 s, and 3.31 s, respectively. In addition, for precise examination, the runtimes of all sub-filters of the proposed method are studied (Table 12, Table 15, Table 16 and Table 18)

7. Discussion

This study used the EAWF, EGF, and WLS filters for the despeckling of SAR images. The most conventional methods are developed for additive white Gaussian noise. Therefore, additive noise in sensing systems and imaging is common. Since speckle noise is a multiplicative noise, EAWF is employed to reduce noise levels and increase PNSR.
During the experimental work, the simulation SAR images, real SAR images, and standard optical images impressed with speckles are used. To consider the actual performance of the proposed filter, we examined the proposed filter in the real SAR image. The purpose of using the simulated SAR image is to test the validity, robustness, effectiveness, and adaptive feature of the despeckling method at diverse noise variances. In the simulation SAR outcomes, the presence of the speckle is already there in the reference SAR image, so the real effectiveness and strength of the despeckling scheme are checked by standard optical images impressed with speckles. According to Table 2, the use of EAWF can increase PSNR by 12.53%, SSIM by 24.15%, ENL by 88.83%, and STD by −66.25%. On the other hand, the use of EGF can increase PSNR by 3.32%, SSIM by 10.06%, ENL by 82.73%, and STD by -58.80%. Thus, the EAWF is more efficient than EGF. Since EAWF + EGF + WLS has the best performance, the combination of these filters is synergistic. When considering EAWF + EGF + WLS, the increase in PSNR value may be slight (about 2%), but the ENL value is increased between 41% and 329% for the Monarch image, between 22% and 476% for the Man image, between 40% and 318% for the Boat image, between 29% and 647% for the Lena image, between 14% and 340% for the Peppers image, and between 13% and 506% for Cameraman image.
The EGF shows excellent noise removal and edge preservation. As can be seen in Figure 6, the new edge-aware EGF filter is better than classic GF. According to Table 7, the WLS filter exhibits the best ENL value (between all filters except the proposed method) and edge preservation performance. Therefore, we adopted the WLS method to increase ENL. As shown in Table 4, Table 5, and Table 6, the proposed method has the best outcome in PNSR (5 best and 7 second best—the PSNR values of the proposed method exhibit the best performance of de-speckling in the images (Boat = 28.24 dB; Cameraman = 28.43 dB; Fruits = 27.79 dB; Napoli = 26.83 dB and Peppers = 28.53 dB)) and SSIM (9 best and 4 second best—the proposed method exhibits the best edge preservation performance in Airplane = 0.84, Boat = 0.79; Cameraman = 0.82; Fruits = 0.78, Lena = 0.85; Man = 0.78; Monarch = 0.90; Napoli = 0.80; and Peppers = 0.85). In SAR image 1, the proposed method demonstrates the best ENL value in Table 7. In SAR image 2, the proposed method has second-best ENL and STD values in Table 8. In SAR image 3, according to Table 10, the proposed method achieved the best results in the three evaluation indexes of SNR, PSNR, and MAE, when the noise variance is 0.06, 0.08, and 0.1. It also shows that the proposed method represents the second-best SSIM values when noise variance is 0.04 and 0.08. According to Table 11, Table 12, Table 13, Table 14, Table 15 and Table 16, the proposed method is faster than Lee, NLLR, ADMSS, SRAD, SRAD-guided, SAR-BM3D, Bilateral filter, DPAD, the proposed method of Fang et al., and the proposed method of Choi et al. The average time cost of the proposed method for fourteen images and three SAR images is about 1.92 s (image size: 512 × 512 and 256 × 256), 2.76 s (image size: 512 × 512), 11. 20 s (image size: 1323 × 803) and 3.31 s (image size: 600 × 418) respectively. The experimental outcome shows that the proposed method exhibits outstanding despeckling and low time complexity while preserving edge information.

8. Conclusions

We propose a hybrid filter based on EAWF, EGF, and WLS Filter to remove the speckle noise. For this purpose, the EAWF method was used as a preprocessing filter. EAWF is applied to the SAR image, directly. After that, the logarithmic transform was applied to create additive noise from the multiplicative noise. We also developed GF based on new edge-aware weighting. The proposed EGF can remove the additive noise and preserve the edge information more efficiently than GF. After applying the EGF to the image, some spots appear in homogeneous areas, which can be reduced by re-applying the EGF with different windows. Finally, in order to achieve the desired ENL, the WLS Filter is applied to a homogenous region of the image. For better evaluation, we used the simulation SAR images, real SAR images, and standard optical images with speckle noise. The experimental outcome shows that the proposed method has the best despeckling and edge preservation and has an acceptable runtime as well.

Author Contributions

H.S. wrote this paper and implemented the simulation. J.V. and T.A. examined and analyzed the results and A.K. and S.Y.B.R. edited this paper edited this paper. All authors have read and agreed to the published version of the manuscript.

Funding

This paper was not funded.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Choi:, H.; Jeong, J. Speckle Noise Reduction Technique for SAR Images Using Statistical Characteristics of Speckle Noise and Discrete Wavelet Transform. Remote Sens. 2019, 11, 1184. [Google Scholar] [CrossRef] [Green Version]
  2. Lopes, A.; Touzi, R.; Nezry, E. Adaptive speckle filters and scene heterogeneity. IEEE Trans. Geosci. Remote Sens. 1990, 28, 992–1000. [Google Scholar] [CrossRef]
  3. Lee, J.S. Refined filtering of image noise using local statistics. Comput. Graph. Image Process. 1981, 15, 380–389. [Google Scholar] [CrossRef]
  4. Liu, s.; Liu, T.; Gao, L.; Li, H.; Hu, Q.; Zhao, J.; Wang, C. Convolution neural network and guided filtering for sar image denoising. Remote Sens. 2019, 11, 702. [Google Scholar] [CrossRef] [Green Version]
  5. Liu, S.; Hu, Q.; Li, P.; Zhao, J.; Zhu, Z. SAR image denoising based on patch ordering in nonsubsample shearlet domain. Turk. J. Electr. Eng. Comput. Sci. 2018, 26, 1860–1870. [Google Scholar] [CrossRef]
  6. Li, S.; Kang, X.D.; Fang, L.; Hu, J.; Yin, H. Pixel-level image fusion: A survey of the state of the art. Inform. Fusion. 2017, 33, 100–112. [Google Scholar] [CrossRef]
  7. Barash, D.A. Fundamental relationship between bilateral filtering, adaptive smoothing and the nonlinear diffusion equation. IEEE Trans. Pattern Anal. Machine Intell. 2002, 24, 844–867. [Google Scholar] [CrossRef] [Green Version]
  8. Tomasi, C.; Manduchi, R. Bilateral Filtering for Gray and Color Images. In Proceedings of the Sixth International Conference on Computer Vision, Bombay, India, 4–7 January 1998; pp. 839–846. [Google Scholar]
  9. Liu, G.; Zhong, H. Nonlocal Means Filter for Polarimetric SAR Data Despeckling Based on Discriminative Similarity Measure 2014; Volume: 11, Issue: 2, pp. 514 - 518. IEEE Geosci. Remote Sens. Letters 2014, 11, 514–518. [Google Scholar] [CrossRef]
  10. Zhang, M.; Gunturk, B.K. Multiresolution Bilateral Filtering for Image Denoising. IEEE Trans. Image Process. 2008, 17, 2324–2333. [Google Scholar] [CrossRef] [Green Version]
  11. Anantrasirichai, N.; Nicholson, L.; Morgan, J.E.; Erchova, I.; Mortlock, K.; North, R.V.; Albon, J.; Achim, A. Adaptive-weighted bilateral filtering and other pre-processing techniques for optical coherence tomography. Comput. Med. Imaging Graph. 2014, 38, 526–539. [Google Scholar] [CrossRef] [Green Version]
  12. Torres, L.; Sant’Anna, S.J.S.; Freitas, C.D.C.; Frery, C. Speckle reduction in polarimetric SAR imagery with stochastic distances and nonlocal means. Pattern Recognit. 2014, 47, 141–157. [Google Scholar] [CrossRef] [Green Version]
  13. Martín-de-Nicolás, J.; Jarabo-Amores, M.; Mata-Moya, D.; del-Rey-Maestre, N.; Bárcena-Humanes, J. Statistical analysis of SAR sea clutter for classification purposes. Remote Sens. 2014, 6, 9379–9411. [Google Scholar] [CrossRef] [Green Version]
  14. Xu, W.; Tang, C.; Gu, F.; Cheng, J. Combination of oriented partial differential equation and shearlet transform for denoising in electronic speckle pattern interferometry fringe patters. Appl. Opt. 2017, 56, 2843–2850. [Google Scholar] [CrossRef] [PubMed]
  15. Li, J.C.; Ma, Z.H.; Peng, Y.X.; Huang, H. Speckle reduction by image entropy anisotropic diffusion. Acta Phys. Sin. 2013, 62, 099501. [Google Scholar]
  16. Cheng, H.; Tian, J. Speckle reduction of synthetic aperture radar images basedon fuzzy logic. In Proceedings of the First International Workshop on Education Technology and Computer Science, IEEE Computer Society, Wuhan, China, 7–8 March 2009; pp. 933–937. [Google Scholar]
  17. Babu, J.J.J.; Sudha, G.F. Adaptive speckle reduction in ultrasound images using fuzzy logic on Coefficient of Variation. Biomed. Signal Processing and Control. 2016, 23, 93–103. [Google Scholar] [CrossRef]
  18. Rezaei, H.; Karami, A. SAR image denoising using homomorphic and shearlet transforms. In Proceedings of the International Conference on Pattern Recognition & Image Analysis, Shahrekord, Iran, 19–20 April 2017; pp. 1–5. [Google Scholar]
  19. Min, D.; Cheng, P.; Chan, A.K.; Loguinov, D. Bayesian wavelet shrinkage with edge detection for SAR image despeckling. IEEE Trans. Geosci. Remote Sens. 2004, 42, 1642–1648. [Google Scholar] [CrossRef] [Green Version]
  20. Loganayagi, T.; Kashwan, K.R.A. Robust edge preserving bilateral filter for ultrasound kidney image. Indian J Sci Technol. 2015, 8, 1–10. [Google Scholar] [CrossRef] [Green Version]
  21. Kumar, M.; Diwakar, M. CT Image Denoising Using Locally Adaptive Shrinkage Rule In Tetrolet Domain. J. King Saud Univ. – Comput. Inf. Sci. 2016, 30, 41–50. [Google Scholar] [CrossRef] [Green Version]
  22. Mehta, S. Speckle noise reduction using hybrid wavelet packets-Wiener filter. Int. J. Comput. Sci. Eng. 2017, 5, 95–99. [Google Scholar] [CrossRef]
  23. Zhang, J.; Lin, G.; Wu, L.; Wang, C.; Cheng, Y. Wavelet and fast bilateral filter based de-speckling method for medical ultrasound images. Biomed. signal processing and control. 2015, 18, 1–10. [Google Scholar] [CrossRef]
  24. Sari, S.; Al Fakkri, S.Z.H.; Roslan, H.; Tukiran, Z. Development of denoising method for digital image in low-light condition. In Proceedings of the IEEE International Conference on Control System, Computing and Engineering, Penang, Malaysia, 29 November–1 December 2013. [Google Scholar]
  25. Dai, L.; Yuan, M.; Tang, L.; Xie, Y.; Zhang, X.; Tang, J. Interpreting and extending the guided filter via cyclic coordinate descent. IEEE Trans. Image Process. 2019, 28, 767–778. [Google Scholar] [CrossRef] [PubMed]
  26. Loizou, C.P.; Pattichis, C.S. Despeckle filtering for ultrasound imaging and video, Volume I: Algorithms and Software; Second Ed. Morgan & Claypool Publishers: San Rafael, CA, USA, 2015. [Google Scholar]
  27. Xue, B.; Huang, Y.; Yang, J.; Shi, L.; Zhan, Y.; Cao, X. Fast nonlocal remote sensing image denoising using cosine integral images. IEEE Geosci. Remote Sens. Lett. 2013, 10, 1309–1313. [Google Scholar] [CrossRef]
  28. Parrilli, S.; Poderico, M.; Angelino, C.V.; Verdoliva, L. A nonlocal SAR image denoising algorithm based on LLMMSE wavelet shrinkage. IEEE Trans. Geosci. Remote Sens. 2012, 50, 606–616. [Google Scholar] [CrossRef]
  29. Buades, A.; Coll, B.; Morel, J.-M. A non-local algorithm for image denoising. In Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Diego, CA, USA, 20–25 June 2005; pp. 60–65. [Google Scholar]
  30. Farbman, Z.; Fattal, R.; Lischinski, D.; Szeliski, R. Edge-preserving decomposition for multi-scale tone and detail manipulation. ACM Trans. Graph. 2008, 27, 1–67. [Google Scholar] [CrossRef]
  31. Treece, G. The bitonic filter: Linear filtering in an edge-preserving morphological framework. IEEE Trans. Image Process. 2016, 25, 5199–5211. [Google Scholar] [CrossRef] [PubMed]
  32. He, K.; Sun, J.; Tang, X. Guided image filtering. IEEE Trans. Pattern Anal. Mach. Intell. 2013, 35, 1397–1409. [Google Scholar] [CrossRef]
  33. Lee, S.T. Digital image enhancement and noise filtering by use of local statistics. IEEE Trans. Pattern Anal. Mach. Intell. 1980, 2, 165–168. [Google Scholar] [CrossRef] [Green Version]
  34. Frost, V.S.; Stiles, J.A.; Shanmugan, K.S.; Holtzman, J.C. A model for radar images and its application to adaptive digital filtering of multiplicative noise. IEEE Trans. Pattern Anal. Mach. Intell. 1982, 4, 66–157. [Google Scholar] [CrossRef]
  35. Ramos-Llordén, G.; Vegas-Sánchez-Ferrero, G.; Martin-Fernández, M.; Alberola-López, C.; Aja-Fernández, S. Anisotropic diffusion filter with memory based on speckle statistics for ultrasound images. IEEE Trans. Image Process. 2015, 24, 345–358. [Google Scholar] [CrossRef] [Green Version]
  36. Zhu, L.; Fu, C.-W.; Brown, M.S.; Heng, P.-A. A non-local low-rank framework for ultrasound speckle reduction. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 5650–5658. [Google Scholar]
  37. Hyunho, C.; Jechang, J. Speckle noise reduction in ultrasound images using SRAD and guided filter. In Proceedings of the International Workshop on Advanced Image Technology, Chiang Mai, Thailand, 7–9 January 2018; pp. 1–4. [Google Scholar]
  38. Yu, Y.; Acton, S.T. Speckle reducing anisotropic diffusion. IEEE Trans. Image Process. 2002, 11, 1260–1270. [Google Scholar]
  39. Dataset of Standard 512X512 Grayscale Test Images. Available online: http://decsai.ugr.es/cvg/CG/base.htm (accessed on 30 December 2018).
  40. Dataset of Standard Test Images. Available online: https://www.sandia.gov/RADAR/imagery/#xBand2 (accessed on 20 September 2019).
  41. Pan, T.; Peng, D.; Yang, W.; Li, H.-C. A Filter for SAR Image Despeckling Using Pre-Trained Convolutional Neural Network Model. Remote Sens. 2019, 11, 2379. [Google Scholar] [CrossRef] [Green Version]
  42. Fang, j.; Hu, S.; Ma, X. A Boosting SAR Image Despeckling Method Based on Non-Local Weighted Group Low-Rank Representation. Sensors 2018, 18, 3448. [Google Scholar] [CrossRef] [Green Version]
  43. Singh, P.; Shree, R. A new SAR image despeckling using directional smoothing filter and method noise thresholding. Eng. Sci. Technol. Int. Journal 2018, 21, 589–610. [Google Scholar] [CrossRef]
Figure 1. The flow diagram of the proposed algorithm.
Figure 1. The flow diagram of the proposed algorithm.
Remotesensing 12 02371 g001
Figure 2. (a) Noisy image ( σ 2 = 0.3 ), (b) homogeneous adaptive Wiener filter (HAWF), (c) adaptive Wiener filter (AWF), (d) extended adaptive Wiener filter (EAWF).
Figure 2. (a) Noisy image ( σ 2 = 0.3 ), (b) homogeneous adaptive Wiener filter (HAWF), (c) adaptive Wiener filter (AWF), (d) extended adaptive Wiener filter (EAWF).
Remotesensing 12 02371 g002
Figure 3. Comparisons of the HAWF the EAWF, the homogeneous fast bilateral filter and the homogeneous wavelet filter with the PSNR, SSIM, IQI, and Pratt’s FOM quantitative measurements: (a) the EAWF and existing methods are compared by PSNR, (b) the EAWF and existing methods are compared by SSIM, (c) the EAWF and existing methods are compared by IQI, (d) the EAWF and existing methods are compared by Pratt’s FOM.
Figure 3. Comparisons of the HAWF the EAWF, the homogeneous fast bilateral filter and the homogeneous wavelet filter with the PSNR, SSIM, IQI, and Pratt’s FOM quantitative measurements: (a) the EAWF and existing methods are compared by PSNR, (b) the EAWF and existing methods are compared by SSIM, (c) the EAWF and existing methods are compared by IQI, (d) the EAWF and existing methods are compared by Pratt’s FOM.
Remotesensing 12 02371 g003
Figure 4. Comparisons of the CV performance and performance of the Standard deviation of the image (a). Noisy image (b). the result of applying the Standard deviation on noisy image (c). the result of applying the CV on noisy image.
Figure 4. Comparisons of the CV performance and performance of the Standard deviation of the image (a). Noisy image (b). the result of applying the Standard deviation on noisy image (c). the result of applying the CV on noisy image.
Remotesensing 12 02371 g004
Figure 5. The outcome of the difference of variances (DoV) in different speckle noise.
Figure 5. The outcome of the difference of variances (DoV) in different speckle noise.
Remotesensing 12 02371 g005
Figure 6. Comparisons of the EGF performance and GF performance. (a) the result of EGF on noisy image; (b) the result of GF on noisy image; (c) Comparison of SSIM values for denoised image by EGF and SSIM values for denoised image by GF; (d) Comparison of PSNR values for denoised image by EGF and SSIM values for denoised image by GF.
Figure 6. Comparisons of the EGF performance and GF performance. (a) the result of EGF on noisy image; (b) the result of GF on noisy image; (c) Comparison of SSIM values for denoised image by EGF and SSIM values for denoised image by GF; (d) Comparison of PSNR values for denoised image by EGF and SSIM values for denoised image by GF.
Remotesensing 12 02371 g006
Figure 7. (a) Man (512 × 512); (b) Monarch (748 × 512); (c) Boat (512 × 512); (d) Lena (512 × 512); (e) Peppers (512 × 512); (f) Cameraman (512 × 512);.
Figure 7. (a) Man (512 × 512); (b) Monarch (748 × 512); (c) Boat (512 × 512); (d) Lena (512 × 512); (e) Peppers (512 × 512); (f) Cameraman (512 × 512);.
Remotesensing 12 02371 g007
Figure 8. Proposed filter steps. (ac) Noisy image; (df) Applying the EAWF; (gi) DOV with window = (5 × 5); (jl) Applying the EGF for the first time; (mo) CV’ with window= (21 × 21); (pr) Applying the EGF for the second time; (su) Final outcome of image despeckling.
Figure 8. Proposed filter steps. (ac) Noisy image; (df) Applying the EAWF; (gi) DOV with window = (5 × 5); (jl) Applying the EGF for the first time; (mo) CV’ with window= (21 × 21); (pr) Applying the EGF for the second time; (su) Final outcome of image despeckling.
Remotesensing 12 02371 g008aRemotesensing 12 02371 g008b
Figure 9. Test images. (a) Baboon (512 × 512); (b) Lena (512 × 512); (c) Cameraman (256 × 256); (d) Airplane (512 × 512); (e) Hill (512 × 512); (f) Man (512 × 512); (g) Barbara (512 × 512); (h) Peppers (256 × 256); (i) Boat (512 × 512); (j) House (256 × 256); (k) Monarch (748 × 512); (l) Napoli (512 × 512); (m) Fruits (512 × 512); (n) Zelda (512 × 512).
Figure 9. Test images. (a) Baboon (512 × 512); (b) Lena (512 × 512); (c) Cameraman (256 × 256); (d) Airplane (512 × 512); (e) Hill (512 × 512); (f) Man (512 × 512); (g) Barbara (512 × 512); (h) Peppers (256 × 256); (i) Boat (512 × 512); (j) House (256 × 256); (k) Monarch (748 × 512); (l) Napoli (512 × 512); (m) Fruits (512 × 512); (n) Zelda (512 × 512).
Remotesensing 12 02371 g009aRemotesensing 12 02371 g009b
Figure 10. Performance comparison of different techniques using the Monarch image. (a) Noisy; (b) guided; (c) Frost; (d) Lee; (e) bitonic; (f) WLS; (g) NLLR; (h) ADMSS; (i) SRAD; (j) SRAD-guided; (k) SARBM3D; and (l) proposed method of Hyunho Choi et al. (m) Proposed method, (n) original image, (o) signal on image, (p) comparison of the original signal and denoised signal, (q) comparison of the denoised signal and degraded signal.
Figure 10. Performance comparison of different techniques using the Monarch image. (a) Noisy; (b) guided; (c) Frost; (d) Lee; (e) bitonic; (f) WLS; (g) NLLR; (h) ADMSS; (i) SRAD; (j) SRAD-guided; (k) SARBM3D; and (l) proposed method of Hyunho Choi et al. (m) Proposed method, (n) original image, (o) signal on image, (p) comparison of the original signal and denoised signal, (q) comparison of the denoised signal and degraded signal.
Remotesensing 12 02371 g010aRemotesensing 12 02371 g010b
Figure 11. Performance comparison of different techniques in SAR image 2. (a) Noisy; (b) guided; (c) Frost; (d) Lee; (e) bitonic; (f) WLS; (g) NLLR; (h) ADMSS; (i) SRAD; (j) SRAD-guided; (k) SAR-BM3D; and (l) proposed method of Choi et al. (m) Proposed method, (n) signal on image (o) Comparison of the denoised signal and degraded signal.
Figure 11. Performance comparison of different techniques in SAR image 2. (a) Noisy; (b) guided; (c) Frost; (d) Lee; (e) bitonic; (f) WLS; (g) NLLR; (h) ADMSS; (i) SRAD; (j) SRAD-guided; (k) SAR-BM3D; and (l) proposed method of Choi et al. (m) Proposed method, (n) signal on image (o) Comparison of the denoised signal and degraded signal.
Remotesensing 12 02371 g011aRemotesensing 12 02371 g011bRemotesensing 12 02371 g011c
Figure 12. (a) Real SAR image 2, (b) denoised SAR image by DPAD filter, (c) denoised SAR image by SAR-BM3D filter, (d) denoised SAR image by fast bilateral filter, (e) denoised SAR image by bilateral filter, (f) denoised SAR image by WLS filter, (g) denoised SAR image by the proposed method, (h) signal on proposed denoised image, (i) comparison of proposed denoised signal and degraded signal, (j) Comparison of SAR-BM3D denoised signal and degraded signal, (k) comparison of WLS filter denoised signal and degraded signal.
Figure 12. (a) Real SAR image 2, (b) denoised SAR image by DPAD filter, (c) denoised SAR image by SAR-BM3D filter, (d) denoised SAR image by fast bilateral filter, (e) denoised SAR image by bilateral filter, (f) denoised SAR image by WLS filter, (g) denoised SAR image by the proposed method, (h) signal on proposed denoised image, (i) comparison of proposed denoised signal and degraded signal, (j) Comparison of SAR-BM3D denoised signal and degraded signal, (k) comparison of WLS filter denoised signal and degraded signal.
Remotesensing 12 02371 g012aRemotesensing 12 02371 g012b
Figure 13. (a) SAR image, (b) noisy image (noise variance = 0.04), (c) noisy image (noise variance = 0.06), (d) Noisy image (noise variance = 0.08), (e) noisy image (noise variance = 0.1).
Figure 13. (a) SAR image, (b) noisy image (noise variance = 0.04), (c) noisy image (noise variance = 0.06), (d) Noisy image (noise variance = 0.08), (e) noisy image (noise variance = 0.1).
Remotesensing 12 02371 g013
Table 1. Present the simulation conditions for the proposed method.
Table 1. Present the simulation conditions for the proposed method.
EAWFEGFEGFWLS Filter
Window size =
(3 × 3)
NeighborhoodSize = 8
DegreeOfSmoothing =
0.2 × diff(getrangefromclass(I))2
guided by outcome of EAWF
NeighborhoodSize = 7
DegreeOfSmoothing =
0.01 × diff(getrangefromclass(I))2
guided by outcome of EAWF
Lambda = 0.1
Alpha = 1
Table 2. The sub-filters performance of the proposed method.
Table 2. The sub-filters performance of the proposed method.
EAWF + EGF + WLSEGF + WLSEAWF + WLS
Noise VarianceFiltersPSNRSSIMENLSTDPSNRSSIMENLSTDPSNRSSIMENLSTD
0.04Noisy image21.1170.344224.4060.094321.11730.3442524.4050.094321.1170.344224.4060.0943
EAWF28.0130.7221191.530.0336----27.9860.7203183.340.0345
EAWF+EGF29.7360.8675821.30.016122.45280.3953837.3320.0758----
EAWF+2(EGF)29.8130.89531683.60.011323.84740.5064975.1130.0531----
EAWF+2(EGF)+WLS29.6180.89582388.90.009525.86510.70259230.840.030129.2650.8338429.120.0225
The rate of decline-----12.67%-21.57%-90.34%-68.44%-1.19%-6.92%-82.04%-57.78%
Man (512 × 512)0.04Noisy image21.670.574125.1940.122621.670.574125.1940.122621.670.574125.1940.1226
EAWF27.9530.7722161.710.0485----27.9530.7722161.710.0485
EAWF+EGF28.930.8198928.790.020223.88730.6259760.5490.0779----
EAWF+2(EGF)28.650.8091463.40.016124.63020.6497496.9230.0614----
EAWF+2(EGF)+WLS28.0440.7821798.20.014526.56940.73599221.20.040528.5060.795293.840.0357
The rate of decline-----5.26%-5.89%-87.70%-64.20%1.65%1.66%-83.66%-59.38%
Boat (512 × 512)0.04Noisy image18.8920.353624.8940.125618.89220.3536324.8940.125618.8920.353624.8940.1256
EAWF26.0140.6072162.640.0492----26.0140.6072162.640.0492
EAWF + EGF28.210.7562678.860.02420.82530.3855859.8270.07980----
EAWF + 2(EGF)28.4140.79021344.50.017121.47290.4002295.7680.06290----
EAWF + 2(EGF) + WLS28.2830.79271883.50.014423.16360.45335218.560.0414927.140.6697324.690.0348
The rate of decline-----18.10%-42.81%-88.40%-65.29%-4.04%-15.51%-82.76%-58.62%
Lena (512 × 512)0.04Noisy image19.80.397824.0380.133319.79950.3977924.0370.133319.80.397824.0380.1333
EAWF26.8730.6713171.850.0499 26.8730.6713171.850.0499
EAWF + EGF28.9510.84971279.70.018222.43460.4639349.8160.0923
EAWF + 2(EGF)29.7480.85762240.60.013823.70640.5209487.3600.0693
EAWF + 2(EGF) + WLS30.1370.85042907.60.012125.70090.65416203.840.045227.9140.7469312.740.0371
The rate of decline-----14.72%-23.07%-92.99%-73.23%-7.38%-12.16%-89.24%-67.39%
Peppers (512 × 512)0.04Noisy image19.9870.353924.3560.137919.9870.3538524.3550.137919.9870.353924.3560.1379
EAWF26.7840.6106127.210.0603----26.7840.6106127.210.0603
EAWF + EGF29.8620.7598559.20.028822.641960.43049447.3450.0984----
EAWF + 2(EGF)29.9650.7729814.290.023823.74080.4767569.7940.0807----
EAWF + 2(EGF) + WLS29.9060.7742930.470.022325.95450.60132125.540.0628.2780.6835169.520.0525
The rate of decline-----13.21%-22.33%-86.51%-62.83%-5.44%-11.71%-81.78%-57.52%
Cameraman (512 × 512)0.04Noisy image19.640.397224.430.131319.63950.3971624.4300.131319.640.397224.430.1313
EAWF26.7840.6273165.070.0505----26.7840.6273165.070.0505
EAWF + EGF29.4010.8302836.910.022422.43360.4535252.3430.0888----
EAWF + 2(EGF)29.3510.84391126.10.019323.65730.4908379.7340.0717----
EAWF + 2(EGF) + WLS29.1210.83781279.70.018125.8590.59268165.470.049628.0960.7062295.530.0378
The rate of decline-----11.20%-29.25%-87.07%-63.51%-3.52%-15.70%-76.91%-52.12%
Mean of decline rate-----12.53%-24.15%-88.83%-66.25%-3.32%-10.06%-82.73%-58.80%
Table 3. The optimal parameters of existing filters in standard images.
Table 3. The optimal parameters of existing filters in standard images.
MethodsOptimal Parameters
NLMMask size = 3 × 3
FrostMask size = 3 × 3
LeeMask size = 3 × 3
BitonicMask size = 3 × 3
WLSMask size = 3 × 3, λ = 3
NLLRΒ = 10, H = 10
ADMSSΔt = 0.5, σ = ρ = 0.1 ,   n i t e r = 15
SAR-BM3DNumber of rows/cols of block = 9,
Maximum size of the 3rd dimension of a stack = 16,
Diameter of search area = 39,
Dimension of step = 3,
Parameter of the 2D Kaiser window = 2,
transform UDWT = daub4
Table 4. Peak signal-to-noise (PSNR) (in dB) results for each standard image (The best value is shown in bold and the second-best value is shown in red color).
Table 4. Peak signal-to-noise (PSNR) (in dB) results for each standard image (The best value is shown in bold and the second-best value is shown in red color).
NoisyNLMGuidedFrostLeeBitonicWLSNLLRADMSSSRADSARD-GuidedSAR-BM3DChoi at alProposed
Airplane16.5319.1219.1422.0623.7826.1824.9717.3923.4326.9726.5328.1027.4527.91
Baboon18.4921.1321.0921.0821.9121.9721.9719.5318.2823.5222.0722.5122.9223.08
Barbara19.1622.4022.0522.3423.2623.6823.7820.3920.5024.9923.7528.3224.5925.94
Boat18.4621.8121.6823.3619.4126.3925.5019.6520.1427.3726.5927.2027.5528.24
Camera-man18.6621.6521.5922.4122.8524.4325.0319.7517.5926.7324.7126.3526.8728.43
Fruits17.0819.9619.9822.3024.0826.3326.3118.0422.0727.4526.9327.6827.4527.79
Hill19.7923.5423.3824.6425.4827.5826.7521.2624.9228.2527.8228.3028.2728.19
House17.9321.1621.0223.2625.0627.3825.9319.0922.4627.5827.8129.8328.5828.90
Lena18.8422.4522.3124.2925.8828.5427.3920.1121.8829.6928.9929.9130.1328.38
Man19.5123.0722.9424.4126.1527.4626.4620.8320.8228.3127.6827.7128.5528.00
Monarch20.1924.5524.1025.1126.7627.7025.8721.9924.0029.5028.0329.5429.6429.59
Napoli21.0024.6224.2724.0624.4824.3423.6922.7122.9026.4124.3425.1425.7026.83
Peppers18.7422.0521.7923.5022.9226.6225.7719.9618.1328.2927.2227.1328.4428.53
Zelda21.1826.2325.9426.7128.6231.4030.6623.1929.2832.6732.2032.3832.7732.58
Table 5. Structural similarity (SSIM) results for each standard image (The best value is shown in bold and the second-best value is shown in red color).
Table 5. Structural similarity (SSIM) results for each standard image (The best value is shown in bold and the second-best value is shown in red color).
NoisyNLMGuidedFrostLeeBitonicWLSNLLRADMSSSRADSARD-GuidedSAR-BM3DChoi at alProposed
Airplane0.210.290.280.370.500.660.700.250.730.720.760.840.820.84
Baboon0.490.560.560.470.540.520.530.530.390.650.530.560.610.57
Barbara0.440.610.570.500.600.640.670.550.520.680.650.840.690.75
Boat0.330.460.440.470.600.680.670.400.390.710.700.720.730.79
Camera-man0.420.490.480.480.570.670.730.450.360.760.740.800.800.83
Fruits0.180.280.270.330.480.640.700.230.430.760.760.780.780.78
Hill0.380.560.540.530.640.690.680.490.580.730.710.730.730.72
House0.250.410.380.410.530.670.710.330.530.780.760.840.780.81
Lena0.290.450.430.450.600.730.750.380.470.810.750.840.830.85
Man0.370.560.540.540.660.720.710.500.500.760.740.760.770.78
Monarch0.310.600.550.530.690.810.830.470.800.860.880.900.890.90
Napoli0.490.720.690.610.690.700.680.670.660.770.700.730.750.80
Peppers0.360.540.520.540.650.770.770.460.360.820.820.830.840.85
Zelda0.350.610.580.550.700.800.820.510.770.860.850.870.860.86
Table 6. The number of best, second best, and the sum of both.
Table 6. The number of best, second best, and the sum of both.
FiltersBestSecond BestTotal
Proposed Method14 (41%)11 (34%)25 (38%)
SARD-BM3D11 (32%)4 (12%)15 (23%)
Choi at al6 (17%)12 (37%)18 (27%)
SRAD3 (8%)4 (12%)6 (9%)
SRAD-Guided0 (0%)1 (3%)1 (1%)
Total343265
Table 7. Equivalent number of looks (ENL) results for SAR image 1.
Table 7. Equivalent number of looks (ENL) results for SAR image 1.
NoisyNLMGuidedFrostLeeBitonicWLSNLLRADMSSSRADSRAD-GuidedSAR-BM3DChoi at alProposed
ROI 114.325629.5316.248.86299.3207.621.2201.6146.91174.02186.54205.89370.79
ROI 216.604128.6613.139.45180.6180.420.56124.8117.17141.3129.35160.67208.54
Table 8. ENL and STD results for SAR image 2.
Table 8. ENL and STD results for SAR image 2.
NoisyBilateralFast BilateralGFWLSDPADSRADSAR-BM3DProposed Method
ENLSTDENLSTDENLSTDENLSTDENLSTDENLSTDENLSTDENLSTDENLSTD
ROI 17.8512.7548.285. 0781.8603.82778.753.80137.82.9019.3611.549.78311.2726.67.11988.6243.799
ROI 239.519.62333.76.73617.364.957201.68.6620452.675165.29.586179.69.1715975.0561107.53.658
Table 9. The parameters of different filters used in this comparison.
Table 9. The parameters of different filters used in this comparison.
MethodsParameters
GuidenhoodSize = 8; smoothValue = 0.01 × diff(getrangefromclass(A))2;
SAR-BM3DNumber of looks = 1
Bilateral Windows size=3, spatial parameter = 2, Intensity parameter = 1.11.
Fast Bilateral spatial parameter= 3, Intensity parameter=100.
WLSLambda = 0.5
DPADNoise Estimation Method = 5, The statistics for noise estimation are estimated on a 5 x 5 square window, Simplified SRAD = ’aja’
Table 10. PSNR, SNR, SSIM, and MAE results for SAR image 3 (The best value is shown in bold and the second-best value is shown in red color).
Table 10. PSNR, SNR, SSIM, and MAE results for SAR image 3 (The best value is shown in bold and the second-best value is shown in red color).
FiltersNoise VariancePSNRSNRSSIMMAENoise VariancePSNRSNRSSIMMAE
Proposed Method0.0430.509617.89820.72880.02060.0630.200817.58950.73290.0211
SRAD30.431917.89050.71170.020829.942917.33150.77680.0221
DPAD31.085118.47370.80740.019629.958417.34700.77790.0222
SARBM3D30.194517.58320.72340.021629.891017.27960.72020.0224
WLS25.716713.10540.63900.030625.970013.35860.64800.0297
Guided28.313715.7020.68790.025528.055315.44400.68420.0261
Bilateral28.196815.58540.69230.024328.108515.49720.68860.0246
Fast Bilateral27.494114.88270.66030.026427.481414.87000.66330.0264
Proposed Method0.0829.794317.18290.73530.02160.129.393016.78160.73250.0222
SRAD29.108816.49740.75300.024128.499915.88850.73090.0257
DPAD29.270716.65940.75600.023828.679516.06820.73470.0253
SARBM3D29.566116.95470.71870.023229.243016.63160.71680.0241
WLS26.178013.56660.65650.029126.345513.73410.66210.0286
Guided27.733815.12240.67690.026926.703914.09260.66460.0311
Bilateral28.032215.42080.68610.024927.959215.34780.68220.0252
Fast Bilateral27.450914.83950.66580.026327.425914.81450.66650.0264
Table 11. Computational complexity results (in seconds) of the despeckling methods for each standard image (512 × 512 and 256 × 256). (the second-best value is shown in red color.).
Table 11. Computational complexity results (in seconds) of the despeckling methods for each standard image (512 × 512 and 256 × 256). (the second-best value is shown in red color.).
ImagesNLMGuidedFrostLeeBitonicWLSNLLRADMSSSRADSARD-GuidedSAR-BM3DChoi at alProposed
Airplane0.480.161.866.410.093.511052.12196.875.515.9261.505.702.31
Baboon0.480.112.007.290.100.481030.23173.142.452.6159.842.762.24
Barbara0.500.122.057.280.081.001003.88162.763.443.7459.363.742.31
Boat0.480.112.017.280.090.981007.25174.225.065.4861.075.362.32
Camera-man0.120.080.521.880.030.46211.2821.641.551.2114.451.910.40
Fruits0.480.112.037.310.090.971012.13181.417.407.8562.177.842.31
Hill0.480.111.987.250.090.991061.75162.394.985.6261.385.282.19
House0.120.090.531.920.030.49231.4628.011.541.0514.341.830.42
Lena0.480.161.866.470.101.001081.19170.447.538.0360.167.712.25
Man0.480.111.997.320.091.091057.03165.615.235.7060.155.402.24
Monarch0.730.132.859.770.121.511661.38277.268.485.9687.948.933.24
Napoli0.500.121.906.640.081.071060.22168.114.024.1059.554.312.28
Peppers0.120.090.501.710.030.50218.1426.961.041.1614.491.320.40
Zelda0.480.121.886.880.090.991001.87164.507.107.4559.577.322.35
Avg.0.420.121.716.100.081.07906.42148.094.674.7152.575.061.92
Table 12. The runtime of all the filters in the proposed method. (the second-best value is shown in red color.).
Table 12. The runtime of all the filters in the proposed method. (the second-best value is shown in red color.).
ImagesEAWFEGF(First Time)EGF(Second Time)WLSTotal
Airplane0.02950.10240.08662.09262.31
Baboon0.05340.24910.07981.85912.24
Barbara0.03140.10720.09692.07372.31
Boat0.02990.09980.09132.09942.32
Camera-man0.01720.10700.01690.25970.40
Fruits0.03960.21820.07901.97332.31
Hill0.04360.24320.09131.81182.19
House0.00630.02370.02190.36810.42
Lena0.03750.20240.08111.92952.25
Man0.04010.21360.07791.90772.24
Monarch0.04260.23850.10602.85383.24
Napoli0.03000.16780.07402.00832.28
Peppers0.00590.02410.02190.35050.40
Zelda0.03980.21670.08132.01232.35
Avg.0.03190.15810.07181.68571.92
Table 13. Computational complexity results of the despeckling methods for the real SAR images 1 (512 × 512).
Table 13. Computational complexity results of the despeckling methods for the real SAR images 1 (512 × 512).
NLMGuidedFrostLeeBitonicWLSNLLRADMSSSRADSRAD-GuidedSAR-BM3DChoi at alProposed
SAR image 120.470.191.736.230.120.711071.8191.197.097.4862.957.452.76
Table 14. Computational complexity results of the despeckling methods for the real SAR image 2 (1323 × 803).
Table 14. Computational complexity results of the despeckling methods for the real SAR image 2 (1323 × 803).
BilateralFast BilateralGFWLSDPADSRADSAR-BM3DFang at alProposed
SAR image 225.184091.1735490.4650488.78559234.50541734.0956851051.7271198.980111.207373
Table 15. Computational complexity results of all the filters in the proposed method for the real SAR image 1 (512 × 512);.
Table 15. Computational complexity results of all the filters in the proposed method for the real SAR image 1 (512 × 512);.
EAWFEGF (First Time)EEGF (Second Time)WLSTotal
SAR image 10.03416660.101355650.098354542.527355692.76123248
Table 16. Computational complexity results of all the filters in the proposed method for the real SAR image 2 (1323 × 803).
Table 16. Computational complexity results of all the filters in the proposed method for the real SAR image 2 (1323 × 803).
EAWFEGF (First time)EEGF (Secondd titime)WLSTotal
SAR image 20.0925520.6826770.9337459.49839911.207373
Table 17. Computational complexity results of the despeckling methods for the real SAR images 3 (600 × 418).
Table 17. Computational complexity results of the despeckling methods for the real SAR images 3 (600 × 418).
FiltersNoise VarianceRun TimeNoise VarianceRun Time
Proposed Method0.043.515320.062.87514
SRAD8.552258.67667
DPAD9.395008.93444
SARBM3D286.548283.948
WLS1.951181.69643
Guided0.127020.14519
Bilateral7.226766.94197
Fast Bilateral0.427090.39429
Proposed Method0.08 2.862620.1 3.98923
SRAD9.025078.67005
DPAD8.759848.21100
SARBM3D284.064285.219
WLS1.718351.66355
Guided0.138480.15345
Bilateral7.383758.77612
Fast Bilateral0.445840.79916
Table 18. Computational complexity results of all the filters in the proposed method for the real SAR image 3 (600 × 418).
Table 18. Computational complexity results of all the filters in the proposed method for the real SAR image 3 (600 × 418).
Noise VarianceEAWFEGF (First Time)EGF (Second Time)WLSTotal
0.040.035110.7101291.0787011.6913.515
0.060.033850.4438890.5653061.8322.875
0.080.034950.4412530.6044231.7822.863
0.10.037991.5000440.5715641.8713.981
0.035480.7738280.7049981.7943.31

Share and Cite

MDPI and ACS Style

Salehi, H.; Vahidi, J.; Abdeljawad, T.; Khan, A.; Rad, S.Y.B. A SAR Image Despeckling Method Based on an Extended Adaptive Wiener Filter and Extended Guided Filter. Remote Sens. 2020, 12, 2371. https://doi.org/10.3390/rs12152371

AMA Style

Salehi H, Vahidi J, Abdeljawad T, Khan A, Rad SYB. A SAR Image Despeckling Method Based on an Extended Adaptive Wiener Filter and Extended Guided Filter. Remote Sensing. 2020; 12(15):2371. https://doi.org/10.3390/rs12152371

Chicago/Turabian Style

Salehi, Hadi, Javad Vahidi, Thabet Abdeljawad, Aziz Khan, and Seyed Yaser Bozorgi Rad. 2020. "A SAR Image Despeckling Method Based on an Extended Adaptive Wiener Filter and Extended Guided Filter" Remote Sensing 12, no. 15: 2371. https://doi.org/10.3390/rs12152371

APA Style

Salehi, H., Vahidi, J., Abdeljawad, T., Khan, A., & Rad, S. Y. B. (2020). A SAR Image Despeckling Method Based on an Extended Adaptive Wiener Filter and Extended Guided Filter. Remote Sensing, 12(15), 2371. https://doi.org/10.3390/rs12152371

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop