Next Article in Journal
Mesoscale Eddy Effects on Vertical Correlation of Sound Field and Array Gain Performance
Previous Article in Journal
Improved Classification of Coastal Wetlands in Yellow River Delta of China Using ResNet Combined with Feature-Preferred Bands Based on Attention Mechanism
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Synthetic Aperture Radar Image Change Detection Based on Principal Component Analysis and Two-Level Clustering

1
School of Information and Electronics, Beijing Institute of Technology, Beijing 100081, China
2
Department of Electronic Engineering, Tsinghua University, Beijing 100084, China
3
School of Computer Science and Technology, Xinjiang University, Urumqi 830046, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2024, 16(11), 1861; https://doi.org/10.3390/rs16111861
Submission received: 11 March 2024 / Revised: 8 May 2024 / Accepted: 17 May 2024 / Published: 23 May 2024
(This article belongs to the Section Remote Sensing Image Processing)

Abstract

:
Synthetic aperture radar (SAR) change detection provides a powerful tool for continuous, reliable, and objective observation of the Earth, supporting a wide range of applications that require regular monitoring and assessment of changes in the natural and built environment. In this paper, we introduce a novel SAR image change detection method based on principal component analysis and two-level clustering. First, two difference images of the log-ratio and mean-ratio operators are computed, then the principal component analysis fusion model is used to fuse the two difference images, and a new difference image is generated. To incorporate contextual information during the feature extraction phase, Gabor wavelets are used to obtain the representation of the difference image across multiple scales and orientations. The maximum magnitude across all orientations at each scale is then concatenated to form the Gabor feature vector. Following this, a cascading clustering algorithm is developed within this discriminative feature space by merging the first-level fuzzy c-means clustering with the second-level neighbor rule. Ultimately, the two-level combination of the changed and unchanged results produces the final change map. Five SAR datasets are used for the experiment, and the results show that our algorithm has significant advantages in SAR change detection.

1. Introduction

Remote sensing change detection is a method that compares various images or datasets collected from the same geographic area at different times to identify and analyze changes [1,2]. This approach is widely applied across numerous sectors, including environmental monitoring, urban planning, agriculture, and disaster management [3,4]. It plays a crucial role in understanding and responding to changes in our environment and infrastructure, providing valuable insights for decision-making and policy development.
Remote sensing change detection is divided into two main types: heterogeneous and homogeneous [5]. Heterogeneous change detection focuses on comparing different types of images, such as optical remote sensing images and synthetic aperture radar (SAR) images. This area has seen significant contributions, such as Du et al. [6], who introduced a concatenated deep-learning framework for multitask change detection, combining optical and SAR images. Liu et al. [7] presented a novel approach using domain-adaptive cross reconstruction for change detection in heterogeneous remote sensing images, leveraging a feedback guidance mechanism. Xiao et al. [8] introduced a method called change alignment-based graph structure learning (CAGSL) for unsupervised heterogeneous change detection by assessing structural differences both forward and backward. Sun et al. [9,10] have contributed twice in this field, first by developing graphs based on similarity and dissimilarity relationships for multimodal change detection, and later by enhancing a nonlocal patch-based graph with a convolutional wavelet neural network (CWNN) or principal component analysis network (PCANet), specifically for detecting changes in optical remote sensing and SAR images. Since deep-learning algorithms require training, deep-learning-based change detection algorithms are relatively time consuming. These advancements reflect the dynamic evolution of methodologies in detecting changes across heterogeneous remote sensing data.
The field of homogeneous change detection has seen significant advancements with the application of deep-learning techniques. These innovations have led to more sophisticated and efficient methods for analyzing changes in both optical remote sensing images and hyperspectral images, as well as SAR images [11,12,13,14,15,16,17,18,19,20,21,22,23]. For instance, Wu et al. [14] proposed a spatial–temporal association-enhanced mobile-friendly vision transformer specifically designed for the change detection of high-resolution images. This method stands out for its lightweight design and high efficiency. Similarly, Song et al. [15] introduced a network that leverages context spatial awareness for remote sensing image change detection, which relies on the graph and convolution interaction. Wang et al. [16] contributed by introducing a knowledge distillation-based lightweight change detection method for high-resolution remote sensing imagery, aimed at facilitating on-board processing. Qu et al. [17] developed a cycle-refined multidecision joint alignment network, tailored for unsupervised domain adaptive hyperspectral change detection, highlighting the continuous refinement in the field. Zhang et al. [18] presented a novel approach combining a convolution and attention mixer specifically for SAR image change detection, showcasing the ongoing evolution of methodologies. Su et al. [19] developed an unsupervised method utilizing a variational graph auto-encoder network for object-based small area change detection in SAR images, which offers benefits in reducing the adverse effects of class imbalance and suppressing speckle noise. Zhang et al. [20] introduced a robust unsupervised method for small area change detection from SAR imagery, employing a convolutional wavelet neural network, further emphasizing the field’s progression towards addressing specific challenges in SAR image analysis.
Despite the impressive strides made by deep learning in the realm of SAR change detection, traditional algorithms continue to play a vital role, showcasing their enduring strengths. These traditional methods primarily involve steps such as image registration, image preprocessing (for instance, speckle noise reduction), computation of difference images (DIs), and the application of thresholds or classification techniques to produce change detection maps [21]. Notably, operators like the log-ratio (LR), mean-ratio (MR), and neighbor-ratio (NR) are extensively utilized to generate difference images [22], while methods such as the Otsu algorithm, fuzzy C-means clustering (FCM), and fuzzy local information C-means clustering are employed to derive the change maps [23]. Zhuang et al. [24] introduced a novel approach called the ratio-based nonlocal information for change detection in multitemporal SAR images. This method leverages spatial–temporal nonlocal neighborhoods, where the similarity between two pixels in the nonlocal neighborhood is effectively characterized by a ratio-based Gaussian kernel function. This technique is particularly good at retaining edge information in the change region while minimizing the overall error in the change detection results. Furthermore, Zhuang et al. [25] proposed an adaptive generalized likelihood ratio test for SAR image change detection, which excels in mitigating noise interference, preserving edge details, and enhancing the accuracy of change detection outcomes. Additionally, Gao et al. [26] developed a superpixel-based multiobjective change detection method that utilizes a self-adaptive neighborhood-based binary differential evolution algorithm, marking further advancement in traditional change detection techniques.
In this study, we introduce a SAR change detection technique that leverages principal component analysis (PCA) and two-level clustering. Our motivation is to compute a difference image that adequately reflects the changing areas, and then use classification algorithms to obtain precise detection results. This method begins by utilizing the log-ratio and mean-ratio operations to create two distinct difference images. The background in the log-ratio image appears relatively uniform, while the change-area information in the mean-ratio image aligns closely with the actual trends observed in the remote sensing imagery. Consequently, both the log-ratio and mean-ratio images can be merged into a single, new difference image that encapsulates complementary information. Following this, a PCA fusion model merges two difference images into a single one. Subsequently, the Gabor wavelets and FCM are applied to generate the final change map.

2. The Proposed Method

In this section, we outline the proposed method for SAR image change detection, which is organized into three main steps: difference image analysis, Gabor feature extraction, and two-level clustering. This method’s structure is visualized in Figure 1, showcasing the systematic approach of our change detection methodology. Further details are provided below.

2.1. Difference Image Analysis

For two co-registered SAR images I 1 and I 2 captured at different times t 1 and t 2 over the same geographic area, the task of change detection aims to precisely identify the locations of changes between the pair of acquired images.
The difference images (DIs) of LR and MR operators are defined as follows [27]:
DI LR = log I 1 I 2
DI MR = { 1 min ( α 2 α 1 , α 1 α 2 ) ,   if   α 2 > α 1 min ( α 2 α 1 , α 1 α 2 ) 1 ,   if   α 2 α 1
where α 1 and α 2 represent the local mean values of image I 1 and I 2 , respectively.
PCA calculates eigenvectors and eigenvalues on a global scale, concentrating the pixel energy into a smaller subset of the PCA dataset. Suppose x i and x j are two images to be fused, represented as column vectors [28]:
x i = [ x 1 x 2   x N ]
x j = [ x 1 x 2   x N ]
where N indicates the number of pixels.
The covariance matrix between the two source images is given by
C o v ( x i , x j ) = E [ ( x i β i ) ( x j β j ) ]
where the mean of all pixels is defined by
β i = ( 1 N ) x i
β j = ( 1 N ) x j
A diagonal matrix D of eigenvalues and a full matrix V, whose columns are the corresponding eigenvectors, are computed. The eigenvalues and eigenvectors are sorted in descending order, and the first 2 × 2 values from V and D matrices are selected for fusion. The normalized components m1 and m2 are derived from V based on the following conditions, and should be less than one. The V matrix contains eigenvectors, while the D matrix is a diagonal matrix that contains eigenvalues.
If D ( 1 , 1 ) > D ( 2 , 2 )
where check if the first eigenvalue in the eigenvalue matrix D is greater than the second eigenvalue. Based on the magnitude of the eigenvalues, the corresponding eigenvector (V(:,1) or V(:,2)) is selected. Each element of the chosen eigenvector is then divided by the sum of all elements in that vector. This process, commonly used for normalization, ensures the vector sums to one, facilitating further analysis or use as weights:
m 1 = V ( 1 ,   1 ) V ( 1 ,   1 ) + V ( 2 ,   1 )
m 2 = V ( 2 ,   1 ) V ( 1 ,   1 ) + V ( 2 ,   1 )
else
m 1 = V ( 1 ,   2 ) V ( 1 ,   2 ) + V ( 2 ,   2 )
m 2 = V ( 2 ,   2 ) V ( 1 ,   2 ) + V ( 2 ,   2 )
Here, m1 and m2 serve as the weights of input images in the fusion rule, which is defined by
y = m 1 × x i + m 2 × x j
The weights for fusion rule, m1 and m2, determine the extent of information fused from each source images. In this section, we utilize the PCA fusion model to merge the difference images LR and MR, producing a new difference image DIPCA. The code for PCA-based fusion is provided in the Appendix A.

2.2. Gabor Feature Extraction

The Gabor wavelets have been extensively applied to image analysis due to their biological relevance and computational properties. A 2-D Gabor wavelet kernel is formulated as the product of an elliptical Gaussian envelope and a complex plane wave, defined by the following equation [29]:
ψ μ , ν ( z ) = k μ , ν 2 σ 2 exp ( k μ , ν 2 z 2 2 σ 2 )                                × [ exp ( i k μ , ν z ) exp ( σ 2 2 ) ]
where μ , and ν show the orientation and scale of the Gabor kernels, respectively. The norm operator is denoted as , k μ , ν = k ν exp ( i φ μ ) with φ μ = π μ / 8 and k ν = k max / f ν , k max signifies the maximum frequency and f indicates the spacing factor between kernels in the frequency domain.
The Gabor wavelet representation of the difference image is achieved by convolving Y D with a family of Gabor kernels { ψ μ , ν ( z ) : μ { 0 , , U 1 } , ν { 0 , , V 1 } } , defined as
O μ , ν ( z ) = Y D ( z ) ψ u , v ( z )
Here, z = ( i , j ) represents the pixel location, and denotes the convolution operator. O μ , ν ( z ) presents the convolution result corresponding to the Gabor kernel at orientation μ and scale ν . U and V refer to the total number of orientations and scales, respectively. The response of (14), i.e., O μ , ν ( z ) , is a complex-valued quantity, having the real and imaginary parts, respectively, given by Re ( O μ , v ( z ) ) and Im ( O μ , ν ( z ) ) . Consequently, O μ , ν ( z ) can be expressed as:
O μ , v ( z ) = A μ , ν ( z ) exp ( i θ μ , ν ( z ) )
and
A μ , ν ( z ) = Re ( O μ , ν ( z ) ) 2 + Im ( O μ , ν ( z ) ) 2
θ μ , ν ( z ) = arctan ( Im ( O μ , ν ( z ) ) ) Re ( O μ , ν ( z ) )
The real part of a Gabor wavelet kernel is regarded as a smooth filter and its imaginary part is utilized for edge detection. The magnitude A μ , ν ( z ) , integrating the complementary information from both Re ( O μ , v ( z ) ) and Im ( O μ , ν ( z ) ) , is often chosen as the stable and discriminative feature value. However, directly concatenating the magnitude responses from all scales and orientations to form a feature vector results in a highly dimensional feature space. To address this issue, we focus on extracting the response with maximum magnitude across all the possible orientations from the orientation sensitivity characteristic of Gabor wavelets, i.e.,
q ν = max μ [ 0 , U 1 ]   A μ , ν ( z )
For each pixel, and considering the scale value, the compact Gabor feature vector Q is formulated as Q = [ q 0 , q 1 , , q ν , , q V 1 ] . Thus, Gabor features χ = [ Q 1 , , Q M N ] T are efficiently extracted for the difference image.

2.3. Two-Level Clustering

Once the Gabor features are extracted, any clustering technique can be employed to separate the difference image into two distinct groups. However, to effectively distinguish the changed from the unchanged classes, it is crucial to choose an efficient clustering algorithm that ensures greater within-class compactness and between-class separation. To address this challenge, a two-level clustering approach is introduced in this subsection.
Its tree topology is depicted in Figure 2. Specifically, it starts from the root of the tree standing for a unique class including all the samples. In the first level, because of its simplicity and applicability, the FCM is used to divide the root node into three child nodes, respectively, representing the changed, unchanged, and intermediate classes (i.e., ω c 1 , ω u 1 , and ω i ), respectively. Pixels classified into ω c 1 and ω u 1 exhibit a high likelihood of being changed and unchanged. In the second level, the internal node ω i is further divided into two leaves ω c 2 and ω u 2 using the nearest neighbor rule by comparing the distances of the corresponding Gabor feature vectors to the centroids of ω c 1 and ω u 1 . Ultimately, we integrate two-level subclusters to construct the change map.
Here is an enhanced version of the description for the proposed cascade clustering approach:
Step 1. Input: given the Gabor features χ corresponding to difference image DIPCA containing M × N pixels.
Step 2. Level 1: perform the FCM model on χ to partition DIPCA into c clusters by minimizing the objective function:
J m ( u , v ) = i = 1 c j = 1 M N u i j m x j ν i 2
s . t .   u i j [ 0 ,   1 ] ,   i = 1 c u i j = 1 j          0 < j = 1 M N u i j < M N i
where m [ 1 , + ] represents the degree of fuzziness, ν i denotes the prototype of the ith cluster, u = [ u i j ] c × M N is a partition matrix with u i j being the membership grade of the j th pixel in cluster i , and ν = [ ν 1 , ν 2 , ν 3 ] is the vector of the centroid of cluster. J m of (19) can be iteratively optimized by alternately updating u i j and v i until convergence, i.e.,
(1)
Set parameters c = 3 , m = 2 , t = 0 , and initialize the partition matrix u ( 0 ) .
(2)
Calculate the centroid of the i th cluster by using
v i ( t + 1 ) = j = 1 M N ( u i j ( t ) ) m x j j = 1 M N ( u i j ( t ) ) m
(3)
Update the membership grade u i j by using
u i j ( t + 1 ) = x j v i ( t + 1 ) 2 / ( m 1 ) r = 1 c x j v r ( t + 1 ) 2 / ( m 1 )
(4)
Set t : = t + 1 , go to (2), and continue until convergence.
(5)
Assign the pixels to a class of { ω c 1 , ω i , ω u 1 } from
label ( Y d l Ω p ) = { ω c 1 ,   p = arg max i = 1 , 2 , 3 M Ω i ω u 1 , p = arg min i = 1 , 2 , 3 M Ω i ω i ,   otherwise
where Ω i = 1 , 2 , 3 show three distinct clusters identified by discriminating the highest grade of membership for each pixel from u . M Ω i shows the average value (i.e., the mean) of Y D in cluster Ω i , and it is given by:
M Ω i = ( 1 | Ω i | ) l Ω i Y D l
Step 3. Level 2: recalculate the centroids of ω c 1 and ω u 1 by
v ω c 1 = j Ω ω c 1 ( u i j ) m X j j Ω ω c 1 ( u i j ) m , i ω c 1
v ω u 1 = j Ω ω u 1 ( u i j ) m X j j Ω ω u 1 ( u i j ) m , i ω u 1
Following this, the nearest neighbor rule is employed to further split the intermediate class ω i into two classes, based on:
label ( Y D l Ω ω i ) = { ω c 2 , x l v ω c 1 2 x l v ω u 1 2 ω u 2 , x l v ω c 1 2 > x l v ω u 1 2
Step 4. Output: the culmination of this process produces the final change map, effectively illustrating the detected changes via
Ω ω c = Ω ω c 1 Ω ω c 2
Ω ω u = Ω ω u 1 Ω ω u 2

3. Experimental Results and Discussions

In this section, we begin by outlining the datasets and evaluation criteria utilized in our experiments, followed by a presentation of our method’s performance.

3.1. Experimental Datasets and Evaluation Criteria

We evaluated the proposed change detection method using five synthetic aperture radar (SAR) datasets. Table 1 summarizes the details of these datasets, including the event, satellites they originate from, sensor type (ST), and image size. Figure 3, Figure 4, Figure 5, Figure 6, Figure 7, Figure 8 and Figure 9 display the images from each dataset, which consist of pairs of co-registered SAR images captured before and after changes occurred, along with a reference image annotated manually based on expert knowledge.
The effectiveness of change detection methods is assessed through both qualitative and quantitative measures. For quantitative analysis, we employ metrics such as false negatives (FNs) [30], false positives (FPs) [31,32,33,34], overall error (OE) [35], percentage of correct classifications (PCCs) [36], Kappa coefficient (KC) [37,38,39], and F1-score [40,41]. The formulas for these metrics are provided in Table 2.

3.2. Experimental Results and Analysis

To assess the performance of the proposed method, it was compared against seven established methods: principal component analysis and K-means (PCAKM) [42], Bayesian [43], Gabor and two-level clustering (GaborTLC) [29], logarithmic mean-based thresholding (LMT) [44], neighborhood-based ratio and collaborative representation (NRCR) [45], the adaptive generalized likelihood ratio test and maximum entropy principle for change detection in SAR images (AGLRTM) [25], and the improved nonlocal patch-based graph and principal component analysis network (INPCANet) [10]. For the parameter settings, the following are used: the h = 3 and S = 9 in the PCAKM method; in the Bayesian method, the low-pass filtering is carried out by averaging the complex values in a 5 × 5 sliding window; a median filter with a size of 3 × 3 is used in the LMT method; in the INPCANet method, p = 2 and neighborhood of size is 5 × 5 ; in the NRCR and AGLRTM methods, the parameters are set to the default values set by the authors of the papers, respectively; and in the proposed method and the GaborTLC method, U = 8 , V = 5 , k max = 2 π , f = 2 , and σ = 2.8 π are defined.
In this subsection, we examine the difference images produced by various methods and discuss the change detection outcomes achieved using the two-level clustering model. Figure 10 illustrates the difference images derived from the log-ratio operator (LR), mean-ratio operator (MR), and PCA fusion technique on the Village of Feltwell dataset, respectively.
To evaluate the efficacy of the difference images (DIs) calculated by the LR, MR, and PCA-fusion approaches, empirical receiver operating characteristic (ROC) curves are employed, as depicted in Figure 11. These curves plot the true positive rate (TPR) against the false positive rate (FPR). Additionally, two quantitative criteria based on the ROC curve—the area under the curve (AUC) and the diagonal distance (Ddist)—along with their metrics [46], are provided in Table 3. For these criteria, larger values indicate superior detection performance. According to the data in Table 3, the PCA fusion model outperforms both the LR and MR operators, showcasing its effectiveness in change detection scenarios.
Figure 12 presents the change detection outcomes for the Village of Feltwell dataset using the difference images processed with the two-level clustering model, with the associated performance metrics detailed in Table 4. From these visuals and data, it is observed that the log-ratio with two-level clustering (LRTLC) method exhibits the fewest false positives, with an FP value of 77. Meanwhile, the mean-ratio with two-level clustering (MRTLC) approach shows the least number of missed detections, evidenced by the FN value of 124. The PCA with two-level clustering (PCATLC) method ranks second in terms of both FN and FP, but it achieves the best performance in overall error (OE), percentage of correct classifications (PCC), Kappa coefficient (KC), and F1-score, with values of 360, 99.77%, 95.54%, and 95.66%, respectively. These results demonstrate the superior change detection capability of the fused difference image approach.
(1)
Results for the Village of Feltwell dataset:
The experimental outcomes for the Village of Feltwell dataset are displayed in Figure 13, with the corresponding performance metrics provided in Table 5. This figure reveals that the PCAKM, Bayesian, and LMT methods report high false positives, with values of 340, 2731, and 1863, respectively. Conversely, the NRCR, AGLRTM, and INPCANet approaches are characterized by high false negatives, with values of 1941, 2715, and 1337, respectively. The GaborTLC method has also achieved relatively good change detection effects, with the values of FN and FP being relatively balanced. Notably, the proposed method outperforms the others, showcasing the best overall performance, with an OE of 360, a PCC of 99.77%, a KC of 95.54%, an F1-score of 95.66%. The FN (269) and FP (91) both rank within the top three, respectively. These findings underscore the effectiveness of the proposed method in achieving accurate change detection.
(2)
Results for the Ottawa dataset:
Figure 14 displays the experimental outcomes for the Ottawa dataset, with the related performance metrics detailed in Table 6. From this figure, it is evident that the PCAKM and Bayesian methods result in a high number of false positives, with values of 583 and 3924, respectively. Meanwhile, the LMT, NRCR, and INPCANet approaches record high false negatives, with values of 5266, 4971, and 8709, respectively. The results of the GaborTLC and AGLRTM methods are similar. It is clear from the analysis that the overall performance of the proposed method surpasses that of the comparative methods, achieving the best results in overall error (OE) of 2316, the percentage of correct classifications (PCC) of 97.72%, a Kappa coefficient (KC) of 90.92%, and an F1-score of 92.25%. Additionally, FN (2258) is one of the top three, and FP (58) is one of the top four. These results highlight the superior accuracy and reliability of the proposed method in detecting changes within the Ottawa dataset.
(3)
Results for the San Francisco dataset:
Figure 15 illustrates the experimental results for the San Francisco dataset, with the performance metrics detailed in Table 7. The figure highlights the fact that the PCAKM, Bayesian, GaborTLC, and AGLRTM methods exhibit high rates of false detections, with false positive (FP) values of 1023, 923, 803, and 1204, respectively. On the other hand, the LMT, NRCR, and INPCANet methods demonstrate significant rates of missed detections, with false negative (FN) values of 1046, 598, and 1120, respectively. It is apparent from the analysis that the overall performance of the proposed method outshines the comparative methods, achieving the highest scores in overall error (OE) of 840, a percentage of correct classifications (PCC) of 98.72%, a Kappa coefficient (KC) of 90.16%, and an F1-score of 90.85%. Moreover, it maintains moderate values for both FN (513) and FP (327), further evidencing its superior ability to accurately detect changes within the San Francisco dataset.
(4)
Results for the Yellow River dataset:
Figure 16 presents the experimental outcomes for the Yellow River dataset, with performance metrics detailed in Table 8. This visualization reveals that the PCAKM, GaborTLC, NRCR, and INPCANet methods exhibit high rates of false detections, with false positive (FP) values of 4224, 1656, 2501, and 955, respectively. Conversely, the Bayesian, LMT, and AGLRTM methods show significant missed detection rates, with false negative (FN) values of 4422, 13404, and 6574, respectively. The overall performance of the proposed method stands out as the best among those evaluated, achieving the lowest overall error (OE) of 3635, the highest percentage of correct classifications (PCC) of 95.11%, the highest Kappa coefficient (KC) of 82.20%, and the highest F1-score of 85.09%. Additionally, it maintains moderate levels for both FN (3061) and FP (574), highlighting its effective balance in accurately detecting changes within the Yellow River dataset.
(5)
Results for the Sulzberger Ice Shelf dataset:
Figure 17 displays the experimental results for the Sulzberger Ice Shelf dataset, with performance metrics outlined in Table 9. This visual representation shows that the PCAKM, Bayesian, and INPCANet methods have considerable false detection issues, with false positive (FP) values of 733, 7734, and 1621, respectively. On the other hand, the LMT, NRCR, AGLRTM, and INPCANet methods exhibit notable missed detection issues, with false negative (FN) values of 5322, 466, 1173, and 2330, respectively. The GaborTLC method excels, achieving the lowest overall error (OE) of 512, the highest percentage of correct classifications (PCC) of 99.22%, the highest Kappa coefficient (KC) of 97.48%, and the highest F1-score of 97.96%. Our proposed algorithm is ranked second in overall change detection performance, with metrics of OE (747), PCC (98.86%), KC (96.34%), and F1-score (97.05%), which underscores the robustness of our approach in detecting changes.

3.3. Discussion

Based on the aforementioned results, we can summarize the performance of the proposed change detection method as follows: the proposed method was rigorously evaluated across five distinct SAR datasets: Village of Feltwell, Ottawa, San Francisco, Yellow River, and Sulzberger Ice Shelf, demonstrating its robustness and effectiveness in detecting changes. For the Yellow River dataset, although the range of variation in this dataset is relatively regular, the presence of significant noise and multiple targets results in comparatively low detection accuracy. By employing a combination of Gabor feature extraction and the two-level clustering model, including the fuzzy C-means (FCM) algorithm and PCA fusion, this method outperformed several established approaches such as PCAKM, Bayesian, GaborTLC, LMT, NRCR, AGLRTM, and INPCANet in various metrics.
The proposed method consistently achieved high performance metrics across all datasets, with notable scores in the overall error (OE), percentage of correct classifications (PCC), Kappa coefficient (KC), and F1-score, indicating a superior ability to accurately detect both changes and non-changes in complex SAR imagery. False positives (FPs) and false negatives (FNs) maintained moderate levels, demonstrating the method’s balanced sensitivity and specificity in identifying changed and unchanged areas, respectively. The PCA fusion model, in particular, was highlighted for its effectiveness in enhancing change detection performance by leveraging the complementary strengths of the log-ratio and mean-ratio operators. In our subsequent work, we will attempt to use algorithms such as edge-preserving filtering [47], contourlet [48], shearlet [49,50,51], etc., for processing LR and MR fusion.
These results underscore the proposed method’s advanced capability in SAR image change detection, combining effective feature extraction with sophisticated clustering techniques to deliver accurate and reliable change maps. This method stands as a significant contribution to the field of remote sensing, especially for applications requiring precise monitoring and analysis of environmental changes, urban development, and disaster assessment.

4. Conclusions

In this study, we propose an innovative method for change detection in SAR imagery, leveraging PCA and a two-level clustering approach. Initially, we compute two distinct difference images utilizing the log-ratio and mean-ratio techniques. Following this, a PCA fusion model is employed to amalgamate these difference images, producing a new, composite difference image. To integrate contextual information within the feature extraction phase, we utilize Gabor wavelets. This approach enables the representation of the difference image across multiple scales and orientations. For each scale, we select the maximum magnitude across all orientations to compile a Gabor feature vector. Subsequently, we develop a cascade clustering algorithm within this discriminative feature space, seamlessly integrating a first-level FCM with a second-level neighbor rule for enhanced precision. This two-tiered approach effectively segregates the changed from the unchanged areas, culminating in the creation of the definitive change map. Our methodology is rigorously tested across five SAR datasets, with the outcomes affirming its efficacy and robustness in change detection applications.

Author Contributions

The experimental measurements and data collection were carried out by L.L., H.M., X.Z. (Xueyu Zhang), X.Z. (Xiaobin Zhao), M.L. and Z.J. The manuscript was written by L.L. with the assistance of H.M., X.Z. (Xueyu Zhang), X.Z. (Xiaobin Zhao), M.L. and Z.J. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the National Science Foundation of China under Grant No. 62261053; the Cross-Media Intelligent Technology Project of Beijing National Research Center for Information Science and Technology (BNRist) under Grant No. BNR2019TD01022.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

function Y = fuse_pca(M1, M2)
%Y = fuse_pca(M1, M2) image fusion with PCA method
%    M1—input image #1
%    M2—input image #2
%    Y—fused image
% check inputs
[z1 s1] = size(M1);
[z2 s2] = size(M2);
if (z1 ~= z2) | (s1 ~= s2)
  error(‘Input images are not of same size’);
end;
% compute, select and normalize eigenvalues
[V, D] = eig(cov([M1(:) M2(:)]));
if (D(1,1) > D(2,2))
  a = V(:,1)./sum(V(:,1));
else
  a = V(:,2)./sum(V(:,2));
end;
% fuse
Y = a(1)*M1+a(2)*M2;

References

  1. Tang, Y.; Yang, X.; Han, T.; Zhang, F.; Zou, B.; Feng, H. Enhanced graph structure representation for unsupervised heterogeneous change detection. Remote Sens. 2024, 16, 721. [Google Scholar] [CrossRef]
  2. Zhuang, H.; Fan, H.; Deng, K.; Yao, G. A Spatial-temporal adaptive neighborhood-based ratio approach for change detection in SAR images. Remote Sens. 2018, 10, 1295. [Google Scholar] [CrossRef]
  3. Li, L.; Ma, H.; Jia, Z. Change detection from SAR images based on convolutional neural networks guided by saliency enhancement. Remote Sens. 2021, 13, 3697. [Google Scholar] [CrossRef]
  4. Zhuang, H.; Fan, H. Change detection in SAR images based on progressive nonlocal theory. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5229213. [Google Scholar] [CrossRef]
  5. Lv, Z.; Liu, J.; Sun, W. Hierarchical attention feature fusion-based network for land cover change detection with homogeneous and heterogeneous remote sensing images. IEEE Trans. Geosci. Remote Sens. 2023, 61, 4411115. [Google Scholar] [CrossRef]
  6. Du, Z.; Li, X.; Miao, J. Concatenated deep-learning framework for multitask change detection of optical and SAR images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 719–731. [Google Scholar] [CrossRef]
  7. Liu, Q.; Ren, K.; Meng, X. Domain adaptive cross reconstruction for change detection of heterogeneous remote sensing images via a feedback guidance mechanism. IEEE Trans. Geosci. Remote Sens. 2023, 61, 4507216. [Google Scholar] [CrossRef]
  8. Xiao, K.; Sun, Y.; Kuang, G.; Lei, L. Change alignment-based graph structure learning for unsupervised heterogeneous change detection. IEEE Geosci. Remote Sens. Lett. 2023, 20, 2504405. [Google Scholar] [CrossRef]
  9. Sun, Y.; Lei, L.; Li, Z.; Kuang, G. Similarity and dissimilarity relationships based graphs for multimodal change detection. ISPRS J. Photogramm. Remote Sens. 2024, 208, 70–88. [Google Scholar] [CrossRef]
  10. Sun, Y.; Lei, L.; Li, X.; Tan, X.; Kuang, G. Structure consistency-based graph for unsupervised change detection with homogeneous and heterogeneous remote sensing images. IEEE Trans. Geosci. Remote Sens. 2022, 60, 4700221. [Google Scholar] [CrossRef]
  11. Chen, M.; Jiang, W.; Zhou, Y. DTT-CGINet: A dual temporal transformer network with multi-scale contour-guided graph interaction for change detection. Remote Sens. 2024, 16, 844. [Google Scholar] [CrossRef]
  12. Seydi, S.T.; Boueshagh, M.; Namjoo, F.; Minouei, S.M.; Nikraftar, Z.; Amani, M. A hyperspectral change detection (HCD-Net) framework based on double stream convolutional neural networks and an attention module. Remote Sens. 2024, 16, 827. [Google Scholar] [CrossRef]
  13. Li, L.; Ma, H.; Jia, Z. Multiscale geometric analysis fusion-based unsupervised change detection in remote sensing images via FLICM model. Entropy 2022, 24, 291. [Google Scholar] [CrossRef] [PubMed]
  14. Wu, S.; Liu, Y.; Liu, S. Change detection enhanced by spatial-temporal association for bare soil land using remote sensing images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 150–161. [Google Scholar] [CrossRef]
  15. Song, X.; Hua, Z.; Li, J. Context spatial awareness remote sensing image change detection network based on graph and convolution interaction. IEEE Trans. Geosci. Remote Sens. 2024, 62, 3000316. [Google Scholar] [CrossRef]
  16. Wang, G.; Zhang, N.; Wang, J. Knowledge distillation-based lightweight change detection in high-resolution remote sensing imagery for on-board processing. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 3860–3877. [Google Scholar] [CrossRef]
  17. Qu, J.; Dong, W.; Yang, Y. Cycle-refined multidecision joint alignment network for unsupervised domain adaptive hyperspectral change detection. IEEE Trans. Neural Netw. Learn. Syst. 2024. [Google Scholar] [CrossRef] [PubMed]
  18. Zhang, H.; Li, Z.; Gao, F. Convolution and attention mixer for synthetic aperture radar image change detection. IEEE Geosci. Remote Sens. Lett. 2023, 20, 4012105. [Google Scholar] [CrossRef]
  19. Su, H.; Zhang, X. Nonlocal feature learning based on a variational graph auto-encoder network for small area change detection using SAR imagery. ISPRS J. Photogramm. Remote Sens. 2022, 193, 137–149. [Google Scholar] [CrossRef]
  20. Zhang, X.; Su, H. Robust unsupervised small area change detection from SAR imagery using deep learning. ISPRS J. Photogramm. Remote Sens. 2021, 173, 79–94. [Google Scholar] [CrossRef]
  21. Li, L.; Ma, H.; Jia, Z. Gamma correction-based automatic unsupervised change detection in SAR images via FLICM model. J. Indian Soc. Remote Sens. 2023, 51, 1077–1088. [Google Scholar] [CrossRef]
  22. Kılıç, D.K.; Nielsen, P. Comparative analyses of unsupervised PCA K-means change detection algorithm from the viewpoint of follow-up plan. Sensors 2022, 22, 9172. [Google Scholar] [CrossRef]
  23. Kumar, J.; Yennapusa, M.; Rao, B. TRI-SU-L ADWT-FCM: TRI-SU-L-based change detection in SAR images with ADWT and fuzzy C-means clustering. J. Indian Soc. Remote Sens. 2022, 50, 1667–1687. [Google Scholar] [CrossRef]
  24. Zhuang, H.; Hao, M. Change detection in SAR images via ratio-based gaussian kernel and nonlocal theory. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5210215. [Google Scholar] [CrossRef]
  25. Zhuang, H.; Tan, D. Adaptive generalized likelihood ratio test for change detection in SAR images. IEEE Geosci. Remote Sens. Lett. 2020, 17, 416–420. [Google Scholar] [CrossRef]
  26. Gao, T.; Li, H.; Gong, M. Superpixel-based multiobjective change detection based on self-adaptive neighborhood-based binary differential evolution. Expert Syst. Appl. 2023, 212, 118811. [Google Scholar] [CrossRef]
  27. Gong, M.; Gao, T.; Zhang, M. An M-nary SAR image change detection based on GAN architecture search. IEEE Trans. Geosci. Remote Sens. 2023, 61, 4503718. [Google Scholar] [CrossRef]
  28. Vijayarajan, R.; Muttan, S. Iterative block level principal component averaging medical image fusion. Optik 2014, 125, 4751–4757. [Google Scholar] [CrossRef]
  29. Li, H.; Celik, T. Gabor feature based unsupervised change detection of multitemporal SAR images based on two-level clustering. IEEE Geosci. Remote Sens. Lett. 2015, 12, 2458–2462. [Google Scholar]
  30. Wang, J.; Gao, F. Change detection from synthetic aperture radar images via graph-based knowledge supplement network. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 1823–1836. [Google Scholar] [CrossRef]
  31. Gao, Y.; Gao, F. Synthetic aperture radar image change detection via siamese adaptive fusion network. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 10748–10760. [Google Scholar] [CrossRef]
  32. Zhao, X.; Liu, K.; Gao, K.; Li, W. Hyperspectral time-series target detection based on spectral perception and spatial-temporal tensor decomposition. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5520812. [Google Scholar] [CrossRef]
  33. Zhang, X.; Li, Q. FD-Net: Feature distillation network for oral squamous cell carcinoma lymph node segmentation in hyperspectral imagery. IEEE J. Biomed. Health Inform. 2024, 28, 1552–1563. [Google Scholar] [CrossRef] [PubMed]
  34. Zhang, X.; Li, W. Hyperspectral pathology image classification using dimension-driven multi-path attention residual network. Expert Syst. Appl. 2023, 230, 120615. [Google Scholar] [CrossRef]
  35. Zhang, J.; Liu, Y.; Wang, B.; Chen, C. A hierarchical fusion SAR image change-detection method based on HF-CRF model. Remote Sens. 2023, 15, 2741. [Google Scholar] [CrossRef]
  36. Qu, X.; Gao, F. Change detection in synthetic aperture radar images using a dual-domain network. IEEE Geosci. Remote Sens. Lett. 2022, 19, 4013405. [Google Scholar] [CrossRef]
  37. Wang, M.; Zhang, J. An adaptive and adjustable maximum-likelihood estimator for SAR change detection. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5227513. [Google Scholar] [CrossRef]
  38. Gao, Y.; Zhang, M.; Wang, J. Cross-scale mixing attention for multisource remote sensing data fusion and classification. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5507815. [Google Scholar] [CrossRef]
  39. Wang, J.; Zhao, T. A hierarchical heterogeneous graph for unsupervised SAR image change detection. IEEE Geosci. Remote Sens. Lett. 2022, 19, 4516605. [Google Scholar] [CrossRef]
  40. Wang, J.; Zeng, F.; Zhang, A.; You, T. A global patch similarity-based graph for unsupervised SAR image change detection. Remote Sens. Lett. 2024, 15, 353–362. [Google Scholar] [CrossRef]
  41. Murdaca, G.; Ricciuti, F.; Rucci, A.; Le Saux, B.; Fumagalli, A.; Prati, C. A semi-supervised deep learning framework for change detection in open-pit mines using SAR imagery. Remote Sens. 2023, 15, 5664. [Google Scholar] [CrossRef]
  42. Celik, T. Unsupervised change detection in satellite images using principal component analysis and k-means clustering. IEEE Geosci. Remote Sens. Lett. 2009, 6, 772–776. [Google Scholar] [CrossRef]
  43. Celik, T. A Bayesian approach to unsupervised multiscale change detection in synthetic aperture radar images. Signal Process. 2010, 90, 1471–1485. [Google Scholar] [CrossRef]
  44. Sumaiya, M.; Kumari, R. Logarithmic mean-based thresholding for SAR image change detection. IEEE Geosci. Remote Sens. Lett. 2016, 13, 1726–1728. [Google Scholar] [CrossRef]
  45. Gao, Y.; Gao, F.; Dong, J. Sea ice change detection in SAR images based on collaborative representation. In Proceedings of the IEEE International Symposium on Geoscience and Remote Sensing IGARSS, Valencia, Spain, 22–27 July 2018; pp. 7320–7323. [Google Scholar]
  46. Sun, Y.; Lei, L. Image regression with structure cycle consistency for heterogeneous change detection. IEEE Trans. Neural Netw. Learn. Syst. 2024, 35, 1613–1627. [Google Scholar] [CrossRef]
  47. Li, L.; Lv, M.; Jia, Z.; Jin, Q.; Liu, M.; Chen, L.; Ma, H. An effective infrared and visible image fusion approach via rolling guidance filtering and gradient saliency map. Remote Sens. 2023, 15, 2486. [Google Scholar] [CrossRef]
  48. Li, L.; Ma, H.; Jia, Z.; Si, Y. A novel multiscale transform decomposition based multi-focus image fusion framework. Multimed. Tools Appl. 2021, 80, 12389–12409. [Google Scholar] [CrossRef]
  49. Li, L.; Lv, M.; Jia, Z.; Ma, H. Sparse representation-based multi-focus image fusion method via local energy in shearlet domain. Sensors 2023, 23, 2888. [Google Scholar] [CrossRef] [PubMed]
  50. Li, L.; Si, Y.; Wang, L.; Jia, Z.; Ma, H. A novel approach for multi-focus image fusion based on SF-PAPCNN and ISML in NSST domain. Multimed. Tools Appl. 2020, 79, 24303–24328. [Google Scholar] [CrossRef]
  51. Li, L.; Ma, H. Saliency-guided nonsubsampled shearlet transform for multisource remote sensing image fusion. Sensors 2021, 21, 1756. [Google Scholar] [CrossRef]
Figure 1. The structure of the proposed change detection method.
Figure 1. The structure of the proposed change detection method.
Remotesensing 16 01861 g001
Figure 2. Tree topology of the proposed two-level clustering for change detection.
Figure 2. Tree topology of the proposed two-level clustering for change detection.
Remotesensing 16 01861 g002
Figure 3. Village of Feltwell dataset. (a) image t1; (b) image t2; (c) reference.
Figure 3. Village of Feltwell dataset. (a) image t1; (b) image t2; (c) reference.
Remotesensing 16 01861 g003
Figure 4. Ottawa dataset. (a) image acquired in May 1997; (b) image acquired in August 1997; (c) reference.
Figure 4. Ottawa dataset. (a) image acquired in May 1997; (b) image acquired in August 1997; (c) reference.
Remotesensing 16 01861 g004
Figure 5. SAR images acquired by ERS-2 over the city of San Francisco.
Figure 5. SAR images acquired by ERS-2 over the city of San Francisco.
Remotesensing 16 01861 g005
Figure 6. San Francisco dataset. (a) image acquired in August 2003; (b) image acquired in May 2004; (c) reference.
Figure 6. San Francisco dataset. (a) image acquired in August 2003; (b) image acquired in May 2004; (c) reference.
Remotesensing 16 01861 g006
Figure 7. SAR images acquired by Radarsat-2 over the Yellow River Estuary in China.
Figure 7. SAR images acquired by Radarsat-2 over the Yellow River Estuary in China.
Remotesensing 16 01861 g007
Figure 8. Yellow River dataset. (a) image acquired in 2008; (b) image acquired in 2009; (c) reference.
Figure 8. Yellow River dataset. (a) image acquired in 2008; (b) image acquired in 2009; (c) reference.
Remotesensing 16 01861 g008
Figure 9. Sulzberger Ice Shelf dataset. (a) image acquired on 11 March 2011; (b) image acquired on 16 March 2011; (c) reference.
Figure 9. Sulzberger Ice Shelf dataset. (a) image acquired on 11 March 2011; (b) image acquired on 16 March 2011; (c) reference.
Remotesensing 16 01861 g009
Figure 10. Difference image of Village of Feltwell dataset. (a) LR; (b) MR; (c) PCA.
Figure 10. Difference image of Village of Feltwell dataset. (a) LR; (b) MR; (c) PCA.
Remotesensing 16 01861 g010
Figure 11. The ROC curves of operator-generated DIs.
Figure 11. The ROC curves of operator-generated DIs.
Remotesensing 16 01861 g011
Figure 12. The change detection results of different DI images. (a) LRTLC; (b) MRTLC (c) PCATLC.
Figure 12. The change detection results of different DI images. (a) LRTLC; (b) MRTLC (c) PCATLC.
Remotesensing 16 01861 g012
Figure 13. Results for the Village of Feltwell dataset. (a) PCAKM; (b) Bayesian; (c) GaborTLC; (d) LMT; (e) NRCR; (f) AGLRTM; (g) INPCANet; (h) proposed; (i) reference. In the binary change map, the following apply: white TPs; red, FPs; black, TNs; and green, FNs.
Figure 13. Results for the Village of Feltwell dataset. (a) PCAKM; (b) Bayesian; (c) GaborTLC; (d) LMT; (e) NRCR; (f) AGLRTM; (g) INPCANet; (h) proposed; (i) reference. In the binary change map, the following apply: white TPs; red, FPs; black, TNs; and green, FNs.
Remotesensing 16 01861 g013
Figure 14. Results for the Ottawa dataset. (a) PCAKM; (b) Bayesian; (c) GaborTLC; (d) LMT; (e) NRCR; (f) AGLRTM; (g) INPCANet; (h) proposed; (i) reference.
Figure 14. Results for the Ottawa dataset. (a) PCAKM; (b) Bayesian; (c) GaborTLC; (d) LMT; (e) NRCR; (f) AGLRTM; (g) INPCANet; (h) proposed; (i) reference.
Remotesensing 16 01861 g014aRemotesensing 16 01861 g014b
Figure 15. Results for the San Francisco dataset. (a) PCAKM; (b) Bayesian; (c) GaborTLC; (d) LMT; (e) NRCR; (f) AGLRTM; (g) INPCANet; (h) proposed; (i) reference.
Figure 15. Results for the San Francisco dataset. (a) PCAKM; (b) Bayesian; (c) GaborTLC; (d) LMT; (e) NRCR; (f) AGLRTM; (g) INPCANet; (h) proposed; (i) reference.
Remotesensing 16 01861 g015
Figure 16. Results for the Yellow River dataset. (a) PCAKM; (b) Bayesian; (c) GaborTLC; (d) LMT; (e) NRCR; (f) AGLRTM; (g) INPCANet; (h) proposed; (i) reference.
Figure 16. Results for the Yellow River dataset. (a) PCAKM; (b) Bayesian; (c) GaborTLC; (d) LMT; (e) NRCR; (f) AGLRTM; (g) INPCANet; (h) proposed; (i) reference.
Remotesensing 16 01861 g016
Figure 17. Results for the Sulzberger Ice Shelf dataset. (a) PCAKM; (b) Bayesian; (c) GaborTLC; (d) LMT; (e) NRCR; (f) AGLRTM; (g) INPCANet; (h) proposed; (i) reference.
Figure 17. Results for the Sulzberger Ice Shelf dataset. (a) PCAKM; (b) Bayesian; (c) GaborTLC; (d) LMT; (e) NRCR; (f) AGLRTM; (g) INPCANet; (h) proposed; (i) reference.
Remotesensing 16 01861 g017aRemotesensing 16 01861 g017b
Table 1. Descriptions of used datasets.
Table 1. Descriptions of used datasets.
DatasetsEventSatelliteSensor TypeSize
Village of FeltwellLand cover variationsDaedalus
1268 ATM
SAR335 × 470
OttawaFloodRadarsat-1SAR290 × 350
San FranciscoFloodERS-2SAR256 × 256
Yellow RiverFloodRadarsat-2SAR257 × 289
Sulzberger Ice ShelfIce breakupEuropean
Space Agency’s Envisat satellite
SAR256 × 256
Table 2. Evaluation criteria.
Table 2. Evaluation criteria.
Evaluation MetricsMeaningCalculation Formula
FNUnchanged pixels that are incorrectly detected as changed pixels-
FPChanged pixels that are incorrectly detected as unchanged pixels-
OEOverall error O E = F N + F P
PCCPercentage of correct classifications P C C = T P + T N T P + T N + F N + F P
KappaKappa coefficient K a p p a = P C C P R E 1 P R E
P R E = ( T P + F P ) ( T P + F N ) + ( F N + T N ) ( F P + T N ) ( T P + T N + F P + F N ) 2
F1F1-score F 1 = 2 p r e c i s i o n r e c a l l p r e c i s i o n + r e c a l l
p r e c i s i o n = T P T P + F P
r e c a l l = T P T P + F N
Notes: TN (True Negative): unchanged pixels that are correctly classified. TP (True Positive): changed pixels that are correctly classified.
Table 3. The AUC and Ddist values of different DI images on Village of Feltwell dataset.
Table 3. The AUC and Ddist values of different DI images on Village of Feltwell dataset.
AUCDdist
LR0.99861.3964
MR0.00170.0252
PCA0.99881.3977
Table 4. The quantitative evaluation of different DI images with two-level clustering algorithm on the Village of Feltwell dataset.
Table 4. The quantitative evaluation of different DI images with two-level clustering algorithm on the Village of Feltwell dataset.
FNFPOEPCC (%)KC (%)F1 (%)
LRTLC5397761699.6192.1192.31
MRTLC124931105599.3388.2988.63
PCATLC2699136099.7795.5495.66
Table 5. The quantitative evaluation of different methods for the Village of Feltwell dataset.
Table 5. The quantitative evaluation of different methods for the Village of Feltwell dataset.
FNFPOEPCC (%)KC (%)F1 (%)
PCAKM29934063999.5992.2992.49
Bayesian462731277798.2474.2575.11
GaborTLC35817753599.6693.3793.55
LMT1271863199098.7479.8780.51
NRCR194114195598.7669.5570.13
AGLRTM271524273998.2651.9352.62
INPCANet133793143099.0979.7780.22
Proposed2699136099.7795.5495.66
Table 6. The quantitative evaluation of different methods for the Ottawa dataset.
Table 6. The quantitative evaluation of different methods for the Ottawa dataset.
FNFPOEPCC (%)KC (%)F1 (%)
PCAKM1901583248497.5590.4991.93
Bayesian2993924422395.8485.6988.18
GaborTLC2531253278497.2689.0790.66
LMT526623528994.7977.4380.31
NRCR497133500495.0778.8481.58
AGLRTM2989101309096.9687.6689.42
INPCANet87090870991.4258.6662.76
Proposed225858231697.7290.9292.25
Table 7. The quantitative evaluation of different methods for the San Francisco dataset.
Table 7. The quantitative evaluation of different methods for the San Francisco dataset.
FNFPOEPCC (%)KC (%)F1 (%)
PCAKM3831023140697.8584.8085.95
Bayesian551923147497.7583.6684.87
GaborTLC324803112798.2887.6388.56
LMT1046248129498.0383.8684.90
NRCR59828788598.6589.5190.23
AGLRTM1821204138697.8985.5386.66
INPCANet11205112598.2885.4786.37
Proposed51332784098.7290.1690.85
Table 8. The quantitative evaluation of different methods for the Yellow River dataset.
Table 8. The quantitative evaluation of different methods for the Yellow River dataset.
FNFPOEPCC (%)KC (%)F1 (%)
PCAKM33594224758389.7966.3872.65
Bayesian4422535495793.3374.6178.43
GaborTLC28691656452593.9178.6982.36
LMT1340401340481.950.340.42
NRCR22562501475793.6078.5482.45
AGLRTM6574192676690.8962.2766.97
INPCANet2788955374394.9682.0485.05
Proposed3061574363595.1182.2085.09
Table 9. The quantitative evaluation of different methods for the Sulzberger Ice Shelf dataset.
Table 9. The quantitative evaluation of different methods for the Sulzberger Ice Shelf dataset.
FNFPOEPCCKCF1
PCAKM322733105598.3994.8895.88
Bayesian07734773488.2069.2276.53
GaborTLC30720551299.2297.4897.96
LMT53221532391.8868.8673.25
NRCR46652799398.4895.1396.07
AGLRTM1173128130198.0193.4094.62
INPCANet23301621395193.9780.1883.88
Proposed33840974798.8696.3497.05
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Li, L.; Ma, H.; Zhang, X.; Zhao, X.; Lv, M.; Jia, Z. Synthetic Aperture Radar Image Change Detection Based on Principal Component Analysis and Two-Level Clustering. Remote Sens. 2024, 16, 1861. https://doi.org/10.3390/rs16111861

AMA Style

Li L, Ma H, Zhang X, Zhao X, Lv M, Jia Z. Synthetic Aperture Radar Image Change Detection Based on Principal Component Analysis and Two-Level Clustering. Remote Sensing. 2024; 16(11):1861. https://doi.org/10.3390/rs16111861

Chicago/Turabian Style

Li, Liangliang, Hongbing Ma, Xueyu Zhang, Xiaobin Zhao, Ming Lv, and Zhenhong Jia. 2024. "Synthetic Aperture Radar Image Change Detection Based on Principal Component Analysis and Two-Level Clustering" Remote Sensing 16, no. 11: 1861. https://doi.org/10.3390/rs16111861

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop