Next Article in Journal
Comparing the Potential of Multispectral and Hyperspectral Data for Monitoring Oil Spill Impact
Previous Article in Journal
Skin Lesion Analysis towards Melanoma Detection Using Deep Learning Network
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Improved Pansharpening Method for Misaligned Panchromatic and Multispectral Data

Key Laborary of Digital Earth Sciences, Institute of Remote Sensing and Digital Earth, Chinese Academy of Sciences, Beijing 100094, China
*
Authors to whom correspondence should be addressed.
Sensors 2018, 18(2), 557; https://doi.org/10.3390/s18020557
Submission received: 16 January 2018 / Revised: 7 February 2018 / Accepted: 9 February 2018 / Published: 11 February 2018
(This article belongs to the Section Remote Sensors)

Abstract

:
Numerous pansharpening methods were proposed in recent decades for fusing low-spatial-resolution multispectral (MS) images with high-spatial-resolution (HSR) panchromatic (PAN) bands to produce fused HSR MS images, which are widely used in various remote sensing tasks. The effect of misregistration between MS and PAN bands on quality of fused products has gained much attention in recent years. An improved method for misaligned MS and PAN imagery is proposed, through two improvements made on a previously published method named RMI (reduce misalignment impact). The performance of the proposed method was assessed by comparing with some outstanding fusion methods, such as adaptive Gram-Schmidt and generalized Laplacian pyramid. Experimental results show that the improved version can reduce spectral distortions of fused dark pixels and sharpen boundaries between different image objects, as well as obtain similar quality indexes with the original RMI method. In addition, the proposed method was evaluated with respect to its sensitivity to misalignments between MS and PAN bands. It is certified that the proposed method is more robust to misalignments between MS and PAN bands than the other methods.

1. Introduction

Due to the physical constraints of remote sensing imaging and limited bandwidth of satellite transmission, a large number of currently operating satellites, such as SPOT, IKONOS (IK), QuickBird (QB), and WorldView-2/3, provide both a single relative high-spatial-resolution (HSR) panchromatic (PAN) band and several low-spatial-resolution (LSR) MS bands. Fusion of PAN and MS images is also referred to as pansharpening [1]. It is an important pre-processing step for generating high quality HSR MS image in various remote sensing tasks [2,3], such as land cover classification, object recognition [4,5], water-bodies mapping [6,7,8], and shadow detection [9]. In recent years, the requirement for fused HSR MS imagery is unceasingly growing, due to abundant remote sensing data sources [10].
Research on pansharpening has been done for decades and numerous algorithms have been developed to produce HSR MS imagery. The major problem encountered by current pansharpening methods is to reduce spectral and spatial distortions. Researchers have proposed several categorizations to group existing pansharpening methods [11,12,13,14,15,16]. One of the categorizations classifies current pansharpening methods into two categories: the component-substitution (CS) methods and the methods based on multiresolution analysis (MRA) [17,18,19,20]. The CS methods can provide fused products with good spatial quality in most cases, but they sometimes suffer from spectral distortions. The MRA-based methods can provide fused products with good spectral quality. However, their fused products may suffer spatial distortions [21]. Moreover, this disadvantage may be emphasized for misaligned MS and PAN data, especially for MSA-based methods using transformations that are not shift-invariant to achieve multiresolution analysis [21,22]. In contrast, the performances of CS methods are robust to possible misalignment between MS and PAN bands. In addition, they have a relative low computational burden. These favorable characteristics lead to their widespread uses.
During the designing of pansharpening algorithms, it is commonly assumed that PAN and MS bands are perfectly aligned. Researchers have made tremendous progress in image registration, for both nature images [23,24,25] and remote sensed images [26,27]. However, it is hard to reach perfect co-registration between PAN and MS bands. Even through PAN and MS images are often simultaneously acquired by the same platforms, there may be misalignment, due to small angle difference between the two sensors exists in data acquisition process. Hence, some misregistration between PAN and MS images is unavoidable in the real world. The impact of image misalignment on quality of fused products has been reported by several researchers. Blanc et al. [28] showed that even a small misregistration of 0.3 pixel can cause a noticed effect on quality of fused images. Baronti et al. [29] theoretically analyzed the effects of misregistration on pansharpened images and concluded that misregistration degrades the performances of all pansharpening methods. In addition, MRA-based methods are much more sensitive to registration errors than CS methods. In order to evaluate quality of pansharpened images with reference to a true MS image, original MS and PAN datasets are usually spatially degraded to a lower resolution. Jing et al. [30] reported that misalignments between MS and PAN bands can be caused by some decimation approaches used to produce degraded datasets, and thus lead to undependable performance evaluation of different pansharpening methods. Xu et al. [13] discussed two key problems that affect quality of fused images, i.e., misregistration and size difference caused by different resolutions between MS and PAN bands. It was found that the two problems should be considered during the design of a fusion algorithm. Hallabia et al. [31] proposed a modified pansharpening method based on filter banks, taking into account the case of misalignment. Jing and Cheng [18] proposed a ratio-based method that can reduce misalignment impact, which is denoted as RMI. It was demonstrated that the RMI method can provide good performances, in terms of quality indexes, as well as visual inspection, for both aligned and misaligned PAN and MS images [18]. This method can achieve good performances, due to two reasons. The key reason is that it uses a synthetic PAN band at PAN scale generated with weighted sum of up-sampled MS bands using optimal regression coefficients obtained at MS scale. The second one is that this method also considers to reduce the effect of image haze caused by atmospheric path radiance.
In this paper, an improved version of the RMI method is proposed. Two improvements were made. The first one is that pixels corresponding to edge pixels identified from the PAN bands are fused through the injection of spatial details with relative large coefficients, to sharpen boundaries between different image objects. The second one is that dark pixels are fused using haze values lower than those used to other pixels, in order to improve the quality of fused dark pixels. The experimental results showed that the proposed method can reduce spectral distortions of fused dark pixels, and sharpen boundaries between different objects. Meanwhile, it is robust to misalignments between MS and PAN bands. The rest of this paper is organized as follows: the proposed improved method is introduced in Section 2, a fusion experiment is reported in Section 3. Section 4 presents some related discussions, whereas Section 5 summarizes the conclusions of this work.

2. Methodologies

2.1. The RMI Method

The RMI method is a ratio-based approach considering haze caused by atmospheric path radiances. Let HP and Hi denote image haze values determined for PAN and ith MS bands, respectively, Ii denotes the ith band of the up-sampled MS image, and P and PS are an original PAN band and a synthetic version of the original PAN band, respectively. The fused ith MS band Fi generated by the RMI method can be expressed using Equation (1):
F i = ( I i H i ) P H p P S H p + H i , i = 1 , ,   N
Equation (1) can also be rewritten as (2):
F i = I i + I i H i P S H p ( P P S ) , i = 1 , ,   N
in which PPS represents spatial details derived from the original PAN band, and I i H i P S H p is the space-varying injection coefficients for the ith band.
PS is obtained by a weighted sum the up-sampled MS bands, as in (3):
P S = i = 1 N a i · I i + b
in which ai is the weight coefficient of the ith band, and b is a constant. The optimal values of ai and b can be estimated by the least-squares approach using Equation (4):
P L = i = 1 N a i · I i L + b
where PL is a degraded version of the original PAN band, and can be obtained using an averaging approach; I i L is the ith original MS band.
For misaligned MS and PAN bands, the optimal regression coefficients, representing the maximum multiple correlations between misaligned PAN with each MS band, can discount the effect of misalignment on quality of fused images. Another factor contributing to the good performance of the RMI method is that image haze caused by atmospheric path radiance is considered by the method. However, haze values of PAN and MS bands give an important effect on spectral vector directions of fused MS pixels and thus the spectral quality of the whole fusion product [32].

2.2. The Improved Version of the RMI Method

In this work, two improvements are made based on the original RMI method. For the first improvement, boundary pixels of the PAN band are identified using edge detection operators, and then fused using larger injection coefficients than the other pixels. Fused images generated by commonly used methods show blurred boundaries between different objects, due to several reasons, such as misalignments, aliasing, and the effect of mixed pixels. As this improvement aims to add more spatial details into edge pixels, it helps to obtain fused products with sharpened boundaries between different image objects. For an edge pixel t, the fusion version of t is calculated using (5):
F i ( t ) = I i ( t ) + ( 1 + k / 10 ) · I i ( t ) H i P S ( t ) H p [ P ( t ) P S ( t ) ] , i = 1 , ,   N , k   =   0 ,   1 , , 4
in which k is a value determining the sharpness boundaries between different objects in pansharpened products. The value can be set by users, according to different application purposes.
For the second improvement, relative dark pixels in the image, which mainly correspond to dark objects such as water-bodies and shadows, are identified according to a certain threshold and then fused using relative low haze values. As a high haze value may result in abnormal high injection coefficients for dark pixels, this may result in over-enhancement of dark pixels. This improvement is helpful for reducing spectral distortions of water-bodies and shadow covered regions in fused products. In this work, a threshold T, used to justify dark pixels, were automatically determined using the product of the standard deviation of the PAN band δP and a scale factor S.
For a pixel with a gray value higher than T in PAN, the fused version F i can be produced using (2). In contrast, for a pixel with a gray value lower than T in PAN, relative low haze values for MS and PAN bands, which are denoted as H i L and H P L , respectively, are employed to generate fused pixels. In this situation, the fused version F i can be produced using (6):
F i = I i + I i H i L P S H P L ( P P S ) , i = 1 , , N
in which H i L is determined using (7):
H i L = p · H i , i = 1 , ,   N
where p is a scale constant be lower 1. A value of 0.75 was assigned to p in this work. The values of H P L are obtained with respect to coefficients obtained using (4), as in (8):
H P L = i = 1 N a i · H i L + b .
The proposed method is implemented in MATLAB. The pseudocode of the proposed method is also reported, as Algorithm 1, to facility other people to implement and use the method in remote sensing tasks.
Algorithm 1: Improved_RMI
   input: upsampled MS bands I, PAN band P, band number N
   output: fused MS bands F
 
   Let PL be a PAN band at MS scale, generating from P using an averaging approach
   Let ai and b are the coefficients generated using (4) by the least-squares approach
   PS ← i = 1 N a i · I i + b
   Let Hp and Hi be haze values for PAN and the ith MS bands, respectively
   Let H i L and H P L be haze values for dark pixels
    H i L p · H i
    H P L i = 1 N a i · H i L + b
   Let E be the union of the edge PAN pixels identified using CANNY, let F be the other pixels no belonging to E
   Let Fi be the fused ith MS band
   for each pixel t in E
      for each band i in [1, 2, …, N]
           Fi(t) is calculated using (5)
      end for
   end for
 
   for each pixel t in F
      Let T be the threshold used to identify dark pixels, determined using T = δP × S
      for each band i in [1, 2, …, N]
           if P(t)HpT
                Fi(t) is calculated using (2)
           Else
                Fi(t) is calculated using (6)
           end if
      end for
   end for
   return F

3. Experiments

3.1. Test Data

Three image scenes recorded by WorldView-2 (WV2), IK, and QB, respectively, were used in the experiment to evaluate the performances of the proposed method. A subset with a size of 512 × 512 pixels for MS bands, and a size of 2048 × 2048 pixels for PAN bands, were selected from each scene. The MS image from WV2 has eight bands, whereas those of IK and QB have four bands. The spatial resolution ratio R for all the three datasets is 4. The MS images of three datasets are presented in Figure 1.

3.2. Comparing with Other Methods

3.2.1. Fusion Methods for Comparison and Evaluation Criteria

The proposed method was compared with adaptive Gram-Schmidt (GSA) [5,33], and Generalized Laplacian Pyramid (GLP) with Gaussian-shaped filter adjusted by modulation transfer function using spectral distortion minimizing model [34]. Actually, image haze can also be considered by some other methods using similar injection models. In order to achieve impartial comparison, the GLP method considering image haze was also achieved and included in the comparisons. It is noted as GLP-H henceforth. The GSA method with image haze was also considered, but it was not included in this work because it yielded the same performance as the original GSA method. Actually, it can be inferred that the considering of image haze could not improve or decrease the performance of GSA.
Besides the original versions of the three datasets, the degraded versions of the original datasets were also considered, as fused images generated at the degraded scale can be assessed using the original MS images as reference. Quality of fused products of the degraded scale was assessed using several widely used quality indexes including the relative average spectral error (RASE) [35], dimensionless global relative error of synthesis (ERGAS) [36], spectral angle mapper (SAM) [37], Q2n [38,39,40], and spatial correlation coefficient (SCC) [41]. An index named quality with no reference index (QNR) [42], which is increasingly used in recent studies, was employed to evaluate quality of pansharpened images produced using the original datasets. The QNR index is a comprehensive index consisting of a spectral distortion indicator Dλ and a spatial distortion indicator DS [42]. In addition, visual inspection was also performed to comparing the quality of fused products. In order to avoid misalignments between MS and PAN bands, an aligned version of an original PAN band was determined using the similar approach employed in [43], and then used to replace the original PAN bands in this experiment. The degraded MS and PAN images were generated by a box filter with a size of 4 × 4, according to the resolution ratio of the three datasets, in order to avoid misalignment introduced during the decimation process [30].

3.2.2. Results and Analysis

Table 1 reports the quality indexes generate from the fused products obtained at the two scales. The improved methods with different values for k used the same haze values as those employed by the original RMI method, for each test image. In this table, RMI (k = n) denotes the fused images generated by the RMI method with k = n, where n is an integer ranging from 0 to 4. In order to highlight the performances of the comparing methods on the fusion of dark pixels, a SAM value between fused and reference dark pixels is calculated from each fused image generated at the degraded scale. Version and the original version of dark pixels, which are identified during the implement of the proposed method. This index is also listed in Table 1, and it is denoted as SAMd in the table.
It can be observed from Table 1 that RMI (k = 0) gives slightly better or similar performances with the original RMI method. The improved and original RMI methods provide higher Q2n and QNR values than other methods. An exception is that GSA and GLP-H yield lower Dλ and Ds values, and higher QNR values than other methods for the original QB dataset. For the degraded WV2 dataset, RMI (k = 0) offers slightly lower RASE and ERGAS values, and slightly higher Q2n and SCC values than the original RMI method. For the original WV2 dataset, RMI (k = 0) provides a slightly lower Ds value and a slightly higher QNR value than the original RMI method. For the degraded IK dataset, RMI (k = 0) provides slightly lower Q2n and RASE values than the original RMI method, and the same ERGAS and SAM values as the original RMI. The former offers a slightly higher QNR value and a slightly lower Ds value than the latter, for the original IK dataset. For the degraded QB dataset, RMI (k = 0) provides slightly lower RASE, ERGAS, SAM values, and higher Q2n and SCC values than the original RMI. For the original QB dataset, RMI (k = 0) also yields a lower Dλ and a higher QNR than the original RMI. The improved RMI methods offer lower SAMd than the original RMI for all the three datasets, indicating that the proposed method is effective for reducing spectral distortions of fused dark pixels. The GSA method yields the highest SAMd values for the three degraded datasets, indicating it gives the poorest performances for dark pixels. As the three test images have different numbers of dark pixels, it is reasonable that the proposed approach gives slightly different performances for these images.
The Q2n and QNR values of fused products generated by the improved RMI methods descend along with the increasing of k. However, the fused products generated by RMI (k = 4) also offer higher Q2n and QNR values than those generated by the GSA, GLP, and GLP-H methods for the WV-2 and IK datasets. The GLP-H method offers higher Q2n and QNR values than GSA and GLP for the three datasets, due to the considering of image haze. The GLP method offers lower Q2n and QNR values than GSA for the three datasets. This may due to the fact that misalignments between MS and PAN bands occur during an down-sampling progress for obtaining a LSR PAN band at MS scale, and an followed up-sampling progress for obtaining an expanded version of the LSR PAN band.The original and pansharpened images of a subset selected from each original dataset are shown in Figure 2, Figure 3 and Figure 4, for visual inspection. For the WV-2 dataset, the images are shown in compositions of bands 5, 7, and 2. For the IK and QB datasets, the images are shown in compositions of bands 3, 4, and 1.
Although pansharpened images generated by these methods show similar tone in each figure, fused images produced by the improved and original RMI methods, and GLP-H show more texture details in vegetation covered regions. This is obvious in Figure 2. This is due to the considering of image hazes by these methods. Some artefacts can be seen from the fused images generated by the original RMI method, as shown in Figure 2f and Figure 4f. The artefacts occur in some regions covered by shadows or water-bodies. In contrast, these artefacts are absent from the pansharpened images generated by RMI (k = 0), RMI (k = 2), and RMI (k = 4), as shown in Figure 2 and Figure 4. Although GSA yields the highest QNR value for the original QB dataset, the GSA-fused images show obvious spectral distortions. This can be observed from the yellow rectangle in Figure 2g, and the red and yellow rectangles in Figure 4g. In addition, the GSA-fused WV2 image show very few texture details for vegetation covered regions. This is because a very small amount of spatial details is injected into the two near-infrared (NIR) bands of the WV2 dataset, due to relative low correlations between PAN and the two NIR bands. The fused images generated by GLP and GLP-H also show noticeable spectral distortions, as shown in yellow rectangles in Figure 2, Figure 3 and Figure 4. Moreover, aliasing effects can be observed in the fused images generated by GLP and GLP-H, as shown in yellow rectangles in Figure 3 and Figure 4. Moreover, the fused products produced by RMI (k = 2), and RMI (k = 4) offer more sharpened boundaries between different objects than the original RMI, GSA, GLP, and GLP-H. This can be seen from the fused subsets in yellow rectangles shown in Figure 2, Figure 3 and Figure 4. The fused images generated by RMI (k = 2), and RMI (k = 4) provide more sharpened boundaries than those produced by RMI (k = 0), due to more spatial details are injected into edge pixels. Consequently, the improved approach can reduce spectral distortions of fused dark pixels and sharpen some boundaries in fused products, as well as obtain similar quality indexes with the original RMI method.

3.3. Sensitivity to Misalignments between MS and PAN Bands

The proposed method is also evaluated with respect to its sensitivity to misalignments between MS and PAN bands using another experiment, with the degraded IK dataset. The up-sampled version of the degraded MS image was shifted by zero to four PAN pixels on row and column, respectively, to yield several shifted versions of the up-sampled MS image. A shift of r pixels on row, along with c pixels on column, is denoted as a couple (r, c). Several couples including (0, 1), (1, 1), (2, 1), (2, 2), (3, 2), (3, 3), (4, 3), and (4, 4) were employed to generated shifted MS images. These shifted up-sampled MS images were then fused with the degraded 4-m PAN band using the proposed method, GSA, GLP, and GLP-H, to produce 4-m fused products.
The quality of these fused products was also assessed using ERGAS, SAM, Q4, and SCC, which are shown in Figure 5a–d, respectively. As the improved RMI methods with different k values give very clustered indexes, only the cases including k = 0, k = 2, k = 4 are shown in the figure for better visual inspection. As illustrated in those figures, the original RMI, RMI (k = 0), and RMI (k = 2) offer very similar performances, in terms of the four indexes. Actually, RMI (k = 0) offers slightly lower ERGAS and SAM values, and slightly higher Q4 and SCC values, than the original RMI method. RMI (k = 2) also gives lower SAM values than the original RMI method.
The pansharpened images generated by the original RMI, RMI (k = 0), and RMI (k = 2) offer the lowest ERGAS and SAM values, and the highest Q4 and SCC values, in most cases. Exceptions occur when the misalignments are (4, 3) and (4, 4), the GSA method yields the highest Q4 values, and higher SCC values than the RMI (k = 2). In addition, RMI (k = 4) offers lower SCC values than the GSA and GLP-H methods in some cases, although the former offers lower ERGAS and SAM values for all the cases. This indicates that RMI (k = 0), and RMI (k = 2) are the best choices, when misalignments between MS and PAN bands are no more than three PAN pixels on row and column.
The GSA method gives better performances than the two GLP methods when the misalignment is larger than two pixels. However, when the misalignment is no more than one pixel, the GSA method offers higher ERGAS values and lower Q4 values than the GLP-H method.
The GLP-H method yields lower ERGAS and SAM values, and higher SCC values than the GLP method for all the cases. However, it offers higher Q4 values than the GLP and GSA methods when the misalignment is not more than two pixels, lower Q4 values than the latter two when the misalignment is larger. This indicates that the GLP-H gives better performances than GLP and GSA when the misalignment is not more than two pixels.
It can also be observed that the four quality indexes decline at different rates, as the misalignment increases. The four quality indexes of the original RMI, RMI (k = 0), RMI (k = 2), RMI (k = 4), and GSA decrease the more slowly than GLP and GLP-H. This is due to the fact that the MRA methods are more sensitive to misalignments between MS and PAN bands than the CS methods. Although the curves of GSA are even gentler than those of the original RMI, RMI (k = 0), RMI (k = 2), and RMI (k = 4), GSA offers poor performances than the latter in terms of quality indexes in most cases. In conclusion, the proposed method is also robust to misalignments between MS and PAN bands, besides offering comparable quality indexes to other methods.
The original and fused images of a 300 × 300 subset of the original IK dataset with a shift of (2, 1) are shown in Figure 6. All the fused MS images were stretched to the same histogram as the corresponding original MS bands. It can be observed that the three fused products generated by RMI (k = 0), RMI (k = 2), RMI (k = 4), shown in Figure 6c–e, yield the best visual quality, followed by those produced by the original RMI and GSA. It can be observed from the yellow rectangle in Figure 6g that the GSA-fused image shows noticeable spectral distortion. In addition, aliasing effects can be observed from the fused images generated by GLP and GLP-H, as shown in the red and yellow rectangles in Figure 6h,i. Hence, visual inspection also indicates that the improved RMI is effective for reducing the impacts of misalignments between PAN and MS bands on quality of pansharpened products.

4. Discussion

The effects of misalignment on pansharpened RS images draw an increasing concern in recent years. Although the RMI method is been proposed for several years, the advantage of this method for fusion of misaligned MS and PAN bands is not explored yet. In addition, image haze is a non-negligible factor for ratio-based fusion methods, especially when image fusion is performed on remote sensing images in digital number. The determination of haze values gives an important impact on spectral distortion of fused products [32,43,44], especially for fused pixels corresponding vegetation and dark objects, such as shadows and water-bodies. This work proposed an improved version of the RMI method though two improvements. The quality indexes for the proposed method, which aims at reducing spectral distortions of fused dark pixels, did not show significant improvement in the experiment, due to a limited amount of dark pixels included by the test datasets. However, the proposed method shows consistent improvements in terms of SAMd, which is an SAM index calculated using only dark pixels. It was shown that the improved RMI method is also robust to misalignment between MS and PAN bands, as well as allowing to produce pansharpened images with sharped boundaries. This is very useful for producing high-resolution MS images covering urban regions. It is also helpful for local region mapping, image interpretation, and applications related to water-bodies and shadows. As the proposed method yield rich texture details in vegetation covered regions, it can provide comparable performances for remote sensing images covering agricultural areas and forest areas. The proposed method also can be used to medium resolution images, i.e., ASTER, Landsat 7/8, and Sentinel 2A. However, it shows more advantages for high-resolution images, as boundary information and texture details are richer in high-resolution images than in medium resolution images.
A major difference between the proposed method and other methods is that it allows producing fused products with more sharpened boundaries, through the determination of different values for k. In order to give some guidelines on how to choose the optimal value for this parameter, a fusion experiment was performed to test values ranging from 0 to 10 for k, using the degraded QB dataset. The quality indexes calculated from the fused images generated by the proposed method are reported in Table 2. It can be seen that the values of the Q2n index descend with the increase of k. This is because more spatial details injected into the up-sampled MS bands may results in larger spectral distortion of fused products. As RMI (k = 4) provides comparable indexes to other methods (as shown in Table 1), such as GSA and GLP, we suggest that the maximum value for k can be set to 4, for the proposed method. We think this value can yield a better balance between spectral and spatial quality of fused pixels corresponding to edge PAN pixels.
As introduced in Section 2.2, a threshold used to justify dark pixels were automatically determined using the product of the standard deviation of the PAN band and a scale factor S. We used a value of 0.3 for S in the experiments in this work. In order to evaluate the impact of this parameter on performances of the proposed method, values ranging from 0.1 to 1 with a step of 0.1 were tested for S in another experiment. The degraded QB dataset was used in this experiment, and the value of k is set to 0. The quality indexes obtained from fused images generated by the proposed method are shown in Table 3. The proposed method with S = 0.2 yields the highest Q2n value. It can be seen that the performance of the proposed method gets poor along with the increases of S from 0.2. However, the changes are not significant. The fused images generated using S values ranging from 0.1 to 0.4 offer very similar Q2n values. We suggest that the value of S can be set to 0.2, or 0.3, according to the results of this experiment.
In this work, edge pixels of the PAN band were automatically identified using the CANNY detector with sensitivity thresholds that are automatically chosen. The proposed method may yield better performances if an optimal sensitivity threshold is selected for each dataset.

5. Conclusions

An improved pansharpening method considering image haze caused by atmospheric path radiance is proposed in this study, in order to further reduce spectral distortion of fused dark pixels and sharpen boundaries between different image objects. The improved method was compared with the original RMI, GSA, GLP, and GLP-H. The main conclusions derived from the experimental results are as follows:
(1)
The improved approach can reduce spectral distortions of fused dark pixels, thus the proposed method is a good choice for producing high-resolution MS images in applications related to water-bodies and shadows.
(2)
The improved approach can be used to obtain fused images with sharpened boundaries between different objects, through choosing a reasonable value for k. This is very useful for fused products covering urban regions, and fused products used for local region mapping or image interpretation.
(3)
The experiment used to evaluate the sensitivities of these method to misalignments between MS and PAN bands showed that the proposed method is more robust to misalignments between the MS and PAN bands than the other methods. These conclusions indicate that the improved method is very promising to be widely used in practical remote sensing applications.

Acknowledgments

This research was supported in part by the Youth Foundation of Director of the Institution of Remote Sensing and Digital Earth, Chinese Academy of Sciences (grant No. Y6SJ1100CX), the Key Research Program of the Chinese Academy of Sciences (grant No. ZDRW-ZS-2016-6-1-3), the Major Special Project-the China High-Resolution Earth Observation System (grant No. 30-Y20A37-9003-15/17), and the “Light of West China” Program of the Chinese Academy of Sciences (grant No. 2015-XBQN-A-07).

Author Contributions

Hui Li was responsible for the research design, experiments and the analyses, wrote the manuscript. Linhai Jing provided technical guide for the proposed method, the research design, and the analyses, and reviewed the manuscript. Yunwei Tang and Haifeng Ding took part in the processing of the IK and WV-2 datasets.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Aiazzi, B.; Baronti, S.; Selva, M. Improving component substitution pansharpening through multivariate regression of MS + PAN data. IEEE Trans. Geosci. Remote Sens. 2007, 45, 3230–3239. [Google Scholar] [CrossRef]
  2. Kim, Y.; Kim, Y. Improved classification accuracy based on the output-level fusion of high-resolution satellite images and airborne LIDAR data in urban area. IEEE Geosci. Remote Sens. Lett. 2014, 11, 636–640. [Google Scholar]
  3. Yang, J.; Li, P. Impervious surface extraction in urban areas from high spatial resolution imagery using linear spectral unmixing. Remote Sens. Appl. Soc. Environ. 2015, 1, 61–71. [Google Scholar] [CrossRef]
  4. Liu, W.; Yamazaki, F. Object-based shadow extraction and correction of high-resolution optical satellite images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2012, 5, 1296–1302. [Google Scholar] [CrossRef]
  5. Yu, L.; Wang, J.; Clinton, N.; Xin, Q.; Zhong, L.; Chen, Y.; Gong, P. FROM-GC: 30 m global cropland extent derived through multisource data integration. Int. J. Digit. Earth 2013, 6, 521–533. [Google Scholar] [CrossRef]
  6. Ashraf, S.; Brabyn, L.; Hicks, B.J. Image data fusion for the remote sensing of freshwater environments. Appl. Geogr. 2012, 32, 619–628. [Google Scholar] [CrossRef]
  7. Jawak, S.D.; Luis, A.J. A spectral index ratio-based antarctic land-cover mapping using hyperspatial 8-band worldview-2 imagery. Polar Sci. 2013, 7, 18–38. [Google Scholar] [CrossRef]
  8. Rokni, K.; Ahmad, A.; Solaimani, K.; Hazini, S. A new approach for surface water change detection: Integration of pixel level image fusion and image classification techniques. Int. J. Appl. Earth Obs. Geoinf. 2015, 34, 226–234. [Google Scholar] [CrossRef]
  9. Adeline, K.R.M.; Chen, M.; Briottet, X.; Pang, S.K.; Paparoditis, N. Shadow detection in very high spatial resolution aerial images: A comparative study. ISPRS J. Photogramm. Remote Sens. 2013, 80, 21–38. [Google Scholar] [CrossRef]
  10. Pohl, C.; Van Genderen, J.L. Review article multisensor image fusion in remote sensing: Concepts, methods and applications. Int. J. Remote Sens. 1998, 19, 823–854. [Google Scholar] [CrossRef]
  11. Vivone, G.; Alparone, L.; Chanussot, J.; Dalla Mura, M.; Garzelli, A.; Licciardi, G.A.; Restaino, R.; Wald, L. A critical comparison among pansharpening algorithms. IEEE Trans. Geosci. Remote Sens. 2015, 33, 2565–2586. [Google Scholar] [CrossRef]
  12. Thomas, C.; Ranchin, T.; Wald, L.; Chanussot, J. Synthesis of multispectral images to high spatial resolution: A critical review of fusion methods based on remote sensing physics. IEEE Trans. Geosci. Remote Sens. 2008, 46, 1301–1312. [Google Scholar] [CrossRef] [Green Version]
  13. Xu, Q.; Zhang, Y.; Li, B. Recent advances in pansharpening and key problems in applications. Int. J. Image Data Fusion 2014, 5, 175–195. [Google Scholar] [CrossRef]
  14. Lang, J.; Hao, Z. Novel image fusion method based on adaptive pulse coupled neural network and discrete multi-parameter fractional random transform. Opt. Lasers Eng. 2014, 52, 91–98. [Google Scholar] [CrossRef]
  15. Pohl, C.; van Genderen, J. Structuring contemporary remote sensing image fusion. Int. J. Image Data Fusion 2015, 6, 3–21. [Google Scholar] [CrossRef]
  16. Li, S.; Kang, X.; Fang, L.; Hu, J.; Yin, H. Pixel-level image fusion: A survey of the state of the art. Inf. Fusion 2017, 33, 100–112. [Google Scholar] [CrossRef]
  17. Zhang, J. Multi-source remote sensing data fusion: Status and trends. Int. J. Image Data Fusion 2010, 1, 5–24. [Google Scholar] [CrossRef]
  18. Jing, L.; Cheng, Q. An image fusion method for misaligned panchromatic and multispectral data. Int. J. Remote Sens. 2011, 32, 1125–1137. [Google Scholar] [CrossRef]
  19. Choi, J.; Yu, K.; Kim, Y. A new adaptive component-substitution-based satellite image fusion by using partial replacement. IEEE Trans. Geosci. Remote Sens. 2011, 49, 295–309. [Google Scholar] [CrossRef]
  20. Dong, W.; Li, X.E.; Lin, X.; Li, Z. A bidimensional empirical mode decomposition method for fusion of multispectral and panchromatic remote sensing images. Remote Sens. 2014, 6, 8446–8467. [Google Scholar] [CrossRef]
  21. Alparone, L.; Wald, L.; Chanussot, J.; Thomas, C.; Gamba, P.; Bruce, L.M. Comparison of pansharpening algorithms: Outcome of the 2006 GRS-S data-fusion contest. IEEE Trans. Geosci. Remote Sens. 2007, 45, 3012–3021. [Google Scholar] [CrossRef] [Green Version]
  22. Aiazzi, B.; Alparone, L.; Baronti, S.; Garzelli, A. Context-driven fusion of high spatial and spectral resolution images based on oversampled multiresolution analysis. IEEE Trans. Geosci. Remote Sens. 2002, 40, 2300–2312. [Google Scholar] [CrossRef]
  23. Peleg, S.; Rousso, B.; Rav-Acha, A.; Zomet, A. Mosaicing on adaptive manifolds. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1144–1154. [Google Scholar] [CrossRef]
  24. Brown, M.; Lowe, D.G. Automatic panoramic image stitching using invariant features. Int. J. Comput. Vis. 2007, 74, 59–73. [Google Scholar] [CrossRef]
  25. Feng, H.; Tuotuo, L.; Zheng, G. Constraints-based graph embedding optimal surveillance-video mosaicking. In Proceedings of the First Asian Conference on Pattern Recognition, Beijing, China, 28 November 2011; pp. 311–315. [Google Scholar]
  26. Kupfer, B.; Netanyahu, N.S.; Shimshoni, I. An efficient sift-based mode-seeking algorithm for sub-pixel registration of remotely sensed images. IEEE Geosci. Remote Sens. Lett. 2015, 12, 379–383. [Google Scholar] [CrossRef]
  27. Ye, Y.; Shan, J.; Bruzzone, L.; Shen, L. Robust registration of multimodal remote sensing images based on structural similarity. IEEE Trans. Geosci. Remote Sens. 2017, 55, 2941–2958. [Google Scholar] [CrossRef]
  28. Blanc, P.; Wald, L.; Ranchin, T. Importance and effect of co-registration quality in an example of “pixel to pixel” fusion process. In Proceedings of the 2nd International Conference on Fusion of Earth Data: Merging Point Measures, Raster Maps Remotely Sensed Images, Sophia Antipolis, France, 28–30 January 1998; SEE/URISCA: Nice, France, 1998; pp. 67–74. [Google Scholar]
  29. Baronti, S.; Aiazzi, B.; Selva, M.; Garzelli, A.; Alparone, L. A theoretical analysis of the effects of aliasing and misregistration on pansharpened imagery. IEEE J. Sel. Top. Signal Process. 2011, 5, 446–453. [Google Scholar] [CrossRef]
  30. Jing, L.; Cheng, Q.; Guo, H.; Lin, Q. Image misalignment caused by decimation in image fusion evaluation. Int. J. Remote Sens. 2012, 33, 4967–4981. [Google Scholar] [CrossRef]
  31. Hallabia, H.; Kallel, A.; Hamida, A.B.; Hégarat-Mascle, S.L. High spectral quality pansharpening approach based on MTF-matched filter banks. Multidimens. Syst. Signal Process. 2016, 27, 831–861. [Google Scholar] [CrossRef]
  32. Jing, L.; Cheng, Q. Two improvement schemes of pan modulation fusion methods for spectral distortion minimization. Int. J. Remote Sens. 2009, 30, 2119–2131. [Google Scholar] [CrossRef]
  33. Laben, C.A.; Brower, B.V. Process for Enhancing the Spatial Resolution of Multispectral Imagery Using Pan-Sharpening. U.S. Patent 6,011,875, 4 January 2000. [Google Scholar]
  34. Aiazzi, B.; Alparone, L.; Baronti, S.; Garzelli, A. MTF-tailored multiscale fusion of high-resolution MS and Pan imagery. Photogramm. Eng. Remote Sens. 2006, 72, 591–596. [Google Scholar] [CrossRef]
  35. Ranchin, T.; Wald, L. Fusion of high spatial and spectral resolution images: The ARSIS concept and its implementation. Photogramm. Eng. Remote Sens. 2000, 66, 49–61. [Google Scholar]
  36. Wald, L. Quality of high resolution synthesised images: Is there a simple criterion? In Proceedings of the 3rd Conference on Fusion of Earth Data: Merging Point Measurements, Raster Maps and Remotely Sensed Images, Sophia Antipolis, France, 26–28 January 2000; SEE/URISCA: Nice, France, 2000; pp. 99–103. [Google Scholar]
  37. Yuhas, R.; Goetz, A.; Boardman, J. Discrimination among semi-arid landscape endmembers using the spectral angle mapper (SAM) algorithm. In Summaries of the Third Annual JPL Airborne Geoscience Workshop; Jet Propulsion Laboratory: Pasadena, CA, USA, 1992; pp. 147–149. [Google Scholar]
  38. Wang, Z.; Bovik, A.C. A universal image quality index. IEEE Signal Process. Lett. 2002, 9, 81–84. [Google Scholar] [CrossRef]
  39. Alparone, L.; Baronti, S.; Garzelli, A.; Nencini, F. A global quality measurement of pan-sharpened multispectral imagery. IEEE Geosci. Remote Sens. Lett. 2004, 1, 313–317. [Google Scholar] [CrossRef]
  40. Garzelli, A.; Nencini, F. Hypercomplex quality assessment of multi/hyperspectral images. IEEE Geosci. Remote Sens. Lett. 2009, 6, 662–665. [Google Scholar] [CrossRef]
  41. Otazu, X.; Gonzalez-Audicana, M.; Fors, O.; Nunez, J. Introduction of sensor spectral response into image fusion methods: Application to wavelet-based methods. IEEE Trans. Geosci. Remote Sens. 2005, 43, 2376–2385. [Google Scholar] [CrossRef] [Green Version]
  42. Alparone, L.; Alazzi, B.; Baronti, S.; Garzelli, A.; Nencini, F.; Selva, M. Multispectral and panchromatic data fusion assessment without reference. Photogramm. Eng. Remote Sens. 2008, 74, 193–200. [Google Scholar] [CrossRef]
  43. Li, H.; Jing, L. Improvement of a pansharpening method taking into account haze. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017, 10, 5039–5055. [Google Scholar] [CrossRef]
  44. Jing, L.; Cheng, Q. Spectral change directions of multispectral subpixels in image fusion. Int. J. Remote Sens. 2011, 32, 1695–1711. [Google Scholar] [CrossRef]
Figure 1. The MS images of the WV2 (a); IK (b); and QB (c) datasets.
Figure 1. The MS images of the WV2 (a); IK (b); and QB (c) datasets.
Sensors 18 00557 g001
Figure 2. The original and pansharpened images of a 400 × 400 subset from the original WV2 dataset. (a) 0.5-m PAN; and (b) the up-sampled version of 2-m MS; and fused images generated by the (c) RMI (k = 0); (d) RMI (k = 2); (e) RMI (k = 4); (f) RMI; (g) GSA; (h) GLP; and (i) GLP-H methods.
Figure 2. The original and pansharpened images of a 400 × 400 subset from the original WV2 dataset. (a) 0.5-m PAN; and (b) the up-sampled version of 2-m MS; and fused images generated by the (c) RMI (k = 0); (d) RMI (k = 2); (e) RMI (k = 4); (f) RMI; (g) GSA; (h) GLP; and (i) GLP-H methods.
Sensors 18 00557 g002
Figure 3. The original and pansharpened images of a 400 × 400 subset from the original IK dataset. (a) 1-m PAN; and (b) the up-sampled version of 4-m MS; and fused images generated by the (c) RMI (k = 0); (d) RMI (k = 2); (e) RMI (k = 4); (f) RMI; (g) GSA; (h) GLP; and (i) GLP-H methods.
Figure 3. The original and pansharpened images of a 400 × 400 subset from the original IK dataset. (a) 1-m PAN; and (b) the up-sampled version of 4-m MS; and fused images generated by the (c) RMI (k = 0); (d) RMI (k = 2); (e) RMI (k = 4); (f) RMI; (g) GSA; (h) GLP; and (i) GLP-H methods.
Sensors 18 00557 g003
Figure 4. The original and pansharpened images of a 480 × 480 subset from the original QB dataset. (a) 0.7-m PAN; and (b) the up-sampled version of 2.8-m MS; and fused images generated by the (c) RMI (k = 0); (d) RMI (k = 2); (e) RMI (k = 4); (f) RMI; (g) GSA; (h) GLP; and (i) GLP-H methods.
Figure 4. The original and pansharpened images of a 480 × 480 subset from the original QB dataset. (a) 0.7-m PAN; and (b) the up-sampled version of 2.8-m MS; and fused images generated by the (c) RMI (k = 0); (d) RMI (k = 2); (e) RMI (k = 4); (f) RMI; (g) GSA; (h) GLP; and (i) GLP-H methods.
Sensors 18 00557 g004
Figure 5. Variations of the ERGAS (a); Q4 (b); SAM (c); and SCC (d) indices of pansharpened images produced from the degraded IK dataset with different misalignments between PAN and MS bands.
Figure 5. Variations of the ERGAS (a); Q4 (b); SAM (c); and SCC (d) indices of pansharpened images produced from the degraded IK dataset with different misalignments between PAN and MS bands.
Sensors 18 00557 g005
Figure 6. The original and fused images for a 300 × 300 subset of the original IK dataset with a misalignment of (2, 1). (a) 1-m PAN; and (b) the up-sampled version of 4-m MS; and fused images generated by the (c) RMI (k = 0); (d) RMI (k = 2); (e) RMI (k = 4); (f) RMI; (g) GSA; (h) GLP; and (i) GLP-H methods.
Figure 6. The original and fused images for a 300 × 300 subset of the original IK dataset with a misalignment of (2, 1). (a) 1-m PAN; and (b) the up-sampled version of 4-m MS; and fused images generated by the (c) RMI (k = 0); (d) RMI (k = 2); (e) RMI (k = 4); (f) RMI; (g) GSA; (h) GLP; and (i) GLP-H methods.
Sensors 18 00557 g006
Table 1. Quality indexes for fused products at the two scales. Numbers in bold indicate the best performances for each quality index along each dataset.
Table 1. Quality indexes for fused products at the two scales. Numbers in bold indicate the best performances for each quality index along each dataset.
ImageMethodDegraded ScaleOriginal Scale
RASEERGASSAMQ2nSCCSAMdDλDSQNR
WV2RMI (k = 0)6.671.7402.280.93600.8451.1270.06000.0670.877
RMI (k = 1)6.721.752.280.9350.8431.1270.0610.0680.876
RMI (k = 2)6.801.772.290.9340.8401.1270.0620.0680.875
RMI (k = 3)6.921.802.300.9330.8361.1260.0620.0680.874
RMI (k = 4)7.051.842.310.9310.8301.1260.0630.0680.873
RMI6.681.7452.280.93570.8441.1480.06000.0680.876
GSA7.601.982.690.9130.8242.2040.0740.0890.844
GLP8.052.062.940.8450.8071.2630.1130.1100.789
GLP-H7.481.952.360.9320.8431.0940.0660.0660.872
EXP12.613.262.940.7910.4411.2630.0000.0680.932
IKRMI (k = 0)4.631.2141.6770.91920.87340.5770.05560.09190.8576
RMI (k = 1)4.651.221.670.9190.8720.5770.0570.0930.855
RMI (k = 2)4.701.231.670.9180.8680.5770.0580.0940.853
RMI (k = 3)4.791.251.680.9160.8630.5770.0600.0950.850
RMI (k = 4)4.901.281.680.9140.8570.5770.0620.0960.849
RMI4.641.2141.6770.91930.87340.7120.05560.09190.8576
GSA6.381.661.930.8800.8462.2390.0970.1430.774
GLP6.941.742.430.8300.7950.5630.1670.1690.692
GLP-H5.291.381.720.9120.8680.5740.0680.0910.847
EXP9.632.522.420.6610.4530.5670.0000.0990.901
QBRMI (k = 0)5.651.3981.8770.8920.8370.7170.0860.1150.809
RMI (k = 1)5.811.431.900.8890.8330.7170.0870.1160.807
RMI (k = 2)6.021.481.930.8860.8280.7170.0880.1170.805
RMI (k = 3)6.261.531.970.8810.8220.7170.0900.1170.803
RMI (k = 4)6.541.602.000.8770.8140.7160.0910.1180.802
RMI5.661.4011.8790.8900.8360.9450.0870.1140.809
GSA7.221.732.530.8770.7972.8710.0450.0860.872
GLP8.422.082.880.7010.7150.7370.1720.2090.655
GLP-H6.341.541.990.8880.8320.7250.0790.1020.827
EXP9.972.372.970.7430.4870.7170.0000.0880.912
Table 2. Quality indexes for fused products of the improved RMI using different values for k.
Table 2. Quality indexes for fused products of the improved RMI using different values for k.
ImageMethodDegraded Scale
RASEERGASSAMQ2nSCCSAMd
QBRMI (k = 0)5.6511.3981.8770.8910.8370.7138
RMI (k = 1)5.8121.4341.9000.8890.8330.7138
RMI (k = 2)6.0171.4801.9300.8850.8280.7138
RMI (k = 3)6.2611.5341.9650.8810.8220.7138
RMI (k = 4)6.5411.5972.0050.8770.8140.7138
RMI (k = 5)6.8511.6672.0470.8710.8050.7137
RMI (k = 6)7.1871.7432.0930.8650.7950.7137
RMI (k = 7)7.5471.8242.1400.8590.7850.7137
RMI (k = 8)7.9261.9102.1900.8520.7740.7137
RMI (k = 9)8.3212.0012.2410.8450.7640.7136
RMI (k = 10)8.7312.0942.2940.8370.7530.7136
Table 3. Quality indexes for fused products of the proposed method using different values for S.
Table 3. Quality indexes for fused products of the proposed method using different values for S.
ImageSDegraded Scale
RASEERGASSAMQ2nSCC
QB0.15.6521.3991.8770.8910.837
0.25.6511.3981.8770.8920.837
0.35.6511.3981.8770.8910.837
0.45.6521.3991.8780.8910.837
0.55.6541.3991.8790.8910.836
0.65.6581.4001.8830.8900.836
0.75.6651.4021.8880.8890.836
0.85.6761.4041.8950.8870.836
0.95.6931.4091.9050.8850.835
15.7181.4151.9170.8810.834

Share and Cite

MDPI and ACS Style

Li, H.; Jing, L.; Tang, Y.; Ding, H. An Improved Pansharpening Method for Misaligned Panchromatic and Multispectral Data. Sensors 2018, 18, 557. https://doi.org/10.3390/s18020557

AMA Style

Li H, Jing L, Tang Y, Ding H. An Improved Pansharpening Method for Misaligned Panchromatic and Multispectral Data. Sensors. 2018; 18(2):557. https://doi.org/10.3390/s18020557

Chicago/Turabian Style

Li, Hui, Linhai Jing, Yunwei Tang, and Haifeng Ding. 2018. "An Improved Pansharpening Method for Misaligned Panchromatic and Multispectral Data" Sensors 18, no. 2: 557. https://doi.org/10.3390/s18020557

APA Style

Li, H., Jing, L., Tang, Y., & Ding, H. (2018). An Improved Pansharpening Method for Misaligned Panchromatic and Multispectral Data. Sensors, 18(2), 557. https://doi.org/10.3390/s18020557

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop