Next Article in Journal
Customization of UWB 3D-RTLS Based on the New Uncertainty Model of the AoA Ranging Technique
Previous Article in Journal
Emission Flux Measurement Error with a Mobile DOAS System and Application to NOx Flux Observations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Fusion of High Resolution Multispectral Imagery in Vulnerable Coastal and Land Ecosystems

by
Edurne Ibarrola-Ulzurrun
1,2,*,
Consuelo Gonzalo-Martin
2,
Javier Marcello-Ruiz
1,
Angel Garcia-Pedrero
2 and
Dionisio Rodriguez-Esparragon
1
1
Instituto de Oceanografía y Cambio Global, IOCAG, Universidad Las Palmas de Gran Canaria (ULPGC), Parque Científico Tecnológico Marino de Taliarte , 35214 Telde, Spain
2
Centro de Tecnología Biomédica, Universidad Politécnica de Madrid (UPM), Campus de Montegancedo, 28223 Pozuelo de Alarcón, Spain
*
Author to whom correspondence should be addressed.
Sensors 2017, 17(2), 228; https://doi.org/10.3390/s17020228
Submission received: 4 October 2016 / Revised: 14 December 2016 / Accepted: 17 January 2017 / Published: 25 January 2017
(This article belongs to the Section Remote Sensors)

Abstract

:
Ecosystems provide a wide variety of useful resources that enhance human welfare, but these resources are declining due to climate change and anthropogenic pressure. In this work, three vulnerable ecosystems, including shrublands, coastal areas with dunes systems and areas of shallow water, are studied. As far as these resources’ reduction is concerned, remote sensing and image processing techniques could contribute to the management of these natural resources in a practical and cost-effective way, although some improvements are needed for obtaining a higher quality of the information available. An important quality improvement is the fusion at the pixel level. Hence, the objective of this work is to assess which pansharpening technique provides the best fused image for the different types of ecosystems. After a preliminary evaluation of twelve classic and novel fusion algorithms, a total of four pansharpening algorithms was analyzed using six quality indices. The quality assessment was implemented not only for the whole set of multispectral bands, but also for the subset of spectral bands covered by the wavelength range of the panchromatic image and outside of it. A better quality result is observed in the fused image using only the bands covered by the panchromatic band range. It is important to highlight the use of these techniques not only in land and urban areas, but a novel analysis in areas of shallow water ecosystems. Although the algorithms do not show a high difference in land and coastal areas, coastal ecosystems require simpler algorithms, such as fast intensity hue saturation, whereas more heterogeneous ecosystems need advanced algorithms, as weighted wavelet ‘à trous’ through fractal dimension maps for shrublands and mixed ecosystems. Moreover, quality map analysis was carried out in order to study the fusion result in each band at the local level. Finally, to demonstrate the performance of these pansharpening techniques, advanced Object-Based (OBIA) support vector machine classification was applied, and a thematic map for the shrubland ecosystem was obtained, which corroborates wavelet ‘à trous’ through fractal dimension maps as the best fusion algorithm for this ecosystem.

1. Introduction

Natural ecosystems provide a wide variety of useful resources that enhance human welfare. However, in recent decades, there has been a decline in these ecosystem services, as well as their biodiversity [1]. In particular, coastal ecosystems are the most complex, dynamic and productive systems in the world [2]. This creates a demand to preserve environmental resources; hence, it is of great importance to develop reliable methodologies, applied to new high resolution satellite imagery. Thus, the analysis, conservation and management of these environments could be studied, in a continuous, reliable and economic way, and at the suitable spatial, spectral and temporal resolution. However, some processing tasks need to be improved; for instance, the weaknesses in the classification and analysis of land and coastal ecosystems on the basis of remote sensing data, as well as the lack of reliability of the maps; particularly, the extreme difficulty to monitor the coastal ecosystems from remote sensing imagery due to the low reflectivity of these areas covered by water.
The framework in which the study has been developed is the analysis of both coastal and land ecosystems through very high resolution (VHR) remote sensing imagery in order to obtain high quality products that will allow the comprehensive analysis of natural resources. In this context, remote sensing imagery offers practical and cost-effective means for a good environmental management, especially when large areas have to be monitored [3] or periodic information is needed. Most VHR optical sensors provide a multispectral image (MS) and a panchromatic image (PAN), which require a number of corrections and enhancements. Image fusion or pansharpening algorithms are important for improving the spatial quality of information available. The pansharpening data fusion technique could be defined as the process of merging MS and PAN images to create new multispectral images with a high spatial resolution [4,5]. This fusion stage is important in the analysis of such vulnerable ecosystems, mainly characterized by heterogeneous and mixed vegetation shrubs, with small shrubs in the case of terrestrial ecosystems and the complexity of seagrass meadows or algae distribution in shallow water ecosystems.
Image fusion techniques combine sensor data from different sources with the aim of providing more detailed and reliable information. The extensive research into image fusion techniques in remote sensing started in the 1980s [6,7]. Generally, image fusion can be categorized into three levels: pixel level, feature level and knowledge or decision level [8,9], and pansharpening is performed at the pixel level.
Many pansharpening techniques have appeared in recent decades, as a consequence of the launch of very high resolution sensors [10,11,12,13,14]. An ideal pansharpening algorithm should have two main attributes: (i) enhancing high spatial resolution; and (ii) reducing spectral distortion [15]. The simplest pansharpening methods, at the conceptual and computational level, are intensity-hue-saturation (IHS), principal component analysis (PCA) and Brovey transforms (BT). However, these techniques have problems because the radiometry on the spectral channels is distorted after fusion. New approaches, such as wavelet transformations and high pass filtering (HPF) [4,8,16,17,18], have been proposed to address particular problems with the traditional techniques.
On the other hand, quality evaluation is a fundamental issue to benchmark and optimize different pansharpening algorithms [18,19], as there is the necessity to assess the spectral and spatial quality of the fused images. There are two types of evaluation approaches commonly used: (1) qualitative analysis (visual assessment); and (2) quantitative analysis (quality indices). Visual analysis is a powerful tool for capturing the geometrical aspect [20] and the main color disturbances. According to [10], some visual parameters are necessary for testing the properties of the image, such as the spectral preservation of features, multispectral synthesis in fused images and the synthesis of images close to actual images at high resolution. On the other hand, quality indices measure the spectral and the spatial distortion produced due to the pansharpening process by comparing each fused image to the reference MS or PAN image. The work in [21] categorized them into three main groups: (i) spectral quality indices such as the spectral angle mapper (SAM) [22] and the spectral relative dimensionless global error, in French ‘erreur relative globale adimensionnelle de synthése’ (ERGAS) [23]; (ii) spatial quality indices, i.e., the spatial ERGAS [24], the frequency comparison index (FC) [25] and the Zhou index [26]; and (iii) global quality indices, such as the 8-band image quality index (Q8) [27]. On the other hand, there are several evaluation techniques with no reference requirement, such as the quality with no reference (QNR) approach [28].
In this study, the main goal is to assess which pansharpening technique, using Worldview-2 VHR imagery with eight MS bands, provides a better fused image. Future research will be focused on the classification of the vulnerable ecosystems, in order to obtain specific products for the management of coastal and land resources, in contrast to several studies assessing and reviewing pansharpening techniques [11,16,20,29,30,31,32,33,34]. Further specific goals in this paper are: (i) the study of the spatial and spectral quality of pansharpened bands covered by and outside the PAN wavelength range; (ii) analysis of pansharpening algorithms’ behavior in vulnerable natural ecosystems, unlike the majority of previous studies, which apply the pansharpening techniques in urban areas or on other land cover types; and (iii) novel assessment in VHR image fusion in shallow coastal waters, whilst being aware of the pansharpening difficulty of these ecosystems. Although other authors apply fusion in water areas, such as [35,36], they use Landsat imagery, not VHR imagery. The aim of the last point is to identify which techniques were more suitable for shallow water areas and the improvement achieved. This information can lead to obtaining more accurate seafloor segmentation or mapping of coastal zones [37]; hence, studies on the state of conservation of natural resources could be conducted.
Finally, in order to strengthen the study, a final thematic map of the shrubland area was carried out. Thus, it will analyze the influence of the fusion on the classification results which serve to obtain accurate information for the conservation of natural resources. This study can be found in more detail in [38].
The paper is structured as follows: Section 2 includes the description of the study area, datasets, the image fusion methods used in the analysis and the evaluation methodology. The visual and quantitative evaluation of the different algorithms, as well as map analysis are presented in Section 3. Section 4 includes the critical analysis of the results. Finally, Section 4 summarizes the main outcomes and contributions.

2. Materials and Methods

2.1. Study Area

This study focuses on three types of vulnerable ecosystems found in different islands of the Macaronesia region. Macaronesia is considered a geological and a biodiversity hotspot due to the volcanic origin and due to the high degree of vulnerability that insular ecosystems are subjected to, mainly as a consequence of climate change and anthropogenic pressure. The ecosystems selected from the Canary Islands were: the shrubland ecosystem, the coastal ecosystem and, finally, a mixed ecosystem surrounded by a touristic area and a lagoon, as a transitional system within the coastal and land ecosystems. The Canary Islands are one of the most remarkable biodiversity areas on the planet [39], and they are chosen as a representative sample of these ecosystems because of the availability of VHR remote sensing imagery and simultaneous field data. Figure 1 displays the 3 protected areas considered in the analysis.
As regards shrubland ecosystems, it is important to highlight the large concentration of vascular plants, which are highly vulnerable to environmental changes. In coastal areas, an intensive natural fragility also appears due to the interaction of a great variety of environmental factors. Moreover, the coastal occupation of urban areas and the development of tourism increase this frailty and vulnerability. In particular, dune systems are affected by this urban-touristic expansion [40,41,42,43]. Furthermore, many coastal ecosystems contain seagrass meadows [44]. The importance of these particular meadows is related to the ocean productivity as they are one of the most valuable ecosystems in the world. In addition, these meadows are part of the solution to climate change, not only producing oxygen, but storing up to twice as much carbon per unit area as the world’s temperate and tropical forests [45].
As indicated, three sensible and heterogenic protected areas of the Canary Islands have been selected (Figure 1) as representative examples of more general ecosystems: the Teide National Park (Tenerife Island), as an example of shrubland ecosystems, the Corralejo Natural Park and Islote de Lobos (Fuerteventura), representing coastal ecosystems, and the Maspalomas Natural Reserve (Gran Canaria Island), an important coastal-dune ecosystem with significant tourism pressure, called the mixed ecosystem in this paper, not only because it is the transitional region within the coastal and land ecosystem, but also due to the inner water ecosystem, known as the Maspalomas lagoon.
Other similar shrubland ecosystems can be identified around the world, for example: Pico do Pico in the Azores, Mt. Halla in South Korea and Hawaii or the Galapagos. Corralejo and Maspalomas are important coastal sand dune systems in Europe, as they contain a large degree of biodiversity [46]. The importance of studying these ecosystems is their similarity with other Mediterranean, temperate or tropical parts of the world: Tuscany (Italy), Doñana National Park (Spain), as well as other parts of the world, such as Parangtritis (Java, Indonesia), Malaysia, Philippines, Vietnam, NE Queensland, the tropical coast of Brazil, the West African coast, Cuba, the Galapagos islands, the West Indies, Cox’s Bazaar (Bangladesh), Hawaii, Ghana, the coast of India or Christmas Island [47].

2.2. Datasets

The WorldView-2 (WV-2) satellite, launched by Digital Globe on 8 October 2009, was the first commercial satellite to provide a VHR sensor with one PAN and eight MS bands (Table 1). WV-2 ortho-ready imagery of the three representative ecosystems were used in the study. Images of the Canary Islands and the central locations of the corresponding three study areas are detailed in Table 2. In order to reduce the computation times in the multiple analyses, 512 × 512 MS pixel scenes were used. Figure 2 displays the PAN band and the corresponding resized MS image (RGB composite). They were selected for their spectral and spatial richness and the content of land and coastal areas. In addition, the scenes include different coverages, basically predominating coastal areas, shallow waters, vegetation and urban regions, and they contain features with different shapes and edges.

2.3. Image Fusion Methodology

In the flow shown in Figure 3, every step of the methodology for assessing which algorithm gives the best fused image for each area is presented.
The first four pansharpening techniques were implemented in the three different vulnerable ecosystems; afterwards, a visual and quantitative assessment was undertaken in order to evaluate the pansharpening results in the different fused images. The quality assessment was carried out for the whole set of MS bands, as well as for those MS bands covering the range of the PAN channel (Bands 2–6) and those outside this range (Bands 1, 7 and 8). Furthermore, an individual band quality map analysis was carried out in the best fused images according to each ecosystem type. Finally, a classification map is obtained in the different fused images, to analyze the influence of the pansharpening in the classification result.

2.3.1. Image Fusion Methods

After a review of the state-of-the-art in pansharpening techniques, at the pixel level, a preliminary assessment was carried out selecting classic and new algorithms that could achieve good performance with WV-2 imagery. Some algorithms specifically adapted to the WorldView-2 sensor have been chosen. An initial visual and quantitative assessment was carried out using a total of 12 different pansharpening techniques in these data, but only the best algorithms were selected for this study. The final election of these four algorithms was carried out with the same methodology explained in this paper. Thus, a visual and quantitative evaluation was performed to assess the spectral and the spatial quality of each algorithm, taking into account the compromise between both qualities in each fused image, depending on each ecosystem type. Thus, after obtaining an objective ranking of the 12 algorithms selected using the Borda count method, explained in Section 2.3.2, the final best four pansharpening algorithms were selected, in order to obtain the most suitable fused image for each ecosystem. Next, a brief overview of each family of the algorithms selected in the study is presented. Any formula or block diagram is omitted (for detailed information, see the references).
  • Fast intensity hue saturation (FIHS) [48]: It uses the spectral bands to estimate the new component I. The spatial detail is extracted, computing the difference between the PAN band and the new component I. The spatial detail is injected into any number of bands.
  • Hyperspherical color sharpening (HCS): This pansharpening algorithm is designed specifically for WV-2 by [49] based on the transformation between any native color space and the hyperspherical color space. Once transformed into HCS, the intensity can be scaled without changing the color, essential for the HCS pansharpening algorithm [15,50]. The transformation to HCS can be made from any native color space.
  • Based on modulation transfer function: The modulation transfer function (MTF) is a function of the sensor spatial frequency response, describing the resolution of an imaging system [28]. Generalized Laplacian pyramid (GLP) is an extension of the Laplacian pyramid where a scale factor different from two is used [10]. Finally, in high pass modulation (HPM), the PAN image is multiplied by each band of the original MS image and normalized by a low pass filtered version of the PAN image in order to estimate the enhanced MS image bands.
  • Weighted wavelet ‘à trous’ method through fractal dimension maps (WAT⊗FRAC) [14]: This method is based on the wavelet ‘à trous’ algorithm. A mechanism that controls the trade-off between the spatial and spectral quality by introducing a weighting factor (αi) for the PAN wavelet coefficients is established. However, this factor only discriminates between different spectral bands, but not between different land covers; therefore, the authors proposed a new approach [51], defining a different weight factor αi (x, y) for each point of each band. αi (x, y) was defined as a fractal dimension map (FDM) with the same size as the original image. A preliminary analysis was carried out using three different window sizes for the windowing process: 7, 15 and 27.

2.3.2. Quality Evaluation Methodology

(a) Visual quality:
A visual analysis was the first step in the quality assessment. Through this approach, the main errors on an overall scale were observed, and then, local artefacts where analyzed closely. For the visual spectral assessment, fused true color images were compared to their original MS image, used as the reference, and its spectral features compared with the original MS image. Firstly, several aspects of the image features were taken into account in the spectral assessment, such as the tone, contrast, saturation, sharpness and texture. Furthermore, we paid attention to color disturbances. False color composites were produced in order to evaluate the fused NIR bands, where we focused on the same aspects as mentioned above. Finally, we concentrated on linear features, specific objects, surfaces or edges of buildings in order to observe spatial disturbances using the PAN as a reference.
(b) Quality indices
There is no current consensus in the literature on the best quality index for pansharpening images [52]. The quantitative quality evaluation of fused images is a debated issue since no reference image exists at the pansharpened resolution [4,20]. A number of statistical evaluation indices were used to measure the quality of the fused images. Each fused image is compared to the reference MS image.
The spectral quality assessment measures the spectral distortion brought by the pansharpening process. The metrics considered in the analysis are as follows:
  • The spectral angle mapper (SAM) was designed to determine the spectral similarity in a multidimensional space [22] (Equation (1) in Table 3).
  • The spectral ERGAS (relative dimensionless global error) is an overall quality index sensitive to mean shifting and dynamic range change (Table 2, Equation (3)). The r m s e i (root mean square error) is calculated by its standard definition [23].
The correlation coefficient was not selected as spectral index due to its low capability in techniques with low quality differences.
On the other hand, the spatial detail information of each fused band is compared with the spatial information of the reference PAN image. The metrics considered in the analysis are as follows:
  • The spatial ERGAS was proposed by [24]. It is a new spatial index considering the PAN band as a reference (Table 3, Equation (3)).
  • The frequency comparison index (FC) is proposed by [25]. It is based on the discrete cosine transform (dct) for the spatial assessment (Table 3, Equation (4)).
  • The Zhou index (Table 3, Equation (5)) measures the spatial quality computing correlation between the high pass filtered fused image ( F U S i h i g h _ p a s s ) and PAN ( P A N h i g h _ p a s s ) image for each band.
Finally, as the global quality assessment:
  • The Q8 index is a generalization to eight-band images of the Q index [27]. It is based on the different statistical properties of the fused and MS images (Table 3, Equation (6)).
In order to identify, in an objective way, the best fused image for each ecosystem, the best algorithms at the spectral, spatial and global level for each scene have been established by the Borda count rank aggregation method (Equation (7)) [53]. Consider U a set of items i, called the universe, and R a set of the rank list, where τ is an item of the rank list. The method is equivalent to: for each item i U, a rank list τ   R , and considering Borda normalized weight ω τ ( i ) , the fused rank list τ ^ is ordered with respect to the Borda score s τ ^ , where the Borda score of an item i U in τ ^ is defined as:
s τ ^ ( i ) =   τ   R ω τ ( i )

2.4. Classification Maps

A supervised classification technique was applied only in the shrubland ecosystem scene in order to analyze the influence of the different pansharpening techniques in the generation of thematic maps [38]. The first step was to determine the classes appearing in the image and obtain a sufficient number of training samples. The classes chosen for this ecosystem were selected by experts of the Teide National Park, the vegetation classes being: Spartocytisus supranubius (Teide broom), Pterocephalus lasiospermus (rosalillo de cumbre), Descurainia bourgaeana (hierba pajonera) and Pinus canariensis (Canarian pine). Urban, road and bare soil classes were also included. In the second step, the OBIA process starts with a segmentation of the input images into local groups of pixels, i.e., objects, that become spatial units in the later analysis, classifications and accuracy assessment. Object shape, size and spectral properties depend on both the segmentation approach and the research goals. The image was segmented using the multiresolution segmentation followed by the spectral difference segmentation, in order to preserve the small objects of interest to classify. Once the objects are obtained from the segmentation techniques, classification algorithms can be applied. The last step was to determine the classification algorithm; in our case, we applied the novel object-based or OBIA classification approach [54], using support vector machine (SVM) as the supervised classifier [55]. SVM is a related supervised learning method that analyzes data and recognizes patterns, used for classification and regression analysis. The standard SVM takes a set of input data and predicts, for each given input, which of the different possible classes the input is a member. Given a set of training examples, each marked as belonging to the categories, an SVM training algorithm builds a model that assigns new examples into one category [56]. Thematic maps were obtained after implementing the SVM classifier in each fused image. Afterwards, the accuracy of the classification must be measured; in this case, testing samples were collected. The statistical accuracy assessment technique used was the overall accuracy and the kappa coefficient.

3. Results

This section is divided into three main blocks: (i) the visual assessment; (ii) the quantitative evaluation based on the quality indexes; and (iii) the thematic maps resulting from the OBIA classification in the shrubland ecosystem.

3.1. Visual Evaluation

For the visual analysis, both color and edge preservation are the most important criteria to evaluate the performance of image fusion techniques in order to identify the fusion technique that provides the fused image with the highest spectral and spatial quality. To facilitate the visual inspection and for a more detailed spatial analysis, a zoom of the previous scenes is shown in Figure 4, Figure 5 and Figure 6. It is important to highlight that, after the preliminary assessment, robust pansharpening algorithms have been selected, so all fusion results are satisfactory, and the spectral differences are difficult to appreciate visually, except in some specific areas. We want to underline that, to the best of our knowledge, this is the first time pansharpening algorithms have been assessed in coastal ecosystems using VHR imagery. This improvement in the spatial quality due to the application of fusion techniques could be useful to improve seafloor or benthic classification of shallow waters (i.e., coral reefs or seagrass meadows).
The visual interpretation at the spectral level in the shrubland region (Figure 4) indicates that every algorithm, except for WAT⊗FRAC, produces a slight color distortion over the entire fused image. On the other hand, in the preliminary analysis, WAT⊗FRAC with a window size of seven pixels provides the best fused image from among the WAT⊗FRAC algorithms, although the differences are minimal.
Observing the coastal region (Figure 5), differences among the techniques are basically undetectable at the spectral level. The WAT⊗FRAC window size, which gives a slightly better result in this image, is 27.
Visual results similar to those of the shrubland region appear in the mixed ecosystem (Figure 6), where the FIHS, HCS and MTF algorithms show a more perceptible color distortion in the sea than in buildings. The WAT⊗FRAC window size chosen for this scene is 15.
Spatially, the differences are clearer than spectrally. In the case of the HCS and MTF_GLP_HPM techniques, although they maintain the spectral information well, the spatial details are not satisfactorily injected, thus not achieving a good spatial enhancement. Furthermore, the blurred aspect found in the WAT⊗FRAC algorithm is because the algorithm makes uniform the homogenous areas found in the multispectral image, which appear as heterogeneous areas in the panchromatic image. Thus, the ‘salt and pepper’ effect is avoided with it, obtaining a better classification and more accurate thematic maps, as demonstrated in Section 3.3.
Finally, Figure 7 shows an example of the false color composite using bands outside the PAN range (Bands (B) 8, 7 and 1).
In this case, a region with soil, vegetation and water is chosen in order to analyze the behavior of the different techniques. Spectrally, algorithms show a more significant color distortion in the water area. At the spatial level, pansharpening algorithms demonstrate the same behavior as in the true color composite, with WAT⊗FRAC achieving an optimum result.

3.2. Quantitative Evaluation of Pansharpened Images

As was mentioned in Section 2.4, in order to perform an objective evaluation of the pansharpening techniques, six spectral and spatial quality indices have been computed on the complete scenes of Figure 2. First, quantitative indices were calculated for all of the bands in the MS image in the three ecosystems. Second, the pansharpening performance for bands covered by and outside of the PAN range was assessed. Finally, an individual band quality map analysis was carried out (Figure 3).

3.2.1. Shrubland Ecosystems

The quality analysis of all multispectral bands can be appreciated in Table 4. SAM, spectral ERGAS and Q8 confirm that the MTF method and the HCS algorithm provide better spectral performance, while FIHS and WAT⊗FRAC get the lowest result, even though there is not a large difference between the highest and the lowest value. As regards the spatial performance, WAT⊗FRAC is confirmed as the best spatial quality method, while HCS shows the worst values. Furthermore, these results confirm the compromise between the spectral and spatial quality, in which the best fused image considered in this study would be the one that provides the best compromise between them. Hence, the Borda count method is used a posteriori to observe this compromise.
The Borda count method has allowed analyzing the balance between the spectral and spatial quality in the pansharpening algorithms to be distinguished to avoid any bias in the global result (Table 4). The results obtained by the Borda count method including the respective spectral and spatial quality indices appear in the ‘spectral’ and ‘spatial’ columns. On the other hand, the ‘global’ column shows the results using the Borda count method in every algorithm considering all of the quality indices.
Analyzing Table 4, WAT⊗FRAC generates the best fused image, not only in the overall evaluation, but also at the spatial level.
As regards the band analysis based on the PAN range, the behavior of the quality index is analyzed with respect to the MS fused bands. Thus, Figure 8 shows the quantitative values obtained for the pansharpened bands within the PAN spectral range (B 2–6) and outside of this range (B 1, 7 and 8). The results for the complete set of bands are included as a reference.
SAM and spectral ERGAS indices provide similar results, where fused bands outside of the PAN range have better quality than the bands within the PAN range, the results for the complete set of bands being an average of them. There is an exception with MTF_GLP_HPM, providing similar results irrespective of the bands that are used for their estimation. While the analysis of the spatial indices, in general, shows that bands inside the PAN range achieve better spatial quality.

3.2.2. Coastal Ecosystems

Table 5 includes the values of the quality indices for all bands obtained when applying the different fusion methods in the coastal ecosystem scenario. It is important to point out that most of the scene is covered by shallow water. The SAM index shows as the best fused images the ones obtained by MTF_GLP_HPM and FIHS; however, it does not demonstrate a large variability in the results, just like that of Q8. In the case of spectral ERGAS, MTF_GLP_HPM shows the best fused image. Spatial indices also show some variability, where spatial ERGAS and FC identify FIHS as the best method, while the WAT⊗FRAC algorithm gets the best quality result in Zhou. All spatial indices agree that the lowest quality image is achieved by the MTF_GLP_HPM and HCS methods.
According to the Borda count method, the best algorithm for getting a good fusion in coastal ecosystems is FIHS, which achieves a good balance between the spectral and the spatial quality.
With respect to the band analysis based on the PAN range (Figure 9), from the spectral point of view, SAM provides similar values between the fusion algorithms, and the spectral ERGAS shows better quality images for the MS bands covered by the PAN range, as expected, except in FIHS. Concerning the spatial indices, the Spatial ERGAS and FC metrics achieve a superior spatial quality for Bands 2–6, while Zhou shows very similar results. The Q8 index also gives higher values to the bands covered by the PAN.

3.2.3. Mixed Ecosystems

The quantitative evaluation in all bands for the mixed ecosystem is shown in Table 6. According to the spectral and spatial quality indices, the results are similar to those of shrubland ecosystems. The MTF_GLP_HPM technique provides the best result at the spectral level, while the WAT⊗FRAC algorithm does so at the spatial level.
Looking at the spectral quality indices in the mixed ecosystem scene (Figure 10), the same behavior is found for the Spectral ERGAS, which obtains the worst fused image using MS bands outside of the PAN range for the fusion, while in SAM, the values are, in general, very similar between them. With respect to the spatial quality, the bands inside the PAN range obtain better quality values. Finally, Q8 obtains worse quality results for the MS bands outside of the PAN range, as expected.
In this case, the WAT⊗FRAC algorithm also achieves the best score in the Borda count rank, making it the most reliable algorithm for this kind of ecosystem.

3.2.4. Individual Band Quality Map Analysis

Quality values for each individual band in each scene are included in Table 7. The Q8 overall index does not show the variability at the local level; thus, quality map analyses were carried out using the Q8 metric with a block size of 64 in order to examine the quality at local level in each band using the best algorithms for each area. The maximum quality index is achieved if a window is not used. This block size of 64 was chosen because increasing the block size increases the index values, and thus, the values obtained in the quality maps could be comparable to the values obtained for the overall Q8. In each quality map, a blue scale has been used with white indicating a higher similarity (Q8 metric) between the original and the fused MS band.
As indicated, the WAT⊗FRAC algorithm was selected as the best compromise to fuse images in a shrubland ecosystem scenario. Quality maps of WAT⊗FRAC (window size of seven) are presented in Figure 11. Better results can be appreciated for Bands 3–8, with Band 4 achieving the best fusion. On the other hand, Bands 1 and 2 (coastal blue and blue) do not show good quality in some areas, with a greater concentration of dark blue pixels, in accordance with the lowest quality results of Table 7.
As regards the coastal ecosystem, in the individual band quality maps of FIHS (Table 7 and Figure 12), the higher quality is achieved in Bands 2 and 3, with values over 0.88. For Band 1 (0.764), a light blue aspect (medium quality) in the sea area is clear, whereas, in the land area, both bands get dark blue pixels in the quality maps with this algorithm. From Bands 4–8, the quality increases in land areas, showing mostly dark blue pixels; however, the quality in sea areas decreases considerably. Band 8 gets a quality value of 0.239, while Bands 5–7 show values around 0.3–0.4. Regardless of the fact that the FIHS algorithm gets the best fusion for this scenario, the quality maps are not very satisfactory for longer wavelengths, but this portion of the spectrum is of minimum interest in seafloor mapping applications due to the low capability of light to penetrate the water.
Concerning the quality maps of the WAT⊗FRAC algorithm in mixed ecosystems (Figure 13), they present a similar behavior to that of shrubland areas. Specifically, Bands 3 and 4 have the highest quality values (0.905 and 0.890, respectively, as presented in Table 7), while Bands 1 and 2 show lower quality (0.695 for Band 1 and 0.842 for Band 2). On the other hand, water areas have, in general, worse quality as the band number increases.

3.3. Thematic Maps of Shrubland Ecosystems

Table 8 shows the overall accuracy and the kappa coefficient for the SVM object-based classification applied to each fused image. The best result is obtained in the WAT⊗FRAC fused image, corroborating the results achieved in the Quantitative Evaluation section, where WAT⊗FRAC shows the best fusion result.
Finally, Figure 14 presents the thematic map from the shrubland ecosystem obtained in the original MS image and in the WAT⊗FRAC fused image by SVM classification. It is observed how the fusion is a fundamental preprocessing in these complex ecosystems because some classes of interest are not well delimited in the original multispectral thematic maps. On the other hand, buildings (red) are erroneously classified as bare soil (brown) in the original multispectral image, and road limits (grey) and vegetation contours seem to be stepped, due to the pixel size in the original image. The results were analyzed by the experts of the national park, confirming a good agreement with respect the real vegetation of the area.

4. Conclusions

As indicated, the main objective of this work was to select the pansharpening algorithm that provides the image with the best spatial and spectral quality for land and coastal ecosystems. Due to this reason, the most efficient pansharpening techniques developed in recent years have been applied, in order to achieve the highest spatial resolution of the MS bands while preserving the original spectral information, and assessed. As not a single pansharpening algorithm has exhibited a superior performance, the best techniques have been evaluated in three different types of ecosystems, i.e., heterogenic shrubland ecosystems, coastal systems and coastal-land transitional zones with inland water and an urban area. Fusion methods have frequently been applied to land and urban areas; however, a novel analysis has been conducted covering their evaluation in areas of shallow water using VHR imagery, as well, as they could be useful for the mapping of seabed species, such as seagrasses and coral reefs.
After a preliminary assessment of twelve pansharpening techniques, a total of four algorithms was selected for the study. In the literature, four band sensors are mostly selected to carry out this kind of study (Ikonos, GeoEye, QuickBird, etc.); however, we have tried to find the best fused image using an eight-band sensor (WorldView-2).
Both the visual evaluation and the quantitative analysis were achieved using six quality indices at the overall, spectral and spatial level. The best algorithms at the spectral and spatial levels were obtained for each type of ecosystem. Finally, the best fused technique with a compromise between the spectral and spatial quality was identified following the Borda count method. Thus, we provide guidance to users in order to choose the best algorithm that would be more suitable in accordance with the type of ecosystem and the information to be preserved.
It is interesting to observe that, for land regions, the MTF algorithm performs better at preserving the spectral quality, while the weighted wavelet ‘à trous’ method through the fractal dimension maps algorithm demonstrates better results considering the spatial detail of the fused imagery. Balancing the spectral and spatial quality, the most appropriate pansharpening algorithm for shrubland and mixed ecosystems is the WAT⊗FRAC technique, while FIHS is selected for the coastal ecosystems. Note that to date, the WAT⊗FRAC algorithm has only been used in agricultural areas; however, we have applied this algorithm in natural vulnerable ecosystems, where a successful result has been obtained. Moreover, we can conclude that the more heterogenic the area to be fused, the smaller the window size in WAT⊗FRAC. FIHS provides the best overall fused image in the simplest scenario. Thus, even though there is no remarkable difference in the way the algorithms perform with respect to land and water areas, we have concluded that images with low variability, such as a coastal scenario, covered mostly by water, require simpler algorithms, rather than more complex and heterogeneous images (i.e., shrubland and mixed ecosystems), which need more advanced algorithms in order to obtain good fused imagery.
Moreover, we have studied the behavior of each algorithm when applied to the complete set of MS bands and on bands covered by and outside of the PAN range. In general, Bands 2–6 mainly have better spatial and spectral quality, but the quality of the remaining bands is acceptable. Analyzing the results, there is a difference in how the same algorithm works on land and coastal areas. The fusion might have higher quality on land, while a lower quality appears in bodies of water.
Additionally, a local study was carried out to identify the distortion introduced in each single band by the best fused algorithms chosen for each scenario. In general, Bands 3–8 attained higher quality for land areas, while in water areas, red and near-infrared bands (5, 7 and 8) experience high spectral distortion. However, these bands are not usually used in seabed mapping applications due to their low penetration capability in water.
Finally, it is important to recall the need to obtain the best fused image in the analyzed ecosystems, as they are heterogenic regions with sparse vegetation mainly made up of small and mixed shrubs with reduced leaf area in the case of shrubland ecosystems and with low radiance absorption in complex and dynamic coastal ecosystems. In this context, thematic maps were obtained using the SVM classifier in the original MS image and in the WAT⊗FRAC fused image. This corroborates the best performance of the WAT⊗FRAC algorithm to generate accurate thematic maps in the shrubland ecosystem. The excellent results provided by these studies are being applied to the generation of challenging thematic maps of coastal and land protected areas, and studies of the state of conservation of natural resources will be performed.

Acknowledgments

This research has been supported by the ARTEMISAT (Análisis de recursos marinos y terrestres mediante el procesado de imagines de alta resolución) (CGL2013-46674-R) project, funded by the Spanish Ministerio de Economía y Competitividad.

Author Contributions

All of these authors contributed extensively to the work. Edurne Ibarrola-Ulzurrun, Consuelo Gonzalo-Martin and Javier Marcello-Ruiz developed the methodology and analyzed the results. Ángel García-Pedrero contributed with the implementation of the pansharpening algorithms, and Dionisio Rodriguez-Esparragon contributed with the quality evaluation indices. Edurne Ibarrola-Ulzurrun processed the WorldView-2 data and carried out the image fusions and their quality evaluation, supervised by Consuelo Gonzalo-Martin and Javier Marcello-Ruiz. Finally, Edurne Ibarrola-Ulzurrun wrote the overall paper under the supervision of Consuelo Gonzalo-Martin and Javier Marcello-Ruiz, who were also involved in the manuscript’s discussion and revision.

Conflicts of Interest

The authors declare no conflicts of interest. The founding sponsors had no role in the design of the study; in the collection, analyses or interpretation of data; in the writing of the manuscript; nor in the decision to publish the results.

References

  1. Pagiola, S.; Von Ritter, K.; Bishop, J. Assessing the Economic Value of Ecosystem Conservation; World Bank: Washington, DC, USA, 2004. [Google Scholar]
  2. Barange, M.; Harris, R.P. Marine Ecosystems and Global Change; Oxford University Press: Oxford, UK, 2010. [Google Scholar]
  3. Reinke, K.; Jones, S. Integrating vegetation field surveys with remotely sensed data. Ecol. Manag. Restor. 2006, 7, S18–S23. [Google Scholar] [CrossRef]
  4. Kpalma, K.; El-Mezouar, M.C.; Taleb, N. Recent trends in satellite image pan-sharpening techniques. In Proceedings of the 1st International Conference on Electrical, Electronic and Computing Engineering, Vrniacka Banja, Serbia, 2–5 June 2014.
  5. Li, X.; He, M.; Zhang, L. Hyperspherical color transform based pansharpening method for worldview-2 satellite images. In Proceedings of the 2013 IEEE 8th Conference on Industrial Electronics and Applications, Melbourne, Australia, 19–21 June 2013.
  6. Hallanda, W.A.; Cox, S. Image sharpening for mixed spatial and spectral resolution satellite systems. In Proceedings of the 17th International Symposium of Remote Sensing of Environment, University of Michigan, Ann Arbor, MI, USA, 9–13 May 1983.
  7. Schowengerdt, R.A. Recosntruction of multispatial, multispectral image data using spatial frequency content. Photogramm. Eng. Remote Sens. 1980, 46, 1325–1334. [Google Scholar]
  8. Pohl, C. Challenges of remote sensing image fusion to optimize earth observation data exploitation. Eur. Sci. J. 2014, 9, 355–365. [Google Scholar]
  9. Zhang, J. Multi-source remote sensing data fusion: Status and trends. Int. J. Image Data Fusion 2010, 1, 5–24. [Google Scholar] [CrossRef]
  10. Amro, I.; Mateos, J.; Vega, M.; Molina, R.; Katsaggelos, A.K. A survey of classical methods and new trends in pansharpening of multispectral images. EURASIP J. Adv. Sig. Proc. 2011, 2011, 1–22. [Google Scholar] [CrossRef]
  11. Fonseca, L.; Namikawa, L.; Castejon, E.; Carvalho, L.; Pinho, C.; Pagamisse, A. Image fusion for remote sensing applications. In Image Fusion and Its Applications; InTech: Rijeka, Croatia, 2011. [Google Scholar]
  12. Laben, C.A.; Brower, B.V. Process for Enhancing the Spatial Resolution of multispectral Imagery Using Pan-Sharpening. U.S. Patent 6011875 A, 4 January 2000. [Google Scholar]
  13. Li, X.; Qi, W. An effective pansharpening method for worldview-2 satellite images. In Proceedings of the International Conference on Estimation, Detection and Information Fusion (ICEDIF), Harbin, China, 10–11 January 2015.
  14. Lillo-Saavedra, M.; Gonzalo, C. Spectral or spatial quality for fused satellite imagery? A trade-off solution using the wavelet à trous algorithm. Int. J. Remote Sens. 2006, 27, 1453–1464. [Google Scholar] [CrossRef]
  15. Li, X.; Li, L.; He, M. A novel pansharpening algorithm for worldview-2 satellite images. In Proceedings of the International Conference on Industrial and Intelligent Information (ICIII 2012), Singapore, 17–18 March 2012.
  16. Alimuddin, I.; Sumantyo, J.T.S.; Kuze, H. Assessment of pan-sharpening methods applied to image fusion of remotely sensed multi-band data. Int. J. Appl. Earth Observ. Geoinf. 2012, 18, 165–175. [Google Scholar]
  17. González-Audícana, M.; Saleta, J.L.; Catalán, R.G.; García, R. Fusion of multispectral and panchromatic images using improved ihs and pca mergers based on wavelet decomposition. IEEE Trans. Geosci. Remote Sens. 2004, 42, 1291–1299. [Google Scholar] [CrossRef]
  18. Marcello, J.; Medina, A.; Eugenio, F. Evaluation of spatial and spectral effectiveness of pixel-level fusion techniques. IEEE Geosci. Remote Sens. Lett. 2013, 10, 432–436. [Google Scholar] [CrossRef]
  19. Rodríguez-Esparragón, D. Evaluación y Desarrollo de Métricas de Calidad Espacial y Espectral Para Aplicaciones de Fusión de Imágenes Multiespectrales de Teledetección de Alta Resolución; Universidad Las Palmas de Gran Canaria: Palmas, Spain, 2015. [Google Scholar]
  20. Alparone, L.; Wald, L.; Chanussot, J.; Thomas, C.; Gamba, P.; Bruce, L.M. Comparison of pansharpening algorithms: Outcome of the 2006 grs-s data-fusion contest. IEEE Trans. Geosci. Remote Sens. 2007, 45, 3012–3021. [Google Scholar] [CrossRef] [Green Version]
  21. Alparone, L.; Aiazzi, B.; Baronti, S.; Garzelli, A.; Nencini, F.; Selva, M. Multispectral and panchromatic data fusion assessment without reference. Photogramm. Eng. Remote Sens. 2008, 74, 193–200. [Google Scholar] [CrossRef]
  22. Kruse, F.; Lefkoff, A.; Boardman, J.; Heidebrecht, K.; Shapiro, A.; Barloon, P.; Goetz, A. The spectral image processing system (sips)—Interactive visualization and analysis of imaging spectrometer data. Remote Sens. Environ. 1993, 44, 145–163. [Google Scholar] [CrossRef]
  23. Wald, L. Quality of high resolution synthesised images: Is there a simple criterion? In Proceedings of the Third Conference: Fusion of Earth Data: Merging Point Measurements, Raster Maps And Remotely Sensed Images, Sophia Antipolis, France, 26–28 January 2000.
  24. Lillo-Saavedra, M.; Gonzalo, C.; Arquero, A.; Martinez, E. Fusion of multispectral and panchromatic satellite sensor imagery based on tailored filtering in the fourier domain. Int. J. Remote Sens. 2005, 26, 1263–1268. [Google Scholar] [CrossRef]
  25. Rodríguez-Esparragón, D.; Marcello-Ruiz, J.; Medina-Machín, A.; Eugenio-González, F.; Gonzalo-Martín, C.; García-Pedrero, A. Evaluation of the performance of the spatial assessment of pansharpened images. In Proceedings of the 2014 IEEE International Geoscience and Remote Sensing Symposium, Quebec City, QC, Canada, 13–18 July 2014.
  26. Zhou, J.; Civco, D.; Silander, J. A wavelet transform method to merge landsat TM and SPOT panchromatic data. Int. J. Remote Sens. 1998, 19, 743–757. [Google Scholar] [CrossRef]
  27. Wang, Z.; Bovik, A.C. A universal image quality index. IEEE Signal Lett. 2002, 9, 81–84. [Google Scholar] [CrossRef]
  28. Vivone, G.; Alparone, L.; Chanussot, J.; Dalla Mura, M.; Garzelli, A.; Licciardi, G.; Restaino, R.; Wald, L. A critical comparison among pansharpening algorithms. IEEE Trans. Geosci. Remote Sens. 2015, 53, 2565–2586. [Google Scholar] [CrossRef]
  29. Amolins, K.; Zhang, Y.; Dare, P. Wavelet based image fusion techniques—An introduction, review and comparison. ISPRS J. Photogramm. Remote Sens. 2007, 62, 249–263. [Google Scholar] [CrossRef]
  30. Ehlers, M.; Klonus, S.; Johan Åstrand, P.; Rosso, P. Multi-sensor image fusion for pansharpening in remote sensing. Int. J. Image Data Fusion 2010, 1, 25–45. [Google Scholar] [CrossRef]
  31. Ozdemir, I.; Karnieli, A. Predicting forest structural parameters using the image texture derived from worldview-2 multispectral imagery in a dryland forest, israel. Int. J. Appl. Earth Observ. Geoinf. 2011, 13, 701–710. [Google Scholar] [CrossRef]
  32. Witharana, C.; Civco, D.L.; Meyer, T.H. Evaluation of pansharpening algorithms in support of earth observation based rapid-mapping workflows. Appl. Geogr. 2013, 37, 63–87. [Google Scholar] [CrossRef]
  33. Hasanlou, M.; Saradjian, M.R. Quality assessment of pan-sharpening methods in high-resolution satellite images using radiometric and geometric index. Arab. J. Geosci. 2016, 9, 1–10. [Google Scholar] [CrossRef]
  34. Jawak, S.D.; Luis, A.J. A semiautomatic extraction of antarctic lake features using worldview-2 imagery. Photogramm. Eng. Remote Sens. 2014, 80, 939–952. [Google Scholar] [CrossRef]
  35. Gungor, O.; Boz, Y.; Gokalp, E.; Comert, C.; Akar, A. Fusion of low and high resolution satellite images to monitor changes on costal zones. Sci. Res. Essays 2010, 5, 654–662. [Google Scholar]
  36. Embabi, N.S.; Moawad, M.B. A semi-automated approach for mapping geomorphology of el bardawil lake, northern sinai, egypt, using integrated remote sensing and gis techniques. Egypt. J. Remote Sens. Space Sci. 2014, 17, 41–60. [Google Scholar] [CrossRef]
  37. Eugenio, F.; Marcello, J.; Martin, J. High-resolution maps of bathymetry and benthic habitats in shallow-water environments using multispectral remote sensing imagery. IEEE Trans. Geosci. Remote Sens. 2015, 53, 3539–3549. [Google Scholar] [CrossRef]
  38. Ibarrola-Ulzurrun, E.; Gonzalo-Martín, C.; Marcello-Ruiz, J. Influence of pansharpening techniques in obtaining accurate vegetation thematic maps. In Proceedings of the Earth Resources and Environmental Remote Sensing/GIS Applications, Edinburgh, UK, 26–29 September 2016.
  39. Mayer-Suárez, P.; Romero-Martín, L.E. La Naturaleza Desértica de Fuerteventura y la Erosionabilidad de sus Precipitaciones. Available online: http://catalogo.museosdetenerife.org/cdm/singleitem/collection/Arca/id/4130/rec/14 (accessed on 14 December 2016).
  40. Fernández-Cabrera, E.; Pérez-Chacón, E.; Cruz Avero, N.; Hernández Cordero, A.; Hernández Calvento, L. Consecuencias ambientales del crecimiento urbano-turístico en el sistema de dunas de corralejo (fuerteventura-islas canarias). In Urbanismo Expansivo de la Utopía a la Realidad. Asociación de Geógrafos Españoles; Colegio de Geógrafos de España y Universidad de Alicante: Alicante, Spain, 2011; pp. 241–252. [Google Scholar]
  41. Fernández Cabrera, E.; Roca Bosch, E.; Cabrera, L.; Hernández-Calvento, L.; Pérez-Chacon, E. Estudio de la Percepción Social en el Entorno del Parque Natural de Las Dunas de Corralejo (Fuerteventura, Islas Canarias): Aplicaciones Para la Gestión Integrada de Zonas Costeras. Available online: http://upcommons.upc.edu/handle/2117/18108 (accessed on 14 December 2016).
  42. Garzón-Machado, V.; del Arco-Aguilar, M.J.; Pérez-de-Paz, P.L. A tool set for description and mapping vegetation on protected natural areas: An example from the canary islands. Biodivers. Conserv. 2011, 20, 3605–3625. [Google Scholar] [CrossRef]
  43. Hernández-Cordero, A.; Pérez-Chacón, E.; Hernández-Calvento, L. Aplicación de tecnologías de la información geográfica al estudio de la vegetación en sistemas de dunas litorales. Resultados preliminares en el campo de dunas de maspalomas (gran canaria, islas canarias). In Tecnologías de la Información Geográfica para el Desarrollo Territorial; Servicio de Publicaciones y Difusión Científica de la ULPGC: Las Palmas de Gran Canaria, Spain, 2008. [Google Scholar]
  44. Gobierno de Canarias, C. Plan director reserva natural especial de las dunas de maspalomas. Gobierno de Canarias. Consejeria de Medio Ambiente y Ordenación Territorial. In Videconsejería de Ordenación Territorial; Dirección General de Ordenación al Territorio: Las Palmas de Gran Canaria, Spain, 2004. [Google Scholar]
  45. Fourqurean, J.W.; Duarte, C.M.; Kennedy, H.; Marbà, N.; Holmer, M.; Mateo, M.A.; Apostolaki, E.T.; Kendrick, G.A.; Krause-Jensen, D.; McGlathery, K.J. Seagrass ecosystems as a globally significant carbon stock. Nat. Geosci. 2012, 5, 505–509. [Google Scholar] [CrossRef]
  46. Ruocco, M.; Bertoni, D.; Sarti, G.; Ciccarelli, D. Mediterranean coastal dune systems: Which abiotic factors have the most influence on plant communities? Estuar. Coast. Shelf Sci. 2014, 149, 213–222. [Google Scholar] [CrossRef]
  47. Martínez, M.L.; Psuty, N.P. Coastal Dunes; Springer Science & Bussiness Media: Berlin, Germany, 2004. [Google Scholar]
  48. Tu, T.M.; Su, S.C.; Shyu, H.C.; Huang, P.S. A new look at ihs-like image fusion methods. Inf. Fusion 2001, 2, 177–186. [Google Scholar] [CrossRef]
  49. Padwick, C.; Deskevich, M.; Pacifici, F.; Smallwood, S. Worldview-2 Pan-Sharpening. In Proceedings of the ASPRS 2010 Annual Conference, San Diego, CA, USA, 26–30 April 2010.
  50. Wu, B.; Fu, Q.; Sun, L.; Wang, X. Enhanced hyperspherical color space fusion technique preserving spectral and spatial content. J. Appl. Remote Sens. 2015, 9, 097291. [Google Scholar] [CrossRef]
  51. Lillo-Saavedra, M.; Gonzalo, C.; Lagosa, O. Toward reduction of artifacts in fused images. Int. J. Appl. Earth Observ. Geoinf. 2011, 13, 368–375. [Google Scholar] [CrossRef] [Green Version]
  52. Shridhar, J.D.; Alvarinho, L.J. A spectral index ratio-based antarctic land-cover mapping using hyperspatial 8-band worldview-2 imagery. Polar Sci. 2013, 7, 18–38. [Google Scholar]
  53. Renda, M.E.; Straccia, U. Web metasearch: Rank vs. Score based rank aggregation methods. In Proceedings of the 2003 ACM symposium on Applied computing, Melbourne, FL, USA, 9–12 March 2003.
  54. Blaschke, T.; Lang, S.; Hay, G. Object-Based Image Analysis: Spatial Concepts for Knowledge-Driven Remote Sensing Applications; Springer Science & Business Media: Berlin, Germany, 2008. [Google Scholar]
  55. Mountrakis, G.; Im, J.; Ogole, C. Support vector machines in remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2011, 66, 247–259. [Google Scholar] [CrossRef]
  56. Baatz, M.; Benz, U.; Dehghani, S.; Heynen, M.; Höltje, A.; Hofmann, P.; Lingenfelder, I.; Mimler, M.; Sohlbach, M.; Weber, M. eCognition User Guide; Definiens Imaging GmbH: Munich, Germany, 2004. [Google Scholar]
Figure 1. Study areas from the Canary Islands: (a) Teide National Park; (b) Corralejo and Islote de Lobos Natural Park; and (c) Maspalomas Natural Reserve.
Figure 1. Study areas from the Canary Islands: (a) Teide National Park; (b) Corralejo and Islote de Lobos Natural Park; and (c) Maspalomas Natural Reserve.
Sensors 17 00228 g001
Figure 2. PAN and MS scenes of WorldView-2 images (512 × 512 pixels for the MS image): (a,d) shrub land ecosystem; (b,e) coastal ecosystem; (c,f) mixed ecosystem with urban area and inner water lagoon.
Figure 2. PAN and MS scenes of WorldView-2 images (512 × 512 pixels for the MS image): (a,d) shrub land ecosystem; (b,e) coastal ecosystem; (c,f) mixed ecosystem with urban area and inner water lagoon.
Sensors 17 00228 g002
Figure 3. The flow diagram followed in the study for each scenario.
Figure 3. The flow diagram followed in the study for each scenario.
Sensors 17 00228 g003
Figure 4. True color fused images of the shrubland region: (a) original MS; (b) FIHS; (c) HCS; (d) MTF_GLP_HPM; (e) WAT⊗FRAC with a window size of seven.
Figure 4. True color fused images of the shrubland region: (a) original MS; (b) FIHS; (c) HCS; (d) MTF_GLP_HPM; (e) WAT⊗FRAC with a window size of seven.
Sensors 17 00228 g004
Figure 5. True color fused images of the coastal region: (a) original MS; (b) FIHS; (c) HCS; (d) MTF_GLP_HPM; (e) WAT⊗FRAC with a window size of 27.
Figure 5. True color fused images of the coastal region: (a) original MS; (b) FIHS; (c) HCS; (d) MTF_GLP_HPM; (e) WAT⊗FRAC with a window size of 27.
Sensors 17 00228 g005
Figure 6. True color fused images of the mixed ecosystem with an urban region: (a) original MS; (b) FIHS; (c) HCS; (d) MTF_GLP_HPM; (e) WAT⊗FRAC with a window size of 15.
Figure 6. True color fused images of the mixed ecosystem with an urban region: (a) original MS; (b) FIHS; (c) HCS; (d) MTF_GLP_HPM; (e) WAT⊗FRAC with a window size of 15.
Sensors 17 00228 g006
Figure 7. False color fused images using Bands 8, 7 and 1 color composition (bands outside the PAN range): (a) original MS; (b) FIHS; (c) HCS; (d) MTF_GLP_HPM; (e) WAT⊗FRAC.
Figure 7. False color fused images using Bands 8, 7 and 1 color composition (bands outside the PAN range): (a) original MS; (b) FIHS; (c) HCS; (d) MTF_GLP_HPM; (e) WAT⊗FRAC.
Sensors 17 00228 g007
Figure 8. Results of the quality indices for the shrubland ecosystem fused image considering the three different MS band combinations (blue: total bands; red: Bands 2–6; green: Bands 1, 7 and 8). X-axis: panharpening algorithms and Y-axis: range values of each quality indices (SAM: better value closer to 0; Spectral and Spatial ERGAS: better values closer to 0; FC: values between 0 and 1, better closer to 1; Zhou: values between 0 and 1, better closer to 1; Q8: values between 0 and 1, better closer to 1).
Figure 8. Results of the quality indices for the shrubland ecosystem fused image considering the three different MS band combinations (blue: total bands; red: Bands 2–6; green: Bands 1, 7 and 8). X-axis: panharpening algorithms and Y-axis: range values of each quality indices (SAM: better value closer to 0; Spectral and Spatial ERGAS: better values closer to 0; FC: values between 0 and 1, better closer to 1; Zhou: values between 0 and 1, better closer to 1; Q8: values between 0 and 1, better closer to 1).
Sensors 17 00228 g008
Figure 9. Results of the quality indices for the coastal ecosystem fused image considering the three different MS band combinations (blue: total bands; red: Bands 2–6; green: Bands 1, 7 and 8). X-axis: panharpening algorithms and Y-axis: range values of each quality indices (SAM: better value closer to 0; Spectral and Spatial ERGAS: better values closer to 0; FC: values between 0 and 1, better closer to 1; Zhou: values between 0 and 1, better closer to 1; Q8: values between 0 and 1, better closer to 1).
Figure 9. Results of the quality indices for the coastal ecosystem fused image considering the three different MS band combinations (blue: total bands; red: Bands 2–6; green: Bands 1, 7 and 8). X-axis: panharpening algorithms and Y-axis: range values of each quality indices (SAM: better value closer to 0; Spectral and Spatial ERGAS: better values closer to 0; FC: values between 0 and 1, better closer to 1; Zhou: values between 0 and 1, better closer to 1; Q8: values between 0 and 1, better closer to 1).
Sensors 17 00228 g009
Figure 10. Results of the quality indices for the mixed ecosystem fused image considering the three different MS band combinations (blue: total bands; red: Bands 2–6; green: Bands 1, 7 and 8). X-axis: panharpening algorithms and Y-axis: range values of each quality indices (SAM: better value closer to 0; Spectral and Spatial ERGAS: better values closer to 0; FC: values between 0 and 1, better closer to 1; Zhou: values between 0 and 1, better closer to 1; Q8: values between 0 and 1, better closer to 1).
Figure 10. Results of the quality indices for the mixed ecosystem fused image considering the three different MS band combinations (blue: total bands; red: Bands 2–6; green: Bands 1, 7 and 8). X-axis: panharpening algorithms and Y-axis: range values of each quality indices (SAM: better value closer to 0; Spectral and Spatial ERGAS: better values closer to 0; FC: values between 0 and 1, better closer to 1; Zhou: values between 0 and 1, better closer to 1; Q8: values between 0 and 1, better closer to 1).
Sensors 17 00228 g010
Figure 11. Fused image with the WAT⊗FRAC of the shrubland ecosystem and its quality maps for each band (scale from 0–1, zero being less fusion quality and one the highest fusion quality).
Figure 11. Fused image with the WAT⊗FRAC of the shrubland ecosystem and its quality maps for each band (scale from 0–1, zero being less fusion quality and one the highest fusion quality).
Sensors 17 00228 g011
Figure 12. Fused image with FIHS of the coastal ecosystem and its quality maps for each band (scale from 0–1, zero being less fusion quality and one the higher fusion quality).
Figure 12. Fused image with FIHS of the coastal ecosystem and its quality maps for each band (scale from 0–1, zero being less fusion quality and one the higher fusion quality).
Sensors 17 00228 g012
Figure 13. Fused image with WAT⊗FRAC of the mixed ecosystem and its quality maps for each band. Scale from 0–1, zero being less fusion quality and one the higher fusion quality.
Figure 13. Fused image with WAT⊗FRAC of the mixed ecosystem and its quality maps for each band. Scale from 0–1, zero being less fusion quality and one the higher fusion quality.
Sensors 17 00228 g013
Figure 14. Zoom of the original MS (a) and WAT⊗FRAC fused image (b) of the thematic maps obtained by the OBIA-SVM classifier applied to the multispectral image (c) and WAT⊗FRAC fused image (d) in the shrubland ecosystem.
Figure 14. Zoom of the original MS (a) and WAT⊗FRAC fused image (b) of the thematic maps obtained by the OBIA-SVM classifier applied to the multispectral image (c) and WAT⊗FRAC fused image (d) in the shrubland ecosystem.
Sensors 17 00228 g014
Table 1. WorldView-2 sensor technical specifications.
Table 1. WorldView-2 sensor technical specifications.
Imaging ModePanchromaticMultispectral
Spatial Resolution0.46 m1.84 m
Spectral Range450–800 nm400–450 nm (coastal)
450–510 nm (blue)
510–580 nm (green)
585–625 nm (yellow)
630–690 nm (red)
705–745 nm (red edge)
770–895 nm (near IR 1)
860–1040 nm (near IR 2)
Table 2. Location and acquisition date of the three images selected from the Canary Islands.
Table 2. Location and acquisition date of the three images selected from the Canary Islands.
Worldview-2 ImageCoordinatesAcquisition Date
Teide National Park28°18′16″ N, 16°33′50″ W16 May 2011
Maspalomas Natural Reserve27°44′12″ N, 15°35′52″ W17 January 2013
Corralejo and Islote de Lobos Natural Park28°43′52″ N, 13°50′37″ W28 October 2010
Table 3. Indices for the quality assessment of the fused image.
Table 3. Indices for the quality assessment of the fused image.
Quality IndicesEquationReferenceEquation
Spectral Angle Mapper cos 1 i = 1 n b a n d F U S i   M S i i = 1 n b a n d F U S i 2 i = 1 n b a n d M S i 2 [22](1)
Spectral ERGAS 100 h l   1 n b a n d i = 1 n b a n d ( r m s e i ( M S ) M S i ) 2 [23](2)
Spatial ERGAS 100 h l   1 n b a n d i = 1 n b a n d ( r m s e i ( P A N ) P A N i ) 2 [24](3)
Frequency Comparison 1 n b a n d   i = 1 n b a n d c o r r i ( d c t n x n A C ( P A N ) , d c t n x n A C ( F U S i ) ) [25](4)
Zhou 1 n b a n d i = 1 n b a n d c o r r i   ( P A N h i g h p a s s ,   F U S i h i g h p a s s ) [26](5)
Q8 i = 1 n b a n d 4 σ M S , F U S   m e a n M S   m e a n F U S σ M S 2 +   σ F U S 2 [ ( m e a n M S ) 2 +   ( m e a n F U S ) 2 ] [27](6)
Note: nband is the number of bands; FUSi represents the fused image; MSi is the i-th band of the MS image; PANi is the PAN image; h and l represent the spatial resolution of the PAN and MS images, respectively; d c t n x n A C is the discrete cosine transform computed in blocks of nxn pixels, and c o r r i defines the cross-correlation of the i-th band; F U S i h i g h _ p a s s is the high pass filtered fused image, and P A N h i g h _ p a s s is the high pass filtered PAN image; σ is the variance of the MS and FUS image.
Table 4. Quality results for the complete WV-2 bands and the shrubland ecosystem (best results in bold). SAM: spectral angle mapper; FC: frequency comparison; Spec.: spectral; Spat.: spatial.
Table 4. Quality results for the complete WV-2 bands and the shrubland ecosystem (best results in bold). SAM: spectral angle mapper; FC: frequency comparison; Spec.: spectral; Spat.: spatial.
Spectral QualitySpatial QualityGlobal QualityBorda Count Rank
SAMSpectral ERGASSpatial ERGASFCZhouQ8GlobalSpec.Spat.
FIHS3.781.680.890.840.720.901346
HCS3.520.390.910.770.670.931472
MTF_GLP_HPM3.870.330.890.810.710.921664
WAT⊗FRAC4.191.440.820.860.890.901738
Table 5. Quality results for the complete WV-2 bands and the coastal ecosystem (best results in bold). SAM: spectral angle mapper; FC: frequency comparison; Spec.: spectral; Spat.: spatial.
Table 5. Quality results for the complete WV-2 bands and the coastal ecosystem (best results in bold). SAM: spectral angle mapper; FC: frequency comparison; Spec.: spectral; Spat.: spatial.
Spectral QualitySpatial QualityGlobal QualityBorda Count Rank
SAMSpectral ERGASSpatial ERGASFCZhouQ8GlobalSpec.Spat.
FIHS1.772.912.360.850.830.981987
HCS1.811.732.640.640.710.9810152
MTF_GLP_HPM1.641.222.610.720.730.9817184
WAT⊗FRAC1.932.632.540.780.880.9814157
Table 6. Quality results for the complete WV-2 bands and the mixed ecosystem (best results in bold). SAM: spectral angle mapper; FC: frequency comparison; Spec.: spectral; Spat.: spatial.
Table 6. Quality results for the complete WV-2 bands and the mixed ecosystem (best results in bold). SAM: spectral angle mapper; FC: frequency comparison; Spec.: spectral; Spat.: spatial.
Spectral QualitySpatial QualityGlobal QualityBorda Count Rank
SAMSpectral ERGASSpatial ERGASFCZhouQ8GlobalSpec.Spat.
FIHS7.112.982.080.890.730.931226
HCS5.661.732.230.800.610.961362
MTF_GLP_HPM5.621.722.230.810.610.961784
WAT⊗FRAC6.882.852.050.930.980.951848
Table 7. Quality results for Bands 1–8 using the best algorithms for each scene. Best results are in bold.
Table 7. Quality results for Bands 1–8 using the best algorithms for each scene. Best results are in bold.
Q8, Block Size: 64Shrubland EcosystemCoastal EcosystemMixed Ecosystem
Q8 Value for WAT⊗FRAC_w7Q8 Value for FIHSQ8 value for WAT⊗FRAC_15
B1 (Coastal Blue)0.6960.7640.695
B2 (Blue)0.7360.8890.842
B3 (Green)0.8780.9360.905
B4 (Yellow)0.9040.6470.890
B5 (Red)0.8720.4100.851
B6 (Red Edge)0.8970.3950.857
B7 (NIR 1)0.8410.3180.845
B8 (NIR 2)0.8810.2390.832
Table 8. Segmentation parameters used for the images and classification accuracy.
Table 8. Segmentation parameters used for the images and classification accuracy.
Classification TechniquesSupport Vector Machine
Pansharpening AlgorithmsOverall AccuracyKappa
MS80.61%0.72
FIHS83.72%0.76
HCS82.72%0.75
MTF_GLP_HPM83.18%0.75
WAT⊗FRAC89.39%0.85

Share and Cite

MDPI and ACS Style

Ibarrola-Ulzurrun, E.; Gonzalo-Martin, C.; Marcello-Ruiz, J.; Garcia-Pedrero, A.; Rodriguez-Esparragon, D. Fusion of High Resolution Multispectral Imagery in Vulnerable Coastal and Land Ecosystems. Sensors 2017, 17, 228. https://doi.org/10.3390/s17020228

AMA Style

Ibarrola-Ulzurrun E, Gonzalo-Martin C, Marcello-Ruiz J, Garcia-Pedrero A, Rodriguez-Esparragon D. Fusion of High Resolution Multispectral Imagery in Vulnerable Coastal and Land Ecosystems. Sensors. 2017; 17(2):228. https://doi.org/10.3390/s17020228

Chicago/Turabian Style

Ibarrola-Ulzurrun, Edurne, Consuelo Gonzalo-Martin, Javier Marcello-Ruiz, Angel Garcia-Pedrero, and Dionisio Rodriguez-Esparragon. 2017. "Fusion of High Resolution Multispectral Imagery in Vulnerable Coastal and Land Ecosystems" Sensors 17, no. 2: 228. https://doi.org/10.3390/s17020228

APA Style

Ibarrola-Ulzurrun, E., Gonzalo-Martin, C., Marcello-Ruiz, J., Garcia-Pedrero, A., & Rodriguez-Esparragon, D. (2017). Fusion of High Resolution Multispectral Imagery in Vulnerable Coastal and Land Ecosystems. Sensors, 17(2), 228. https://doi.org/10.3390/s17020228

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop