Next Article in Journal
Comparative Analysis of Responses of Land Surface Temperature to Long-Term Land Use/Cover Changes between a Coastal and Inland City: A Case of Freetown and Bo Town in Sierra Leone
Next Article in Special Issue
Vicarious Radiometric Calibration of the Hyperspectral Imaging Microsatellites SPARK-01 and -02 over Dunhuang, China
Previous Article in Journal
Classification of PolSAR Images Using Multilayer Autoencoders and a Self-Paced Learning Approach
Previous Article in Special Issue
Recursive Local Summation of RX Detection for Hyperspectral Image Using Sliding Windows
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Band Subset Selection for Hyperspectral Image Classification

1
Center for Hyperspectral Imaging in Remote Sensing (CHIRS), Information and Technology College, Dalian Maritime University, Dalian 116026, China
2
State Key Laboratory of Integrated Services Networks, School of Telecommunications Engineering, Xidian University, Xi’an 710071, China
3
Department of Computer Science and Information Engineering, National Yunlin University of Science and Technology, Douliu 64002, Taiwan
4
Department of Computer Science and Information Management, Providence University, Taichung 02912, Taiwan
5
Department of Computer Science and Electrical Engineering, University of Maryland, Baltimore County, Baltimore, MD 21250, USA
*
Author to whom correspondence should be addressed.
Remote Sens. 2018, 10(1), 113; https://doi.org/10.3390/rs10010113
Submission received: 24 November 2017 / Revised: 4 January 2018 / Accepted: 10 January 2018 / Published: 15 January 2018
(This article belongs to the Special Issue Hyperspectral Imaging and Applications)

Abstract

:
This paper develops a new approach to band subset selection (BSS) for hyperspectral image classification (HSIC) which selects multiple bands simultaneously as a band subset, referred to as simultaneous multiple band selection (SMMBS), rather than one band at a time sequentially, referred to as sequential multiple band selection (SQMBS), as most traditional band selection methods do. In doing so, a criterion is particularly developed for BSS that can be used for HSIC. It is a linearly constrained minimum variance (LCMV) derived from adaptive beamforming in array signal processing which can be used to model misclassification errors as the minimum variance. To avoid an exhaustive search for all possible band subsets, two numerical algorithms, referred to as sequential (SQ) and successive (SC) algorithms are also developed for LCMV-based SMMBS, called SQ LCMV-BSS and SC LCMV-BSS. Experimental results demonstrate that LCMV-based BSS has advantages over SQMBS.

1. Introduction

Hyperspectral image classification has received considerable interest in recent years [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23]. Its band selection (BS) issue has been also studied extensively [24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57]. In general, there are two approaches to BS. One is to select bands one at a time, sequentially; this is referred to as sequential multiple band selection (SQMBS). In this case, a criterion that can be used to select bands, according to priorities ranked by the criterion, is usually required. Such a criterion is referred to as a band prioritization (BP) criterion, and it can be designed according to two perspectives. One type of BP criterion is based on data characteristics or statistics such as variance, signal-to-noise ratio (SNR), entropy, and information divergence (ID) to calculate a priority score for each of the individual bands in order to rank them [25]. As a result, such BP-based SQMBS is generally unsupervised and is not adaptive to any particular application. In other words, the same selected bands are also applied to all different applications. The other type of BP criterion is supervised and is adaptive to a particular application, such as classification [26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57], target detection [49,50], endmember extraction [51], spectral unmixing [52], etc. Unfortunately, one of major problems with BP-derived BS methods is how to deal with band correlation. Since hyperspectral imagery has very high interband correlation, the fact that a band has a high priority to be selected implies that its adjacent bands also have high priorities to be selected. To avoid this dilemma, band decorrelation may be required to remove redundant bands from a group of selected bands. However, this also comes with two issues, i.e., how to select a band correlation criterion to measure the correlation between two bands, and how to determine the threshold for two bands that are sufficiently decorrelated.
As an alternative to BP-based SQMBS methods, another approach, referred to as simultaneous multiple band selection (SMMBS), is to select multiple bands simultaneously as a band subset. This approach does not have issues in prioritizing bands or decorrelating bands that are encountered in SQMBS. However, the price paid for these advantages is how to develop an effective search strategy to find an optimal band subset, since it generally requires an exhaustive search, which is practically infeasible. To address this issue, several works have been recently proposed, such as band clustering [58,59,60], particle swarm optimization (PSO) in [35], firefly algorithm (FA) in [36], multitask sparsity pursuit (MTSP) [38], multigraph determinantal point process (MDPP) [43], dominant set extraction BS (DSEBS) in [40], etc. Of particular interest is a new concept of band subset selection (BSS) to address this issue which is quite different from the aforementioned SMMBS methods in the sense of the search strategy to be used for finding an optimal set of multiple bands. It considers a selected band as a desired endmember. Accordingly, finding an optimal set of endmembers from all data sample vectors can be translated to selecting an optimal band subset simultaneously from all bands. With this interpretation, two sequential algorithms designed to realize an N-finder algorithm (N-FINDR) [61] numerically, called sequential N-FINDR (SQ N-FINDR) and successive N-FINDR (SC N-FINDR) [62,63,64,65] can be redesigned to find desired band subsets, called SQ BSS and SC BSS algorithms. These two SQ BSS and SC BSS algorithms were recently developed for SMMBS in applications of anomaly detection [66] and spectral unmixing and classification [67,68]. This paper further extends BSS to hyperspectral image classification and has several different aspects not found in [66,67,68]. First and foremost is the criterion used for BSS, which is the minimum variance resulting from a linearly constrained finite impulse response filter arising in adaptive beamforming in array signal processing [69,70,71,72]. This linearly constrained minimum variance (LCMV)-based BSS interprets signal sources as class signature vectors and linearly constrains the class signature vectors, finding an optimal band subset for classification. It is very different from constrained energy minimization (CEM)-based BS [26], which constrains a single selected band, and also from constrained multiple band selection (CMBS) [68], which extends CEM-BS by constraining multiple bands as band subsets, not as class signature vectors as LCMV-BSS does. Secondly, two new SQ BSS and SC BSS algorithms are developed for LCMV-BSS, specifically for classification, referred to as SQ LCMV-BSS and SC LCMV-BSS. Thirdly, the classifier used to evaluate BS performance is also an LCMV classifier which is particularly designed to best utilize the bands selected by LCMV-BSS. Fourthly, despite the fact that LCMV-BSS may not exhaust all possible band combinations, to the authors’ best knowledge, LCMV-BSS is probably the only BSS algorithm to search band subsets among all possible band combinations numerically compared to other SMMBS algorithms such as PSO, FA, MTSP, MDPP, DSEBS which are indeed designed to run only a very small selected set of band subsets. Finally, and most importantly, the proposed LVMV-BSS is very easy to implement because there are no parameters that need to be tuned, as many BS methods have. This is a tremendous advantage since such parameters must be adaptive to various applications.

2. LCMV Criterion for BSS

Suppose that there are M classes of interest and each class is specified by a class signature vector, denoted by d 1 , d 2 , , d M . We can now form a class signature matrix, denoted by D = [ d 1 d 2 d M ] . The goal is to design an FIR linear filter with L filter coefficients { w 1 , w 2 , , w L } , denoted by an L-dimensional vector w = ( w 1 w 2 w L ) T that minimizes the filter output energy subject to the following constraint:
D T w = c   where   d j T w = l = 1 L w l t j l   for   1 j M
where c = ( c 1 c 2 c k ) T is a constraint vector. Using (1), we derive the following linearly constrained optimization problem:
min w { w T R w }   subject   to D T w = c
where R = ( 1 / N ) i = 1 N r i r i T is the autocorrelation sample matrix of the image. The solution to (2) is called the LCMV-based classifier and can be obtained in [69,71,72] by
δ LCMV ( r ) = ( w LCMV ) T r
with
w LCMV = R 1 D ( D T R 1 D ) 1 c .
Substituting (3) into (4) yields
( w LCMV ) T R 1 w LCMV = [ R 1 D ( D T R 1 D ) 1 c ] T R 1 [ R 1 D ( D T R 1 D ) 1 c ] = c T ( D T R 1 D ) 1 D T R 1 D ( D T R 1 D ) 1 c = c T ( D T R 1 D ) 1 c .
According to [70], (5) is the minimum variance weighted by R 1 . As a matter of fact, (5) can be also viewed as the minimal R 1 -weighted least squares error (LSE) caused by misclassification errors from operating δLCMV on the entire image cube. For those who would like to learn more about LCMV, its details can be found in [69,70,71].

3. Band Subset Selection

A BS problem is generally described as follows. Assume that J(.) is a generic objective function of ΩBS for the BS to be optimized where ΩBS is a band subset selected from a full band set Ω. For a given number nBS of selected bands, a BS method is to find an optimal band subset Ω BS * with | Ω BS | = n BS which satisfies the following optimization problem:
Ω BS * = arg { max / min Ω BS Ω ,   | Ω BS | = n BS J ( Ω BS ) } .
Depending upon how the objective function J(ΩBS) is designed, the optimization in (6) can be performed by either maximization or minimization over all possible band subsets ΩBS contained in Ω with | Ω BS | = n BS .
Since solving (6) requires exhausting all possible nBS-band combinations to find an optimal band subset, Ω BS * , it is practically impossible to do so. Accordingly, many approaches have been investigated by designing various criteria or features to define J(ΩBS) and solve (6). One traditional approach is to design a BP criterion to rank all bands from which BS can be carried out by selecting bands according to their calculated priorities by a particular BP criterion. Such an approach generally results in an SQMBS method which selects multiple bands one at a time sequentially. As noted in the introduction, one major issue arising from this approach is how to deal with redundant bands caused by band correlation. As an alternative, another BP-derived SQMBS method is to specify a particular application such as minimum estimated abundance covariance (MEAC) for classification [34], which can generate feature vectors for BP and then takes advantage of the sequential forward floating search (SFFS) and sequential backward floating search (SBFS) developed in [73] to derive forward and backward BS methods. However, the band correlation issue still remains.
In contrast to SQMBS, many recent efforts have been directed to SMMBS, which selects multiple bands simultaneously at the same time. Associated with SMMBS are also two main issues needed to be addressed. One is determining the number nBS of bands to be selected, which is also an issue in SQMBS. Generally, nBS can be determined by either trial-and-error or the virtual dimensionality (VD) developed in [69,74]. The other is a more critical issue, which is to how to find appropriate nBS bands. Suppose that nBS = p is the number of bands needed to be selected, Ω p = { B l 1 , B l 2 , , B l p } is a p-band band subset selected from a full band set Ω = { B 1 , B 2 , , B L } where L is the total number of bands, and B l j is the selected jth band. In order to find an optimal band subset Ω p * , we must run through all possible ( L p ) = L ! p ! ( L p ) ! p-combinations among L bands. Practically, this is infeasible if L is large such as in hyperspectral imagery. In this case, developing an effective search strategy for finding an optimal set of multiple bands that does not exist in SQMBS is a great challenge to SMMBS.
A simple SMMBS approach is to group or combine bands into clusters, each of which produces a representative band for BS using certain band measure criteria [58,59,60]. In particular, the concept in [58] is similar to Fisher’s ratio, using mutual information as a band prioritization criterion for clustering. Most interestingly, a band group-wise method was developed [38], which used band combinations by compressive sensing and a multitask sparsity pursuit (MTSP)-based criterion to select band combinations based on linear sparse representation via an evolution-based algorithm-derived search strategy. Another SMMBS approach is to narrow the search range by specifying particular parameters to limit a small number of band subsets as candidate optimal sets, then follow an optimization algorithm such as PSO [35] or FA [36] to find an optimal band subset from the selected candidate set of band subsets.
Most recently, two other promising approaches have been reported. One is to use graph-based representations with each path used to specify a particular band subset. For example, Yuan et al. [43] proposed a graph-based SMMBS method, called multigraph determinantal point process (MDPP), which makes use of multiple graphs to discover a structure and diverse band subset from a graph where each node represents a band and the edges are specified by similarity between bands. Accordingly, a path represents a possible band subset. Then, a search algorithm called mixture determinantal point process (Mix-DPP) was further developed to find a diverse subset that can be a potential optimal band combination. The other is DSEBS, which exploits structure information via a set of local spatial–spectral filters and uses a graph-based clustering search strategy derived from dominant set extraction to find a potential optimal band subset [40].
In addition to the above-mentioned approaches there is also a new approach, called BSS, which considers the problem of multiple band selection as an endmember finding problem. If a desired selected band is interpreted as an endmember and the full band set as the entire data set, then a band subset can be interpreted as a set of endmembers. Consequently, finding an optimal set of nBS bands can be carried out in a similar way to finding an optimal set of nBS endmembers. This BSS-based approach has recently proved to be very promising and has great potential in various applications such as anomaly detection in [65], spectral unmixing in [66], and target detection in [67]. This paper presents another new application of BSS to hyperspectral image classification with LCMV used as a criterion particularly designed for classification.

4. LCMV-BSS Algorithms

Now, if we replace the full band set Ω in R 1 of (5) with a selected band subset ΩBS, then (5)
MV ( Ω BS ) = c T ( D Ω BS T R Ω BS 1 D Ω BS ) 1 c
which is the minimum variance weighted by R Ω BS 1 resulting from the LCMV filter using a partial band subset specified by ΩBS. There is another interpretation of (7) which can be also considered as the least R Ω BS 1 -weighted square error. It should be noted that the constraint vector c is specifically designed to take care of M class signatures, d 1 , d 2 , , d M , not bands. Accordingly, c has nothing to do with the selected band subset ΩBS and, thus, it remains a constant in (7) for any selected band subset ΩBS.
Using the MV(ΩBS) in (7), a criterion can be designed to find an optimal band subset Ω BS * which solves
Ω BS * = arg { min Ω BS Ω MV ( Ω BS ) } .
By virtue of (8), two types of algorithms from SQ N-FINDR and SC N-FINDR, called the sequential LCMV-BSS (SQ LCMV-BSS) algorithm and the successive LCMV-BSS (SC LCMV-BSS) algorithm, can be further developed as follows.

4.1. SQ LCMV-BSS

The idea of SQ LCMV-BSS is to use two loops to iterate band subsets ΩBS in an outer loop and compute MV(ΩBS) in (7) in an inner loop. Depending upon how MV(ΩBS) is computed in the inner loop, two versions can be developed. The first one is called SQ LCMV-BSS-1, and finds the minimum variance MV ( Ω BS ( j ) ) currently being iterated for 1 j n BS in the inner loop compared to the minimum variance MV ( Ω BS ( l ) ) obtained at the lth iteration in the outer loop. A detailed step-by-step implementation is described below.
Algorithm 1 SQ LCMV-BSS-1
Step 1: Initial conditions
(i)
nBS = p, which is the number of selected multiple bands determined by VD.
(ii)
Let Ω p ( 0 ) = { B 1 ( 0 ) , B 2 ( 0 ) , , B p ( 0 ) } with B 1 ( 0 ) = B 1 , B 2 ( 0 ) = B 2 , , B p ( 0 ) = B p uniformly selected from the band set Ω.
(iii)
Calculate
       MV ( Ω p ( 0 ) ) = c T ( D Ω p ( 0 ) T R Ω p ( 0 ) 1 D Ω p ( 0 ) ) 1 c .
Step 2: Outer loop
 For l = 1 , , L do
  Step 3: Inner loop
  Compute MV ( Ω p ( l ) )
   For j = 1 , , p do
   Find an index j* by
          j * = arg { ( min 1 j p MV ( Ω p ( j ) ) ) < MV ( Ω p ( l ) ) }
   with
           MV ( Ω p ( j ) ) = c T ( D Ω p ( j ) T R Ω p ( j ) 1 D Ω p ( j ) ) 1 c
   which specifies the band to be replaced by the lth band Bl. Such a band is now denoted by B j ( l + 1 ) . A new set of bands is then produced by letting B j * ( l + 1 ) = B l and B j ( l + 1 ) = B j ( l ) for j j *
A second version of SQ LCMV-BSS, referred to as SQ LCMV-BSS-2, always finds the minimum variance MV ( Ω BS ( j ) ) currently being iterated for 1 j n BS at each iteration in the inner loop; its detailed step-by-step implementation is summarized as follows.
Algorithm 2 SQ LCMV-BSS-2
Step 1: Initial conditions
(i)
nBS = p, which is the number of selected multiple bands determined by VD.
(ii)
Let Ω p ( 0 ) = { B 1 ( 0 ) , B 2 ( 0 ) , , B p ( 0 ) } with B 1 ( 0 ) = B 1 , B 2 ( 0 ) = B 2 , , B p ( 0 ) = B p uniformly selected from the band set Ω.
(iii)
Calculate
       MV ( Ω p ( 0 ) ) = c T ( D Ω p ( 0 ) T R Ω p ( 0 ) 1 D Ω p ( 0 ) ) 1 c .
Step 2: Outer loop
 For l = 1 , , L do
  Step 3: Inner loop
   For j = 1 , , p do
    Find an index j* by
             j * = arg { min 1 j p MV ( Ω p ( j ) ) }
    with
           MV ( Ω p ( j ) ) = c T ( D Ω p ( j ) T R Ω p ( j ) 1 D Ω p ( j ) ) 1 c
    which specifies the band to be replaced by the lth band Bl. Such a band is now denoted by B j ( l + 1 ) . A new set of bands is then produced by letting B j * ( l + 1 ) = B l and B j ( l + 1 ) = B j ( l ) for j j *

4.2. SC LCMV-BSS

A second type of LCMV-BSS algorithm is SC LCMV-BSS, which reverses the two loops implemented in SQ LCMV-BSS by iterating the computation of MV(ΩBS) in (7) in an outer loop, while iterating band subsets nBS in an inner loop. Its detailed step-step implementation is provided in the following.
Algorithm 3 SC LCMV-BSS
Step 1: Initial conditions
(i)
nBS = p, which is the number of selected multiple bands determined by VD.
(ii)
Let Ω p ( 0 ) = { B 1 ( 0 ) , B 2 ( 0 ) , , B p ( 0 ) } with B 1 ( 0 ) = B 1 , B 2 ( 0 ) = B 2 , , B p ( 0 ) = B p uniformly selected from the band set Ω.
(iii)
Calculate
MV ( Ω p ( 0 ) ) = c T ( D Ω p ( 0 ) T R Ω p ( 0 ) 1 D Ω p ( 0 ) ) 1 c
Step 2: Outer loop
  For j = 1 , , p do
   Step 3: Inner loop
    For l = 1 , , L do
     Find
           B j ( * ) = arg { min B l Ω MV ( Ω ˜ p l ) }
     where     Ω = Ω { B 1 ( * ) , , B j 1 ( * ) , B j + 1 ( p ) , , B p ( p ) } , Ω ˜ p l = { B 1 ( * ) , , B j 1 ( * ) , B l , B j + 1 ( p ) , , B p ( p ) } .
Step 4: Output the final band subset, { B 1 ( * ) , B 2 ( * ) , , B p ( * ) } .

5. Real Image Experiments

Three popular real hyperspectral images, Purdue University’s Indiana Indian Pines, Salinas, and University of Pavia, available at http://www.ehu.eus/ccwintco/index.php?title=Hyperspectral_Remote_Sensing_Scenes, were used in experiments. The detailed data descriptions and matlab data files can be also found on this website.

5.1. Purdue Indiana Indian Pines Scene

The first image scene used for experiments is an airborne visible/infrared imaging spectrometer (AVIRIS) hyperspectral data set from the Purdue Indiana Indian Pines test site shown in Figure 1a, with its ground truth of 17 class maps in Figure 1b. It has a size of 145 × 145 pixel vectors, taken from an area of mixed agriculture and forestry in Northwestern Indiana, USA with details of band and wavelength given in the caption. The data set is available at website https://purr.purdue.edu/publications/1947/serve/1?el=1. It was recorded in June 1992 with 220 bands which include water absorption bands (bands 104–108 and 150–163, 220).

5.2. Salinas

A second set of AVIRIS data used for experiments was the Salinas scene shown in Figure 2a, which was captured by the AVIRIS sensor over Salinas Valley, California, with a spatial resolution of 3.7 m per pixel and spectral resolution of 10 nm. It has a size of 512 × 217 × 224 . Figure 2b,c show the color composite of the Salinas image along with the corresponding ground truth class labels.

5.3. ROSIS Data

The last hyperspectral image data used for experiments was the University of Pavia image shown in Figure 3, which is an urban area surrounding the University of Pavia, Italy. It was recorded using the ROSIS-03 satellite sensor. It is of size 610 × 340 × 115 with a spatial resolution of 1.3 m per pixel and spectral coverage ranging from 0.43 to 0.86 μm with spectral resolution of 4 nm (the 12 most noisy channels were removed before experiments). Nine classes of interest, plus a background (BKG) class (class 0), were considered for this image.
In the following experiments, four types of BS methods were tested for a comparative study and analysis.
  • Uniform band selection (UBS): According to our extensive experiments, UBS is a reasonably good BS method which is also reported in the literature. It does not require any prior knowledge or BS criterion. It is the simplest BS method.
  • MEAC: This uses the minimum covariance derived from the estimated abundance matrix, which is similar to the minimum variance in (5). In addition, it can also represent the category of SQMBS methods.
  • MDPP and DSEBS: Both represent the category of SMMBS methods. They make use of graph representations to specify band groups. Most importantly, these two methods were compared with CEM/LCMV-based methods in [26] and both are also based on the LCMV formulation specified by (2).
  • LCMV-BSS developed in this paper: This represents the category of BSS methods using the LCMV formulation in (2).
As noted in the introduction and in Section 3, although PSO, FA, and MTSP are also SMMBS methods, they are not compared in this paper for the following reasons. One is that their design rationale is completely different from that of LCMV-BSS. Secondly, the initial candidate sets from which their search algorithms find an optimal band subset are random and are also too small. So, their results are not representative and also are not reproducible. Thirdly, the details of their used parameters were not specified and provided in their papers. Therefore, it is very difficult to implement their algorithms for fair comparisons.
Table 1 tabulates the number nBS of selected bands estimated for three scenes using Harsanyi-Farrand-Chang (HFC) method/noise whitened HFC (NWHFC method developed for VD in [69,74,75] where nBS was determined to be nBS = 18 for Purdue’s data, 21 for Salinas and 14 for University of Pavia with a false alarm probability of 10−4.
Table 2 lists the bands selected by seven BS methods—uniform BS (UBS), minimum estimated abundance covariance (MEAC), multigraph determinantal point process (MDPP), dominant set extraction BS (DSEBS), SQ LCMV-BSS-1, SQ LCMV-BSS-2, and SC LCMV-BSS—for the three scenes; nBS = 18 for Purdue’s Indian Pines, nBS = 21 for Salinas, and nBS = 14 for University of Pavia.
In order to perform HSIC, choosing an appropriate classifier is crucial. Recently, Yu et al. [76] developed a new classifier, called the iterative multiclass constrained background suppression classifier (IMCBSC), and further demonstrated that IMCBSC performed well in both overal accuarcy rate (POA) and precision rate (PR) Since IMCBSC was also derived from LCMV and implemented by LCMV in an iterative manner, the iterative linearly constrained minimum variance (ILCMV) is used in this paper instead of IMCBSC to reflect its idea arising from LCMV and its iterative nature in algorithm implementation. Most importantly, ILCMV was adopted for two main reasons. One is because of the work in [76], which showed that ILCMV could perform at least comparably in POA but significantly better than the work in [12]. The other is that ILCMV is indeed derived from the LCMV criterion specified by (2). So, it is natural to use ILCMV to perform classification.
Two remarks on the implementation of ILCMV are noteworthy.
  • Unlike most supervised classifiers used for HSIC which require training samples, ILCMV only needs the knowledge of the class signatures D, which can be obtained by either prior knowledge or class sample means. Specifically, the class signatures in D are not necessarily real data samples.
  • Also, unlike most supervised classifiers used for HSIC which require test and training data samples from the same class, the test samples for ILCMV can be selected from any arbitrary class including the BKG class, and are not necessarily limited to the same class trained by the training samples. This is a crucial difference between ILCMV and existing hyperspectral image classification algorithms reported in the literature. For more details, we refer to [23,76].
Figure 4c–i, Figure 5c–i and Figure 6c–i show classification maps produced by ILCMV, using bands selected in Table 2 by seven BS methods—UBS, MEAC, MDPP, DSEBS, SQ LCMV-BSS-1, SQ LCMV-BSS-2, and SC LCMV-BSS, respectively—where the ground truth map and classification map produced by the full bands are also included in (a) and (b), respectively, for comparison.
Apparently, it is difficult to see any appreciable difference among all the classification results in Figure 4, Figure 5 and Figure 6 by visual inspection. In this case, to better evaluate each BS method, conducting a quantitative analysis is necessary. It has been shown in [23,76] that using overall accuracy (OA), POA may not be sufficient to evaluate the effectiveness of classification performance. To address this issue, two additional measures, called precision rate, PR, and detection rate, PD (also known as recall rate), developed in [23,76] were introduced for HSIC where PR and PD have been widely used in pattern recognition such as medical imaging, handwritten character recognition, and biometric recognition. The definitions and details of POA, PR, and PD can be found in [23,76].
Table 3, Table 4 and Table 5 show PD, POA, and PR calculated by the ILCMV classification results in Figure 4, Figure 5 and Figure 6 using the bands selected in Table 2 for Purdue’s data, Salinas, and University of Pavia, respectively, where the best results with highest rates are shown in boldface. Here, we would like to point out a crucial fact used in the experiments, as noted in the second remark described above, where the PD, POA, and PR were calculated by including the background (BKG) for classification because LCMV is particularly designed to take care of the BKG issue in classification, as shown in [76]. This is quite different from many reports which calculate POA excluding BKG from classification, such as [12].
Since PD varies with each class, it is difficult to evaluate the overall classification performance. So, our analysis is conducted based on POA and PR. As we can see from the tables, SQ LCMV-BSS-2 and SC LCMV-BSS outperformed all the other five BS methods in terms of POA and PR for Salinas and University of Pavia scenes, but were slightly worse than MDPP in POA and DSEBS in PR. Interestingly, both MDPP and DSEBS produced the best results in terms of POA and PR respectively for the Purdue data. As also noted in Table 3, Table 4 and Table 5, the POA and PR using full bands were generally not as good as those produced by most of the test BS methods, but also worse than that produced by UBS. These experiments showed that hyperspectral image classification can benefit greatly from the judicious selection of bands with appropriately determined nBS.
Table 6 tabulates the computing times in seconds for each of six BS methods in a computer environment with a 1.6 GHz Intel Core i5 with OS X EI Capitan and 4 GB 1600 MHz DDR3; the software used to run experiments was Matlab_R2014b. Obviously, the best time was achieved by DSEBS, followed by SC LCMV-BSS and SQ LCMV-BSS. The worst time was achieved by MDPP for the Purdue data and MEAC for Salinas and University of Pavia.
As noted above, a classifier can also have a significant impact on BS, especially when BKG is included for consideration. A recent work [12] developed four edge preserving filtering (EPF)-based techniques—EPF-B-c, EPF-G-c, EPF-B-g, and EPF-G-g for HSIC—and also conducted a comprehensive comparative analysis to show that their methods indeed performed better than most recently developed spectral–spatial techniques. Therefore, in what follows, we conducted experiments to evaluate the performance of ILCMV in comparison with these four EPF-based techniques with BKG particularly included for classification. To see this, we also implemented these four EPF-based techniques with “B” and “G” used to specify bilateral filter and guided filter, respectively, and “g” and “c” indicate that the first principal component and color composite of the three principal components are used as reference images [12].
Table 7, Table 8, Table 9, Table 10, Table 11, Table 12, Table 13, Table 14 and Table 15 tabulate the results in terms of POA and PR rates produced by the four EFP-based methods and ILCMV, all of which included BKG for classification and also used the bands selected in Table 2 to implement the three image scenes. Data for the Purdue image is shown in Table 7, Table 8 and Table 9 using bands selected by SQ LCMV-BSS-1, SQ LCMV-BSS-2, and SC LCMV-BSS; data for Salinas is shown in Table 10, Table 11 and Table 12 using bands selected by SQ LCMV-BSS-1, SQ LCMV-BSS-2, and SC LCMV-BSS; and data for University of Pavia is shown in Table 13, Table 14 and Table 15 using bands selected by SQ LCMV-BSS-1, SQ LCMV-BSS-2, and SC LCMV-BSS. In addition, their computing times in seconds are included in the tables for comparison.
Several interesting findings can be derived from the results in Table 7, Table 8, Table 9, Table 10, Table 11, Table 12, Table 13, Table 14 and Table 15.
  • It is very obvious to note that BSS did improve ILCMV classification results. Such an improvement cannot be found in the four EPF-based methods, where the classification results of the four EPF-based methods using band subsets could only get worse compared with the results using full bands. This may be due to the fact that the four EPF-based methods used principal component analysis (PCA) to compress the original data in preprocessing which retains some crucial information provided by full bands.
  • The precision rates produced by the four EPF-based methods were very low as also noted in [23,76]. However, ILCMV using bands selected by LCMV-BSS consistently performed very well in both POA and PR.
  • According to Table 7, Table 8 and Table 9, ILCMV performed slightly better than the four EPF-based methods in POA but significantly better in PR for Purdue’s data and Salinas. The scene of the University of Pavia is interesting, as shown in Table 13, Table 14 and Table 15. The four EPF-based methods performed very well in POA but did very poorly in PR with about only 20%. Furthermore, POA produced by ILCMV may not be as good as those produced by the four EPF-based methods (about 10% less) but the PR produced by ILCMV were around 96% which is nearly 4.8 times better than the 20% produced by the four EPF-based methods. These experiments demonstrated that the BKG issue is critical in data analysis of the University of Pavia and cannot be ignored or discarded in data processing. Unfortunately, this BKG issue has never been investigated in the past.
  • Unlike the four EPF-based methods, which performed well in POA but very poorly in PR, ILCMV consistently performs well in both POA and PR, and even better when it is implemented in conjunction with BSS—a case that the EPF-based methods actually failed, as shown in Table 7, Table 8, Table 9, Table 10, Table 11, Table 12, Table 13, Table 14 and Table 15.
  • Last but not least, BS is heavily determined by three factors: the data to be processed, the BS method selected, and the classifier used. Unfortunately, most works on BS for hyperspectral image classification have been focused on the design and development of BS methods but very little has been reported on performance evaluation of different classifiers which use the same set of bands selected by a BS method. For example, as shown in Table 7, Table 8, Table 9, Table 10, Table 11, Table 12, Table 13, Table 14 and Table 15, if the four EPF methods were implemented by BS, their classification results could not be improved, but those of ILCMV could.
  • It should be noted that PD results are not included in Table 7, Table 8, Table 9, Table 10, Table 11, Table 12, Table 13, Table 14 and Table 15 due to two reasons. One is that the results of PD using full bands are already available in [23,76]. The other is that EPF-based methods using partial bands did not perform better than their counterparts using full bands. So, it does not make sense to include their results in tables. Besides this, due to limited space, there is no need to include their results.

6. Conclusions

This paper developed an SMMBS method, called LCMV-BSS, which selects multiple bands as a band subset using LCMV to linearly constrain class signature vectors as a criterion to select an optimal band subset. It is completely different from existing BS methods, with the following contributions: (i) It is a BSS method particularly developed for HSIC; (ii) It is quite different from single band-constrained methods in [26] and multiple-band constrained methods in [68], by constraining multiple class signature vectors instead of multiple bands; (iii) It develops three numerical search algorithms to find optimal band subsets which are different from the graph-based approaches [40,43] used by other SMMBS methods; (iv) It is very simple to implement via (7) with no parameters needing to be tuned; (v) Most importantly, it shows that HSIC can be improved by BS provided that the number nBS of selected bands and the set of nBS bands are properly selected.

Acknowledgments

The work of C.Y. is supported by National Nature Science Foundation of Liaoning Province (20170540095). The work of M.S. is supported by National Nature Science Foundation of China (61601077) and State Key Laboratory of Integrated Services Networks. The work of C.-I.C. is supported by the Fundamental Research Funds for Central Universities under Grant 3132016331. The authors would like to thank Xiaoqiang Lu for helping run the MDPP method with the same parameters used in [43] and the authors of ref. [40] for providing their software to run DSEBS.

Author Contributions

C.Y. and M.S. conceived and designed the experiments; C.Y. performed the experiments; C.Y. and C.-I.C. analyzed the data; C.Y. and M.S. contributed reagents/materials/analysis tools; C.-I.C. wrote the paper.

Conflicts of Interest

All authors have declared no conflict of interest.

References

  1. Melgani, F.; Bruzzone, L. Classification of hyperspectral remote sensing images with support vector machines. IEEE Trans. Geosci. Remote Sens. 2004, 42, 1778–1790. [Google Scholar] [CrossRef]
  2. Benediktsson, J.A.; Palmason, J.A.; Sveinsson, J.R. Classification of hyperspectral data from urban areas based on extended morphological profiles. IEEE Trans. Geosci. Remote Sens. 2005, 43, 480–491. [Google Scholar] [CrossRef]
  3. Camps-Valls, G.; Bruzzone, L. Kernel-based methods for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2005, 43, 1351–1362. [Google Scholar] [CrossRef]
  4. Tarabalka, Y.; Benediktsson, J.A.; Chanussot, J. Spectral-spatial classification of hyperspectral imagery based on partitional clustering techniques. IEEE Trans. Geosci. Remote Sens. 2009, 47, 2973–2987. [Google Scholar] [CrossRef]
  5. Tarabalka, Y.; Fauvel, M.; Chanussot, J.; Benediktsson, J.A. SVM- and MRF-based method for accurate classification of hyperspectral images. IEEE Geosci. Remote Sens. Lett. 2010, 9, 736–740. [Google Scholar] [CrossRef]
  6. Li, J.; Bioucas-Dias, J.M.; Plaza, A. Semisupervised hyperspectral image segmentation using multinominal logistic regression model with active learning. IEEE Trans. Geosci. Remote Sens. 2010, 48, 4085–4098. [Google Scholar]
  7. Zhang, B.; Li, S.; Jia, X.; Gao, L.; Peng, M. Adpative hyperspectral Markov random field approach for classification of hyperspectral imagery. IEEE Geosci. Remote Sens. Lett. 2011, 8, 4085–4098. [Google Scholar] [CrossRef]
  8. Li, J.; Bioucas-Dias, J.M.; Plaza, A. Hyperspectral image segmentation using a new Bayesion approach with active learning. IEEE Trans. Geosci. Remote Sens. 2011, 49, 3947–3960. [Google Scholar] [CrossRef]
  9. Chen, Y.; Nasabardi, N.; Tran, T.D. Hyperspectral image segmentation using dictionary-based sparse representation. IEEE Trans. Geosci. Remote Sens. 2011, 49, 3973–3985. [Google Scholar] [CrossRef]
  10. Tarabalka, Y.; Fauvel, M.; Chanussot, J.; Benediktsson, J.A. Segmentation and classification of hyperspectral images using minimum spanning forest grown from automatically selected makers. IEEE Trans. Syst. Man Cybern. Part B Cybern. 2011, 40, 1267–1279. [Google Scholar] [CrossRef] [PubMed]
  11. Fauvel, M.; Tarabalka, Y.; Benediktsson, J.A.; Chanussot, J.; Tilton, J. Advances in spectral-spatial classification of hyperspectral images. Proc. IEEE 2013, 101, 652–675. [Google Scholar] [CrossRef]
  12. Kang, K.; Li, S.; Benediktsson, J.A. Spectral-spatial hyperspectral image classification with edge-preserving filtering. IEEE Trans. Geosci. Remote Sens. 2014, 52, 2666–2677. [Google Scholar] [CrossRef]
  13. Fu, W.; Li, S.; Fang, L.; Kang, X.; Benediktsson, J.A. Hyperspectral image classification via shape-adaptive joint sparse representation. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2016, 9, 556–567. [Google Scholar] [CrossRef]
  14. Kang, K.; Li, S.; Fang, L.; Li, M.; Benediktsson, J.A. Extended random walker-based classification of hyperspectral images. IEEE Trans. Geosci. Remote Sens. 2015, 53, 144–153. [Google Scholar] [CrossRef]
  15. Lu, T.; Li, S.; Fang, L.; Bruzzone, L.; Benediktsson, J.A. Set-to-set distance-based spectral spatial classification of hyperspectral images. IEEE Trans. Geosci. Remote Sens. 2016, 54, 7122–7134. [Google Scholar] [CrossRef]
  16. Guo, X.; Huang, X.; Zhang, L.; Plaza, A.; Benedikisson, J.A. Support tensor machines for classification of hyperspectral remote sesning imagery. IEEE Trans. Geosci. Remote Sens. 2016, 54, 3248–3264. [Google Scholar] [CrossRef]
  17. Ma, X.; Wang, H.; Wang, J. Semisupervised classification of hyperspectral image based on multi- decision labeling and deep feature learning. ISPRS J. Photogramm. Remote Sens. 2016, 120, 99–107. [Google Scholar] [CrossRef]
  18. Camps-Valls, G.; Gomez-Chova, L.; Munoz-Mari, J.; Vila-Frances, J.; Calpe-Maravilla, J. CKs for hyperspectral image classification. IEEE Geosci. Remote Sens. Lett. 2006, 3, 93–97. [Google Scholar] [CrossRef]
  19. Gurram, P.; Kwon, H. Contextual SVM using Hilbert space embedding for hyperspectral classification. IEEE Geosci. Remote Sens. Lett. 2013, 10, 1031–1035. [Google Scholar] [CrossRef]
  20. Peng, J.; Zhou, Y.; Chen, C. Region-kernel-based support vector machines for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2015, 53, 4810–4824. [Google Scholar] [CrossRef]
  21. Sun, S.; Zhong, P.; Xiao, H.; Wang, R. Active learning with Gaussian process classifier for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2015, 53, 1746–1760. [Google Scholar] [CrossRef]
  22. Pullanagari, R.; Kereszturi, G.; Yule, I.J.; Ghamisi, P. Assessing the performance of multiple spectral-spatial features of a hyperspectral image for classification of urban land cover classes using support vector machines and artificial neural network. J. Appl. Remote Sens. 2017, 11, 026009-1–026009-21. [Google Scholar] [CrossRef]
  23. Xue, B.; Yu, C.; Wang, Y.; Song, M.; Li, S.; Wang, L.; Chen, H.M.; Chang, C.-I. A subpixel target detection approach to hyperpsectral image classification. IEEE Trans. Geosci. Remote Sens. 2017, 55, 5093–5114. [Google Scholar] [CrossRef]
  24. Chang, C.-I. Hyperspectral Data Processing: Signal Processing Algorithm Design and Analysis; Wiley: Hoboken, NJ, USA, 2013. [Google Scholar]
  25. Chang, C.-I.; Du, Q.; Sun, T.S.; Althouse, M.L.G. A joint band prioritization and band decorrelation approach to band selection for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 1999, 37, 2631–2641. [Google Scholar] [CrossRef]
  26. Chang, C.-I.; Wang, S. Constrained band selection for hyperspectral imagery. IEEE Trans. Geosci. Remote Sens. 2006, 44, 1575–1585. [Google Scholar] [CrossRef]
  27. Huang, R.; He, M. Band selection based on feature weighting for classification of hyperspectral data. IEEE Trans. Geosci. Remote Sens. Lett. 2005, 2, 156–159. [Google Scholar] [CrossRef]
  28. Keshava, N. Distance metrics and band selection in hyperspectral processing with applications to material identification. IEEE Trans. Geosci. Remote Sens. 2004, 42, 1552–1565. [Google Scholar] [CrossRef]
  29. Mausel, P.W.; Kramber, W.J.; Lee, J.K. Optimum band selection for supervised classification of multispectral data. Photogramm. Eng. Remote Sens. 1990, 56, 55–60. [Google Scholar]
  30. Jia, S.; Ji, Z.; Qian, Y.-Y.; Shen, L.-L. Unsupervised band selection for hyperspectral imagery classification without manual band removal. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2012, 5, 531–543. [Google Scholar] [CrossRef]
  31. Stearns, S.D.; Wilson, B.E.; Peterson, J.R. Dimensionality reduction by optimal band selection for pixel classification of hyperspectral imagery. Proc. SPIE 1993, 2028, 118–127. [Google Scholar]
  32. Backer, S.; Kempeneers, P.; Debruyn, W.; Scheunders, P. Band selection for hyperspectral remote sensing. Pattern Recognit. 2005, 2, 319–323. [Google Scholar]
  33. Du, Q.; Yang, H. Similarity-based unsupervised band selection for hyperspectral image analysis. IEEE Geosci. Remote Sens. Lett. 2008, 5, 564–568. [Google Scholar] [CrossRef]
  34. Yang, H.; Du, Q. An efficient method for supervised hyperspectral band selection. IEEE Geosci. Remote Sens. Lett. 2011, 8, 138–142. [Google Scholar] [CrossRef]
  35. Su, H.; Du, Q.; Chen, G.; Du, P. Optimized hyperspectral band selection using particle swarm optimization. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 2659–2670. [Google Scholar] [CrossRef]
  36. Su, H.; Yang, B.; Du, Q. Hyperspectral band selection using improved firefly algorithm. IEEE Geosci. Remote Sens. Lett. 2016, 13, 68–72. [Google Scholar] [CrossRef]
  37. Wei, W.; Du, Q.; Younan, N.H. Fast supervised hyperspectral band selection using graphics processing unit. J. Appl. Remote Sens. 2012, 6, 061504-1–061504-12. [Google Scholar]
  38. Yuan, Y.; Zhu, G.; Wang, Q. Hyperspectral band selection by multitask sparisty pursuit. IEEE Trans. Geosci. Remote Sens. 2015, 53, 631–644. [Google Scholar] [CrossRef]
  39. Lu, X.; Li, X.; Mou, L. Semi-supervised multitask learning for scene recognition. IEEE Trans. Cybern. 2015, 45, 1967–1976. [Google Scholar] [PubMed]
  40. Zhu, G.; Huang, H.; Lei, J.; Bi, Z.; Xu, F. Unsupervised hyperspectral band selection by dominant set extraction. IEEE Trans. Geosci. Remote Sens. 2016, 54, 227–239. [Google Scholar] [CrossRef]
  41. Yuan, Y.; Lin, J.; Wang, Q. Dual-clustering-based hyperspectral band selection by contextual analysis. IEEE Trans. Geosci. Remote Sens. 2016, 54, 1431–1445. [Google Scholar] [CrossRef]
  42. Wang, Q.; Lin, J.; Yuan, Y. Salient band selection for hyperspectral image classification via manifold ranking. IEEE Trans. Neural Netw. Learn. Syst. 2016, 27, 1279–1289. [Google Scholar] [CrossRef] [PubMed]
  43. Yuan, Y.; Zheng, X.; Lu, X. Discovering diverse subset for unsupervised hyperspectral band selection. IEEE Trans. Image Process. 2017, 26, 51–64. [Google Scholar] [CrossRef] [PubMed]
  44. Lu, X.; Yuan, Y.; Zheng, X. Joint dictionary learning for multispectral change detection. IEEE Trans. Cybern. 2017, 47, 884–897. [Google Scholar] [CrossRef] [PubMed]
  45. Lu, X.; Zheng, X.; Yuan, Y. Remote sensing scene classification by unsupervised representation learning. IEEE Trans. Geosci. Remote Sens. 2017, 55, 5148–5157. [Google Scholar] [CrossRef]
  46. Feng, J.; Jiao, L.C.; Zhang, X.; Sun, T. Hyperspectral band selection based on trivariate mutual information and clonal selection. IEEE Trans. Geosci. Remote Sens. 2014, 52, 4092–4105. [Google Scholar] [CrossRef]
  47. Feng, J.; Jiao, L.C.; Liu, F.; Sun, T.; Zhang, X. Mutual-information0-based semi-supervised hyperspectral band selection with high discrimination, high information and low redundancy. IEEE Trans. Geosci. Remote Sens. 2015, 53, 2956–2969. [Google Scholar] [CrossRef]
  48. Wang, C.; Gong, M.; Zhang, M.; Chan, Y. Unsupervised hyperspectral image band selection via column subset selection. IEEE Geosci. Remote Sens. Lett. 2015, 12, 1411–1415. [Google Scholar] [CrossRef]
  49. Geng, X.; Sun, K.; Ji, L. Band selection for target detection in hyperspectral imagery using sparse CEM. Remote Sens. Lett. 2014, 5, 1022–1031. [Google Scholar] [CrossRef]
  50. Sun, K.; Geng, X.; Ji, L. A new sparsity-based band selection method for target detection of hyperspectral image. IEEE Geosci. Remote Sens. Lett. 2015, 12, 329–333. [Google Scholar]
  51. Zare, A.; Gader, P. Hyperspectral band selection and endmember detection using sparisty promoting priors. IEEE Geosci. Remote Sens. Lett. 2008, 5, 256–260. [Google Scholar] [CrossRef]
  52. Ball, J.E.; Bruce, L.E.; Younan, N.H. Hyperspectral pixel unmixing via spectral band selection and DC-insensitive singular value decomposition. IEEE Geosci. Remote Sens. Lett. 2007, 4, 382–386. [Google Scholar] [CrossRef]
  53. Sun, K.; Geng, X.; Ji, L.; Lu, Y. A new band selection method for hyperspectral image based on data quality. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 2697–2703. [Google Scholar]
  54. Sun, K.; Geng, X.; Ji, L. Exemplar component analysis: A fast band selection method for hyperspectral imagery. IEEE Geosci. Remote Sens. Lett. 2015, 12, 998–1002. [Google Scholar]
  55. Koonsanit, K.; Jaruskulchai, C.; Eiumnoh, A. Band selection for dimension reduction in hyper spectral image using integrated information gain and principal components analysis technique. Int. J. Mach. Learn. Comput. 2012, 2, 248–251. [Google Scholar] [CrossRef]
  56. Xia, W.; Wang, B.; Zhang, L. Band selection for hyperspectral imagery: A new approach based on complex networks. IEEE Geosci. Remote Sens. Lett. 2013, 10, 1229–1233. [Google Scholar] [CrossRef]
  57. Chang, C.-I.; Liu, K.-H. Progressive band selection for hyperspectral imagery. IEEE Trans. Geosci. Remote Sens. 2014, 52, 2002–2017. [Google Scholar] [CrossRef]
  58. Martínez-Usó, A.; Pla, F.; Sotoca, J.M.; García-Sevilla, P. Clustering-based hyperspectral band selection using information measures. IEEE Trans. Geosci. Remote Sens. 2007, 45, 4158–4171. [Google Scholar] [CrossRef]
  59. Yang, H.; Du, Q.; Sheng, Y. Semisupervised band clustering for dimensionality reduction of hyperspectral imagery. IEEE Geosci. Remote Sens. Lett. 2011, 8, 1135–1139. [Google Scholar]
  60. Su, H.; Du, Q. Hyperspectral band clustering and band selection for urban land cover classification. Geocarto Int. 2012, 27, 395–411. [Google Scholar] [CrossRef]
  61. Winter, M.E. N-finder: An algorithm for fast autonomous spectral endmember determination in hyperspectral data. Proc. SPIE 1999, 3753, 266–277. [Google Scholar]
  62. Wu, C.-C.; Chu, S.; Chang, C.-I. Sequential N-FINDR algorithm. In Proceedings of the SPIE Conference on Imaging Spectrometry XIII, San Diego, CA, USA, 10–14 August 2008. [Google Scholar]
  63. Xiong, X.; Wu, C.-C.; Chang, C.-I.; Kapalkis, K.; Chen, H.M. Fast algorithms to implement N-FINDR for hyperspectral endmember extraction. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2011, 4, 545–564. [Google Scholar] [CrossRef]
  64. Chang, C.-I. Maximum Simplex Volume-Based Endmember Extraction Algorithms. U.S. Patent 8,417,748 B2, 9 April 2013. [Google Scholar]
  65. Chang, C.-I. Real Time Progressive Hyperspectral Image Processing: Endmember Finding and Anomaly Detection; Springer: New York, NY, USA, 2016. [Google Scholar]
  66. Wang, L.; Chang, C.-I.; Wang, Y.; Xue, B.; Song, M.; Yu, C.; Li, S. Band subset selection for anomaly detection in hyperspectral imagery. IEEE Trans. Geosci. Remote Sens. 2017, 55, 4887–4898. [Google Scholar] [CrossRef]
  67. Chang, C.-I.; Lee, L.C.; Xue, B.; Song, M.; Chen, J. Channel capacity approach to band subset selection for hyperspectral imagery. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017, 10, 4630–4644. [Google Scholar] [CrossRef]
  68. Wang, L.; Li, H.C.; Xue, B.; Chang, C.-I. Constrained band subset selection for hyperspectral imagery. IEEE Geosci. Remote Sens. Lett. 2017, 14, 2032–2036. [Google Scholar] [CrossRef]
  69. Chang, C.-I. Hyperspectral Imaging: Techniques for Spectral Detection and Classification; Kluwer Academic/Plenum Publishers: New York, NY, USA, 2003. [Google Scholar]
  70. Frost, O.L., III. An algorithm for linearly constrained adaptive array processing. Proc. IEEE 1972, 60, 926–935. [Google Scholar] [CrossRef]
  71. Chang, C.-I. Target signature-constrained mixed pixel classification for hyperspectral imagery. IEEE Trans. Geosci. Remote Sens. 2002, 40, 1065–1081. [Google Scholar] [CrossRef]
  72. Harsanyi, J.C. Detection and Classification of Subpixel Spectral Signatures in Hyperspectral Image Sequences; Department of Electrical Engineering, University of Maryland: Baltimore County, MD, USA, 1993. [Google Scholar]
  73. Pudil, P.; Novovicova, J.; Kittler, J. Floating search methods in feature selection. Pattern Recognit. Lett. 1994, 15, 1119–1125. [Google Scholar] [CrossRef]
  74. Chang, C.-I.; Du, Q. Estimation of number of spectrally distinct signal sources in hyperspectral imagery. IEEE Trans. Geosci. Remote Sens. 2004, 42, 608–619. [Google Scholar] [CrossRef]
  75. Harsanyi, J.C.; Farrand, W.; Chang, C.-I. Detection of subpixel spectral signatures in hyperspectral image sequences. In Proceedings of the American Congress on Surveying & Mapping (ACSM)/American Society of Photogrammetry & Remote Sensing (ASPRS) Annual Converntion and Exposition, Baltimore, MD, USA, 25–28 April 1994; Volume 1, pp. 236–247. [Google Scholar]
  76. Yu, C.; Xue, B.; Wang, Y.; Song, M.; Wang, L.; Li, S.; Chang, C.-I. Multi-class constrained background suppression approach to hyperspectral image classification. In Proceedings of the 2017 IEEE/GRSS International Geoscience and Remote Sensing Symposium (IGARSS), Fort Worth, TX, USA, 23–28 July 2017. [Google Scholar]
Figure 1. Purdue’s Indiana Indian Pines scene with 16 classes. (a) Band 186, (b) ground truth map, (c) ground truth class labels.
Figure 1. Purdue’s Indiana Indian Pines scene with 16 classes. (a) Band 186, (b) ground truth map, (c) ground truth class labels.
Remotesensing 10 00113 g001
Figure 2. Ground truth of Salinas scene with 16 classes. (a) Band 126, (b) color ground-truth image, (c) class labels.
Figure 2. Ground truth of Salinas scene with 16 classes. (a) Band 126, (b) color ground-truth image, (c) class labels.
Remotesensing 10 00113 g002
Figure 3. Ground truth of University of Pavia scene with nine classes. (a) Band 95, (b) color ground truth image, (c) class labels.
Figure 3. Ground truth of University of Pavia scene with nine classes. (a) Band 95, (b) color ground truth image, (c) class labels.
Remotesensing 10 00113 g003
Figure 4. Classification maps produced by iterative LCMV (ILCMV) for Purdue’s data using bands selected in Table 2. (a) Ground truth, (b) Full bands, (c) UBS, (d) MEAC, (e) MDPP, (f) DSEBS, (g) SQ LCMV-BSS-1, (h) SQ LCMV-BSS-2, (i) SC LCMV-BSS.
Figure 4. Classification maps produced by iterative LCMV (ILCMV) for Purdue’s data using bands selected in Table 2. (a) Ground truth, (b) Full bands, (c) UBS, (d) MEAC, (e) MDPP, (f) DSEBS, (g) SQ LCMV-BSS-1, (h) SQ LCMV-BSS-2, (i) SC LCMV-BSS.
Remotesensing 10 00113 g004
Figure 5. Classification maps produced by ILCMV for Salinas using bands selected in Table 2. (a) Ground truth, (b) Full bands, (c) UBS, (d) MEAC, (e) MDPP, (f) DSEBS, (g) SQ LCMV-BSS-1, (h) SQ LCMV-BSS-2, (i) SC LCMV-BSS.
Figure 5. Classification maps produced by ILCMV for Salinas using bands selected in Table 2. (a) Ground truth, (b) Full bands, (c) UBS, (d) MEAC, (e) MDPP, (f) DSEBS, (g) SQ LCMV-BSS-1, (h) SQ LCMV-BSS-2, (i) SC LCMV-BSS.
Remotesensing 10 00113 g005
Figure 6. Classification maps produced by ILCMV for Pavia using bands selected in Table 2. (a) Ground truth, (b) Full bands, (c) UBS, (d) MEAC, (e) MDPP, (f) DSEBS, (g) SQ LCMV-BSS-1, (h) SQ LCMV-BSS-2, (i) SC LCMV-BSS.
Figure 6. Classification maps produced by ILCMV for Pavia using bands selected in Table 2. (a) Ground truth, (b) Full bands, (c) UBS, (d) MEAC, (e) MDPP, (f) DSEBS, (g) SQ LCMV-BSS-1, (h) SQ LCMV-BSS-2, (i) SC LCMV-BSS.
Remotesensing 10 00113 g006
Table 1. nBS estimated by HySime and HFC/NWHFC.
Table 1. nBS estimated by HySime and HFC/NWHFC.
PF = 10−1PF = 10−2PF = 10−3PF = 10−4PF = 10−5
Purdue73/2149/1935/1827/1825/17
Salinas32/3328/2425/2121/2120/20
Univ. of Pavia25/3421/2716/1714/1413/12
Table 2. Bands selected by UBS, SQ LCMV-BSS-1, SQ LCMV-BSS-2, SC LCMV-BSS.
Table 2. Bands selected by UBS, SQ LCMV-BSS-1, SQ LCMV-BSS-2, SC LCMV-BSS.
DataMethodsSelected Bands
Purdue Indian
Pines
(18 bands)
UBS1, 14, 27, 40, 53, 66, 79, 92, 105, 118, 131, 144, 157, 170, 183, 196, 209, 220
MEAC159, 3, 92, 96, 82, 36, 39, 55, 41, 1, 2, 33, 206, 38, 163, 17, 204, 9
MDPP10, 39, 59, 75, 79, 85, 92, 130, 140, 146, 147, 149, 150, 152, 160, 164, 175, 193
DSEBS42, 129, 97, 131, 174, 16, 176, 177, 172, 43, 192, 193, 98, 171, 99, 132, 40, 33
SQ LCMV-BSS-139, 164, 29, 155, 108, 66, 79, 8, 105, 42, 44, 17, 156, 150, 3, 43, 213, 41
SQ LCMV-BSS-238, 109, 29, 52, 163, 66, 158, 8, 164, 219, 43, 78, 157, 220, 3, 49, 218, 2
SC LCMV-BSS54, 156, 42, 159, 53, 41, 79, 91, 105, 57, 51, 43, 157, 48, 107, 160, 115, 163
Salinas
(21 bands)
UBS1, 12, 23, 34, 45, 56, 67, 78, 89, 100, 111, 122, 133, 144, 155, 166, 177, 188, 199, 210, 224
MEAC107, 148, 203, 149, 5, 8, 105, 3, 28, 12, 18, 10, 44, 36, 25, 17, 51, 32, 110, 68, 58
MDPP1, 8, 11, 22, 27, 28, 50, 57, 58, 65, 90, 99, 105, 119, 123, 134, 142, 157, 175, 191, 204
DSEBS99, 101, 16, 119, 177, 112, 44, 46, 120, 47, 131, 175, 196, 121, 17, 102, 174, 180, 187, 135, 42
SQ LCMV-BSS-17, 50, 23, 48, 45, 73, 65, 15, 40, 19, 80, 122, 38, 41, 42, 46, 78, 47, 200, 37, 2
SQ LCMV-BSS-27, 42, 56, 28, 45, 58, 67, 15, 41, 19, 50, 122, 38, 34, 36, 47, 224, 46, 183, 37, 172
SC LCMV-BSS18, 39, 41, 31, 45, 44, 67, 78, 90, 101, 40, 91, 42, 141, 46, 48, 102, 185, 47, 86, 50
Univ. of Pavia
(14 bands)
UBS1, 9, 17, 25, 33, 41, 49, 57, 65, 73, 81, 89, 97, 103
MEAC1, 23, 24, 40, 42, 58, 56, 59, 48, 31, 47, 83, 25, 54
MDPP2, 23, 44, 46, 50, 62, 66, 73, 89, 91, 92, 93, 96, 102
DSEBS86, 102, 64, 20, 21, 63, 65, 6, 19, 22, 7, 66, 95, 67
SQ LCMV-BSS-11, 4, 55, 16, 95, 83, 84, 93, 39, 77, 91, 102, 92, 103
SQ LCMV-BSS-21, 4, 38, 76, 85, 55, 84, 102, 16, 83, 93, 89, 92, 103
SC LCMV-BSS1, 4, 84, 16, 38, 102, 85, 92, 83, 72, 95, 91, 96, 103
Table 3. PD, POA, and PR calculated from the classification results in Figure 4 for Purdue’s data.
Table 3. PD, POA, and PR calculated from the classification results in Figure 4 for Purdue’s data.
ClassFull BandsUBSMEACMDPPDSEBSSQ LCMV-BSS-1SQ LCMV-BSS-2SC LCMV-BSS
PDPRPDPRPDPRPDPRPDPRPDPRPDPRPDPR
195.6510095.6510093.4810095.6510095.6510095.6510097.83100100100
296.0110097.1399.5793.0799.6396.0810096.9910095.5999.7193.7899.8594.8999.85
396.9999.8896.5110096.2710097.3510097.2399.8896.3910095.6710094.10100
498.7310098.7310098.3110099.5810098.3110097.8910098.3110098.31100
589.4410090.6810091.5110092.3410093.5810091.9310092.3410092.96100
697.1210097.6710097.4099.5896.7110097.1210096.4410097.9510095.75100
7100100100100100100100100100100100100100100100100
898.7810098.5410099.1610097.4910097.9110097.9110099.1610098.95100
910010010010090.0010010010010010010090.9110095.24100100
1093.9399.7891.9810093.3110094.6599.7893.0010094.2410091.9899.5891.98100
1194.7099.8796.1399.9694.5598.2295.4899.8795.8510095.4899.9696.1710095.9399.49
1295.4510094.9410096.2910096.8010097.3010095.9510095.1110096.63100
1398.5410098.5410099.0210097.5610096.5910097.5610098.5410098.54100
1493.5210094.1510094.7810094.7010094.5510095.8910096.0510096.13100
1590.6710095.6010092.4910096.8910093.5210094.8210094.5610096.11100
1698.9298.9298.9298.9298.9210098.9298.9298.9210098.9210097.8510095.7097.80
POA95.0995.6994.9195.8995.8895.6795.4895.46
PR97.6197.9097.5298.0097.9997.8997.8097.79
Table 4. PD, POA, and PR calculated from the classification results in Figure 5 for Salinas.
Table 4. PD, POA, and PR calculated from the classification results in Figure 5 for Salinas.
ClassFull BandsUBSMEACMDPPDSEBSSQ LCMV-BSS-1SQ
LCMV-BSS-2
SC LCMV-BSS
PDPRPDPRPDPRPDPRPDPRPDPRPDPRPDPR
195.5210097.1610097.7110097.7610097.1610096.3710097.0110096.91100
298.4210098.8510098.4410097.9910099.1710098.7910098.3610098.71100
393.7899.7095.5010094.0310093.9810095.6510090.4410095.1410095.95100
495.6210094.6998.8094.3397.8497.4998.7694.7499.6296.5698.3995.9199.1192.0494.83
596.9010096.4510095.1999.8895.2210096.9099.8595.8710095.9410090.7899.79
698.7910098.5910098.5610098.7910098.5610097.9510098.9110097.75100
798.6310098.2110098.1810097.9910097.6510098.3510098.4410098.32100
896.6998.2695.8199.3997.4099.8495.2399.7496.1199.3895.8499.0697.4710096.6199.42
995.8710095.6010094.7410095.2910095.7310094.7910094.8910095.44100
1096.6710096.3710096.3410096.4610097.2510095.7310096.5810096.77100
1197.7510097.8510091.1010097.7510098.3110095.7910097.3810097.66100
1297.1510096.1610095.5410097.4610097.6610096.3210095.3910095.43100
1396.5110096.9499.4493.3599.8896.4010095.6310087.7710097.3899.7894.0098.97
1495.8910098.1410097.6699.9097.0110098.0410097.7699.0597.2010096.9399.81
1594.0098.6695.2798.0996.5210095.4296.7095.2598.8495.4297.7396.2799.8695.8498.60
1693.3010096.0710093.8610095.0710095.0210095.6810095.1310095.41100
POA96.3796.4996.4596.2596.6395.9396.8196.21
PR98.2398.2998.2798.1798.3698.0298.4598.15
Table 5. PD, POA, and PR calculated from the classification results in Figure 6 for University of Pavia.
Table 5. PD, POA, and PR calculated from the classification results in Figure 6 for University of Pavia.
ClassFull BandsUBSMEACMDPPDSEBSSQ LCMV-BSS-1SQ
LCMV-BSS-2
SC LCMV-BSS
PDPRPDPRPDPRPDPRPDPRPDPRPDPRPDPR
186.4299.9087.6799.4586.4499.7687.9799.6887.7199.7484.4499.6388.0599.4488.6799.77
273.3499.9984.3899.9583.3399.8982.1499.9684.6399.9284.1499.8985.2199.9886.7699.95
379.8596.3078.9010076.6610076.1799.0279.2210076.4910074.7110078.5699.95
498.8196.6597.8495.1698.8887.9196.9591.8597.9988.7198.1488.9697.7795.3097.7093.11
591.4910089.9310091.3310093.3210087.1110093.5010090.5710090.77100
689.1099.9891.3510082.7810087.5310087.4410086.1910090.0010091.13100
781.1010083.3210076.2610075.6410076.3410082.8410082.9210082.46100
878.4685.2079.0997.3779.5197.2079.8395.7179.3097.4477.0997.1677.0998.4579.3098.96
977.2499.8775.8699.4776.3210074.0198.4476.1710080.2199.4678.2299.8677.0899.87
POA84.3285.1983.8584.3384.2584.4585.4185.92
PR96.7696.9396.6496.7696.7596.7896.9696.95
Table 6. Computing time in seconds required by six test BS methods: MEAC, MDPP, DSEBS, SQ LCMV-BSS-1, SQ LCMV-BSS-2, SC LCMV-BSS.
Table 6. Computing time in seconds required by six test BS methods: MEAC, MDPP, DSEBS, SQ LCMV-BSS-1, SQ LCMV-BSS-2, SC LCMV-BSS.
MEACMDPPDSEBSSQ LCMV-BSS-1SQ LCMV-BSS-2SC LCMV-BSS
Purdue13.7041.140.587.007.106.93
Salinas83.6444.665.2743.4346.6343.55
University of Pavia44.5329.224.6216.6717.5216.84
Table 7. POA and PR calculated by the classification results using the bands selected by SQ LCMV-BSS-1 for the Purdue data.
Table 7. POA and PR calculated by the classification results using the bands selected by SQ LCMV-BSS-1 for the Purdue data.
ClassEPF-B-g with Full BandsEPF-B-c with Full BandsEPF-G-g with Full BandsEPF-G-c with Full BandsILCMV with Full BandsEPF-B-g-BSEPF-B-c-BSEPF-G-g-BSEPF-G-c-BSILCMV-BS
110010097.8310095.6510010010010095.65
276.4787.8280.6082.9896.0174.7990.7671.9979.9095.59
393.4983.9879.6465.4296.9963.3775.6684.1072.7796.39
499.1610010096.2098.7310010010097.8997.89
593.7994.0097.1094.8289.4489.8697.5296.0794.4191.93
610099.5999.5999.4597.1298.2299.8699.7399.3296.44
792.8692.8696.4396.4310092.8664.2992.8689.29100
810010010010098.7810010010010097.91
980.0065.0010010010065.0095.0065.0010.00100
1090.5391.4687.1493.0093.9375.3173.9784.8876.5494.24
1190.6792.6786.2788.8894.7078.9870.7964.7380.9495.48
1298.3196.4693.9392.0795.4547.5542.5079.7657.6795.95
1399.0299.5199.5199.5198.5410010010010097.56
1497.7197.0097.8798.2693.5294.8694.8687.6792.5795.89
1510010099.7482.9090.6795.8596.3795.8597.4194.82
1697.8510010010098.9298.9210098.9297.8598.92
POA92.2793.4590.3289.7995.0981.6282.9481.1784.1595.67
PR44.9844.5644.0343.7797.6139.7940.4339.8641.0297.89
Time(s)196.58200.84194.09200.8725.3731.2736.7731.1436.1637.25
Table 8. POA and PR calculated by the classification results using full bands and the bands selected by SQ LCMV-BSS-2 for the Purdue data.
Table 8. POA and PR calculated by the classification results using full bands and the bands selected by SQ LCMV-BSS-2 for the Purdue data.
ClassEPF-B-g with Full BandsEPF-B-c with Full BandsEPF-G-g with Full BandsEPF-G-c with Full BandsILCMV with Full BandsEPF-B-g-BSEPF-B-c-BSEPF-G-g-BSEPF-G-c-BSILCMV-BS
110010010010095.6510010010010097.83
286.8382.4986.6284.5296.0185.7881.8664.2266.6093.78
390.8485.9082.5380.7296.9981.9374.5863.9864.8295.67
499.1699.1610099.5898.7399.1610099.5810098.31
595.0392.9697.7292.3489.4493.1795.0391.7292.1392.34
610099.8699.5999.7397.1299.7397.8110099.5997.95
789.2989.2996.4396.4310092.8678.5796.4392.86100
810010010010098.7810010010010099.16
960.0075.0070.0050.0010010095.0065.0035.00100
1092.5990.3390.7492.4993.9370.2766.2664.7154.4291.98
1189.1289.3388.5186.4494.7064.8970.2682.1282.2496.17
1296.4698.3198.1598.9995.4567.4552.7851.1034.7495.11
1399.0299.0299.5199.5198.5499.5199.5199.5199.5198.54
1498.8198.1896.3695.0293.5290.5997.7992.8189.4196.05
1599.7410094.0495.3490.6790.4190.4193.2683.4294.56
1610010010010098.9295.7098.9210010097.85
POA93.3792.1792.1090.9695.0981.4981.2880.0177.6695.48
PR45.5244.9344.8944.3497.6139.7239.6139.0037.8597.80
Time(s)194.14199.37194.13200.3625.3731.1637.9332.5636.5141.58
Table 9. POA and PR calculated by the classification results using full bands and the bands selected by SC LCMV-BSS for the Purdue data.
Table 9. POA and PR calculated by the classification results using full bands and the bands selected by SC LCMV-BSS for the Purdue data.
ClassEPF-B-g with Full BandsEPF-B-c with Full BandsEPF-G-g with Full BandsEPF-G-c with Full BandsILCMV with Full BandsEPF-B-g-BSEPF-B-c-BSEPF-G-g-BSEPF-G-c-BSILCMV-BS
110010010097.8395.6510097.8397.83100100
283.2683.2683.0578.9296.0148.2542.0948.7439.9294.89
379.1669.6480.4866.8796.9962.0555.1860.3644.1094.10
410098.3110099.5898.7310010010099.5898.31
593.7993.1795.8693.7989.4491.9394.0093.5893.7992.96
699.7399.7398.0899.3297.1299.3299.4594.9399.4595.75
796.4392.8696.4389.2910085.7171.4392.8664.29100
810099.7910010098.7810010099.7910098.95
920.0075.0075.0065.0010020.0045.0035.000100
1075.3184.1684.4789.6193.9359.0548.9747.2249.9091.98
1191.6591.0087.9891.7394.7076.2578.0970.2272.8795.93
1296.2992.0796.1297.4795.4570.8360.2044.8658.8596.63
1399.5199.5199.5199.5198.5410010010010098.54
1492.8998.3496.3696.6893.5298.3493.0497.2397.3996.13
1510010098.1999.4890.6793.2689.1279.2767.3696.11
1610010010010098.9210010093.5595.7095.70
POA90.06 90.4290.5690.3795.0977.3774.1272.3171.2595.46
PR43.90 44.0844.15 44.0597.6137.7236.1335.2534.7397.79
Time(s)187.76203.60195.20201.2125.3732.0138.7831.9238.0742.73
Table 10. POA and PR calculated by the classification results using full bands and the bands selected by SQ LCMV-BSS-1 in Table 2 for Salinas.
Table 10. POA and PR calculated by the classification results using full bands and the bands selected by SQ LCMV-BSS-1 in Table 2 for Salinas.
ClassEPF-B-g with Full Bands.EPF-B-c with Full BandsEPF-G-g with Full BandsEPF-G-c with Full BandsILCMV with Full BandsEPF-B-g-BSEPF-B-c-BSEPF-G-g-BSEPF-G-c-BSILCMV-BS
110010010010095.5210099.7510010096.37
210010010010098.4299.9710099.9710098.79
310010010010093.7810010010010090.44
410010010099.9395.6299.9399.8699.9310096.56
599.3799.2598.6999.2596.9098.9298.9598.9298.9595.87
610010010010098.7999.9799.9799.9799.9797.95
710099.9210010098.6399.6699.7299.6699.8398.35
890.9290.4089.2690.6196.6987.1191.9487.1191.7095.84
999.9810099.9799.9795.8799.5810099.5899.9794.79
1096.9598.0898.6098.5196.6796.1998.5796.1998.6095.73
1199.9199.9199.9110097.7599.9199.9199.9199.8195.79
1210010010010097.1510010010010096.32
1399.8999.5699.0299.8996.5198.4799.1398.4799.5687.77
1499.9199.2599.6310095.8998.9799.3598.9798.0497.76
1589.0185.6585.6587.0894.0090.6682.2190.6685.9995.42
1699.8310099.4599.8393.3099.6710099.6710095.68
POA96.4095.8995.6496.1796.3795.6495.7395.6496.1995.93
PR46.9746.2646.9546.8598.2346.6046.6446.6046.8698.02
Time(s)1060.77741.841082.511134.06167.8075.1778.7375.17104.31134.96
Table 11. POA and PR calculated by the classification results using full bands and the bands selected by SQ LCMV-BSS-2 in Table 2 for Salinas.
Table 11. POA and PR calculated by the classification results using full bands and the bands selected by SQ LCMV-BSS-2 in Table 2 for Salinas.
ClassEPF-B-g with Full BandsEPF-B-c with Full BandsEPF-G-g with Full BandsEPF-G-c with Full BandsILCMV with Full BandsEPF-B-g-BSEPF-B-c-BSEPF-G-g-BSEPF-G-c-BSILCMV-BS
110010010010095.5210010010010097.01
210010010010098.4210010010099.9798.36
310010010010093.7810010099.9510095.14
410010010099.7895.6210099.9310010095.91
599.4898.9598.5899.1496.9098.6698.7399.1498.8495.94
610010010010098.7999.9510010010098.91
710099.8910099.8998.6399.9299.9799.8999.8098.44
888.6391.9887.9090.8696.6987.8489.5090.6191.8697.47
999.9099.9899.9510095.8799.9799.9499.6010094.89
1097.5698.9097.0497.5096.6799.3399.5199.2198.2996.58
1110099.9199.9199.7297.7510099.9199.7210097.38
1210010010010097.1510010010010095.39
1399.2499.8998.8099.1396.5199.0210098.3699.1397.38
1499.9199.9199.9199.2595.8910010099.9199.3597.20
1588.6685.4394.2591.4394.0079.3593.0483.9486.1396.27
1699.9410099.6199.5093.3010010010010095.13
POA95.9196.2496.4296.6996.3794.5696.7795.7196.2496.81
PR46.7346.8946.9747.1198.2346.0747.1546.6346.8998.45
Time(s)1128.55755.521050.34722.76167.8073.49101.8071.5798.16159.26
Table 12. POA and PR calculated by the classification results using full bands and the bands selected by SC LCMV-BSS in Table 2 for Salinas.
Table 12. POA and PR calculated by the classification results using full bands and the bands selected by SC LCMV-BSS in Table 2 for Salinas.
ClassEPF-B-g with Full BandsEPF-B-c with Full BandsEPF-G-g with Full BandsEPF-G-c with Full BandsILCMV with Full BandsEPF-B-g-BSEPF-B-c-BSEPF-G-g-BSEPF-G-c-BSILCMV-BS
110010010010095.5210010010010096.91
210010010010098.4299.9599.8499.9299.9798.71
310010010010093.7899.9010010099.9095.95
410010010010095.6210099.5710099.8692.04
599.0799.2299.1098.9296.9098.8899.5599.3398.9990.78
610010010010098.7910010010099.9097.75
710099.9710099.9798.6399.6199.8910099.9498.32
890.1889.5189.4091.3896.6990.8889.0290.6890.3096.61
910099.9899.8199.9495.8799.9599.9599.9710095.44
1097.4797.7197.4199.4896.6798.9099.2498.1198.2996.77
1110010010010097.7510099.7299.3499.7297.66
1210010010010097.1510010010010095.43
1310099.8998.9199.8996.5199.3499.8999.2499.1394.00
1410010010099.8195.8999.3599.6398.7998.3296.93
1588.9984.2688.0886.9094.0089.7493.3578.9288.9295.84
1699.2899.8910099.9493.3010010010099.9495.41
POA96.2595.5295.9596.3496.3796.5096.7095.0296.2696.21
PR46.8946.5346.7546.9498.2347.0247.1146.2946.9098.15
Time(s)1139.991106.951154.781089.61167.8073.0694.1769.9291.50147.30
Table 13. POA and PR calculated by the classification results using full bands and the bands selected by SQ LCMV-BSS-1 in Table 2 for University of Pavia.
Table 13. POA and PR calculated by the classification results using full bands and the bands selected by SQ LCMV-BSS-1 in Table 2 for University of Pavia.
ClassEPF-B-g with Full BandsEPF-B-c with Full BandsEPF-G-g with Full BandsEPF-G-c with Full BandsILCMV with Full BandsEPF-B-g-BSEPF-B-c-BSEPF-G-g-BSEPF-G-c-BSILCMV-BS
198.0498.0498.0897.8177.2496.1592.4796.3497.2980.21
298.6699.3997.7998.2886.4294.3194.8497.3998.0684.44
391.0993.5295.0094.3373.3492.6294.8195.5795.3384.14
493.4795.2792.9298.0179.8597.5297.4597.0096.8776.49
510010010099.8598.8110010010010098.14
699.9810010010091.4999.6498.1599.6410093.50
710099.3299.9299.7789.1099.9299.3299.4010086.19
899.0299.0097.8099.7881.1095.6096.2296.4196.2882.84
910010010010078.4610099.8910010077.09
POA98.1298.6797.8098.4684.3295.9697.4997.4998.0184.45
PR20.2420.3520.1720.3196.7619.7919.7120.1120.2196.82
Time(s)225.93265.79232.05252.50401.0851.1683.4652.9078.321387.01
Table 14. POA and PR calculated by the classification results using full bands and the bands selected by SQ LCMV-BSS-2 in Table 2 for University of Pavia.
Table 14. POA and PR calculated by the classification results using full bands and the bands selected by SQ LCMV-BSS-2 in Table 2 for University of Pavia.
ClassEPF-B-g with Full BandsEPF-B-c with Full BandsEPF-G-g with Full BandsEPF-G-c with Full BandsILCMV with Full BandsEPF-B-g-BSEPF-B-c-BSEPF-G-g-BSEPF-G-c-BSILCMV-BS
194.8997.8097.8498.2177.2495.4395.9688.7692.0178.22
298.5897.6399.2598.0186.4296.4394.8892.5993.1388.05
394.9093.5792.5795.1973.3492.1992.4795.5293.6285.21
495.6394.9793.0298.8379.8598.6996.0298.3798.4774.71
510010010099.8598.8110010010010097.77
610010010010091.4910099.5299.7299.9490.57
710010099.4099.7789.1010099.9210010090.00
896.9398.8998.9498.4881.1093.9794.7095.2293.5682.92
910010010010078.4610010010010077.09
POA97.7697.8598.3698.3984.3296.7495.9794.2594.7885.41
PR20.1620.1820.2920.2996.7619.9519.7919.4419.5596.94
Time(s)219.10249.76226.85238.81401.0848.5282.9553.5379.05971.56
Table 15. POA and PR calculated by the classification results using full bands and the bands selected by SC LCMV-BSS in Table 2 for University of Pavia.
Table 15. POA and PR calculated by the classification results using full bands and the bands selected by SC LCMV-BSS in Table 2 for University of Pavia.
ClassEPF-B-g with Full BandsEPF-B-c with Full BandsEPF-G-g with Full BandsEPF-G-c with Full BandsILCMV with Full BandsEPF-B-g-BSEPF-B-c-BSEPF-G-g-BSEPF-G-c-BSILCMV-BS
199.4397.1697.9597.9877.2495.9392.7398.2797.9977.08
298.1999.1598.8098.7986.4295.4494.3197.0096.2388.67
399.2495.0994.8593.5773.3493.5291.0492.3894.1486.76
493.9394.6594.6198.2779.8597.5296.7498.2796.9678.56
510099.8510099.9398.8110010010010097.70
699.9699.3499.6810091.4999.7098.1199.0510090.77
710010099.7099.7789.1099.6299.4799.9299.5591.13
896.8598.1397.7799.5181.1092.8892.6192.9992.9782.46
910010010010078.4610010010010079.30
POA98.3798.3298.2898.6784.3296.2294.8597.2196.9285.92
PR20.2920.2820.2720.3596.7619.8519.5620.0519.9997.09
Time(s)238.01270.24234.01259.33401.0851.1383.4149.3776.48998.17

Share and Cite

MDPI and ACS Style

Yu, C.; Song, M.; Chang, C.-I. Band Subset Selection for Hyperspectral Image Classification. Remote Sens. 2018, 10, 113. https://doi.org/10.3390/rs10010113

AMA Style

Yu C, Song M, Chang C-I. Band Subset Selection for Hyperspectral Image Classification. Remote Sensing. 2018; 10(1):113. https://doi.org/10.3390/rs10010113

Chicago/Turabian Style

Yu, Chunyan, Meiping Song, and Chein-I Chang. 2018. "Band Subset Selection for Hyperspectral Image Classification" Remote Sensing 10, no. 1: 113. https://doi.org/10.3390/rs10010113

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop