Next Article in Journal
Tell Me, What Do You See?—Interpretable Classification of Wiring Harness Branches with Deep Neural Networks
Previous Article in Journal
Design of a Single-Layer ±45° Dual-Polarized Directional Array Antenna for Millimeter Wave Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Crop Classification Based on Red Edge Features Analysis of GF-6 WFV Data

1
School of Surveying and Land Information Engineering, Henan Polytechnic University, Jiaozuo 454003, China
2
Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing 100101, China
3
Sanya Institute of Remote Sensing, Sanya 572029, China
*
Author to whom correspondence should be addressed.
Sensors 2021, 21(13), 4328; https://doi.org/10.3390/s21134328
Submission received: 16 April 2021 / Revised: 14 June 2021 / Accepted: 22 June 2021 / Published: 24 June 2021
(This article belongs to the Section Optical Sensors)

Abstract

:
A red edge band is a sensitive spectral band of crops, which helps to improve the accuracy of crop classification. In view of the characteristics of GF-6 WFV data with multiple red edge bands, this paper took Hengshui City, Hebei Province, China, as the study area to carry out red edge feature analysis and crop classification, and analyzed the influence of different red edge features on crop classification. On the basis of GF-6 WFV red edge band spectral analysis, different red edge feature extraction and red edge indices feature importance evaluation, 12 classification schemes were designed based on GF-6 WFV of four bands (only including red, green, blue and near-infrared bands), stepwise discriminant analysis (SDA) and random forest (RF) method were used for feature selection and importance evaluation, and RF classification algorithm was used for crop classification. The results show the following: (1) The red edge 750 band of GF-6 WFV data contains more information content than the red edge 710 band. Compared with the red edge 750 band, the red edge 710 band is more conducive to improving the separability between different crops, which can improve the classification accuracy; (2) According to the classification results of different red edge indices, compared with the SDA method, the RF method is more accurate in the feature importance evaluation; (3) Red edge spectral features, red edge texture features and red edge indices can improve the accuracy of crop classification in different degrees, and the red edge features based on red edge 710 band can improve the accuracy of crop classification more effectively. This study improves the accuracy of remote sensing classification of crops, and can provide reference for the application of GF-6 WFV data and its red edge bands in agricultural remote sensing.

1. Introduction

Crop classification is an important part of agricultural remote sensing monitoring, as well as the basis and key link of the application of remote sensing technology in the field of agriculture [1]. Timely and accurate classification of crops by remote sensing can determine planting area, spatial distribution, landscape pattern and planting structure of crops accurately, and provide important input parameters for crop yield estimation and growth monitoring [1,2,3]. The study on crop classification by remote sensing is of great significance for guiding agricultural production, formulating agricultural policy, ensuring food security and realizing sustainable agricultural development [4,5].
With the continuous development of remote sensing science and technology, moderate-to-high spatial resolution remote sensing data have been widely used in land use and land cover classification, crop classification, and so on [6,7,8,9]. The traditional moderate-to-high spatial resolution multispectral optical satellite payload is mainly composed of four bands: blue (450–520 nm), green (520–590 nm), red (630–690 nm) and near-infrared (770–890 nm). The limited bands make it difficult to meet the requirements of fine classification of crops by remote sensing. The red edge band is between the red band and the near-infrared band, and the band range is roughly 670–780 nm. The red edge is the region where the spectral reflectance of green vegetation rises rapidly within a certain band range, and it is the sensitive spectral band of vegetation, which is closely related to the pigment state and physical and chemical properties of crops and other vegetation [10]. Red edge information was first used in hyperspectral remote sensing, which is often used in the inversion of vegetation physiological and biochemical parameters such as chlorophyll content, leaf area index (LAI), biomass, nitrogen content, crop growth and pest monitoring, etc. [11,12,13,14]. In addition, some studies have shown that the red edge band can enhance the separability between different ground features, which plays an important role in improving the accuracy of crop remote sensing classification [15,16].
Based on the recognition of the importance of red edge band, more and more multispectral satellite payloads have improved the application ability of remote sensing satellites by adding red edge bands, such as RapidEye satellite of Germany, WorldView-2/3 satellite of the United States, Sentinel-2 satellite of ESA and Gaofen-6 (GF-6) satellite of China. Many studies have confirmed that the addition of red edge bands can significantly improve the classification accuracy of crops [17,18,19,20]. Meanwhile, the red edge index, red edge texture and other features based on red edge bands enrich the feature space of crop classification, which is conducive to better use of different red edge features. For example, Kim et al. [21] used two seasons’ RapidEye images to classify rice crops in Jeonju, Korea, and applied the spectral and texture features of broadband red edge in different rice classification, which confirmed that broadband red edge information has the potential to improve the accuracy of rice classification. Ustuner et al. [22], based on RapidEye images, discussed the impact of three different vegetation indices on the classification accuracy of crops in the Aegean Sea area of Turkey, including normalized vegetation index (NDVI), green normalized difference vegetation index (GNDVI) and normalized difference edge index (NDRE), and found that NDRE contributed the most to classification accuracy. Huang et al. [23] adopted Sentinel-2A data of time series to extract crop classification information, by introducing parcel elements and improved chlorophyll absorption red edge index (MCARI), combined with machine learning methods, to explore the impact of different classification feature combinations on classification accuracy, which showed that the introduction of red edge spectra and red edge indices could significantly improve the identification ability of crops in arid areas. Wu et al. [24] selected the multitemporal Sentinel-2A remote sensing data, calculated the NDVI and red edge normalized vegetation index (RENDVI), designed five different vegetation indices time series combination feature classification schemes, and used the random forest algorithm to realize the fine classification of crops. The classification results prove that the red edge indices can assist NDVI to improve the classification accuracy.
At present, machine learning is widely used in the research of crop remote sensing classification algorithm. There are many kinds of machine learning classification methods, such as support vector machine (SVM), random forest (RF), decision tree (DT), maximum likelihood (ML), k-nearest neighbor (k-NN) and artificial neural network (ANN), which can be applied to the research of land cover and land use classification, crop classification and other remote sensing applications, among which random forest is one of the most commonly used machine learning classification methods [25]. Zeraatpasheh et al. [26] used a variety of linear and nonlinear machine learning algorithms, including Cubist (Cu), random forest (RF), regression tree (RT) and multiple linear regression (MLR), to conduct a digital mapping of soil properties in the semi-arid Borujen region of central Iran. Sonobe et al. [27] compared kernel-based extreme learning machine (KELM), multilayer feedforward neural network (FNN), random forest (RF) and support vector machine (SVM) using Sentinel-1A and Sentinel-2A data, and evaluated the sensitivity of the different supervised learning models in the study area of Hokkaido, Japan. Maponya et al. [28] evaluated the classification performance of SVM (support vector machine), DT (decision tree), k-NN (k-nearest neighbor), RF (random forest) and ML (maximum likelihood) for different time series Sentinel-2 data in two different sites in the Western Cape, South Africa, and concluded that SVM and RF can obtain better classification accuracy and greater application potential. In a word, compared with different machine learning classification methods, RF classification method has the characteristics of less parameter setting, stability, maturity and high classification accuracy, so it is suitable for the classification and comparative study of different red edge features in this study.
Although the current research generally confirms the important role of the red edge band for crop classification, most studies only use different red edge spectra or red edge indices participating in the classification to test its effect on crop classification, which is not enough for the analysis and evaluation of different red edge features, especially for GF-6 WFV, Sentinel-2 and other remote sensing satellites with multiple red edge bands. To resolve these problems, this study is mainly aimed at: (1) Performing spectral feature analysis and red edge indices feature importance evaluation of GF-6 WFV data, and extracting different red edge spectra, red edge textures and red edge indices; (2) Designing red edge feature classification schemes for crop classification in Hengshui City, analyzing the impact of different red edge features on crop classification, improving the accuracy of crop classification and promoting the application of GF-6 WFV data in the field of agricultural remote sensing.

2. Study Area and Data Sources

2.1. Overview of the Study Area

The study area is Hengshui City, located in the southeast of Hebei Province, China, with a total area of 8815 km2. Hengshui City belongs to the semi-humid and semi-arid monsoon climate zone in the warm temperate zone, with four distinct seasons, sufficient sunshine and annual average precipitation of 500–600 mm. The study area has rich land resources and diverse soil types, which are suitable for planting and growing a variety of crops, and the coverage rate of crops is more than 70%. The crop types in Hengshui City include winter wheat, summer maize, spring maize, cotton, fruit trees, greenhouse vegetables, etc. The main planting mode is winter wheat–summer maize rotation, two crops a year, and spring maize, cotton and other crops once a year [29,30]. The Heilonggang River Basin in Hebei Province, where the study area is located, is a typical groundwater funnel area in China and one of the pilot areas of the national fallow system. In recent years, affected by the national fallow policy and the transfer of rural labor, the planting area of winter wheat with large water consumption has decreased, while the planting proportion of spring maize has increased year by year [31]. Therefore, the study of crop classification by remote sensing is helpful to timely and accurately monitor the crop planting structure in Hengshui City, and is of great significance to implementing the national fallow policy and promoting the sustainable development of agriculture [32].

2.2. Data Source

2.2.1. Remote Sensing Data

The GF-6 satellite is equipped with a 2 m panchromatic/8 m multi-spectral (PMS) camera and a 16 m multispectral wide-field-viewing (WFV) camera, which has the characteristics of high resolution and wide coverage, and the width of its wide-field-viewing camera can reach 800 km [33]. The GF-6 satellite was successfully launched at the Jiuquan Satellite Launch Center on 2 June 2018. For the first time in China, the wide-field camera on the GF-6 satellite has added two “red edge” bands that can effectively reflect the specific spectral characteristics of crops. At the same time, it will operate in a network with the GF-1 satellite in orbit, which can greatly improve China’s ability to monitor agriculture, forestry and other resources [34]. The main information parameters of GF-6 WFV data are shown in Table 1. In this paper, the GF-6 WFV image covering Hengshui City in the study area was selected on 28 August 2019. At that time, the crops in the study area were in a period of vigorous growth, which was suitable for remote sensing identification and classification research of crops. In order to meet the application requirements of crop classification by remote sensing, GF-6 WFV data were preprocessed by orthorectification, atmospheric correction and geometric correction to obtain surface reflectance data.

2.2.2. Sample Data

The field sampling of different crops in Hengshui City was conducted in July 2019. At that time, the winter wheat had been harvested, and the summer maize with winter wheat and summer maize rotation system had basically sprouted in the stubble farmland, while the spring maize was in the jointing and heading stage, and the cotton flower was in the budding stage, which was conducive to the visual judgment of crop types. During the field survey, the sample points of different crop types in 11 counties and urban areas of Hengshui City were collected by handheld GPS, and photos were taken. The survey route covered most of the farmland in Hengshui City and the distribution was relatively uniform. A total of 614 valid sample plots were obtained in this survey, which were divided into training samples and validation samples according to the ratio of 1:1 for crop classification and accuracy evaluation. The geographical location and sample distribution of the study area are shown in Figure 1, and the number of samples for different classifications is shown in Table 2.

3. Methods

The spectral features of remote sensing images and their derived features such as texture and vegetation index are the main basis of remote sensing image classification. GF-6 WFV data have several new spectral segments, and the feature analysis and optimization of these spectral segments is an important means of tapping into the potential of data application and ensuring classification accuracy. In this study, adaptive band selection (ABS) method and the extension of Jeffries–Matusita distance (JBh) were used to analyze the spectral features of red edge bands of GF-6 WFV data based on information content and inter-class separability. Then, different red edge bands were used to construct the corresponding texture features and 10 different red edge index features. Stepwise discriminant analysis (SDA) and random forest (RF) feature importance evaluation algorithms were used to optimize the red edge indices. Finally, based on the spectrum, texture and optimal red edge index features of different red edge bands of GF-6 WFV data, different red edge feature classification schemes were designed, and the RF algorithm was used to classify crops according to different schemes. The technical route of the study is shown in Figure 2.

3.1. Spectral Feature Analysis

Since remote sensing classification mainly uses the spectra, texture and vegetation index features of the image, combined with the red edges and other new bands of GF-6 WFV data, it is very important to analyze the spectral features of GF-6 WFV data. In this study, the spectral feature analysis of GF-6 WFV data mainly adopts the ABS method based on information content and JBh distance method based on inter-class separability, so as to analyze the spectral feature of GF-6 WFV data from the two aspects of information content and inter-class separability.

3.1.1. ABS Method

ABS is a sort-based band selection method based on the optimum index factor (OIF) mathematical model, which was proposed by Liu Chunhong et al. [35] on the basis of fully studying the OIF model. This method can select bands with rich information contents and little correlation with other bands. The calculation method of ABS method is shown in Equation (1), which is based on the following principles: (1) the selected band has more information content; (2) the selected band has less correlation with other bands [36]. The index obtained by the ABS method fully considers the similarity between the degree of information enrichment of each image and the adjacent bands. It has the characteristics of simple algorithm, convenient operation and effectively shortening the running time of the OIF model. In addition, the ABS method can obtain the evaluation index of a single band, which is beneficial to the spectral analysis and the study of remote sensing images in specific bands [37].
I i = σ i ( r i 1 , i + r i , i + 1 ) / 2
In the above equation, σi is the standard deviation of band I; r i 1 , i and r i , i + 1 are the correlation coefficients between the band i and the two bands before and after it; I i represents the ABS index—the larger its value, the greater the information content in the corresponding band.

3.1.2. JBh Distance

The performance of classifier classification largely depends on whether the feature can accurately describe the nature of the object. Therefore, a separability criterion such as Jeffries–Matusita (JM) distance is needed to measure the separability between different types of features. JM distance can better reflect the actual relationship with classification accuracy [38], but it can only measure the separability between each two classes and has the drawback that it cannot reflect the separability between multiple categories. Therefore, the JBh distance was introduced to measure the separability between multiple types of features [39]. Based on the Bhattacharyya principle, JBh distance gives greater weights to categories with higher probability of a priori, which is calculated according to the number of samples between different categories. The calculation method is shown in Equation (2):
J B h = i = 1 M j > i N P ( w i ) × P ( w j ) × J M 2 ( i , j )
In the above equation, N is the number of categories; P ( w i ) and P ( w j ) are the a priori possibilities of Class i and Class j, which are calculated according to the number of samples (pixels) [40].

3.2. Texture Feature Extraction

Texture feature is the law and feature formed by the repeated appearance of a large number of small objects in the image. It is a comprehensive reflection of the size, shape, shadow and color of a large number of individuals, and describes the spatial variation characteristics of pixel brightness. Gray-level co-occurrence matrix (GLCM) is a widely used method for extracting texture features. The commonly used statistical measures of texture features include Mean, Variance, Homogeneity, Contrast, Dissimilarity, Angular Second Moment (ASM), Entropy and Correlation, a total of 8 categories [41].
In this paper, 8 texture statistical measures with a 3 × 3 pixel-sized window were extracted from the two red edge bands and near-infrared band of GF-6 WFV data by GLCM method, respectively. Due to the redundancy between different texture statistical measures, the study will adopt the principal component analysis (PCA) method [42] to extract the first principal component (PC1) of the 8 texture statistical measures to represent the texture features of three different bands, so as to analyze the influence of the texture features of different red edge bands on crop classification.

3.3. Red Edge Index Analysis

Vegetation index is a linear or non-linear combination of spectral reflectance of remote sensing image, which can reflect some certain characteristics of vegetation information [43]. The red edge index is also known as the red edge vegetation index, namely, the vegetation index calculated by the surface reflectance of the red edge band. Since the red edge band is the sensitive characteristic spectral band of vegetation, the red edge index has an important influence on the classification of crops and other vegetation [44]. According to the characteristics of GF-6 WFV data with two red edge bands, this paper refers to relevant literature to construct 10 kinds of red edge indices for crop classification study [45,46]. The 10 red edge indices based on the GF-6 WFV data are shown in Table 3.
Feature selection and feature importance evaluation are of great significance for remote sensing image classification, which plays an important role in promoting model performance and improving classification model and algorithm [54]. Feature selection methods are divided into three categories: Filter, Wrapper and Embedded [55]. In this paper, the filter method of SDA and the embedded method of RF were used to evaluate 10 red edge vegetation indices of GF-6 WFV data [56,57]. In order to provide reference for the application of different red edge index features in crop classification, the importance of different red edge index features was analyzed, and the comparison and cross validation of SDA and RF were carried out.

3.4. Classification Scheme and Accuracy Evaluation

Based on the GF-6 WFV data of only red, green, blue and near-infrared bands, this paper designed some crop classification schemes by adding the spectra, texture and vegetation index features of one or two red edge bands to ensure that the feature dimensions involved in the classification are effectively consistent, so as to analyze the influence of the red edge 710, red edge 750 and near-infrared bands of GF-6 WFV data on crop classification [26,58]. The classification scheme design was first divided into three groups: A, B and C, according to the spectra, texture and vegetation index of different red edge bands, and then was subdivided into 12 specific classification schemes according to different combinations of red edge bands. The designs of different red edge feature classification schemes are shown in Table 4.
According to the texture features of different red edge bands, 8 texture statistical measures of GLCM were calculated for two red edge bands and a near-infrared band, respectively, and PCA was performed. The PC1 was extracted to form three texture features, and then four different texture feature classification schemes of group B were designed.
Based on two feature evaluation methods, namely SDA and RF, the first 4 red edge indices with high feature importance for classification were selected from the 10 red edge indices of GF-6 WFV data. Through different red edge indices participating in crop classification, the importance of different red edge indices and their impact on crop classification were analyzed, and the accuracy of different feature evaluation methods was verified, so as to provide reference for the application of different red edge indices in crop classification.
According to the abovementioned 12 red edge feature classification schemes, the RF classification module of EnMAP-Box Toolkit [59] was used to classify the main crop types in the study area. RF is a supervised classification method composed of multiple CART decision trees. It uses random resampling technology and node random splitting technology to construct multiple decision trees, and the final classification result is obtained through voting. Compared with traditional classification algorithms such as Maximum Likelihood Classification (MLC) and Support Vector Machine (SVM), the RF algorithm has a faster training speed and a higher degree of intelligence, is not easy to overfit and has high classification accuracy, and is widely used in crop classification and area statistics [60,61].
The evaluation of classification accuracy was based on the Confusion Matrix. The overall accuracy (OA), kappa coefficient, producer accuracy (PA) and user accuracy (UA) were selected to evaluate the accuracy of different classification schemes [62]. The F1 accuracy was used to evaluate the identification accuracy of a specific category of crops. F1 accuracy is the weighted harmonic mean of UA and PA [63], the specific calculation formula is shown in Equation (3):
F 1 = 2 × U A × P A / ( U A + P A ) × 100 %
McNemar’s test was employed in this study to evaluate the statistical significance of the accuracy of the different classifiers or groups [64]. The test was based on the error matrix of two classifications and the chi-squared statistic value (χ2) with one degree of freedom was calculated as follows:
χ 2 = ( f 12 f 21 ) 2 f 12 + f 21
where fij is the number of samples that are correctly classified by classification scheme i and incorrectly classified by classification scheme j (i = 1, 2; j = 1, 2). The difference between two classification schemes is statistically significant at the 95% confidence level (p = 0.05) when the χ2 value is greater than or equal to 3.84 [65,66]. In this study, McNemar’s test was used to evaluate whether there were significant differences in the classification accuracy between two classification schemes by RF classifier with different red edge features.

4. Results

4.1. Red Edge Spectral Analysis Results

4.1.1. Spectral Analysis Based on Information Content

According to the characteristics of the ABS method, the GF-6 WFV data were first reconstructed with the purple band as the first band and the yellow band as the last band, so as to calculate and sort the ABS index of other bands of GF-6 WFV data. The ABS indices of GF-6 WFV data except the purple band and yellow band are shown in Table 5.
Combined with the results of the ABS index analysis, it can be concluded that the near-infrared band (NIR) and the two red edge bands (RE1, RE2) have more information content than the visible light bands (R, G, B), and the order of information content is as follows: NIR > RE2 > RE1 > R > G > B. The results show that the near-infrared and red edge bands of GF-6 WFV data can provide more information, which is beneficial to the classification of crops and other ground features.

4.1.2. Spectral Analysis Based on Separability Distance

According to the characteristics of GF-6 WFV data with two red edge bands, four different spectral classification schemes of “four bands”, “four bands + red edge 710”, “four bands + red edge 750” and “four bands + red edge 710 + red edge 750” were designed, corresponding to the classification schemes mentioned in Section 3.4, respectively. Combined with the training samples of different crops, the influence of different red edge bands on crop separability was analyzed by calculating JBh distance. JBh distance reflects the separability of crops in different classification schemes, and can be used to analyze the separability measure of different red edge bands of GF-6 WFV data to participate in crop classification. The JBh distance of four different classification schemes is shown in Figure 3.
The distances of schemes A-1, A-2, A-3 and A-4 were 10.561, 11.583, 11.239 and 11.777, respectively. The results show that the JBh distance of the two red edge bands (RE1, RE2) participating in the classification is the largest, the JBh distance of the red edge 710 band (RE1) is greater than the red edge 750 band (RE2) participating in the classification, and the JBh distance of bands without red edge participating in the classification is the smallest. By calculating the JBh distance of different classification schemes, the study confirms that the addition of red edge band of GF-6 WFV data can improve the separability of different crop types, and the red edge 710 band is more beneficial to improving the separability of different crops than the red edge 750 band, which plays an important role in crop classification.

4.2. Feature Importance Evaluation Results of Red Edge Indices

The SDA method of evaluating the importance of different features is mainly based on the F value of different features. The larger the F value, the greater the importance of the corresponding feature, and the smaller the F value, the lower the importance of the feature [67,68]. In this study, SPSS statistical analysis software [69] was used to realize the stepwise discriminant analysis and evaluation of different red edge index features. RF algorithm used random forest classifier model in the scikit-learn machine learning library of Python language to reflect feature importance by mean decrease Gini (MDG) [70,71]. The feature importance score of different red edge indices based on the two methods of SDA and RF are shown in Table 6, and the evaluation results of the feature importance of different red edge indices are shown in Figure 4.
According to Table 6 and Figure 4, although the importance order of the red edge indices obtained by the two methods was different, the top four red edge indices in the order of importance were CIre1, MTCI, NDRE and NDVIre1. Therefore, CIre1, MTCI, NDRE and NDVIre1 were selected to match the optimal red edge index 1, optimal red edge index 2, optimal red edge index 3 and optimal red edge index 4 as shown in Table 4, respectively. Combined with the four red edge indices, different classification schemes were designed to analyze the impact of different red edge indices on crop classification based on GF-6 WFV data.

4.3. Classification Results of Red Edge Features

Based on the RF algorithm, the classification and accuracy evaluation of 12 schemes were carried out. The crop classification results in Hengshui city are shown in Figure 5, and the OAs and kappa coefficient scores of different schemes are shown in Figure 6. The classification accuracy of different crops based on the three classification schemes of A, B and C, corresponding to different red edge spectral features, red edge texture features and red edge indices are shown in Table 7, Table 8 and Table 9. McNemar’s test results for analyses 1–23 between the two different classification schemes of red edge features are shown in Table 10.
According to the crop classification results shown in Figure 5 (taking scheme B-3 with the highest overall classification accuracy as an example), we can obtain the planting situation and spatial distribution of different crops in Hengshui City on 28 August 2019. Among them, summer maize was the most important planting type and distributed in most areas of Hengshui City. Spring maize was mainly distributed in the north and northeast of Hengshui City. Planting spring maize can save water resources and labor, which is in line with the country’s seasonal fallow policy. Cotton was mainly distributed in the southern part of Hengshui City, which was close to the cotton producing area in southern Hebei Province and was a traditional cotton planting area. Greenhouse was mainly concentrated in the east of Raoyang County in Hengshui City. Raoyang County is known as the hometown of Chinese vegetables and the hometown of Chinese facility grapes, so the distribution of greenhouses is the most concentrated in Raoyang. Orchards were mainly concentrated in the north of Shenzhou City in Hengshui City. Shenzhou City is known as the hometown of Chinese peach and fruit production base in China, so orchards are most concentrated in Shenzhou City. The minor crops in Hengshui were mainly peanuts and soybeans, as well as some peppers and potatoes, etc., which were distributed evenly and dispersed in the whole area of Hengshui City, without a large range of centralized planting and distribution.

5. Discussion

The results of red edge spectral analysis show that the two red edge bands and the near-infrared band of GF-6 WFV data contain more information content, among which, the red edge 750 band holds more information than the red edge 710 band, and the near-infrared band contains the most information content. The separability measure analysis of different samples shows that the red edge 710 band is more beneficial to improving the separability of different crops than the red edge 750 band, which is helpful for improving the classification accuracy of crops. The results of spectral analysis based on separability distance are consistent with the results of different red edge spectra participated in classification, and have a greater impact on crop classification than band information content.
The feature importance evaluation results of red edge indices show that CIre1, MTCI, NDRE and NDVIre1 are the top four red edge indices in SDA and RF algorithm. Combined with the crop classification results of the RF algorithm with four different red edge index features, the RF feature importance evaluation method is more consistent with the actual classification results and is more accurate than the SDA method.
The classification results of different red edge spectral features are consistent with the spectral analysis results based on separability distance, and compared with the A-1 scheme (only including R, G, B, NIR four bands), the addition of different red edge spectral information can improve the overall classification accuracy. Among them, the red edge 710 band is more beneficial to improving the overall classification accuracy than the red edge 750 band, while both the red edge 710 band and the red edge 750 band participate in the overall classification accuracy is the highest. For specific crop types, the red edge 710 band and red edge 750 band can improve the identification accuracy of crops in varying degrees. Compared with red edge 750 band, red edge 710 band is more beneficial to the identification of summer maize, spring maize and orchards, while for minor crops and greenhouse, the red edge 750 band is more effective than red edge 710 band on crop identification.
The classification results of four different texture features show that the OA and kappa coefficient of red edge 710 band texture features is better than that of red edge 750 band texture feature, and the OA and kappa coefficient of two red edge band texture features are similar to that of only red edge 710 band texture feature, while the classification accuracy of only near-infrared band texture feature is lower. For specific crop types, the texture features of red edge 710 band and red edge 750 band can improve the identification accuracy in varying degrees. Compared with the texture feature of red edge 750 band, the texture feature of red edge 710 band can improve the identification accuracy of summer maize, cotton and orchards, while the red edge 750 band texture feature is more effective for the identification of spring maize, minor crops and greenhouse.
For the classification results of different red edge index features, the order of the effect of four optimal red edge indices on the overall classification accuracy of crops in the study area is as follows: MTCI > NDVIre1 > CIre1 > NDRE, but the overall accuracies of the four different red edge indices have little difference between each other, while the red edge index based on red edge 710 band is more beneficial to improving the accuracy of crop classification than the red edge 750 band. For specific crop types, compared with the other three red edge indices, the CIre1 index is more conducive to the identification of summer maize and orchards, the MTCI index is more conducive to the identification of minor crops and woods, the NDRE index is more conducive to the identification of cotton and greenhouse, while the NDVIre1 index is more conducive to the identification of spring maize. In addition, the evaluation results of feature importance obtained by the RF algorithm are more consistent with the actual classification accuracy results, which also proves that the RF algorithm is better than the SDA method in the two different feature importance evaluation methods.
Additionally, this study compared whether there is a significant difference in the classification accuracy between different red edge feature classification schemes. Through McNemar’s test, all the different classification schemes listed (Analyses 1–23) in Table 10 have significant differences. Therefore, it can be concluded that the design of different red edge feature classification schemes is more reasonable, and there is a statistically significant difference between the classification accuracy at a certain significance level (P), which also reflects the important role of red edge in crop classification.
Schuster et al. [17] tested the application potential of red edge band of RapidEye image in improving land use classification, and considered that the addition of red edge channel can improve the accuracy of land use and land cover classification. Immitzer et al. [19] used Sentinel-2 data to classify crops and tree species in two different study areas in Central Europe, analyzed the influence of different bands of Sentinel-2 data on crop classification, and obtained that red edge bands make a certain contribution to the improvement of classification accuracy. Kim et al. [21] used RapidEye images to identify and classify paddy rice and other crops, extracted the red edge spectral features and red edge texture features of paddy rice, and analyzed the impact of different red edge features on paddy rice crop classification. Compared with the red edge texture feature, the red edge spectral feature is more conducive to improving the accuracy of classification. Ustuner et al. [22] selected RapidEye images to extract three different vegetation indices, namely Normalized Difference Vegetation Index (NDVI), the Green Normalized Difference Vegetation Index (GNDVI), and the Normalized Difference Red Edge Index (NDRE), which was used for crop classification in the Aegean Sea region of Turkey. By comparing the classification results, it was found that NDRE contributed the most to the classification accuracy, which confirmed that the red edge index constructed by the red edge band was beneficial to crop classification. The abovementioned studies have confirmed that the red edge spectra, red edge textures and red edge indices related to the red edge bands of remote sensing data are important for the classification of crops and other ground objects, but the abovementioned results are not sufficient for the analysis and evaluation of different red edge features and lack of classification and comparative research between different red edge features. The purpose of this study is to make full use of the characteristics that GF-6 WFV data contain multiple red edge bands, construct a variety of different red edge spectra, red edge textures and red edge indices, and design different red edge feature classification schemes, so as to analyze the influence of different red edge features on crop classification and promote the application of different red edge features in crop classification.
The classification results of different red edge feature schemes designed in this study confirm that the spectra, textures and red edge indices of different red edge bands can improve the classification accuracy of crops to varying degrees, and the red edge texture feature has the best effect on improving the classification accuracy. At the same time, by comparing the classification accuracy of each scheme, it can be found that the spectral features, texture features and red edge indices constructed by the red edge 710 band are more conducive to improving the classification accuracy of crops than the red edge 750 band, which is of great significance for the application of different red edge band features in crop classification. In summary, this study shows the important role of the red edge bands of GF-6 WFV data for crop identification and classification.

6. Conclusions

This study extracted a variety of different red edge spectra, red edge textures and red edge indices from GF-6 WFV data. By using the ABS and JBh distance method, the spectral characteristics of the red edge bands were analyzed, and it was found that the two red edge bands had a large amount of information content and were beneficial to improving the separability of different crops. Through the two feature evaluation methods of SDA and RF, this study selected four red edge indices, namely, CIre1, MTCI, NDRE and NDVIre1, with greater importance. Different red edge spectral features, red edge texture features and red edge indices can improve the classification accuracy of crops to different degrees, and there are significant differences in statistics between the classification accuracy at a certain significance level by McNemar’s test. The red edge feature extracted from red edge 710 band can improve the classification accuracy of crops more effectively, which is helpful to the application of red edge feature in crop classification. The red edge features of GF-6 WFV data are fully exploited and analyzed in this study, and the application potential of different red edge features for improving the accuracy of crop classification is discussed, which promotes the popularization and application of GF-6 WFV and other satellite data with red edge bands in the field of agricultural remote sensing monitoring.

Author Contributions

Conceptualization, Q.M., Y.Z. and Y.K.; methodology, Y.K., Y.Z.; investigation and data acquisition Y.K. and X.W.; data analysis and validation, Y.K., Q.M. and M.L.; writing—original draft preparation, Y.K.; writing—review and editing, Q.M., M.L. and X.W. All authors contributed to the discussion, provided suggestions to improve the manuscript and checked the writing. All authors have read and agreed to the published version of the manuscript.

Funding

This study was funded by the Major Project of High-Resolution Earth Observation System of China (30-Y20A07-9003-17/18) and the Major Projects of High-Resolution Earth Observation Systems of National Science and Technology (05-Y30B01-9001-19/20-1).

Acknowledgments

The authors are grateful to the colleagues who participated in the field surveys and data collection. In addition, thanks very much for the GF-6 WFV data provided by China Center for Resources Satellite Data and Application (http://www.cresda.com/EN/, (accessed on 24 June 2021)).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zhao, C.J. Advances of research and application in remote sensing for agriculture. Trans. Chin. Soc. Agric. Mach. 2014, 45, 277–293. [Google Scholar]
  2. Wardlow, B.D.; Egbert, S.L.; Kastens, J.H. Analysis of time-series MODIS 250 m vegetation index data for crop classification in the US Central Great Plains. Remote Sens. Environ. 2007, 108, 290–310. [Google Scholar] [CrossRef] [Green Version]
  3. Chen, Z.X.; Ren, J.Q.; Tang, H.J.; Shi, Y.; Liu, J. Progress and perspectives on agricultural remote sensing research and applications in China. Journal of Remote Sensing. J. Remote Sens. 2016, 20, 748–767. [Google Scholar]
  4. Thenkabail, P.S. Global Croplands and their Importance for Water and Food Security in the Twenty-first Century: Towards an Ever Green Revolution that Combines a Second Green Revolution with a Blue Revolution. Remote Sens. 2010, 2, 2305–2312. [Google Scholar] [CrossRef] [Green Version]
  5. Atzberger, C. Advances in remote sensing of agriculture: Context description, existing operational monitoring systems and major information needs. Remote Sens. 2013, 5, 949–981. [Google Scholar] [CrossRef] [Green Version]
  6. Hao, P.; Wang, L.; Niu, Z.; Aablikim, A.; Huang, N.; Xu, S.; Chen, F. The potential of time series merged from Landsat-5 TM and HJ-1 CCD for crop classification: A case study for Bole and Manas Counties in Xinjiang, China. Remote Sens. 2014, 6, 7610–7631. [Google Scholar] [CrossRef] [Green Version]
  7. Song, J.W.; Zhang, Y.J.; Li, X.C.; Yang, W.Z. Comparison between GF-1 and Landsat-8 images in land cover classification. Prog. Geogr. 2016, 35, 255–263. [Google Scholar]
  8. Cai, Y.; Guan, K.; Peng, J.; Wang, S.; Seifert, C.; Wardlow, B.; Li, Z. A high-performance and in-season classification system of field-level crop types using time-series Landsat data and a machine learning approach. Remote Sens. Environ. 2018, 210, 35–47. [Google Scholar] [CrossRef]
  9. Li, H.K.; Wu, J.; Wang, X.L. Object oriented land use classification of Dongjiang River Basin based on GF-1 image. Trans. Chin. Soc. Agric. Eng. 2018, 34, 245–252. [Google Scholar]
  10. Liu, J.; Wang, L.; Teng, F.; Yang, L.; Gao, J.; Yao, B.; Yang, F. Impact of red edge waveband of RapidEye satellite on estimation accuracy of crop planting area. Trans. Chin. Soc. Agric. Eng. 2016, 32, 140–148. [Google Scholar]
  11. Delegido, J.; Verrelst, J.; Meza, C.M.; Rivera, J.P.; Alonso, L.; Moreno, J. A red-edge spectral index for remote sensing estimation of green LAI over agroecosystems. Eur. J. Agron. 2013, 46, 42–52. [Google Scholar] [CrossRef]
  12. She, B.; Huang, J.; Shi, J.; Wei, C. Extracting oilseed rape growing regions based on variation characteristics of red edge position. Trans. Chin. Soc. Agric. Eng. 2013, 29, 145–152. [Google Scholar]
  13. Kanke, Y.; Tubana, B.; Dalen, M.; Harrell, D. Evaluation of red and red-edge reflectance-based vegetation indices for rice biomass and grain yield prediction models in paddy fields. Precis. Agric. 2016, 17, 507–530. [Google Scholar] [CrossRef]
  14. Qin, Z.F.; Chang, Q.; Shen, J.; Yu, Y.; Liu, J.Q. Red Edge Characteristics and SPAD Estimation Model Using Hyperspectral Data for Rice in Ningxia Irrigation Zone. Geomat. Inf. Sci. Wuhan Univ. 2016, 41, 1168–1175. [Google Scholar]
  15. Qiu, S.; He, B.; Yin, C.; Liao, Z. Assessments of Sentinel 2 vegetation red-edge spectral bands for improving land cover classification. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 1055–1059. [Google Scholar] [CrossRef] [Green Version]
  16. Forkuor, G.; Dimobe, K.; Serme, I.; Tondoh, J.E. Landsat-8 vs. Sentinel-2: Examining the added value of sentinel-2’s red-edge bands to land-use and land-cover mapping in Burkina Faso. GIScience Remote Sens. 2018, 55, 331–354. [Google Scholar] [CrossRef]
  17. Schuster, C.; Förster, M.; Kleinschmit, B. Testing the red edge channel for improving land-use classifications based on high-resolution multi-spectral satellite data. Int. J. Remote Sens. 2012, 33, 5583–5599. [Google Scholar] [CrossRef]
  18. Liu, H.P.; An, H.J. Greening tree species spectrum characteristics analysis in Huhhot based on worldview-Ⅱ. J. Inn. Mong. Agric. Univ. 2014, 35, 41–45. [Google Scholar]
  19. Immitzer, M.; Vuolo, F.; Atzberger, C. First experience with Sentinel-2 data for crop and tree species classifications in central Europe. Remote Sens. 2016, 8, 166. [Google Scholar] [CrossRef]
  20. Liu, J.Y.; Xin, C.L.; Wu, H.G.; Zeng, Q.W.; Shi, J.J. Potential Application of GF-6 WFV Data in Forest Types Monitoring. Spacecr. Recovery Remote Sens. 2019, 40, 107–116. [Google Scholar]
  21. Kim, H.O.; Yeom, J.M. Effect of red-edge and texture features for object-based paddy rice crop classification using RapidEye multi-spectral satellite image data. Int. J. Remote Sens. 2014, 35, 7046–7068. [Google Scholar] [CrossRef]
  22. Ustuner, M.; Sanli, F.B.; Abdikan, S.; Esetlili, M.T.; Kurucu, Y. Crop Type Classification Using Vegetation Indices of RapidEye Imagery. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, 40, 195–198. [Google Scholar] [CrossRef] [Green Version]
  23. Hang, S.Y.; Yang, L.; Chen, X.; Yao, Y. Study of typical arid crops classification based on machine learning. Spectrosc. Spectr. Anal. 2018, 38, 3169–3176. [Google Scholar]
  24. Wu, J.; Lu, Y.N.; Li, C.B.; Li, Q.H. Fine Classification of County Crops Based on Multi-temporal Images of Sentinel-2A. Trans. Chin. Soc. Agric. Mach. 2019, 50, 194–200. [Google Scholar]
  25. Talukdar, S.; Singha, P.; Mahato, S.; Shahfahad; Pal, S.; Liou, Y.-A.; Rahman, A. Land-Use Land-Cover Classification by Machine Learning Classifiers for Satellite Observations—A Review. Remote Sens. 2020, 12, 1135. [Google Scholar] [CrossRef] [Green Version]
  26. Zeraatpisheh, M.; Ayoubi, S.; Jafari, A.; Tajik, S.; Finke, P. Digital mapping of soil properties using multiple machine learning in a semi-arid region, central Iran. Geoderma 2019, 338, 445–452. [Google Scholar] [CrossRef]
  27. Sonobe, R.; Yamaya, Y.; Tani, H.; Wang, X.; Kobayashi, N.; Mochizuki, K.-I. Assessing the suitability of data from Sentinel-1A and 2A for crop classification. GIScience Remote. Sens. 2017, 54, 918–938. [Google Scholar] [CrossRef]
  28. Maponya, M.G.; van Niekerk, A.; Mashimbye, Z.E. Pre-harvest classification of crop types using a Sentinel-2 time-series and machine learning. Comput. Electron. Agric. 2020, 169, 105164. [Google Scholar] [CrossRef]
  29. Liu, J.; Wang, L.M.; Yang, F.G.; Yang, L.B.; Wang, X.L. Remote sensing estimation of crop planting area based on HJ time-series images. Trans. Chin. Soc. Agric. Eng. 2015, 31, 199–206. [Google Scholar]
  30. Hao, P.; Tang, H.; Chen, Z.; Liu, Z. Early-season crop mapping using improved artificial immune network (IAIN) and Sentinel data. PeerJ 2018, 6, e5431. [Google Scholar] [CrossRef]
  31. Huang, G.Q.; Zhao, Q.G. Mode of rotation/fallow management in typical areas of China and its development strategy. Acta Pedol. Sin. 2018, 55, 283–292. [Google Scholar]
  32. Xie, H.L.; Cheng, L.J. Influence factors and ecological compensation standard of winter wheat-fallow in the ground water funnel area. J. Nat. Resour. 2017, 32, 2012–2022. [Google Scholar]
  33. Wang, M.; Guo, B.B.; Long, X.X.; Xue, L.; Cheng, Y.F.; Jin, S.Y.; Zhou, X. On-orbit geometric calibration and accuracy verification of GF-6 WFV camera. Acta Geod. Cartogr. Sin. 2020, 49, 171–180. [Google Scholar]
  34. Zhang, Q.Y.; Li, Z.; Xia, C.Z.; Chen, J.; Peng, D.L. Tree species classification based on the new bands of GF-6 remote sensing satellite. J. Geo-Inf. Sci. 2019, 21, 1619–1628. [Google Scholar]
  35. Liu, C.; Zhao, C.; Zhang, L.Y. A new method of hyperspectral remote sensing image dimensional reduction. J. Image Graph. 2005, 10, 218–222. [Google Scholar]
  36. Zhang, A.W.; Du, N.; Kang, X.Y.; Guo, F.C. Hyperspectral adaptive band selection method through nonlinear transform and information adjacency correlation. Infrared Laser Eng. 2017, 46, 221–229. [Google Scholar]
  37. Zhang, Y.; Guan, Y.L. Hyperspectral band reduction by combining clustering with adaptive band selection. Remote Sens. Inf. 2018, 33, 66–70. [Google Scholar]
  38. Ma, N.; Hu, Y.F.; Zhuang, D.F.; Wang, X.S. Determination on the optimum band combination of HJ-1A hyperspectral data in the case region of Dongguan based on optimum index factor and J–M distance. Remote Sens. Technol. Appl. 2010, 25, 358–365. [Google Scholar]
  39. Bruzzone, L.; Roli, F.; Serpico, S.B. An extension of the Jeffreys-Matusita distance to multiclass cases for feature selection. IEEE Trans. Geosci. Remote Sens. 1995, 33, 1318–1321. [Google Scholar] [CrossRef] [Green Version]
  40. Hao, P.; Wu, M.; Niu, Z.; Wang, L.; Zhan, Y. Estimation of different data compositions for early-season crop type classification. PeerJ 2018, 6, e4834. [Google Scholar] [CrossRef]
  41. Haralick, R.M.; Shanmugam, K.; Dinstein, I.H. Textural features for image classification. IEEE Trans. Syst. Man Cybern. 1973, 3, 610–621. [Google Scholar] [CrossRef] [Green Version]
  42. Palsson, F.; Sveinsson, J.R.; Ulfarsson, M.O.; Benediktsson, J.A. Model-based fusion of multi-and hyperspectral images using PCA and wavelets. IEEE Trans. Geosci. Remote Sens. 2014, 53, 2652–2663. [Google Scholar] [CrossRef]
  43. Zhao, Y.S. Principles and Methods of Remote Sensing Application Analysis, 2nd ed.; Science Press: Beijing, China, 2013; pp. 174–175. [Google Scholar]
  44. Zhang, L.; Gong, Z.N.; Wang, Q.W.; Jin, D.; Wang, X. Wetland mapping of Yellow River Delta wetlands based on multi-feature optimization of Sentinel-2 images. J. Remote Sens. 2019, 23, 313–326. [Google Scholar]
  45. Fang, C.Y.; Wang, L.; Xu, H.Q. A comparative study of different red edge indices for remote sensing detection of urban grassland health status. J. Geo-Inf. Sci. 2017, 19, 1382–1392. [Google Scholar]
  46. Xie, Q.Y. Research on Leaf Area Index Retrieve Methods Based on The Red Edge Bands from Multi-Platform Remote Sensing Data. Ph.D. Thesis, University of Chinese Academy of Sciences, Institute of Remote Sensing and Digital Earth, Chinese Academy of Sciences, Beijing, China, 2017. [Google Scholar]
  47. Gitelson, A.A.; Merzlyak, M.N. Spectral reflectance changes associated with autumn senescence of Aesculus hippocastanum L. and Acer platanoides L. leaves. Spectral features and relation to chlorophyll estimation. J. Plant Physiol. 1994, 143, 286–292. [Google Scholar] [CrossRef]
  48. Barnes, E.M.; Clarke, T.R.; Richards, S.E.; Colaizzi, P.D.; Haberland, J.; Kostrzewski, M.; Moran, M.S. Coincident detection of crop water stress, nitrogen status and canopy density using ground based multispectral data. In Proceedings of the Fifth International Conference on Precision Agriculture, Bloomington, MN, USA, 27–30 July 2000; p. 1619. [Google Scholar]
  49. Gitelson, A.A.; Gritz, Y.; Merzlyak, M.N. Relationships between leaf chlorophyll content and spectral reflectance and algorithms for non-destructive chlorophyll assessment in higher plant leaves. J. Plant Physiol. 2003, 160, 271–282. [Google Scholar] [CrossRef] [PubMed]
  50. Gitelson, A.A.; Keydan, G.P.; Merzlyak, M.N. Three-band model for noninvasive estimation of chlorophyll, carotenoids, and anthocyanin contents in higher plant leaves. Geophys. Res. Lett. 2006, 33, L11402. [Google Scholar] [CrossRef] [Green Version]
  51. Daughtry, C.S.; Walthall, C.L.; Kim, M.S.; De Colstoun, E.B.; McMurtrey, J.E. Estimating corn leaf chlorophyll concentration from leaf and canopy reflectance. Remote Sens. Environ. 2000, 74, 229–239. [Google Scholar] [CrossRef]
  52. Haboudane, D.; Miller, J.R.; Pattey, E.; Zarco-Tejada, P.J.; Strachan, I.B. Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: Modeling and validation in the context of precision agriculture. Remote Sens. Environ. 2004, 90, 337–352. [Google Scholar] [CrossRef]
  53. Dash, J.; Curran, P.J. MTCI: The MERIS terrestrial chlorophyll index. Int. J. Remote Sens. 2004, 25, 5403–5413. [Google Scholar] [CrossRef]
  54. Blum, A.; Langley, P. Selection of relevant features and examples in machine learning. Artif. Intell. 1997, 97, 245–271. [Google Scholar] [CrossRef] [Green Version]
  55. Guyon, I.; Elisseeff, A. An introduction to variable and feature selection. J. Mach. Learn. Res. 2003, 3, 1157–1182. [Google Scholar]
  56. Costanza, M.C.; Afifi, A.A. Comparison of Stopping Rules in Forward Stepwise Discriminant Analysis. J. Am. Stat. Assoc. 1979, 74, 777–785. [Google Scholar] [CrossRef]
  57. Zhang, H.; Li, Q.; Liu, J.; Du, X.; Dong, T.; McNairn, H.; Shang, J. Object-based crop classification using multi-temporal SPOT-5 imagery and textural features with a Random Forest classifier. Geocarto Int. 2018, 33, 1017–1035. [Google Scholar] [CrossRef]
  58. Wang, N.; Li, Q.Z.; Du, X.; Zhang, Y.; Zhao, L.C.; Wang, H.Y. Identification of main crops based on the univariate feature selection in Subei. J. Remote Sens. 2017, 21, 519–530. [Google Scholar]
  59. Van der Linden, S.; Rabe, A.; Held, M.; Jakimow, B.; Leitão, P.J.; Okujeni, A.; Hostert, P. The EnMAP-Box--A Toolbox and Application Programming Interface for EnMAP Data Processing. Remote Sens. 2015, 7, 11249–11266. [Google Scholar] [CrossRef] [Green Version]
  60. Htitiou, A.; Boudhar, A.; Lebrini, Y.; Hadria, R.; Lionboui, H.; Elmansouri, L.; Benabdelouahab, T. The performance of random forest classification based on phenological metrics derived from Sentinel-2 and Landsat 8 to map crop cover in an irrigated semi-arid region. Remote Sens. Earth Syst. Sci. 2019, 2, 208–224. [Google Scholar] [CrossRef]
  61. He, Y.; Huang, C.; Li, H.; Liu, Q.S.; Liu, G.H.; Zhou, Z.C.; Zhang, C.C. Land-cover Classification of Random Forest based on Sentinel-2A Image Feature Optimization. Resour. Sci. 2019, 41, 992–1001. [Google Scholar]
  62. Cohen, J. A Coefficient of Agreement for Nominal Scales. Educ. Psychol. Meas. 1960, 20, 37–46. [Google Scholar] [CrossRef]
  63. Congalton, R.G. A review of assessing the accuracy of classifications of remotely sensed data. Remote Sens. Environ. 1991, 37, 35–46. [Google Scholar] [CrossRef]
  64. Foody, G.M. Thematic map comparison: Evaluating the statistical significance of differences in classification accuracy. Photogramm. Eng. Remote Sens. 2004, 70, 627–633. [Google Scholar] [CrossRef]
  65. Vasilakos, C.; Kavroudakis, D.; Georganta, A. Machine learning classification ensemble of multitemporal Sentinel-2 images: The case of a mixed mediterranean ecosystem. Remote Sens. 2020, 12, 2005. [Google Scholar] [CrossRef]
  66. Li, X.; Chen, G.; Liu, J.; Chen, W.; Cheng, X.; Liao, Y. Effects of RapidEye imagery’s red-edge band and vegetation indices on land cover classification in an arid region. Chin. Geogr. Sci. 2017, 27, 827–835. [Google Scholar] [CrossRef]
  67. Li, Q.; Wang, C.; Zhang, B.; Lu, L. Object-based crop classification with Landsat-MODIS enhanced time-series data. Remote Sens. 2015, 7, 16091–16107. [Google Scholar] [CrossRef] [Green Version]
  68. Zhang, H.; Li, Q.; Liu, J.; Shang, J.; Du, X.; McNairn, H.; Liu, M. Image Classification Using RapidEye Data: Integration of Spectral and Textual Features in a Random Forest Classifier. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017, 10, 5334–5349. [Google Scholar] [CrossRef]
  69. Homer, M.S. An introduction to secondary data analysis with IBM SPSS statistics. Educ. Rev. 2018, 70, 251–252. [Google Scholar] [CrossRef]
  70. Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Duchesnay, E. Scikit-learn: Machine learning in Python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
  71. Raschka, S. Python Machine Learning; Packt Publishing: Birmingham, UK, 2015; pp. 124–126. [Google Scholar]
Figure 1. Geographical location and sample distribution of the study area.
Figure 1. Geographical location and sample distribution of the study area.
Sensors 21 04328 g001
Figure 2. Technical route of the study.
Figure 2. Technical route of the study.
Sensors 21 04328 g002
Figure 3. JBh distance of different spectral classification schemes for GF-6 WFV data.
Figure 3. JBh distance of different spectral classification schemes for GF-6 WFV data.
Sensors 21 04328 g003
Figure 4. Importance evaluation results of red edge indices based on SDA and RF.
Figure 4. Importance evaluation results of red edge indices based on SDA and RF.
Sensors 21 04328 g004
Figure 5. Crop classification results in Hengshui city.
Figure 5. Crop classification results in Hengshui city.
Sensors 21 04328 g005
Figure 6. Overall classification accuracy and kappa coefficient of all different classification schemes.
Figure 6. Overall classification accuracy and kappa coefficient of all different classification schemes.
Sensors 21 04328 g006
Table 1. Main information parameters of GF-6 WFV data.
Table 1. Main information parameters of GF-6 WFV data.
Band NumberBand NameCentral Wavelength (nm)Wavelength Range (nm)Calibration Coefficient in 2019Spatial Resolution (m)
B1Blue (B)485450–5200.070516
B2Green (G)555520–5900.0567
B3Red (R)660630–6900.0516
B4Near-infrared (NIR)830770–8900.0322
B5Red edge 1 (RE1)710690–7300.0532
B6Red edge 2 (RE2)750730–7700.0453
B7Purple (P)425400–4500.0786
B8Yellow (Y)610590–6300.0585
Table 2. Number of samples for different classes.
Table 2. Number of samples for different classes.
TypeTraining SamplesValidation SampleTotal
Number of PolygonsNumber of PixelsNumber of PolygonsNumber of PixelsNumber of Polygons
Summer maize925855924690184
Spring maize503434503055100
Cotton35178835172170
Minor crops304153038860
Greenhouses153961539130
Orchards258712585550
Woods2012342088140
Cities and towns24516424460948
Water bodies16314316346532
Table 3. Red edge vegetation index based on GF-6 WFV data (10 kinds).
Table 3. Red edge vegetation index based on GF-6 WFV data (10 kinds).
Red Edge IndicesCalculation Formula (GF-6 WFV)
Normalized Difference Red Edge (NDRE) [47] ( ρ RE 2 ρ RE 1 ) / ( ρ RE 2 + ρ RE 1 )
Normalized Difference Vegetation Index red edge 1 (NDVIre1) [48] ( ρ NIR ρ RE 1 ) / ( ρ NIR + ρ RE 1 )
Normalized Difference Vegetation Index red edge 2 (NDVIre2) [48] ( ρ NIR ρ RE 2 ) / ( ρ NIR + ρ RE 2 )
Chlorophyll Index red edge 1 (CIre1) [49] ρ NIR / ρ RE 1 1
Chlorophyll Index red edge 2 (CIre2) [50] ρ NIR / ρ RE 2 1
Modified Chlorophyll Absorption Ratio Index 1 (MCARI1) [51] [ ( ρ RE 1 ρ R ) 0.2 ( ρ RE 1 ρ G ) ] × ( ρ RE 1 / ρ R )
Modified Chlorophyll Absorption Ratio Index 2 (MCARI2) [52] [ ( ρ RE 2 ρ R ) 0.2 ( ρ RE 2 ρ G ) ] × ( ρ RE 2 / ρ R )
Transformed Chlorophyll Absorption Reflectance Index 1 (TCARI1) [52] 3 [ ( ρ RE 1 ρ R ) 0.2 ( ρ RE 1 ρ G ) ] × ( ρ RE 1 / ρ R )
Transformed Chlorophyll Absorption Reflectance Index 2 (TCARI2) [51] 3 [ ( ρ RE 2 ρ R ) 0.2 ( ρ RE 2 ρ G ) ] × ( ρ RE 2 / ρ R )
MERIS Terrestrial Chlorophyll Index (MTCI) [53] ( ρ RE 2 ρ RE 1 ) / ( ρ RE 1 ρ R )
ρ indicates the surface reflectance of a band of GF-6 WFV data. The subscript NIR refers to near-infrared band, RE1 refers to red edge 710 band, RE2 refers to red edge 750 band, R refers to red band and G refers to green band.
Table 4. Different red edge feature classification schemes.
Table 4. Different red edge feature classification schemes.
Classification Schemes
Classification
Features
Scheme
A-1
Scheme
A-2
Scheme
A-3
Scheme
A-4
Scheme
B-1
Scheme
B-2
Scheme
B-3
Scheme
B-4
Scheme
C-1
Scheme
C-2
Scheme
C-3
Scheme
C-4
Traditional four bands (R,G,B,NIR)
Red edge spectral featuresRed edge 710
Red edge 750
Red edge texture
features
Red edge texture 710
Red edge texture 750
Near-infrared texture
Red edge index
features
Optimal red edge index 1
Optimal red edge index 2
Optimal red edge index 3
Optimal red edge index 4
Table 5. ABS index and ranking of each band of GF-6 WFV data.
Table 5. ABS index and ranking of each band of GF-6 WFV data.
Band NameBand OrderABS indexRanking
Purple (P)107
Blue (B)2383.46
Green (G)3474.85
Red (R)4556.34
Near-infrared (NIR)52418.71
Red edge 710 (RE1)6728.33
Red edge 750 (RE2)71732.52
Yellow (Y)807
Table 6. Red edge indices importance score based on SDA and RF.
Table 6. Red edge indices importance score based on SDA and RF.
Red Edge IndicesF Value (SDA)MDG (RF)
CIre1237.2680.111
CIre246.4140.091
MCARI164.4750.090
MCARI254.8860.098
MTCI227.0100.137
NDRE387.0080.108
NDVIre1337.6050.110
NDVIre252.8120.092
TCARI1115.5140.104
TCARI29.5660.059
Table 7. Classification accuracy statistics of red edge spectral features.
Table 7. Classification accuracy statistics of red edge spectral features.
ClassScheme A-1Scheme A-2Scheme A-3Scheme A-4
PA%UA%F1%PA%UA%F1%PA%UA%F1%PA%UA%F1%
Summer
maize
87.3174.8480.5991.1577.9884.0588.6177.2982.5691.2278.3484.29
Spring
maize
48.3558.3852.8955.6168.9261.5553.3964.1658.2856.2071.6662.99
Cotton84.8383.3384.0787.6885.2186.4287.9185.0586.4591.6986.9989.27
Minor
crops
30.9360.3040.8829.6447.9236.6229.3857.2938.8427.3249.5335.21
Greenhouses75.7086.8080.8775.7088.8981.7675.4590.7782.4075.9691.6783.08
Orchards47.7254.9151.0661.0573.3166.6257.5465.5161.2661.1772.1466.20
Woods75.6075.9475.7682.4180.9481.6679.8076.0877.8983.2077.2480.11
Cities and towns98.4274.0884.5398.7275.8685.7998.8574.6585.0698.7879.1487.88
Water
bodies
56.9497.7771.9661.1598.2475.3758.0797.9172.9068.4097.9780.56
OA (%)74.9578.8477.1580.55
Kappa
coefficient
0.69370.74140.72080.7627
Table 8. Classification accuracy statistics of red edge texture features.
Table 8. Classification accuracy statistics of red edge texture features.
ClassScheme B-1Scheme B-2Scheme B-3Scheme B-4
PA%UA%F1%PA%UA%F1%PA%UA%F1%PA%UA%F1%
Summer
maize
90.6477.8783.7788.1777.1782.3090.1578.5783.9688.0675.1581.09
Spring
maize
53.8567.0159.7155.3565.4259.9656.0169.0561.8547.8960.0653.29
Cotton89.2585.5287.3487.4586.4486.9491.3486.5288.8686.2383.9485.07
Minor
crops
27.3250.7235.5128.8751.8537.0927.3253.8136.2430.6757.4940.00
Greenhouses75.1980.3377.6777.2483.6680.3278.5280.7979.6472.6381.6176.86
Orchards58.9564.9561.8050.2959.2354.3953.3362.7257.6544.3353.9148.65
Woods82.5282.3382.4278.0973.6675.8181.2774.1277.5374.9168.7571.69
Cities and towns96.9297.7797.3497.9477.7186.6697.0398.0797.5598.2481.0988.84
Water
bodies
99.8397.6698.7365.1197.0377.9399.8397.4698.6372.3898.2483.35
OA (%)84.7177.9584.9077.56
Kappa
coefficient
0.81420.7310.81670.7263
Table 9. Classification accuracy statistics of red edge index features.
Table 9. Classification accuracy statistics of red edge index features.
ClassScheme C-1Scheme C-2 Scheme C-3Scheme C-4
PA%UA%F1%PA%UA%F1%PA%UA%F1%PA%UA%F1%
Summer
maize
90.9477.6483.7691.0776.9583.4289.4976.7382.6290.7577.6483.68
Spring
maize
55.2969.3961.5452.8369.2759.9453.6267.6059.8055.5869.1961.64
Cotton87.5786.7187.1490.3584.6987.4390.9488.1789.5388.3287.1687.74
Minor
crops
30.1549.7937.5633.5148.5139.6428.6146.6435.4630.6751.0738.32
Greenhouses75.7086.5580.7674.1790.0681.3578.2690.0083.7273.4087.5079.83
Orchards64.0974.2568.7957.6673.5864.6561.5267.4464.3463.2773.5168.01
Woods83.5480.0081.7384.2279.9682.0379.0079.8279.4183.4380.2481.80
Cities and towns98.6873.2184.0698.3176.6186.1198.7473.8184.4798.7674.4484.89
Water
bodies
54.8697.8470.3062.9797.5476.5356.2597.8971.4557.8998.0072.78
OA (%)77.8278.8277.4878.35
Kappa
coefficient
0.72880.74130.72480.7354
Table 10. Result of the McNemar’s test for different combination schemes of red edge features.
Table 10. Result of the McNemar’s test for different combination schemes of red edge features.
AnalysisScheme 1Scheme 2f12f21χ2p
1A-1A-26786768.18<0.0001%
2A-1A-38449425.56<0.0001%
3A-1A-41511391094.78<0.0001%
4A-2A-335011318.34<0.0001%
5A-2A-410354325.09<0.0001%
6A-3A-412695659.81<0.0001%
7B-1B-214641081169.68<0.0001%
8B-1B-3831217.080.8%
9B-1B-41509751583.99<0.0001%
10B-2B-34911431302.44<0.0001%
11B-2B-43522749.720.2%
12B-3B-41542701344.16<0.0001%
13C-1C-215435579.37<0.0001%
14C-1C-318812015.010.01%
15C-1C-42713471.11<0.0001%
16C-2C-3373104151.70<0.0001%
17C-2C-424815421.980.0003%
18C-3C-465240100.41<0.0001%
19A-1B-37920751849.59<0.0001%
20A-1C-212789753.72<0.0001%
21A-4B-32281100572.58<0.0001%
22A-4C-238134293.06<0.0001%
23B-3C-21409190929.31<0.0001%
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Kang, Y.; Meng, Q.; Liu, M.; Zou, Y.; Wang, X. Crop Classification Based on Red Edge Features Analysis of GF-6 WFV Data. Sensors 2021, 21, 4328. https://doi.org/10.3390/s21134328

AMA Style

Kang Y, Meng Q, Liu M, Zou Y, Wang X. Crop Classification Based on Red Edge Features Analysis of GF-6 WFV Data. Sensors. 2021; 21(13):4328. https://doi.org/10.3390/s21134328

Chicago/Turabian Style

Kang, Yupeng, Qingyan Meng, Miao Liu, Youfeng Zou, and Xuemiao Wang. 2021. "Crop Classification Based on Red Edge Features Analysis of GF-6 WFV Data" Sensors 21, no. 13: 4328. https://doi.org/10.3390/s21134328

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop