Next Article in Journal
Traffic Sign Detection System for Locating Road Intersections and Roundabouts: The Chilean Case
Next Article in Special Issue
A Forward GPS Multipath Simulator Based on the Vegetation Radiative Transfer Equation Model
Previous Article in Journal
Amorphous SiC/c-ZnO-Based Quasi-Lamb Mode Sensor for Liquid Environments
Previous Article in Special Issue
MERITXELL: The Multifrequency Experimental Radiometer with Interference Tracking for Experiments over Land and Littoral—Instrument Description, Calibration and Performance
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Mapping Winter Wheat with Multi-Temporal SAR and Optical Images in an Urban Agricultural Region

1
College of Resources and Environmental Sciences, Nanjing Agricultural University, Nanjing 210095, China
2
College of Public Administration, Nanjing Agricultural University, Nanjing 210095, China
*
Author to whom correspondence should be addressed.
Sensors 2017, 17(6), 1210; https://doi.org/10.3390/s17061210
Submission received: 7 April 2017 / Revised: 21 May 2017 / Accepted: 21 May 2017 / Published: 25 May 2017

Abstract

:
Winter wheat is the second largest food crop in China. It is important to obtain reliable winter wheat acreage to guarantee the food security for the most populous country in the world. This paper focuses on assessing the feasibility of in-season winter wheat mapping and investigating potential classification improvement by using SAR (Synthetic Aperture Radar) images, optical images, and the integration of both types of data in urban agricultural regions with complex planting structures in Southern China. Both SAR (Sentinel-1A) and optical (Landsat-8) data were acquired, and classification using different combinations of Sentinel-1A-derived information and optical images was performed using a support vector machine (SVM) and a random forest (RF) method. The interference coherence and texture images were obtained and used to assess the effect of adding them to the backscatter intensity images on the classification accuracy. The results showed that the use of four Sentinel-1A images acquired before the jointing period of winter wheat can provide satisfactory winter wheat classification accuracy, with an F1 measure of 87.89%. The combination of SAR and optical images for winter wheat mapping achieved the best F1 measure–up to 98.06%. The SVM was superior to RF in terms of the overall accuracy and the kappa coefficient, and was faster than RF, while the RF classifier was slightly better than SVM in terms of the F1 measure. In addition, the classification accuracy can be effectively improved by adding the texture and coherence images to the backscatter intensity data.

1. Introduction

Wheat is one of the world’s major food crops [1]. With the acceleration of urbanization in China, the amount of cultivated land has been decreasing, and food security has become an important issue. Winter wheat classification is the basis for acreage and yield estimates, which are important for public policy makers to develop food policies and economic plans [2,3]. The development of remote sensing technology provides a rich data source for crop classification. Remote sensing has the characteristics of frequent and large area detection, and provides a viable method for wheat classification [1,4,5]. Optical remote sensing satellites have been widely used for crop type classification [6]. However, optical remote sensing is susceptible to cloudy and rainy weather, and it is difficult to obtain ideal optical images in the critical period of winter wheat growth in Southern China. Compared with optical remote sensing, synthetic aperture radar (SAR) has all-weather, day and night imaging, canopy penetration, and high-resolution capabilities [7,8,9]. Due to these advantages, SAR has become an effective source of data for crop classification.
Although the classification of SAR images is more difficult than that of optical images [10], several studies have used SAR data for crop monitoring and mapping [11,12,13]. Shao et al. [14] used multi-temporal RADARSAT-1 data to identify rice in the Zhaoqing area of Guangdong Province, China, with a classification accuracy of 91%. Silva et al. [15] compared the use of VV polarization, HV polarization, and HH polarization for crop classification; the results showed that the classification accuracy of HH polarization was better than VV polarization and HV polarization. McNairn et al. [16] used C-band SAR data to classify wheat, maize, soybean, and other crops, and found that multi-polarization SAR data classification accuracy was higher than that of single-polarization SAR data. Ferrazzoli et al. [17] also suggested that increasing the polarization mode can increase the classification accuracy. Several researchers have also found that, compared with single-frequency SAR data, multi-frequency SAR data can effectively improve the classification accuracy [18,19,20].
It has also been demonstrated that the addition of texture and coherence features can improve classification accuracy compared to backscatter intensity images alone in crop classification [20,21,22,23]. Jia et al. [20] reported that the use of texture features can improve the classification accuracy of SAR data. Similar results were reported by Yayusman and Nagasawa [23], who used a gray-level co-occurrence matrix (GLCM) to extract SAR texture features for crop classification and found that texture features were useful for crop classification. In the work of Parihar et al. [21], the classification accuracy was improved by adding coherence information to the backscatter intensity data compared to the use of backscatter intensity data alone. Liesenberg and Gloaguen’s [24] research showed that when the texture and coherence information were added, the overall accuracy was improved. Therefore, this study evaluated the classification accuracy of winter wheat using texture and coherence features. Due to the influence of SAR image speckle effects, resulting in inter-class confusion [25], many researchers have studied the combination of optical and SAR images to improve classification accuracy [26,27]. The classification accuracy of maize, soybean, and sunflower increased by 6.2%, 16.2%, and 25.9%, respectively, when multi-temporal SAR images were added to optical images [26]. Kussul et al. [28] used the RADARSAT-2 and EO-1 to classify Ukrainian crops, and found that the combination of SAR and optical images gave better accuracy. In this paper, the combination of SAR and optical images for improving crop classification accuracy was also evaluated. The selection of classification algorithms directly affects the classification results. Support vector machine (SVM) [29], random forest (RF) [30], and neural network (NN) [31] are the most commonly-used classifiers for SAR data classification. Wang et al. [32] used SAR and optical images to classify land cover and found that the RF algorithm was better than SVM when classifying various land covers using pixel-based image analysis. Other studies have also demonstrated the effectiveness of SVM for SAR data classification [3,33].
However, few reported studies have involved crop classification at the early stages of agriculture; particularly there is a lack of use of multi-temporal SAR data for such studies. The use of remote sensing data at the end of the season, or later, to obtain crop acreages is not useful for supporting in-season crop management [34], such as yield forecasting and irrigation management [11]. Some studies have used the combination of SAR and optical images for crop classification. Forkuor et al. [9] studied the contribution of SAR images to crop classification when used as a supplement to high-resolution optical images. Villa et al. [34] combined multi-temporal SAR and optical data for in-season crop mapping. However, they used the combination of SAR and optical images to classify and did not investigate the ability of SAR data alone for its use in in-season crop mapping.
The above studies demonstrated that SAR data produced good classification accuracy. However, SAR data have rarely been studied in the classification of winter wheat in urban agriculture regions with complex planting structures and fragmental planting parcels, which are more difficult to classify than those with simple planting structures or less fragmented parcels. Particularly, there is a lack of such studies using multi-temporal Sentinel-1A images for in-season winter wheat mapping. The purpose of this paper was to explore the feasibility of in-season winter wheat mapping and to improve classification accuracy by using SAR data, optical images, and the combination of both types of images to classify winter wheat in an urban agricultural region with a complex planting structure in Southern China. For this objective, interference coherence, texture, and backscatter intensity images were first extracted from SAR data during winter wheat growth. Then, the satisfactory classification results were successfully identified based on these Sentinel-1A-derived information and optical images, and using the RF and SVM classifiers. The results of this study are important not only in the urban agriculture region, but also in other areas of simple planting structures. This study is particularly important for in-season winter wheat mapping of SAR data and compares the effects of the addition of coherence and texture images on winter wheat classification.

2. Study Area and Datasets

2.1. Study Area

The study site is located in a typical urban agricultural region in Gaochun District of Nanjing, the capital of Jiangsu province, China, with central coordinates of 118°52’ E and 31°19’ N (see Figure 1). The area, covering a total area of 802 km2, has a subtropical monsoon climate with an annual average temperature of 16.0 °C and abundant rainfall (the annual average rainfall is 1157 mm). The area has a complex topography, being low in the west and high in the east; the west is a plain, and the east is a low hilly area. The main winter crops are winter wheat and winter rapeseed. The winter wheat growth period is late October to the following June. Winter rapeseed is sown in early October and harvested in mid-May of the following year. Although the study area is not the main crop production area in China, it is a representative urban agricultural region in Southern China due to its abundant rainfall and complex crop planting structure.

2.2. SAR Satellite Data

Sentinel-1 is an important component of the Global Monitoring for Environment and Security (GMES) [35], a joint initiative of the European Commission and the European Space Agency (ESA), consisting of two satellites of the constellation. Sentinel-1A was launched on 3 April 2014, while Sentinel-1B was launched on 25 April 2016 [3]. Sentinel-1A provides C-band images in both singular and dual polarization with a 12-day repeat cycle [36], and has four imaging modes: strip map (SM), interferometric wide swath (IW), extra wide swath (EW), and wave (WV), each with different resolutions. In this study, six Sentinel-1A images in IW mode with a dual polarization scheme (single-look complex (SLC) products) were acquired from the ESA (European Space Agency), as shown in Table 1.

2.3. Optical Satellite Data

Landsat-8 launched in February 2013 and carries two sensors, the Operational Land Imager (OLI) and the Thermal Infrared Sensor (TIRS). The Operational Land Imager (OLI) is a nine-band push broom scanner with a swath width of 185 km, eight channels at 30 m spatial resolution, and one panchromatic channel at 15 m spatial resolution [37]. Four Landsat-8 OLI images were obtained for the following dates in 2016: 25 February, 12 March, 28 March, and 29 April. These images are Level 1T products downloaded from the USGS (US Geological Survey). In addition, a GaoFen-1 (GF-1) P/MS image with a spatial resolution of 2 m on 29 April 2016 was acquired. As the first satellite of the Chinese High-Resolution Earth Observation System [38], the GF-1 satellite was launched from Jiuquan Satellite Launch Centre (Gansu province, China) in April 2013, and carries two panchromatic/multi-spectral (P/MS) and four wide-field view (WFV) cameras [39]. The acquired GF-1 P/MS (Level 1A product) image served as reference data, with one panchromatic band and four multispectral bands.

2.4. Field Survey

In order to collect land cover information, a field survey was conducted on 17 May 2016. According to the field survey, the land cover types in the study area are winter wheat, winter rapeseed, forest, water body, and urban. The land covers, which are easily identified by visual interpretation, are identified mainly from the GF-1 image, including urban, forest, and water body. GPS coordinates and land cover information for each survey site were recorded and their spatial extent was transferred to a geographic information system (GIS). Fifty percent of the reference data were selected as the training samples by stratified random sampling, and the remaining 50% were used to perform the accuracy assessment (Table 2). All the classifications were conducted at per-pixel basis.
In this paper, a confusion matrix was used to calculate the overall accuracy, and the kappa coefficient and the F1 measure (Equation (1)) were used to evaluate the classification accuracy. The overall accuracy is computed by dividing all correctly-classified pixels by the entire validation dataset [40,41]. The kappa coefficient is computed to determine whether the values in an error matrix are significantly better than the values in a random assignment [3]. Additionally, the F1 measure, as the harmonic mean of the producer’s and user’s accuracy [42,43], is considered to be more meaningful than the kappa coefficient and the overall accuracy [40]. The F1 measure ranges from 0 to 1; a large F1 measure indicates good results, while a small F1 measure indicates poor results.
F 1   measure   =   2 × p r o d u c e r s   a c c u r a c y × u s e r s   a c c u r a c y u s e r s   a c c u r a c y + p r o d u c e r s   a c c u r a c y

3. Methods

3.1. Satellite Data Pre-Processing

In order to reduce errors resulting from instrumental variations in data acquisition, image noise, and misregistration, the optical images were corrected from radiometric and atmospheric effects [44]. The atmospheric correction was done by the ENVI FLAASH model. In this paper, the normalized difference vegetation index (NDVI, Equation (2)) [45] and the simple ratio index (SR, Equation (3)) [46] vegetation index were selected and calculated, and the latter reduced the effect of variable illumination due to topography [47].
N D V I = N I R R E D N I R + R E D
S R = N I R R E D
where NIR and RED correspond to the surface reflectance of band 4 and band 5 of Landat-8, respectively.
The SAR data were pre-processed by SARscape 5.2 software (Sarmap, Purasca, Switzerland), which included multi-look, registration, speckle filtering, geocoding, and radiometric calibration. A Lee filter with a 3 × 3 window was applied to reduce speckle noise [48]. The SAR data was resampled to a pixel size of 20 m spatial resolution, and the digital number values (DN) were converted to a decibel (dB) scale backscatter coefficient (σ°). Then, the images were geocoded using the shuttle radar topography mission (SRTM) DEM. All data were also geometrically rectified using 30 ground control points, with the root mean square error (RMSE) less than 0.5 pixels. Ground coordinates of these points were provided from the 1:10,000 LULC map developed by Nanjing Institute of Surveying and Geotechnical Investigation.

3.2. Feature Sets

3.2.1. Texture Features

Texture feature is an important feature to improve the classification accuracy of SAR data, which can be quantified by a series of different statistical measures, and is widely used for crop classification [7,20,23,49]. In this paper, the gray level co-occurrence matrix (GLCM) proposed by Haralick, which is widely used in spatial and texture extraction [50], was used to extract texture features. Finally, the following eight texture measures were extracted: mean, variance, entropy, angular second moment, contrast, correlation, dissimilarity, and homogeneity.

3.2.2. Coherence Features

Coherence is a correlation coefficient that explains that small changes in the surface (vegetation, non-vegetation, rock, etc.) occurring during the time interval between two SAR acquisitions [21]. As the coherence size determines whether the corresponding pixel changes, the coherence coefficient ranges from 0 to 1. A large coherence coefficient indicates that the change is small, whereas a small coherence coefficient indicates that the change is large. In this paper, the coherence of the SAR images was calculated using SARscape 5.2 software and 10 coherence images were obtained using the following InSAR data pairs: (1) 22 November 2015 and 9 January 2016; (2) 9 January 2016 and 26 February 2016; (3) 26 February 2016 and 21 March 2016; (4) 21 March 2016 and 14 April 2016; and (5) 14 April 2016 and 8 May 2016.

3.3.3. Feature Combination

In order to explore the ability of SAR variables and optical images to distinguish winter wheat, and to obtain the best combination of features to improve the classification accuracy of winter wheat, the following combinations were considered (Table 3).

3.3. Classifiers

In this study, two classifiers were used for classification: support vector machine (SVM) and random forest (RF).
The SVM classifier is a machine learning method based on statistical learning theory [51], which has been widely used in remote sensing data classification and has achieved better classification accuracy [52,53]. It searches the optimal hyperplane by mapping the input vector into the high-dimensional space to separate the training vectors of the two classes into two subspaces [33]. In this paper, the radial basis function (RBF) kernel was implemented, and the kernel parameters kernel width and penalty parameter were set to 0.125 and 100, respectively.
The random forest (RF) classification algorithm belongs to an ensemble classifier class and is built on multiple decision trees, with each tree being fitted to a different bootstrapped training sample and a randomly-selected set of predictive variables [7,9,32,54]. The final classification or prediction results are obtained by voting [7,54]. A large number of studies have proved that the random forest algorithm has high prediction accuracy [55,56], good tolerance for abnormal values and noise, and is not prone to over-fitting. In this study, 500 trees were generated using the square root of the total number of predictors at each node.

4. Results and Discussion

4.1. Analysis of Temporal Variables Used for Classification

In order to use temporal variables to distinguish winter wheat, the temporal changes of each land cover class were analyzed from the temporal variables derived from optical and SAR images. Then, the separability of the land cover classes was also explored.

4.1.1. Temporal Variables Extracted from SAR Data

Figure 2 shows the time series of mean backscatter (σ0) values for each land cover class. The results show that the temporal variation of the backscatter of different classes is obviously different. The backscatter value of VH polarization compared to VV is relatively small. The average VH backscatter of wheat and rapeseed decreased from −13.2 dB and −11.5 dB on 22 November 2015, to −16.2 dB and −12.8 dB on 21 March 2016, respectively. The average VV backscatter of wheat and rapeseed were attenuated from −6.7 dB and −4.2 dB on 22 November 2015 to −11.9 dB on 14 April 2016 and −7.4 dB on 21 March 2016, respectively. With the growth of wheat and rapeseed, the backscatter values for VV and VH polarization both increased. The VV and VH backscatter of forest were first reduced and then increased to the maximum on 8 May 2016. The results showed that: (1) The VV polarization decay of winter wheat was higher than that of VH, because the wheat has obvious vertical structure by which the VV polarization was affected during the propagation process; (2) In the early stage of growth, soil played a leading role in radar backscatter [57,58]. With the gradual growth of wheat, the leaf density and the rod density gradually increased, and the wheat was more uniformly covered on the ground surface. The dominant position of the surface scattering decreased gradually, and the backscatter value decreased with the increase of the leaf density trend; (3) With the maturation of wheat, the water content of the canopy gradually decreased, so that the scattering effect of the soil gradually increased, which eventually led the VH and VV polarization backscatter to increase [59].
The backscatter values of the water body were very low due to the specular reflection of water, which causes less reflection towards the radar antenna [21,60]. The backscatter values of the urban area were relatively high and remained constant.
Figure 3 shows the time series of the mean coherence values for each land cover class. The results show that the coherence of the urban land cover was the highest (in the range of 0.45 to 0.57) and the coherence of the water body land cover was the smallest (in the range of 0.18 to 0.19) among these different land covers. Moreover, similar results have been observed in other studies [21]. This is because the water body surface was easily affected by natural factors, such as wind, and was thus extremely unstable, whereas the urban area was rarely affected by environmental factors [61]. Winter wheat had a maximum value of 0.44 generated from the 22 November 2015 and 9 January 2016 InSAR pair; the minimum value of 0.20 was generated from the 14 April 2016 and 8 May 2016 InSAR pair. Blaes and Defourny’s [62] research found that there was a strong correlation between winter wheat height and the coherence value. Thus, the degrees of change of the surfaces were different, and the coherence values of the land covers were different, which provided effective information for the classification of the land.

4.1.2. Temporal Variables Extracted from Optical Data

Figure 4 shows the time series of the mean SR and NDVI for each land cover class. As observed in this figure, the SR values of wheat increased first and then decreased (in the range of 3.0 to 4.1), with the highest value of 7.4 occurring on 28 March 2016. The trends of SR and NDVI curves were basically the same; at the end of March there was a peak for winter wheat, and rapeseed had a valley. During this period, rapeseed entered the flowering stage, which contributed to the two vegetation indices’ values decreasing, while after the flowering period the two vegetation indices’ values increased, indicating that winter wheat had the highest separability at the end of March. The vegetation indices’ values over the water body and urban areas were very low. On the other hand, the forest indices’ values were less than that of the corresponding winter wheat. Our results indicate that the time series of the SR and NDVI values of each land cover class were obviously different, which has been widely used in crop remote sensing classification [47].

4.2. Winter Wheat Mapping

4.2.1. SAR Image Classification and Accuracy Assessment

The classification accuracy of winter wheat using RF and SVM classifiers are shown in Table 4. Compared to the RF using different combinations of SAR data, the results of the SVM were generally slightly higher in terms of overall accuracy and kappa coefficient; moreover, RF cost more time than SVM in all combinations (see Figure 5). However, RF obtained better winter wheat classification results than SVM in terms of the F1 measure. As shown in Table 4, all F1 measures for winter wheat using the single polarization were lower than 75% and the performance of VV polarization was better than that of VH polarization, which indicates that two single polarizations could not meet the winter wheat classification accuracy requirements. The winter wheat classification accuracy using the SAR data combination of VV + VH was much better than that of VH and VV, with an overall accuracy of 91.45% and a kappa coefficient of 0.8729 (F1 measure = 85.72), which was mainly due to winter wheat VV polarization and VH polarization having different scattering mechanisms, and the combined polarization (VV and VH) rather than VV or VH provided richer wheat radar wave scattering information, effectively improving the winter wheat classification accuracy.
In order to fully explore the Sentinel-1A data to improve the classification accuracy, texture and coherence information were extracted and added to the classification. As shown in Table 4, the classification accuracy using backscatter intensity data was higher than that of the texture and the coherence images. Although the overall accuracy and kappa coefficient of the texture images were higher than that of the coherence images, the F1 measure of the winter wheat was lower than the coherence images. This suggests that backscatter intensity was more important for classification than coherence and texture information. When the coherence images were added to the backscatter intensity images, the overall accuracy increased from 91.45% to 94.94% for RF (F1 measure = 92.35%) and from 89.82% to 96.19% for SVM (F1 measure = 93.10%), respectively. This result was consistent with earlier studies using the combination of backscatter intensity and coherence images [21,22]. Furthermore, when the texture images were added to the backscatter intensity images, the classification accuracy also increased. This indicates that coherence and texture information are useful parameters for the classification of SAR data. Jia et al. [20] also showed that the use of texture images can improve the classification accuracy of SAR data. However, the classification accuracy using the combination of coherence and texture images was lower, with an overall accuracy of 73.82% for RF, and 85.19% for SVM, respectively.
The best results of winter wheat were obtained using an SAR data combination of VV + VH + C + T, with the overall accuracy and kappa coefficient up to 95.95% and 0.9399 for SVM (F1 measure = 93.61), respectively, which was lower than the F1 measure for RF. In general, when the target accuracy is greater than 85%, the classification of this crop is reliable [26,63,64]. Therefore, only the use of SAR data can meet the winter wheat classification accuracy requirements (with F1 measure than 85%). That is to say, the optical image can be replaced by SAR data in order to distinguish winter wheat with complex planting structures. For the two classifiers, although the F1 measure using VV + VH + C + T was higher than that of VV + VH + C, the overall accuracy using VV + VH + C was better than that of VV + VH + C + T. This indicates that the combination of coherence, texture, and backscatter intensity images can improve the classification accuracy. For forest, urban areas, and water fields, the F1 measure using VV + VH + C + T were better than 90%, with an overall accuracy and kappa coefficient equal to more than 94% and 0.9, respectively. For rapeseed fields, VV + VH + C + T had the best performance of all combinations, while it could not distinguish rapeseed areas well (with an F1 measure of no more than 85%). Compared with the previous research results [26], rapeseed classification accuracy is low because the rapeseed planting structure is complex and the planting area is small.

4.2.2. Optical Image Classification and Accuracy Assessment

In order to evaluate the effect of the combination of SAR and optical images on the classification accuracy, the optical images were classified by RF and SVM. The accuracy of winter wheat classification is shown in Figure 6. For RF and SVM classifiers, the classification accuracy was the highest at the end of March 2016 (with the overall accuracy and kappa coefficient equal to more than 95% and 0.9500, respectively). Compared with the highest accuracy at the end of March, the classification accuracy at the end of April was obviously reduced. This might be due to the separability of the two crops, which was high at the end of March, in which rapeseed had begun to flower, while the winter wheat was in the jointing stage. However, the separability was reduced after the florescence of rapeseed at the end of April. At the end of February 2016 (early winter wheat growth), winter wheat had the lowest classification accuracy, the reason being that the wheat had a much lower height and coverage at an early stage [20].

4.2.3. Classification Results Using SAR and Optical Images

As mentioned above, winter wheat has the highest separability when the optical image corresponds to the jointing period. Thus, in the present study, we selected optical images on 28 March and SAR data to combine. As shown in Table 4, the classification accuracy was improved when multi-temporal SAR and optical images were combined. The application of RF allowed us to increase the overall accuracy from 98.92% to 99.35% (F1 measure = 98.06%). For winter wheat, the addition of SAR images to the optical images increased the F1 measure by 3.23% for RF, and 1.48% for SVM, respectively; SVM was better than RF in terms of the overall accuracy and kappa coefficient, but the F1 measure was lower than RF. For other areas, the classification accuracies were also improved. Some earlier studies have also shown that the classification accuracy could be improved when SAR and optical images were combined [26,33,57,65]. The reason might be that the optical image contains rich spectral information, SAR data has more space texture information, and the combination of SAR and optical images can make up the defects of two kinds of remote sensing images in crop identification to improve the classification accuracy. The results indicated that SAR data is not only a suitable substitution to optical images of crop classification in urban agriculture regions, but also as a supplemental data source, especially in cloud-prone areas.
In this paper, satisfactory classification accuracy was achieved in urban agricultural regions with fragmented planting parcels, and higher classification could be achieved in areas with less-fragmented parcels. Therefore, the results of this study have good application potential for winter wheat classification in other areas.

4.2.4. Incremental Classification Results Using SAR Data

In order to assess the ability of SAR data to identify winter wheat early, an incremental classification was conducted with backscatter intensity, coherence, and texture images of SAR data. Every new image acquisition was added to all previously available images and was performed using the RF classifier (Figure 7). As shown in Figure 7, the classification accuracy of winter wheat improved as the season progressed. The combination of all six Setinel-1A datasets achieved the highest classification accuracy, using only one Sentinel-1A image with the lowest classification accuracy. This shows that multi-temporal SAR data can provide more useful information to improve classification accuracy [20,66]. In addition, the classification accuracy was satisfactory using four Sentinel-1A images (22 November 2015, 9 January 2016, 26 February 2016 and 21 March 2016) acquired before the jointing period of winter wheat (with an overall accuracy up to 93.91% and a kappa coefficient up to 0.9093, respectively), while the F1 measure was 87.89%. The classification accuracy using four Sentinel-1A datasets is shown in Table 5. As shown in Table 5, for rapeseed fields, the F1 measure was no higher than 75% using the data combination. For others fields, this data combination produced satisfactory accuracy, with both the F1 measure and overall accuracy being higher than 90%. Therefore, using SAR data early in winter wheat, satisfactory classification accuracy can be obtained. Samples of final classification maps are displayed in Figure 8. Inglada et al. [11] evaluated the usefulness of the Sentinel-1A early crop type classification as an optical image supplement and found that a satisfactory land cover map could be obtained early in the season, with a significant improvement in accuracy compared to the use of optical images alone.

5. Conclusions

In this paper, the multi-temporal Sentinel-1A data and/or Landsat-8 were classified by RF and SVM classifiers in an urban agriculture region of Southern China. The main conclusions of this paper are: (1) In the urban agriculture region with a complex planting structure, using early- and late-season winter wheat growth multi-temporal Sentinel-1A data for classification, satisfactory classification of winter wheat can be obtained. This indicates that optical data can be replaced by Sentinel-1A images when the effective optical image cannot be obtained due to weather; (2) The classification accuracy of backscatter intensity, alone, was higher than that of the coherence and texture images; (3) The classification accuracy can be effectively improved by coherence and texture information of SAR data, but the overall accuracy of the combination of both features was less than 86%; (4) The result of the combination of SAR and optical data was the best, with the highest overall accuracy of 99.81%, which was better than using optical data alone. For winter wheat, the addition of SAR images to the optical images increased the F1 measure by up to 3.23%; (5) The SVM classifier outperformed the RF slightly in terms of overall accuracy and kappa coefficient, and SVM was much faster than RF; however, RF performed better than SVM in terms of the F1 measure.
As an important component of the Chinese High-Resolution Earth Observation System, the GaoFen-3 (GF-3) was successfully launched in August 2016. GF-3 is China’s first 1-m resolution C-band synthetic aperture radar (SAR) satellite, and has 12 imaging modes. The reported results of this study are also important for the application of GF-3 images in agricultural applications.

Acknowledgments

This research was supported by a project funded by the Priority Academic Program Development of Jiangsu Higher Education Institutions.

Author Contributions

Tao Zhou envisioned and designed this research and wrote the paper. Jianjun Pan provided suggestions and modified the paper. Tao Zhou, Shanbao Wei and Tao Han conducted the analysis. Tao Zhou, Peiyu Zhang revised the manuscript draft.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Xu, J.; Li, Z.; Tian, B.; Huang, L.; Chen, Q.; Fu, S. Polarimetric analysis of multi-temporal RADARSAT-2 SAR images for wheat monitoring and mapping. Int. J. Remote Sens. 2014, 35, 3840–3858. [Google Scholar] [CrossRef]
  2. Hao, P.; Zhan, Y.; Wang, L.; Niu, Z.; Shakir, M. Feature Selection of Time Series MODIS Data for Early Crop Classification Using Random Forest: A Case Study in Kansas, USA. Remote Sens. 2015, 7, 5347–5369. [Google Scholar] [CrossRef]
  3. Navarro, A.; Rolim, J.; Miguel, I.; Catalao, J.; Silva, J.; Painho, M.; Vekerdy, Z. Crop Monitoring Based on SPOT-5 Take-5 and Sentinel-1A Data for the Estimation of Crop Water Requirements. Remote Sens. 2016, 8, 525. [Google Scholar] [CrossRef]
  4. Zhang, Y.; Wang, C.; Wu, J.; Qi, J.; Salas, W.A. Mapping paddy rice with multitemporal ALOS/PALSAR imagery in southeast China. Int. J. Remote Sens. 2009, 30, 6301–6315. [Google Scholar] [CrossRef]
  5. Xie, L.; Zhang, H.; Li, H.; Wang, C. A unified framework for crop classification in southern China using fully polarimetric, dual polarimetric, and compact polarimetric SAR data. Int. J. Remote Sens. 2015, 36, 3798–3818. [Google Scholar] [CrossRef]
  6. Mishra, N.B.; Crews, K.A. Mapping vegetation morphology types in a dry savanna ecosystem: Integrating hierarchical object-based image analysis with Random Forest. Int. J. Remote Sens. 2014, 35, 1175–1198. [Google Scholar] [CrossRef]
  7. Du, P.J.; Samat, A.; Waske, B.; Liu, S.C.; Li, Z.H. Random Forest and Rotation Forest for fully polarized SAR image classification using polarimetric and spatial features. ISPRS J. Photogramm. Remote Sens. 2015, 105, 38–53. [Google Scholar] [CrossRef]
  8. Oyoshi, K.; Tomiyama, N.; Okumura, T.; Sobue, S.; Sato, J. Mapping rice-planted areas using time-series synthetic aperture radar data for the Asia-RiCE activity. Paddy Water Environ. 2016, 14, 463–472. [Google Scholar] [CrossRef]
  9. Forkuor, G.; Conrad, C.; Thiel, M.; Ullmann, T.; Zoungrana, E. Integration of Optical and Synthetic Aperture Radar Imagery for Improving Crop Mapping in Northwestern Benin, West Africa. Remote Sens. 2014, 6, 6472–6499. [Google Scholar] [CrossRef]
  10. Waske, B.; Braun, M. Classifier ensembles for land cover mapping using multitemporal SAR imagery. ISPRS J. Photogramm. Remote Sens. 2009, 64, 450–457. [Google Scholar] [CrossRef]
  11. Inglada, J.; Vincent, A.; Arias, M.; Marais-Sicre, C. Improved Early Crop Type Identification By Joint Use of High Temporal Resolution SAR And Optical Image Time Series. Remote Sens. 2016, 8, 362. [Google Scholar] [CrossRef]
  12. Kussul, N.; Lemoine, G.; Gallego, F.J.; Skakun, S.V.; Lavreniuk, M.; Shelestov, A.Y. Parcel-Based Crop Classification in Ukraine Using Landsat-8 Data and Sentinel-1A Data. IEEE J. Stars 2016, 9, 2500–2508. [Google Scholar] [CrossRef]
  13. Hoshikawa, K.; Nagano, T.; Kotera, A.; Watanabe, K.; Fujihara, Y.; Kozan, O. Classification of crop fields in northeast Thailand based on hydrological characteristics detected by L-band SAR backscatter data. Remote Sens. Lett. 2014, 5, 323–331. [Google Scholar] [CrossRef]
  14. Shao, Y.; Fan, X.; Liu, H.; Xiao, J.; Ross, S.; Brisco, B.; Brown, R.; Staples, G. Rice monitoring and production estimation using multitemporal RADARSAT. Remote Sens. Environ. 2001, 76, 310–325. [Google Scholar] [CrossRef]
  15. Silva, W.F.; Rudorff, B.F.T.; Formaggio, A.R.; Paradella, W.R.; Mura, J.C. Discrimination of agricultural crops in a tropical semi-arid region of Brazil based on L-band polarimetric airborne SAR data. ISPRS J. Photogramm. Remote Sens. 2009, 64, 458–463. [Google Scholar] [CrossRef]
  16. McNairn, H.; van der Sanden, J.J.; Brown, R.J.; Ellis, J. The potential of RADARSAT-2 for crop mapping and assessing crop condition. In Proceedings of the Second International Conference on Geospatial Information in Agriculture and Forestry, Lake Buena Vista, FL, USA, 10–12 January 2000; vol. II, pp. 81–88. [Google Scholar]
  17. Ferrazzoli, P.; Guerriero, L.; Schiavon, G. Experimental and model investigation on radar classification capability. IEEE Trans. Geosci. Remote Sens. 1999, 37, 960–968. [Google Scholar] [CrossRef]
  18. Shang, J.; Mcnairn, H.; Champagne, C.; Jiao, X. Application of Multi-Frequency Synthetic Aperture Radar (SAR) in Crop Classification; InTech: Rijeka, Croatia, 2009. [Google Scholar]
  19. McNairn, H.; Kross, A.; Lapen, D.; Caves, R.; Shang, J. Early season monitoring of corn and soybeans with TerraSAR-X and RADARSAT-2. Int. J. Appl. Earth Obs. Geoinf. 2014, 28, 252–259. [Google Scholar] [CrossRef]
  20. Jia, K.; Li, Q.; Tian, Y.; Wu, B.; Zhang, F.; Meng, J. Crop classification using multi-configuration SAR data in the North China Plain. Int. J. Remote Sens. 2012, 33, 170–183. [Google Scholar] [CrossRef]
  21. Parihar, N.; Das, A.; Rathore, V.S.; Nathawat, M.S.; Mohan, S. Analysis of L-band SAR backscatter and coherence for delineation of land-use/land-cover. Int. J. Remote Sens. 2014, 35, 6781–6798. [Google Scholar] [CrossRef]
  22. Sonobe, R.; Tani, H.; Wang, X.; Kobayashi, N.; Shimamura, H. Discrimination of crop types with TerraSAR-X-derived information. Phys. Chem. Earth Parts A/B/C 2015, 83–84, 2–13. [Google Scholar] [CrossRef]
  23. Yayusman, L.F.; Nagasawa, R. ALOS-Sensor data integration for the detection of smallholders oil palm plantation in Southern Sumatra, Indonesia. J. Jpn. Agric. Syst. Soc. 2015, 31, 27–40. [Google Scholar]
  24. Liesenberg, V.; Gloaguen, R. Evaluating SAR polarization modes at L-band for forest classification purposes in Eastern Amazon, Brazil. Int. J. Appl. Earth Obs. Geoinf. 2013, 21, 122–135. [Google Scholar] [CrossRef]
  25. LI, W. Classification of SAR images using morphological texture features. Int. J. Remote Sens. 1998, 19, 3399–3410. [Google Scholar] [CrossRef]
  26. Skakun, S.; Kussul, N.; Shelestov, A.Y.; Lavreniuk, M.; Kussul, O. Efficiency Assessment of Multitemporal C-Band Radarsat-2 Intensity and Landsat-8 Surface Reflectance Satellite Imagery for Crop Classification in Ukraine. IEEE J. Stars 2015, 9, 1–8. [Google Scholar] [CrossRef]
  27. Dong, J.; Xiao, X.; Chen, B.; Torbick, N.; Jin, C.; Zhang, G.; Biradar, C. Mapping deciduous rubber plantations through integration of PALSAR and multi-temporal Landsat imagery. Remote Sens. Environ. 2013, 134, 392–402. [Google Scholar] [CrossRef]
  28. Kussul, N.; Skakun, S.; Shelestov, A.; Kravchenko, O.; Kussul, O. Crop Classification in Ukraine Using Satellite Optical and Sar Images. Int. J. Inf. Models Anal. 2013, 2, 118–122. [Google Scholar]
  29. Vapnik, V.N.; Vapnik, V. Statistical Learning Theory; Wiley: New York, NY, USA, 1998; Volume 1. [Google Scholar]
  30. Breiman, L. Random forests. In Machine Learning; Springer: Berlin, Germany, 2001; Volume 45. [Google Scholar]
  31. Ban, Y. Synergy of multitemporal ERS-1 SAR and Landsat TM data for classification of agricultural crops. Can. J. Remote Sens. 2003, 29, 518–526. [Google Scholar] [CrossRef]
  32. Wang, X.Y.; Guo, Y.G.; He, J.; Du, L.T. Fusion of HJ1B and ALOS PALSAR data for land cover classification using machine learning methods. Int. J. Appl. Earth Obs. Geoinf. 2016, 52, 192–203. [Google Scholar] [CrossRef]
  33. Ban, Y.; Jacob, A. Object-Based Fusion of Multitemporal Multiangle ENVISAT ASAR and HJ-1B Multispectral Data for Urban Land-Cover Mapping. IEEE Trans. Geosci. Remote Sens. 2013, 51, 1998–2006. [Google Scholar] [CrossRef]
  34. Villa, P.; Stroppiana, D.; Fontanelli, G.; Azar, R.; Brivio, P.A. In-Season Mapping of Crop Type with Optical and X-Band SAR Data: A Classification Tree Approach Using Synoptic Seasonal Features. Remote Sens. 2015, 7, 12859–12889. [Google Scholar] [CrossRef]
  35. Schoenfeldt, U.; Braubach, H. Electrical Architecture of the SENTINEL-1 SAR Antenna Subsystem. In Proceedings of the European Conference on Synthetic Aperture Radar, Friedrichshafen, Germany, 2–5 June 2008; pp. 1–4. [Google Scholar]
  36. Abdikan, S.; Sanli, F.B.; Ustuner, M.; Calò, F. Land Cover Mapping Using SENTINEL-1 SAR Data. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, XLI-B7, 757–761. [Google Scholar] [CrossRef]
  37. Vanhellemont, Q.; Ruddick, K. Turbid wakes associated with offshore wind turbines observed with Landsat 8. Remote Sens. Environ. 2014, 145, 105–115. [Google Scholar] [CrossRef]
  38. Huang, Z.; Liu, X.; Jin, M.; Ding, C.; Jiang, J.; Wu, L. Deriving the Characteristic Scale for Effectively Monitoring Heavy Metal Stress in Rice by Assimilation of GF-1 Data with the WOFOST Model. Sensors 2016, 16, 340. [Google Scholar] [CrossRef] [PubMed]
  39. Jia, K.; Liang, S.; Gu, X.; Baret, F.; Wei, X.; Wang, X.; Yao, Y.; Yang, L.; Li, Y. Fractional vegetation cover estimation algorithm for Chinese GF-1 wide field view data. Remote Sens. Environ. 2016, 177, 184–191. [Google Scholar] [CrossRef]
  40. Schuster, C.; Schmidt, T.; Conrad, C.; Kleinschmit, B.; Forster, M. Grassland habitat mapping by intra-annual time series analysis—Comparison of RapidEye and TerraSAR-X satellite data. Int. J. Appl. Earth Obs. Geoinforma. 2015, 34, 25–34. [Google Scholar] [CrossRef]
  41. Clevers, J.; Russell, G. Congalton and Kass Green, Assessing the Accuracy of Remotely Sensed Data—Principles and Practices, 2nd ed.; CRC Press: Boca Raton, FL, USA, 2009. [Google Scholar]
  42. Baumann, M.; Ozdogan, M.; Kuemmerle, T.; Wendland, K.J.; Esipova, E.; Radeloff, V.C. Using the Landsat record to detect forest-cover changes during and after the collapse of the Soviet Union in the temperate zone of European Russia. Remote Sens. Environ. 2012, 124, 174–184. [Google Scholar] [CrossRef]
  43. Schuster, C.; Förster, M.; Kleinschmit, B. Testing the red edge channel for improving land-use classifications based on high-resolution multi-spectral satellite data. Int. J. Remote Sens. 2012, 33, 5583–5599. [Google Scholar] [CrossRef]
  44. Dusseux, P.; Corpetti, T.; Hubert-Moy, L.; Corgne, S. Combined Use of Multi-Temporal Optical and Radar Satellite Images for Grassland Monitoring. Remote Sens. 2014, 6, 6163–6182. [Google Scholar] [CrossRef]
  45. Rouse, J.W., Jr.; Haas, R.; Schell, J.; Deering, D. Monitoring Vegetation Systems in the Great Plains with ERTS; NASA: Washington, DC, USA, 1974.
  46. Birth, G.S.; McVey, G.R. Measuring the color of growing turf with a reflectance spectrophotometer. Agron. J. 1968, 60, 640–643. [Google Scholar] [CrossRef]
  47. Karale, Y.; Mohite, J.; Jagyasi, B. Crop Classification Based on Multi-Temporal Satellite Remote Sensing Data for Agro-Advisory Services. SPIE Asia-Pac. Remote Sens. 2014, 9260, 926004. [Google Scholar] [CrossRef]
  48. Arsenault, H.H. Speckle Suppression and Analysis for Synthetic Aperture Radar Images. Opt. Eng. 1986, 25, 636–643. [Google Scholar]
  49. Balzter, H.; Cole, B.; Thiel, C.; Schmullius, C. Mapping CORINE Land Cover from Sentinel-1A SAR and SRTM Digital Elevation Model Data using Random Forests. Remote Sens. 2015, 7, 14876–14898. [Google Scholar] [CrossRef]
  50. Haralick, R.M.; Shanmugam, K.; Dinstein, I.H. Textural Features for Image Classification. Syst. Man Cybern. IEEE Trans. 1973, smc-3, 610–621. [Google Scholar] [CrossRef]
  51. Cherkassky, V. The Nature of Statistical Learning Theory; Springer: Berlin, Germany, 1997; p. 1564. [Google Scholar]
  52. Pal, M.; Foody, G.M. Evaluation of SVM, RVM and SMLR for Accurate Image Classification with Limited Ground Data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2012, 5, 1344–1355. [Google Scholar] [CrossRef]
  53. Jia, K.; Wei, X.; Gu, X.; Yao, Y.; Xie, X.; Li, B. Land cover classification using Landsat 8 Operational Land Imager data in Beijing, China. Geocarto Int. 2014, 29, 941–951. [Google Scholar] [CrossRef]
  54. Hutt, C.; Koppe, W.; Miao, Y.X.; Bareth, G. Best Accuracy Land Use/Land Cover (LULC) Classification to Derive Crop Types Using Multitemporal, Multisensor, and Multi-Polarization SAR Satellite Images. Remote Sens. 2016, 8, 684. [Google Scholar] [CrossRef]
  55. Rodriguez-Galiano, V.F.; Ghimire, B.; Rogan, J.; Chica-Olmo, M.; Rigol-Sanchez, J.P. An assessment of the effectiveness of a random forest classifier for land-cover classification. ISPRS J. Photogramm. Remote Sens. 2012, 67, 93–104. [Google Scholar] [CrossRef]
  56. Long, J.A.; Lawrence, R.L.; Greenwood, M.C.; Marshall, L.; Miller, P.R. Object-oriented crop classification using multitemporal ETM + SLC-off imagery and random forest. GiSci. Remote Sens. 2013, 50, 418–436. [Google Scholar]
  57. McNairn, H.; Champagne, C.; Shang, J.; Holmstrom, D.; Reichert, G. Integration of optical and Synthetic Aperture Radar (SAR) imagery for delivering operational annual crop inventories. ISPRS J. Photogramm. Remote Sens. 2009, 64, 434–449. [Google Scholar] [CrossRef]
  58. Wang, C.Z.; Wu, J.P.; Zhang, Y.; Pan, G.D.; Qi, J.G.; Salas, W.A. Characterizing L-band scattering of paddy rice in Southeast China with radiative transfer model and multitemporal ALOS/PALSAR imagery. IEEE Trans. Geosci. Remote Sens. 2009, 47, 988–998. [Google Scholar] [CrossRef]
  59. Jia, M.; Tong, L.; Zhang, Y.; Chen, Y. Multitemporal radar backscattering measurement of wheat fields using multifrequency (L, S, C, and X) and full-polarization. Radio Sci. 2013, 48, 471–481. [Google Scholar] [CrossRef]
  60. O’Grady, D.; Leblanc, M. Radar mapping of broad-scale inundation: challenges and opportunities in Australia. Stoch. Environ. Res. Risk Assess. 2014, 28, 29–38. [Google Scholar] [CrossRef]
  61. Jung, H.C.; Alsdorf, D. Repeat-pass multi-temporal interferometric SAR coherence variations with Amazon floodplain and lake habitats. Int. J. Remote Sens. 2010, 31, 881–901. [Google Scholar] [CrossRef]
  62. Blaes, X.; Defourny, P. Retrieving crop parameters based on tandem ERS 1/2 interferometric coherence images. Remote Sens. Environ. 2003, 88, 374–385. [Google Scholar] [CrossRef]
  63. De Wit, A.J.W.; Clevers, J.G.P.W. Efficiency and accuracy of per-field classification for operational crop mapping. Int. J. Remote Sens. 2004, 25, 4091–4112. [Google Scholar] [CrossRef]
  64. Foody, G.M. Status of land cover classification accuracy assessment. Remote Sens. Environ. 2002, 80, 185–201. [Google Scholar] [CrossRef]
  65. Blaes, X.; Vanhalle, L.; Defourny, P. Efficiency of crop identification based on optical and SAR image time series. Remote Sens. Environ. 2005, 96, 352–365. [Google Scholar] [CrossRef]
  66. Chen, J.; Lin, H.; Pei, Z. Application of ENVISAT ASAR Data in Mapping Rice Crop Growth in Southern China. IEEE Geosci. Remote Sens. Lett. 2007, 4, 431–435. [Google Scholar] [CrossRef]
Figure 1. The study area is located in the southwest of Jiangsu Province, China and an overview of the SAR data (R: 2015-11-12 VH polarization, G: 2016-03-21 VV polarization, B: 2016-05-08 VH polarization).
Figure 1. The study area is located in the southwest of Jiangsu Province, China and an overview of the SAR data (R: 2015-11-12 VH polarization, G: 2016-03-21 VV polarization, B: 2016-05-08 VH polarization).
Sensors 17 01210 g001
Figure 2. Average backscatter values for each land cover class on six image acquisition dates.
Figure 2. Average backscatter values for each land cover class on six image acquisition dates.
Sensors 17 01210 g002
Figure 3. Average coherence values for each land cover class. Note: the meaning of the notations 1, 2, 3, 4, and 5 can be found in Section 3.2.2.
Figure 3. Average coherence values for each land cover class. Note: the meaning of the notations 1, 2, 3, 4, and 5 can be found in Section 3.2.2.
Sensors 17 01210 g003
Figure 4. Average SR and NDVI values for each land cover class on four image acquisition dates.
Figure 4. Average SR and NDVI values for each land cover class on four image acquisition dates.
Sensors 17 01210 g004
Figure 5. Comparison of processing time of different combinations classification by RF and SVM algorithms.
Figure 5. Comparison of processing time of different combinations classification by RF and SVM algorithms.
Sensors 17 01210 g005
Figure 6. Accuracy of winter wheat using each optical image alone for RF (a) and SVM (b).
Figure 6. Accuracy of winter wheat using each optical image alone for RF (a) and SVM (b).
Sensors 17 01210 g006
Figure 7. Incremental classification accuracy of winter wheat with backscatter intensity, coherence, and texture images by adding every new image acquisition to all previously available images.
Figure 7. Incremental classification accuracy of winter wheat with backscatter intensity, coherence, and texture images by adding every new image acquisition to all previously available images.
Sensors 17 01210 g007
Figure 8. Classification results using the RF classifier. (Top) Using the four SAR datasets (22 November 2015, 9 January 2016, 26 February 2016 and 21 March 2016). (Bottom) Using the combination of S + O.
Figure 8. Classification results using the RF classifier. (Top) Using the four SAR datasets (22 November 2015, 9 January 2016, 26 February 2016 and 21 March 2016). (Bottom) Using the combination of S + O.
Sensors 17 01210 g008
Table 1. Main characteristics of Sentinel-1A images used in this study.
Table 1. Main characteristics of Sentinel-1A images used in this study.
Acquisition Date Product Imaging Mode Polarization Incidence Angle
22 November 2015SLCIWVV/VH33.8
9 January 2016SLCIWVV/VH33.9
26 February 2016SLCIWVV/VH39.0
21 March 2016SLCIWVV/VH33.8
14 April 2016SLCIWVV/VH33.8
8 May 2016SLCIWVV/VH33.9
Table 2. Numbers of pixels per class for the training and validation data.
Table 2. Numbers of pixels per class for the training and validation data.
ClassNumber of Training PixelsNumber of Validation Pixels
Winter wheat816928
Rapeseed807930
Forest11861095
Water body10741192
Urban12531325
Table 3. Different combinations of SAR variables and optical images for winter wheat classification.
Table 3. Different combinations of SAR variables and optical images for winter wheat classification.
IDSimple Code in This StudyDescriptions of Inputs
AVHAll six Sentinel-1A images (VH)
BVVAll six Sentinel-1A images (VV)
CVV + VHDual polarization (VV + VH) of all six Sentinel-1A images
DTTextures of all six Sentinel-1A images
ECCoherence values of all six Sentinel-1A images
FVV + VH + TTextures and dual polarization (VV + VH) of all six Sentinel-1A images
GVV + VH + CCoherence values and dual polarization (VV + VH) of all six Sentinel-1A images
HC + TTextures and coherence values of all six Sentinel-1A images
IVV + VH + C + TCombination of dual polarization (VV + VH), textures, and coherence values of all six Sentinel-1A images
JS + OCombination of all six Sentinel-1A images (dual polarization (VV + VH), textures, and coherence values) and Landsat-8 image on 28 March 2016
Notes: VH means VH polarization; VV means VV polarization; C means coherence values; T means textures; S means SAR data; O means optical data.
Table 4. Comparison of overall accuracy, kappa coefficient, and F1 measure for each land cover class.
Table 4. Comparison of overall accuracy, kappa coefficient, and F1 measure for each land cover class.
F1 Measure (%)Overall Accuracy (%)Kappa
ClassifierIDWheatRapeseedForestUrbanWater
RFa50.0061.8069.8080.4295.5882.260.7351
b74.4864.4787.2588.7096.9990.100.8525
c85.7278.0287.0990.2497.2191.450.8729
d15.6220.2654.5982.6272.6272.270.5741
e44.3621.4930.7683.4167.8667.270.5086
f90.3466.3487.7592.4297.2192.380.8865
g92.3583.2093.3495.2097.2694.940.9263
h20.9526.8752.6786.3774.5873.820.6046
i94.8372.2691.2995.2797.9494.780.9224
j98.0698.8599.4499.5399.2599.350.9905
SVMa46.3354.8757.7171.7795.4175.470.6350
b66.3656.6080.6083.4296.8286.120.7957
c83.5174.2082.1088.1698.0989.820.8490
d24.1639.0768.4986.1985.4080.590.7057
e44.1420.4641.4085.1071.2469.550.5449
f91.2677.9290.4594.6199.2094.830.9231
g93.1084.7895.5196.4197.9396.190.9447
h65.6542.0074.1790.6291.1285.190.7833
i93.6182.0894.4896.0598.0695.950.9399
j95.0996.3799.3199.6199.9499.810.9939
Table 5. Accuracy of winter wheat using the four SAR datasets (22 November 2015, 9 January 2016, 26 February 2016 and 21 March 2016).
Table 5. Accuracy of winter wheat using the four SAR datasets (22 November 2015, 9 January 2016, 26 February 2016 and 21 March 2016).
Class
Accuracy MeasureWinter WheatRapeseedForestUrbanWater
F1 measure87.8970.7590.5194.2397.83
Overall accuracy/Kappa 93.91/0.9093

Share and Cite

MDPI and ACS Style

Zhou, T.; Pan, J.; Zhang, P.; Wei, S.; Han, T. Mapping Winter Wheat with Multi-Temporal SAR and Optical Images in an Urban Agricultural Region. Sensors 2017, 17, 1210. https://doi.org/10.3390/s17061210

AMA Style

Zhou T, Pan J, Zhang P, Wei S, Han T. Mapping Winter Wheat with Multi-Temporal SAR and Optical Images in an Urban Agricultural Region. Sensors. 2017; 17(6):1210. https://doi.org/10.3390/s17061210

Chicago/Turabian Style

Zhou, Tao, Jianjun Pan, Peiyu Zhang, Shanbao Wei, and Tao Han. 2017. "Mapping Winter Wheat with Multi-Temporal SAR and Optical Images in an Urban Agricultural Region" Sensors 17, no. 6: 1210. https://doi.org/10.3390/s17061210

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop