Next Article in Journal
Cascade Clutter Suppression Method for Airborne Frequency Diversity Array Radar Based on Elevation Oblique Subspace Projection and Azimuth-Doppler Space-Time Adaptive Processing
Previous Article in Journal
Thermal Profile Dynamics of a Central European River Based on Landsat Images: Natural and Anthropogenic Influencing Factors
Previous Article in Special Issue
Wheat Yield Robust Prediction in the Huang-Huai-Hai Plain by Coupling Multi-Source Data with Ensemble Model under Different Irrigation and Extreme Weather Events
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Early-Stage Mapping of Winter Canola by Combining Sentinel-1 and Sentinel-2 Data in Jianghan Plain China

1
Artificial Intelligence Research Institute, School of Computer Science and Technology, Harbin Institute of Technology, Harbin 150008, China
2
College of Resources and Environment, Huazhong Agricultural University, Wuhan 430070, China
3
College of Forestry, Northeast Forestry University, Harbin 150040, China
4
Key Laboratory of Sustainable Forest Ecosystem Management, Ministry of Education, Harbin 150040, China
5
National Key Laboratory of Smart Farming Technologies and Systems, Harbin 150008, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2024, 16(17), 3197; https://doi.org/10.3390/rs16173197
Submission received: 5 June 2024 / Revised: 6 August 2024 / Accepted: 24 August 2024 / Published: 29 August 2024
(This article belongs to the Special Issue Within-Season Agricultural Monitoring from Remotely Sensed Data)

Abstract

:
The early and accurate mapping of winter canola is essential in predicting crop yield, assessing agricultural disasters, and responding to food price fluctuations. Although some methods have been proposed to map the winter canola at the flowering or later stages, mapping winter canola planting areas at the early stage is still challenging, due to the insufficient understanding of the multi-source remote sensing features sensitive for winter canola mapping. The objective of this study was to evaluate the potential of using the combination of optical and synthetic aperture radar (SAR) data for mapping winter canola at the early stage. We assessed the contributions of spectral features, backscatter coefficients, and textural features, derived from Sentinel-2 and Sentinel-1 SAR images, for mapping winter canola at early stages. Random forest (RF) and support vector machine (SVM) classification models were built to map winter canola based on early-stage images and field samples in 2017 and then the best model was applied to corresponding satellite data in 2018–2022. The following results were obtained: (1) The red edge and near-infrared-related spectral features were most important for the mapping of early-stage winter canola, followed by VV (vertical transmission, vertical reception), DVI (Difference vegetation index), and GOSAVI (Green Optimized Soil Adjusted Vegetation Index); (2) based on Sentinel-1 and Sentinel-2 data, winter canola could be mapped as early as 130 days prior to ripening (i.e., early overwinter stage), with the F-score over 0.85 and the OA (Overall Accuracy) over 81%; (3) adding Sentinel-1 could improve the OA by about 2–4% and the F-score by about 1–2%; and (4) based on the classifier transfer approach, the F-scores of winter canola mapping in 2018–2022 varied between 0.75 and 0.97, and the OAs ranged from 79% to 86%. This study demonstrates the potential of early-stage winter canola mapping using the combination of Sentinel-2 and Sentinel-1 images, which could enable the large-scale early mapping of canola and provide valuable information for stakeholders and decision makers.

1. Introduction

The production of canola, which is one of the four major oil crops in the world [1], occupies a critical position in the national economy. As a cool-season crop, the planting of winter canola provides an effective way to make use of winter fallow fields and rational multiple cropping [2,3]. Early and accurate mapping of winter canola can help to inform decisions on food security and other policies, as the early-season crop maps are the basis for rationally planning and evaluating the current status of arable land use and forecast the yield of oilseeds [4,5]. The most common method for estimating crop cultivated areas is using agricultural statistical data from field visits and farmer reports [6,7]; however, it is costly, labor-intensive, and time-consuming. Recently, satellite remote sensing has become an effective way to mapping the spatial-temporal distribution of winter canola planting areas, as it can efficiently collect multi-source data from vast geographical areas in a short time [8,9,10].
Many studies have used medium-resolution, optical satellite images (e.g., Sentinel-2) for crop mapping [11,12,13,14]. Recently, some new spectral indices have been proposed for the mapping of winter canola. Spectral index-based methods commonly utilize the spectral features of canola at the flowering stage (i.e., the distinctive bright yellow color), such as the Normalized Difference Yellowness Index (NDYI) [15]. For example, Ashourloo developed an index to enhance the spectral feature of reflectance of red and green bands for the winter canola mapping during the flowering stage [16]. Han et al. developed another index to map canola by combining the high normalized difference yellow index values during the flowering stage and the high VH values during the podding stage [17]. However, a method that does not rely on flowering images is urgently needed to mapping winter canola at the early stage. Based on this, we defined early-season mapping of winter canola as it is mapped before the flowering stage [18].
Cloud-free optical satellite data during the flowering stage in China’s main planting area are often lacking, as winter canola often flowers in the local rainy season. As they are collected from active sensors, satellite SAR data are unaffected by rain and clouds and can provide regular observations day and night compared to satellite optical data [19,20]. Optical and SAR data together provide complementary information for winter canola mapping: optical images reflect the spectral features of winter canola, while the SAR data provide information about the surface roughness, texture, and dielectric properties of winter canola [21,22]. Although many studies have used a combination of optical and SAR images for crop mapping (e.g., rice, corn, and wheat) [21,23], few studies have been conducted for the early-stage mapping of winter canola by combining optical and SAR data. In other words, there has been insufficient understanding on the spectral-SAR features for the early-stage mapping of winter canola.
Therefore, this study will explore the contribution of the combination of optical and SAR features to the early-season winter canola mapping by applying the random forest (RF) and support vector machine (SVM) algorithms. Early-stage Sentinel-1/2 imagery and field samples collected in 2017 were used to train RF and SVM classifiers, which were then applied to the corresponding satellite data in 2018–2022. Specifically, we address two questions, as follows: (1) What were the most important features for mapping winter canola at early stage? (2) Can the integrated use of Sentinel-2 and Sentinel-1 data effectively improve the timeliness and accuracy of winter canola mapping, compared with the use of only Sentinel-2 or Sentinel-1?

2. Materials and Methods

The flowchart of the study is shown in Figure 1. First, we obtained Sentinel-1 and Sentinel-2 data after data preprocessing to extract various features of the images. Second, the VSURF algorithm was used to select features at different phenological stages of winter canola. Then, we constructed SAR data models and optical-SAR combination models to study the earliest mapping timing based on different data sources. Finally, to verify the transferability of the model in time, the approach based on multisource data model in 2017 was applied to 2018 and 2020–2022 and that, based on SAR data model in 2017, was applied to 2019 to realize the information extraction on multi-year and early-stage winter canola planting area.

2.1. Study Area

Jianghan Plain in Hubei Province is one of the major planting areas for canola in China, accounting for 9% of total canola production in China. Its geographical location is between 29°26′–31°37′ north latitude and 111°14′–114°36′ east longitude. Winter canola accounts for approximately 90% of the total canola in China [18]. The study area includes Zhijiang, Shishou, Xiantao, Zhongxiang, Tianmen, Qianjiang, and 20 other cities and counties, with a total area of 37,660.79 square kilometers (Figure 2). The crops grown in Jianghan Plain are mainly rice, cotton, and canola. The two types of canola are spring and winter, according to when they are planted. Winter canola is planted in the fall (September–October) over winters and is harvested in the summer (June).

2.2. Datasets and Processing

2.2.1. Sentinel-1 Data

Sentinel-1 is a C-band Synthetic Aperture Radar (SAR) satellite with dual polarization (VV and VH) [24]. The Sentinel-1 Ground Range Detected (GRD) data used in this study have been preprocessed, including orbit restoration, thermal noise removal, topographic correction, and radiometric calibration [23,25,26]. In addition, we used DNCNN algorithm to reduce speckle of Sentinel-1. This method was selected because it can filter white Gaussian noise at unknown noise levels and better preserve detailed information under the influence of speckle removal [27,28,29]. Since early-stage crop mapping cannot rely on long time series data, we chose to use the mean of single temporal data at each phenological stage separately to construct the predictors. This process helped reduce the bias of phenological characteristics of remote sensing data caused by the sowing time. The phenological information of winter canola and winter wheat, as the main crops of Jianghan Plain in winter, is shown in Table 1. According to Table 1, we selected the highest quality Sentinel-1 data at different phenological stages of winter canola. The Sentinel-1 data of each phenological stage of winter canola from 2017 to 2022 were selected as shown in Table S1.

2.2.2. Sentinel-2 Data

Sentinel-2 optical data has 13 spectral bands (from visible and near-infrared to short-wave infrared) at variable spatial resolutions (10, 20, and 60 m) [24]. We accessed the Sentinel-2 Level-1C Top of Atmosphere (TOA) reflectance images in 2017–2022, which had been ortho-rectified and geometrically corrected. We performed atmospheric correction of Sentinel-2 with an algorithm (SIAC) developed on the GEE platform [30]. We selected the Sentinel-2 data with clouds of <20% to reduce the influence of cloud pollution. After cloud mask using the QA60 band of Sentinel-2, all images used in the study were re-sampled to 10 m with Nearest Neighbor [31]. Based on the phenological stages of winter canola in Table 1, Table S2 shows the Sentinel-2 data with the best quality and the least image cloud cover at the early stage of winter canola from 2017 to 2022. Figure 3 shows that, in 2019, a higher presence of clouds and increased rainfall limited the availability of optical images for early-stage winter canola, resulting in difficulties of early-stage winter canola mapping. Thus, early-stage crop mapping research based on SAR data is particularly important.

2.2.3. Sample Data

In this study, we classified satellite data into five land types, as follows: (1) winter canola; (2) other winter crops (mainly winter wheat); (3) trees (including orchard, evergreen broad-leaf forest, broadleaved deciduous forest, coniferous forest, and shrub); (4) water (including rivers, lakes, and ponds); (5) building (including roads and houses). In other winter crops, we primarily focused on winter wheat and did not classifier other winter crops, such as vegetables, as the primary source of misclassification for winter canola mapping comes from winter wheat. The samples in this study were obtained by visual interpretation of high-resolution images of Google Earth and Planet in 2017–2022 (Figure S1). To ensure the representativeness of the samples, we uniformly sampled from different areas in the Jianghan Plain. In this study, the coverage area for each sample is no less than 10 m × 10 m. The number and spatial distribution of these samples for each year are shown in Table 2 and Figure 3, respectively. As shown in Table 2, we had a total of 6227 samples selected in 2017, and in the subsequent years from 2018 to 2022, the number decreased to around 2000. The data from 2017 were used for training the model and validating the model accuracy at various phenological stages, while data from other years were only used for validation. In this study, the training samples were selected from four experimental areas in 2017, namely, Zhijiang, Shishou, Hanchuan, and Yingcheng. The selected training samples and early-stage images were used to train classifiers, which were then applied to the corresponding images in 2017–2022 in our study area.

2.2.4. Field Survey Data

We collected the field survey data at Huazhong Agricultural University during its growth stage in 2022 to further illustrate the phenological characteristics of winter canola and the reliability of the multispectral data. From November 2021 to April 2022, the growth stage of winter canola and winter wheat was continuously observed at different phenological stages (Figure 4).

2.3. Feature Extraction

In this study, six types of predictors were used: spectral bands, spectral indexes, backscatter coefficients, polarization indexes, textural features, and topographical features.
We used sample data from the 2017 collected in Zhijiang, Shishou, Hanchuan, and Yingcheng to extract these features. There are a total of 1809 samples, including 346 winter canola, 368 winter wheat, 365 trees, 365 water, and 365 buildings. We calculated 20 spectral indices from 10 spectral bands of Sentinel-2 (Table S3). We also calculated seven polarization indexes (Table 3) from two polarization features of Sentinel-1. We extracted 32 textural features based on gray level co-occurrence matrix (GLCM) from Sentinel-1 bands (VH polarization and VV polarization, Table 4). Moreover, three topographic features including DEM, slope, and aspect were calculated based on the SRTM data.

2.4. Feature Selection

Feature selection is a crucial issue, as excessive features increase the computational cost and easily cause data redundancy and overfitting, leading to the degradation of the performance of machine learning models [35]. To study the contribution of different features for early-stage winter canola mapping, the features of the training samples at different phenological stages are calculated and their importance is evaluated. An RF-based variable selection method (VSURF) proposed by Genuer et al. was adopted, as it has high computational efficiency and is suitable for feature selection of crop type mapping [36].

2.5. Classification Models

In this study, mapping models were constructed using training data through two machine learning approaches, RF and SVM. Additionally, we analyzed the spectral bands, spectral indexes, backscatter coefficients, polarization indexes, textural features, and topographical features at the early stage as inputs to the models based on different data sources.
The RF is an ensemble algorithm, which is suitable for supervised classification [36,37]. This algorithm is based on the classification and decision tree, which has been successfully applied for crop mapping in remote sensing [38].No relationship exists between each decision tree, each decision tree in the forest performs judgments separately, and voting is used to select the class as the final classification result. Three parameters need to be optimized in random forest: ntree, the number of regression trees grown based on a bootstrap sample with replacement of the observations; mtry, the number of different independents tested at each node; and node size, the minimal size of the terminal nodes of the trees [39]. In this study, we set the nTree value to 200, which can ensure accuracy and avoid overfitting. Mtry was set to the default value, which is the square root of the input feature data.
SVM is a machine learning classification method initially proposed by Cortes and Vapnik in 1995 [40]. It is a supervised learning method that widely used in statistical classification and regression analysis. SVM conducts classification by determining hyperplanes that optimally separates classes. One hyperplane only separates two classes so multiple hyperplanes are used to classify multiple classes [21]. Kernel functions are typically adopted to transform the data dimensionality to make it easy to identify optimum hyperplanes. There are various kernel functions including linear, polynomial, radial basis function (RBF), and sigmoid. In this study, we set the parameters as follows: kernel_type = RBF, gamma = 0.6, and cost = 40.

2.6. Accuracy Assessment

According to the sample data selected in Jianghan Plain, we used confusion matrix, overall classification accuracy (OA, Overall Accuracy), winter canola producer accuracy (PA, Producer’s Accuracy), winter canola user accuracy (UA, User’s Accuracy), and the F-score to evaluate the accuracy of winter canola at different phenological stages.
U A = T P T P + F P
P A = T P T P + F N
O A = T N + T P T N + T P + F N + F P
F s c o r e = 2 × P A × U A P A + U A
Specifically, TP is the positive sample that is correctly classified by the model, TN is the negative sample that is correctly classified, FN is the positive sample that is misclassified, and FP is the negative sample that is misclassified.

3. Results

3.1. Feature Analysis

3.1.1. Backscattering Features of Winter Canola at Different Phenological Stages

Regarding backscattering mechanisms, VH is sensitive to volume scattering (e.g., in legumes, canola, and corn) [41]. The complex structure of the vegetation canopy, such as multiple layers of leaves and branches, increases the volume scattering. VV polarization is sensitive to surface scattering and canopy structure in the stem elongation phase and can reflect the density and height of the canopy [42,43]. Canola is a broad-leaf plant with distinctive differences in canopy structure throughout the growing season. After emergence, the plant develops a dense rosette of leaves near to the soil [44]. Meanwhile, winter wheat has a relatively sparse canopy with smaller and more closely spaced leaves. Therefore, the canopy characteristics of winter canola resulted in a stronger backscatter response. From the distribution of features in Figure 5, at the early bolting stage, late bolting stage, and flowering stage, winter canola had much higher VH and VV values than winter wheat. In addition, Figure 5 shows that, at the seedling stage, overwinter stage, and early bolting stage, the VH and VV polarizations of winter canola were significantly different from those of other land types at the level of 0.05. The findings showed that Sentinel-1 data could distinguish early-stage winter canola and other land types (especially winter wheat) to different degrees at different phenological stages and could be identified by subsequent studies.

3.1.2. Textural Features of Winter Canola at Different Phenological Stages

Figure 6 shows that, at the seedling stage, overwinter stage, and early bolting stage, the VH_Savg and VV_Savg features of winter canola were significantly different from those of other land types (p < 0.05). At the late bolting stage, VH_Savg and VV_Savg features also showed significant differences between winter canola and other land types (except trees). The Sum Average (SAVG) measures the average of the sum of gray levels and reflects the central tendency of the distribution of gray levels in an image [33]. If a crop has a high Sum Average value, it usually indicates a dense crop canopy with less gray level variation between leaves and a high overall gray value. For example, healthy, dense crop plots may result in higher Sum Average values due to leaf overlap and complex structure. At the early stages, the leaves of the winter canola canopy are larger and denser than those of winter wheat, resulting in higher SAVG values. Through the distribution of features in Figure 6, VH_Savg and VV_Savg were clearly distinguished from winter canola at the early bolting stage, late bolting stage, and flowering stage. According to the analysis, some textural features could effectively distinguish winter canola from other land types and could be used for early-stage winter canola mapping research.

3.1.3. Spectral Indices of Winter Canola at Different Phenological Stages

The vegetation indexes are usually high in areas covered by vegetation to distinguish winter canola from non-crops (water and buildings). During the seedling and overwinter stages of winter canola, the leaves of winter canola are generally high in water content and rich in chlorophyll, while those of winter wheat are relatively low [45]. These differences in physicochemical parameters affect the performance of vegetation indices, which can be distinguished in remotely sensed data [45]. As shown in Figure 7, at the seedling stage, the GOSAVI of winter canola was significantly different from that of other land types (p < 0.05). Its LSWI was also significantly different from that of other land types (except water). At the overwinter stage, the GOSAVI and DVI features of winter canola were significantly different from those of other land types (p < 0.05). At the bolting stage, the DVI of winter canola was significantly different from that of other land types (p < 0.05). Its GARI as also significantly different from that of other land types (except winter wheat). Generally, according to the distribution of features in Figure 7, the GOSAVI, LSWI, and DVI of winter canola are significantly higher than those of other land types at the seedling, overwinter, and bolting stages of winter canola. The findings showed that these spectral indices are useful for early-season mapping of winter canola.

3.2. Earliest Mapping Timing Based on Sentinel-1

Table 5 shows the results of our feature selection based on Sentinel-1 for winter canola at different phenological stages (Table 5). Figure 8 shows the mapping accuracy of winter canola at different phenological stages using RF and SVM models based on Sentinel-1 images. The figure shows that the accuracy of winter canola shows a trend of initially increasing and then decreasing with the changes in phenology. At the seedling stage, the F-score of winter canola was the lowest, and the F-score of winter canola under RF and SVM models was 0.64, and the overall accuracy (OA) was 72.25% and 70.79%, respectively. At the early bolting stages, the mapping accuracy of winter canola reached the highest, with F-score of 0.76 and 0.75, and the OA could reach 79.23% and 76.62% under RF and SVM models, respectively. Then, at the flowering stage, the accuracy began to decline.
Based on this analysis, the Sentinel-1SAR images indicate that while ensuring accuracy, the earliest winter canola mapping research could be achieved at the early bolting stage, which was approximately 50 days ahead of the traditional flowering stage recognition of winter canola. In addition, the mapping based on SAR images was unaffected by weather and could be used for early-stage winter canola mapping research at any time and any year.

3.3. Earliest Mapping Timing Based on Sentinel-1 and Sentinel-2

Table 6 shows the results of our feature selection based on Sentinel-1 and Sentinel-2 for winter canola at different phenological stages (Table 6). Using Sentinel-1 and Sentinel-2 data combination for crop mapping can achieve higher mapping accuracy than using only Sentinel-1SAR data. Table 7 shows that at the seeding stage (early December 2017), combining Sentinel-1 and Sentinel-2 data, the F-score of winter canola under the RF model was 0.8010, with an OA of 81.48%. At the overwinter stage (late December 2017), winter canola could be classified with F-scores of 0.8615 and 0.8583 and an OA of 81.63% and 82.34% under the RF and SVM. The F-score increased by 6% compared with the seedling stage. At the bolting stage (late February 2017), the accuracy of mapping winter canola was improved by 18% and 12%, compared with the seedling and overwinter stages (F-score). In addition, winter canola could be classified 20 days prior to flowering at the bolting stage, 80 days prior to flowering at the overwinter stage, and 100 days prior to flowering at the seedling stage.
In summary, early-stage mapping of winter canola at the overwinter stage achieved the best accuracy. Considering the benefits of time and accuracy, it could not only ensure high mapping accuracy, but also map winter canola 80 days in advance, indicating great significance for early-stage agricultural monitoring and planning. The spatial details of the winter canola map at the overwinter stage are shown in the Figure S4.

3.4. Mapping Multiyear and Early-Stage Winter Canola

Figure 9 shows the mapping accuracy of winter canola based on optical and SAR images for different cities and years. In 2018, the average F-score of winter canola could reach 0.8366, and the OA was 79.22%. The F-score of winter canola in most areas was above 0.8, and the accuracy of some areas could reach more than 0.9 (such as Tianmen and Zhongxiang). The lowest accuracy was in Jianli, where the F-score of winter canola was only 0.59 and the OA was 76.79%, indicating that more winter canola in the region was inaccurately classified. In general, the model migration accuracy based on the combination of Sentinel-1 and Sentinel-2 at the early overwinter stage performed effectively in 2018.
According to Figure 9, in 2019, winter canola had an average F-score of 0.7855 and an average OA of 80.52% when only using SAR data. The F-score of winter canola mapping in most areas was above 0.7, and the accuracy in some areas was above 0.8 (Gongan, Jianli, Qianjiang, Shayang, Zhongxiang, and Shishou). Overall, the transfer accuracy of SAR model at the early bolting stage performed well in 2019.
In 2020 and 2022, the average F-score of winter canola was 0.9750 and 0.9482, and the average OA was 85.42% and 86.87%, respectively. Winter canola in all eight areas of the Jianghan Plain achieved an F-score exceeding 0.9 and an OA above 80%, demonstrating a good model transferability based on the combination of Sentinel-1 and Sentinel-2 data. In 2021, the F-score of winter canola in some areas of the Jianghan Plain could was up to 0.9 (Gongan, Zhongxiang) and in most areas it was above 0.7. The average F-score of winter canola was 0.7569, and the OA was 84.34%. The model transferability was slightly lower than that in 2020.
Figure 10 shows the early-stage mapping result of winter canola in the Jianghan Plain from 2017 to 2022. The figure indicates that the planting area of winter canola in 2020 and 2021 was relatively small, while the planting area in 2017, 2019, and 2022 was relatively large.

4. Discussion

4.1. Important Features for Winter Canola Early-Season Mapping

The importance of each feature was evaluated and ranked from the highest to lowest in Figure S2. By analyzing the features importance of different stages, we determined the features at different stages for the winter canola mapping using different data combination (Table 5 and Table 6). The SAR signal is sensitive to the geometry (e.g., roughness, texture, internal structure) and wetness of the observed targets [46]. Han et al. found the VH reached a maximum value at the pod stage of winter canola, and using the high VH value on Sentinel-1 images to map winter canola [17]. Although, the VH value of winter canola reaches its maximum at the podding stage, but for the early-stage mapping, we only need to study the polarization features that can distinguish winter canola from other crops. In this study, we found that the VV value of winter canola was higher than that of other crops at the early bolting stage (late January) (Figure 5). According to Cookmartin et al. [47], VV is particularly sensitive to vegetation wetness, and Fieuzal et al. [48] observed maximum water content at the stem elongation stage. At the early bolting stage of winter canola, the rapid increase in the number of stems per plant and the length of stems results in an increase in VV polarization, which is dominated by the influence of soil and canopy, resulting in differences in polarization features between winter canola and other crops. Besides, textural features can distinguish ground objects with similar spectral features. We calculated textural features of the crop provided by Sentinel-1 images and found that VV_Savg and VH_Savg values were higher than other crops at the early bolting stage (Figure 6), which is also a critical factor in distinguishing winter canola from other ground objects at the early stage based on Sentinle-1 images. Moreover, we also added the SRI polarization index for early-season mapping (Table 5). Based on this, early-stage winter canola was mapped based on SAR images at the early bolting stage, with F-score of 0.76 and OA of 79.23% in 2017 (Figure 8).
Some studies have found that red, green, blue and NIR bands can be effectively used for winter canola mapping during the flowering stage [16,49,50]. Previous studies have mainly explored the features of winter canola at the flowering stage, but rarely explored the early-stage features of winter canola. In addition, they have not yet studied the role of red-edge in winter canola mapping. Because the temperature is low during the early stage of winter canola, the ground vegetation stops growing or even gradually freezes until death [51]. For winter canola, winter canola grows rapidly with leaf development after the seeding stage, so the spectral response of winter canola is different from that of other ground objects at the early stage. In this study, based on Sentinel-2 data, we found the reflectance values of red-edge and near-infrared in winter canola were higher than others at the seeding stage, overwinter stage, and bolting stage (Figure S3), which is a critical factor in distinguishing winter canola from other ground objects at the early stage (Table 6). During the overwintering stage, the differences in the red-edge and near-infrared bands were the largest, followed by VV, DVI, and GOSAVI, which also showed greater differences than the other features (Table 6). Based on this, we achieved the earliest mapping of winter canola during the overwinter stage, with F-score of 0.86 and OA of 81.63% (Table 7) based on Sentinel-1 and Sentinel-2 data. The proposed approach potentially provides a reference for early-stage mapping of other crop types in agricultural regions worldwide [51].

4.2. Advantages of Combination of Sentinle-1 and Sentinel-2 Data for Winter Canola Mapping

We proposed the method of early-stage mapping based on the unique spectral and polarization features of Sentinel-2 and Sentinel-1 images. The study is different from the methods developed in previous studies [52,53]. (1) Most of these studies used a single type of optical satellite imagery for winter canola mapping. The combination of optical and SAR images can solve the problem of the cloud to some extent. Additionally, optical images reflect the spectral features of winter canola but not the geometric features and textural features of winter canola [17]. The integrated spectrum and polarization indices incorporated in the study are effective for distinguishing winter canola from other land cover types. (2) Existing winter canola mapping studies typically achieve the highest classification accuracy during the flowering and podding stages of winter canola [17,54]. For example, Ashourloo et al., based on Sentinel-2 time-series data from March to June, achieved a winter canola mapping accuracy of 88% [16]. Compared to previous studies, we map winter canola at the early stage by using the early-stage features of winter canola.
By combining Sentinel-1 and Sentinel-2 images, the early-stage mapping accuracy and time of winter canola are much higher and earlier than those obtained using SAR data alone, mainly because optical images can obtain rich crop canopy spectral information [55]. Yet, SAR data provide complementary information to optical sensors, reflecting geometric and dielectric properties of vegetation and soil, which have been proven relevant for crop classification [43], and the combination of the two might improve classification results [45,56]. Additionally, when cloud conditions limit the viability of optical data, SAR observations still provide useful information [57]. According to Table 7, adding Sentinel-1 could improve the overall accuracy by about 2–4% and the F-score by about 1–2% in winter canola mapping. The addition of sentinel-1 data did not result in a significant improvement in accuracy from the results, because high accuracy can be achieved based on Sentinel-2 data alone, leaving little room for improvement in accuracy. Therefore, although the early-stage mapping accuracy of winter canola using only SAR images is not as good as the combination of Sentinel-1 and Sentinel-2 data, it is still worthy of studying given the effects of rainy and cloudy conditions on the optical images, especially in our study area.

4.3. Future Studies

The early-stage winter canola mapping method proposed in this study achieved satisfactory results. This method can map winter canola planting areas 4.5 months before harvest. However, there are still some limitations. In this study, we used machine learning algorithms; the RF model is highly dependent on samples for training. RF requires manual feature selecting and pre-definition of crop growth stage, whereas deep learning can overcome this problem and generate high-dimensional features [51,55,58]. For example, Wang et al. developed a deep adaptation crop classification network (DACCN) [29,59]. They found that DACCN outperformed other models (RF and SVM) with overall classification accuracies ranging from 0.835 to 0.92. The use of deep neural networks for early-stage mapping of crops is a potential improvement in the future.
The early-stage mapping method developed in this study can be applied to other regions by the classifier transfer method. However, classifier transfer often produces errors owing to differences in remote sensing observations and climate environment. In order to obtain the best transfer results, we need to use remote sensing images of the same phenological stage. If there is significant interannual variation in weather conditions or different growing dates of crops in different regions, it may lead to differences in crop phenology, thereby diminishing the accuracy of classifier transfer. For example, in this study, in the 2021, because of cloud impact, the phenological stage of the images we used was different from that of other years (Table S2), and the migration accuracy of the classifier is only 0.7569 (Figure 9). Some countries and regions have accurate crop phenology observations, such as data from agrometeorological stations in China. In the future, combining remote sensing and topographic and climate variables (i.e., temperature, precipitation) data to simulate winter canola growth phenology may prove an effective method for improving the accuracy of early-stage winter canola.

5. Conclusions

Early-stage crop mapping would greatly benefit policymakers, civil society, and private industry to forecast food production and ensure food safety. Here, we studied the earliest mapping of the phenological stage of winter canola in the Jianghan Plain with the combination of Sentinel-1 and Sentinel-2 imagery. The mapping accuracy of winter canola at different phenological stages was compared. In addition, the impact of combining Sentinel-2 optical and Sentinel-1SAR data on the mapping accuracy of early-stage winter canola was evaluated, and various features of winter canola mapping at different phenological stages were analyzed. RF and SVM classifiers were trained based on early-stage images and field samples in 2017, and then transferred to the corresponding images in 2018–2022 to obtain early-stage canola maps independent of within-year ground samples. The following conclusions could be drawn:
  • At the overwintering stage, the differences in the red-edge and near-infrared bands were the largest, followed by VV, DVI, and GOSAVI, which also showed greater differences than the other features
  • When using only Sentinel-1 data, the winter canola could be earliest identified at the bolting stage (late January, 3.5 months prior to harvest), that is, 50 days earlier than the traditional flowering period (mid-March). The F-score of winter canola under the RF and SVM models could reach more than 0.75, and the OA could reach more than 80%; the accuracy of the RF model was slightly higher.
  • Combining Sentinel-1 and Sentinel-2 data, the winter canola could be mapped earliest during the early overwinter stage (late December, 4.5 months prior to harvest), with the F-score of winter canola over 0.85 and the OA over 81% under the RF and SVM models. Adding Sentinel-1 could improve the overall accuracy by about 2–4% and the F-score by about 1–2% in winter canola mapping.
  • The F-score of winter canola mapping based on the classifier transfer approach in 2018–2022 varied between 0.75 and 0.97, and the OA ranged from 79% to 86%.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/rs16173197/s1, Figure S1: Five sample images from Planet and Google Earth; Figure S2: Importance of features at different winter canola phenological stages based on Sentinel-2 and Sentinel-1; Figure S3: Spectral curves of different land types at different phenological stages; Figure S4: The spatial details of the winter canola map at the overwinter stage in 2017; Figure S5: Winter canola area in each city of Jianghan Plain in 2017–2022; Table S1: Description of the number of Sentinel-1 satellite images used in the study; Table S2: Description of the number of Sentinel-2 satellite images used in the study; Table S3: Spectral indexes.

Author Contributions

Conceptualization, T.L. and R.M.; methodology, T.L. and R.M.; validation, T.L. and P.L.; formal analysis, T.L.; data curation, T.L. and P.L.; writing—original draft preparation, T.L.; writing—review and editing, T.L., R.M., J.L. and F.Z.; visualization, J.L. and R.M.; supervision, R.M., F.Z. and J.L. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by Key Research and Development Program of Heilongjiang, China (grant No. 2022ZX01A25; JD2023GJ01) and National Natural Science Foundation of China (grant No. 42171056; 42471362).

Data Availability Statement

Data will be made available on request.

Acknowledgments

We thank Chaoyang Zhang, Yigui Liao, Longfei Zhou, Ping Zhao, Binyuan Xu, and Rui Sun for their help.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Sulik, J.J.; Long, D.S. Spectral indices for yellow canola flowers. Int. J. Remote Sens. 2015, 36, 2751–2765. [Google Scholar] [CrossRef]
  2. Kontgis, C.; Schneider, A.; Ozdogan, M. Mapping rice paddy extent and intensification in the Vietnamese Mekong River Delta with dense time stacks of Landsat data. Remote Sens. Environ. 2015, 169, 255–269. [Google Scholar] [CrossRef]
  3. Zhang, M.; Lin, H. Object-based rice mapping using time-series and phenological data. Adv. Space Res. 2019, 63, 190–202. [Google Scholar] [CrossRef]
  4. Chipanshi, A.; Zhang, Y.; Kouadio, L.; Newlands, N.; Davidson, A.; Hill, H.; Warren, R.; Qian, B.; Daneshfar, B.; Bedard, F. Evaluation of the Integrated Canadian Crop Yield Forecaster (ICCYF) model for in-season prediction of crop yield across the Canadian agricultural landscape. Agric. For. Meteorol. 2015, 206, 137–150. [Google Scholar] [CrossRef]
  5. Song, Q.; Hu, Q.; Zhou, Q.; Hovis, C.; Xiang, M.; Tang, H.; Wu, W. In-season crop mapping with GF-1/WFV data by combining object-based image analysis and random forest. Remote Sens. 2017, 9, 1184. [Google Scholar] [CrossRef]
  6. Mosleh, M.; Hassan, Q.; Chowdhury, E. Application of Remote Sensors in Mapping Rice Area and Forecasting Its Production: A Review. Comput. Fluids 2015, 117, 114–124. [Google Scholar] [CrossRef]
  7. Jing, Y.; Li, G.; Chen, J.; Shi, Y. Determination of Paddy Rice Growth Indicators with MODIS Data and Ground-Based Measurements of LAI; Atlantis Press: Dordrecht, The Netherlands, 2013. [Google Scholar]
  8. Jin, Z.; Azzari, G.; You, C.; Di Tommaso, S.; Aston, S.; Burke, M.; Lobell, D.B. Smallholder maize area and yield mapping at national scales with Google Earth Engine. Remote Sens. Environ. 2019, 228, 115–128. [Google Scholar] [CrossRef]
  9. Belgiu, M.; Csillik, O. Sentinel-2 cropland mapping using pixel-based and object-based time-weighted dynamic time warping analysis. Remote Sens. Environ. 2018, 204, 509–523. [Google Scholar] [CrossRef]
  10. Griffiths, P.; Nendel, C.; Hostert, P. Intra-annual reflectance composites from Sentinel-2 and Landsat for national-scale crop and land cover mapping. Remote Sens. Environ. 2019, 220, 135–151. [Google Scholar] [CrossRef]
  11. Wang, M.; Zheng, Y.; Huang, C.; Meng, R.; Pang, Y.; Jia, W.; Zhou, J.; Huang, Z.; Fang, L.; Zhao, F. Assessing Landsat-8 and Sentinel-2 spectral-temporal features for mapping tree species of northern plantation forests in Heilongjiang Province, China. For. Ecosyst. 2022, 9, 100032. [Google Scholar] [CrossRef]
  12. Huang, Z.; Zhong, L.; Zhao, F.; Wu, J.; Tang, H.; Lv, Z.; Xu, B.; Zhou, L.; Sun, R.; Meng, R. A spectral-temporal constrained deep learning method for tree species mapping of plantation forests using time series Sentinel-2 imagery. ISPRS J. Photogramm. Remote Sens. 2023, 204, 397–420. [Google Scholar] [CrossRef]
  13. Mercier, A.; Betbeder, J.; Baudry, J.; Le Roux, V.; Spicher, F.; Lacoux, J.; Roger, D.; Hubert-Moy, L. Evaluation of Sentinel-1 & 2 time series for predicting wheat and rapeseed phenological stages. ISPRS J. Photogramm. Remote Sens. 2020, 163, 231–256. [Google Scholar]
  14. Liu, S.S.; Chen, Y.R.; Ma, Y.T.; Kong, X.X.; Zhang, X.Y.; Zhang, D.Y. Mapping Ratoon Rice Planting Area in Central China Using Sentinel-2 Time Stacks and the Phenology-Based Algorithm. Remote Sens. 2020, 12, 3400. [Google Scholar] [CrossRef]
  15. Sulik, J.J.; Long, D.S. Spectral considerations for modeling yield of canola. Remote Sens. Environ. 2016, 184, 161–174. [Google Scholar] [CrossRef]
  16. Ashourloo, D.; Shahrabi, H.S.; Azadbakht, M.; Aghighi, H.; Nematollahi, H.; Alimohammadi, A.; Matkan, A.A. Automatic canola mapping using time series of sentinel 2 images. ISPRS J. Photogramm. Remote Sens. 2019, 156, 63–76. [Google Scholar] [CrossRef]
  17. Han, J.; Zhang, Z.; Cao, J.; Luo, Y. Mapping rapeseed planting areas using an automatic phenology-and pixel-based algorithm (APPA) in Google Earth Engine. Crop. J. 2022, 10, 1483–1495. [Google Scholar] [CrossRef]
  18. Tao, J.-B.; Liu, W.-B.; Tan, W.-X.; Kong, X.-B.; Meng, X.U. Fusing multi-source data to map spatio-temporal dynamics of winter rape on the Jianghan Plain and Dongting Lake Plain, China. J. Integr. Agric. 2019, 18, 2393–2407. [Google Scholar] [CrossRef]
  19. De Vroey, M.; de Vendictis, L.; Zavagli, M.; Bontemps, S.; Heymans, D.; Radoux, J.; Koetz, B.; Defourny, P. Mowing detection using Sentinel-1 and Sentinel-2 time series for large scale grassland monitoring. Remote Sens. Environ. 2022, 280, 113145. [Google Scholar] [CrossRef]
  20. Sun, R.; Zhao, F.; Huang, C.; Huang, H.; Lu, Z.; Zhao, P.; Ni, X.; Meng, R. Integration of deep learning algorithms with a Bayesian method for improved characterization of tropical deforestation frontiers using Sentinel-1 SAR imagery. Remote Sens. Environ. 2023, 298, 113821. [Google Scholar] [CrossRef]
  21. Park, S.; Im, J.; Park, S.; Yoo, C.; Han, H.; Rhee, J. Classification and mapping of paddy rice by combining Landsat and SAR time series data. Remote Sens. 2018, 10, 447. [Google Scholar] [CrossRef]
  22. Liao, C.H.; Wang, J.F.; Xie, Q.H.; Baz, A.A.; Huang, X.D.; Shang, J.L.; He, Y.J. Synergistic Use of multi-temporal RADARSAT-2 and VENµS data for crop classification based on 1D convolutional neural network. Remote Sens. 2020, 12, 832. [Google Scholar] [CrossRef]
  23. Luo, C.; Qi, B.; Liu, H.; Guo, D.; Lu, L.; Fu, Q.; Shao, Y. Using Time Series Sentinel-1 Images for Object-Oriented Crop Classification in Google Earth Engine. Remote Sens. 2021, 13, 561. [Google Scholar] [CrossRef]
  24. Hu, L.; Xu, N.; Liang, J.; Li, Z.; Chen, L.; Zhao, F. Advancing the Mapping of Mangrove Forests at National-Scale Using Sentinel-1 and Sentinel-2 Time-Series Data with Google Earth Engine: A Case Study in China. Remote Sens. 2020, 12, 3120. [Google Scholar] [CrossRef]
  25. Gašparović, M.; Dobrinić, D. Comparative assessment of machine learning methods for urban vegetation mapping using multitemporal sentinel-1 imagery. Remote Sens. 2020, 12, 1952. [Google Scholar] [CrossRef]
  26. Sun, Z.; Luo, J.H.; Yang, J.Z.C.; Yu, Q.Y.; Zhang, L.; Xue, K.; Lu, L.R. Nation-Scale Mapping of Coastal Aquaculture Ponds with Sentinel-1 SAR Data Using Google Earth Engine. Remote Sens. 2020, 12, 3086. [Google Scholar] [CrossRef]
  27. Wang, P.; Zhang, H.; Patel, V.M. SAR Image Despeckling Using a Convolutional Neural Network. IEEE Signal Process. Lett. 2017, 24, 1763–1767. [Google Scholar] [CrossRef]
  28. Zhang, K.; Zuo, W.; Chen, Y.; Meng, D.; Zhang, L. Beyond a Gaussian Denoiser: Residual Learning of Deep CNN for Image Denoising. IEEE Trans. Image Process. 2017, 26, 3142–3155. [Google Scholar] [CrossRef] [PubMed]
  29. Wang, Y.; Feng, L.; Zhang, Z.; Tian, F. An unsupervised domain adaptation deep learning method for spatial and temporal transferable crop type mapping using Sentinel-2 imagery. ISPRS J. Photogramm. Remote Sens. 2023, 199, 102–117. [Google Scholar] [CrossRef]
  30. Yin, F.; Lewis, P.E.; Gomez-Dans, J. Bayesian atmospheric correction over land: Sentinel-2/MSI and Landsat 8/OLI. Geosci. Model Dev. 2022, 15, 7933–7976. [Google Scholar] [CrossRef]
  31. Porwal, S.; Katiyar, S.K. Performance evaluation of various resampling techniques on IRS imagery. In Proceedings of the International Conference on Contemporary Computing, Noida, India, 7–9 August 2014. [Google Scholar]
  32. Periasamy, S. Significance of dual polarimetric synthetic aperture radar in biomass retrieval: An attempt on Sentinel-1. Remote Sens. Environ. 2018, 217, 537–549. [Google Scholar] [CrossRef]
  33. Haralick, R.M.; Shanmugam, K.; Dinstein, I.H. Textural features for image classification. IEEE Trans. Syst. Man Cybern. 1973, SMC-3, 610–621. [Google Scholar] [CrossRef]
  34. Conners, R.W.; Trivedi, M.M.; Harlow, C.A. Segmentation of a high-resolution urban scene using texture operators. Comput. Vis. Graph Image Process 1984, 25, 273–310. [Google Scholar] [CrossRef]
  35. Hastie, T.; Tibshirani, R.; Friedman, J.; Franklin, J. The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Math. Intell. 2004, 27, 83–85. [Google Scholar] [CrossRef]
  36. Breiman, L.J.M.L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
  37. Genuer, R.; Poggi, J.M.; Tuleau-Malot, C. VSURF: Variable Selection Using Random Forests. Pattern Recognit. Lett. 2016, 31, 2225–2236. [Google Scholar] [CrossRef]
  38. Chen, N.; Yu, L.; Zhang, X.; Shen, Y.; Zeng, L.; Hu, Q.; Niyogi, D. Mapping Paddy Rice Fields by Combining Multi-Temporal Vegetation Index and Synthetic Aperture Radar Remote Sensing Data Using Google Earth Engine Machine Learning Platform. Remote Sens. 2020, 12, 2992. [Google Scholar] [CrossRef]
  39. Wei, C.; Huang, J.; Mansaray, L.; Li, Z.; Liu, W.; Han, J. Estimation and Mapping of Winter Oilseed Rape LAI from High Spatial Resolution Satellite Data Based on a Hybrid Method. Remote Sens. 2017, 9, 488. [Google Scholar] [CrossRef]
  40. Cortes, C.; Vapnik, V. Support-vector networks. Mach. Learn. 1995, 20, 273–297. [Google Scholar] [CrossRef]
  41. Whelen, T.; Siqueira, P. Use of time-series L-band UAVSAR data for the classification of agricultural fields in the San Joaquin Valley. Remote Sens. Environ. 2017, 193, 216–224. [Google Scholar] [CrossRef]
  42. Liao, C.; Wang, J.; Shang, J.; Huang, X.; Liu, J.; Huffman, T. Sensitivity study of Radarsat-2 polarimetric SAR to crop height and fractional vegetation cover of corn and wheat. Int. J. Remote Sens. 2018, 39, 1475–1490. [Google Scholar] [CrossRef]
  43. Mcnairn, H.; Shang, J.; Jiao, X.; Champagne, C. The Contribution of ALOS PALSAR Multipolarization and Polarimetric Data to Crop Classification. IEEE Trans. Geosci. Remote Sens. 2009, 47, 3981–3992. [Google Scholar] [CrossRef]
  44. Mandal, D.; Kumar, V.; Ratha, D.; Dey, S.; Bhattacharya, A.; Lopez-Sanchez, J.M.; McNairn, H.; Rao, Y.S. Dual polarimetric radar vegetation index for crop growth monitoring using sentinel-1 SAR data. Remote Sens. Environ. 2020, 247, 111954. [Google Scholar] [CrossRef]
  45. Zhang, H.; Liu, W.; Zhang, L. Seamless and automated rapeseed mapping for large cloudy regions using time-series optical satellite imagery. ISPRS J. Photogramm. Remote Sens. 2022, 184, 45–62. [Google Scholar] [CrossRef]
  46. Betbeder, J.; Fieuzal, R.; Philippets, Y.; Ferro-Famil, L.; Baup, F. Contribution of multitemporal polarimetric synthetic aperture radar data for monitoring winter wheat and rapeseed crops. J. Appl. Remote Sens. 2016, 10, 026020. [Google Scholar] [CrossRef]
  47. Cookmartin, G.; Saich, P.; Quegan, S.; Cordey, R.; Burgess-Allen, P.; Sowter, A. Modeling microwave interactions with crops and comparison with ERS-2 SAR observations. IEEE Trans. Geosci. Remote Sens. 2002, 38, 658–670. [Google Scholar] [CrossRef]
  48. Fieuzal, R.; Baup, F.; Maraissicre, C. Monitoring Wheat and Rapeseed by Using Synchronous Optical and Radar Satellite Data—From Temporal Signatures to Crop Parameters Estimation. Adv. Remote Sens. 2013, 2, 162–180. [Google Scholar] [CrossRef]
  49. Sulik, J.J.; Long, D.S. Automated detection of phenological transitions for yellow flowering plants such as Brassicaoilseeds. Agrosystems Geosci. Environ. 2020, 3, e20125. [Google Scholar] [CrossRef]
  50. Zang, Y.; Chen, X.; Chen, J.; Tian, Y.; Cui, X. Remote Sensing Index for Mapping Canola Flowers Using MODIS Data. Remote Sens. 2020, 12, 3912. [Google Scholar] [CrossRef]
  51. Huang, X.; Huang, J.; Li, X.; Shen, Q.; Chen, Z. Early mapping of winter wheat in Henan province of China using time series of Sentinel-2 data. GIsci Remote Sens. 2022, 59, 1534–1549. [Google Scholar] [CrossRef]
  52. D’Andrimont, R.; Taymans, M.; Lemoine, G.; Ceglar, A.; Velde, M.V.D. Detecting flowering phenology in oil seed rape parcels with Sentinel-1 and -2 time series. Remote Sens. Environ. 2020, 239, 111660. [Google Scholar] [CrossRef]
  53. Dong, W.; Shenghui, F.; Zhenzhong, Y.; Lin, W.; Wenchao, T.; Yucui, L.; Chunyan, T. A Regional Mapping Method for Oilseed Rape Based on HSV Transformation and Spectral Features. Int. J. Geo-Inf. 2018, 7, 224. [Google Scholar]
  54. Tao, J.-B.; Zhang, X.-Y.; Wu, Q.-F.; Wang, Y. Mapping winter rapeseed in South China using Sentinel-2 data based on a novel separability index. J. Integr. Agric. 2023, 22, 1645–1657. [Google Scholar] [CrossRef]
  55. Adrian, J.; Sagan, V.; Maimaitijiang, M. Sentinel SAR-optical fusion for crop type mapping using deep learning and Google Earth Engine. ISPRS J. Photogramm. Remote Sens. 2021, 175, 215–235. [Google Scholar] [CrossRef]
  56. Orynbaikyzy, A.; Gessner, U.; Conrad, C. Crop type classification using a combination of optical and radar remote sensing data: A review. Int. J. Remote Sens. 2019, 40, 6553–6595. [Google Scholar] [CrossRef]
  57. Arias, M.; Campo-Bescós, M.Á.; Álvarez-Mozos, J. Crop Classification Based on Temporal Signatures of Sentinel-1 Observations over Navarre Province, Spain. Remote Sens. 2020, 12, 278. [Google Scholar] [CrossRef]
  58. Zhong, L.; Hu, L.; Zhou, H.; Tao, X. Deep learning based winter wheat mapping using statistical data as ground references in Kansas and northern Texas, US. Remote Sens. Environ. 2019, 233, 111411. [Google Scholar] [CrossRef]
  59. Zhao, F.; Sun, R.; Zhong, L.; Meng, R.; Huang, C.; Zeng, X.; Wang, M.; Li, Y.; Wang, Z. Monthly mapping of forest harvesting using dense time series Sentinel-1 SAR imagery and deep learning. Remote Sens. Environ. 2022, 269, 112822. [Google Scholar] [CrossRef]
Figure 1. Flowchart of the study.
Figure 1. Flowchart of the study.
Remotesensing 16 03197 g001
Figure 2. Study area (from Globe Land30 2010 dataset).
Figure 2. Study area (from Globe Land30 2010 dataset).
Remotesensing 16 03197 g002
Figure 3. Optical image coverage and spatial distribution of samples in Jianghan Plain from 2017 to 2022.
Figure 3. Optical image coverage and spatial distribution of samples in Jianghan Plain from 2017 to 2022.
Remotesensing 16 03197 g003
Figure 4. RGB images showing phenological stages of winter canola in the experiment.
Figure 4. RGB images showing phenological stages of winter canola in the experiment.
Remotesensing 16 03197 g004
Figure 5. Backscatter coefficient distributions of winter canola during different phenological stages across different land types (Different letters represent significant differences between crops).
Figure 5. Backscatter coefficient distributions of winter canola during different phenological stages across different land types (Different letters represent significant differences between crops).
Remotesensing 16 03197 g005
Figure 6. Textural features of winter canola and other land types at different phenological stages (Different letters represent significant differences between crops).
Figure 6. Textural features of winter canola and other land types at different phenological stages (Different letters represent significant differences between crops).
Remotesensing 16 03197 g006
Figure 7. Spectral features and significant difference tests of land types at different phenological stages (Different letters represent significant differences between crops).
Figure 7. Spectral features and significant difference tests of land types at different phenological stages (Different letters represent significant differences between crops).
Remotesensing 16 03197 g007
Figure 8. Mapping accuracy of winter canola at different phenological stages based on Sentinel-1 data.
Figure 8. Mapping accuracy of winter canola at different phenological stages based on Sentinel-1 data.
Remotesensing 16 03197 g008
Figure 9. Mapping accuracy of winter canola in 2018–2022 based Sentinel-1 and Sentinel-2 data.
Figure 9. Mapping accuracy of winter canola in 2018–2022 based Sentinel-1 and Sentinel-2 data.
Remotesensing 16 03197 g009
Figure 10. Maps of winter canola in 2017–2022.
Figure 10. Maps of winter canola in 2017–2022.
Remotesensing 16 03197 g010
Table 1. Phenological information of main crops in winter.
Table 1. Phenological information of main crops in winter.
Remotesensing 16 03197 i001
Table 2. Number of ground samples in 2017–2022.
Table 2. Number of ground samples in 2017–2022.
Class201720182019202020212022
Winter canola1302521549321320339
Winter wheat1216527561340340325
Trees1225508559447447447
Water1242520591470470470
Building1242517585465465465
Total622725932845204320422046
Table 3. Polarization indexes.
Table 3. Polarization indexes.
Polarization IndexesFormularCitation
SRIVH/VVThis paper
SDIVH-VVThis paper
SNDI(VH − VV)/(VH + VV)This paper
SDRI2 × VV/(VH + VV)This paper
SRDVH/VV − VV/VHThis paper
SNDVHvv/VVvh(VH/VVVV/VH)/(VH/VV + VV/VH)This paper
SMI(VH + VV)/2[32]
Table 4. Textural features.
Table 4. Textural features.
Textural FeaturesCitation
ASM, Contrast, CORR, VAR, IDM, SAVG, SVAR, SENT, ENT, DVAR, DENT, IMCORR1, IMCORR2, MaxCORR[33]
DISS, PROM[34]
Table 5. Feature selection of winter canola at different phenological stages based on Sentinel-1.
Table 5. Feature selection of winter canola at different phenological stages based on Sentinel-1.
Phenological StagesSelected Variables
Seeding stage (Early December)VH_Savg, VV_Savg, SMI, SRI, Elevation, SDRI, VV_Diss
Overwinter stage (Late December)VV_Savg, SRI, VV, SND, VH_Savg, Elevation, VH_Svar, VH_Prom
Early bolting stage (Late January)VV_Savg, VH_Savg, VV, SRI, Elevation, VV_Dvar, VH_Diss
Late bolting stage (Mid-February)SMI, VH_Savg, VH, VV, Elevation, VV_Savg, VV_Diss
Flowering stage (Late March)VH_Savg, VH, SMI, VV_Savg, Elevation, SDI, VV_Diss
Table 6. Feature selection of winter canola at different phenological stages based on Sentinel-2 and Sentinel-1.
Table 6. Feature selection of winter canola at different phenological stages based on Sentinel-2 and Sentinel-1.
Phenological StagesSelected Variables
Seeding stage (Early December)B6, B5, B8, RPVI, GOSAVI, LSWI, VV
Overwinter stage (Late December)B6, B7, B8, DVI, GOSAVI, VV, B5
Late bolting stage (Mid-February)B6, B8, B7, VV, DVI, VH, GARI
Table 7. Accuracy evaluation of different data sources at the different phenological stage.
Table 7. Accuracy evaluation of different data sources at the different phenological stage.
ModelData SourceSeedingOverwinterBolting
F-ScoreOAF-ScoreOAF-ScoreOA
RFS20.792880.06%0.844577.53%0.979191.65%
S1 + S20.801081.48%0.861581.63%0.983194.35%
SVMS20.765880.93%0.847879.94%0.978092.11%
S1 + S20.789482.58%0.858382.34%0.980094.48%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Liu, T.; Li, P.; Zhao, F.; Liu, J.; Meng, R. Early-Stage Mapping of Winter Canola by Combining Sentinel-1 and Sentinel-2 Data in Jianghan Plain China. Remote Sens. 2024, 16, 3197. https://doi.org/10.3390/rs16173197

AMA Style

Liu T, Li P, Zhao F, Liu J, Meng R. Early-Stage Mapping of Winter Canola by Combining Sentinel-1 and Sentinel-2 Data in Jianghan Plain China. Remote Sensing. 2024; 16(17):3197. https://doi.org/10.3390/rs16173197

Chicago/Turabian Style

Liu, Tingting, Peipei Li, Feng Zhao, Jie Liu, and Ran Meng. 2024. "Early-Stage Mapping of Winter Canola by Combining Sentinel-1 and Sentinel-2 Data in Jianghan Plain China" Remote Sensing 16, no. 17: 3197. https://doi.org/10.3390/rs16173197

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop