Next Article in Journal
Horticultural Image Feature Matching Algorithm Based on Improved ORB and LK Optical Flow
Previous Article in Journal
Adaptive Support-Driven Sparse Recovery STAP Method with Subspace Penalty
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

High-Resolution Flowering Index for Canola Yield Modelling

by
Hansanee Fernando
,
Thuan Ha
,
Anjika Attanayake
,
Dilshan Benaragama
,
Kwabena Abrefa Nketia
,
Olakorede Kanmi-Obembe
and
Steven J. Shirtliffe
*
Department of Plant Science, College of Agriculture and Bioresources, University of Saskatchewan, Saskatoon, SK S7N 5A8, Canada
*
Author to whom correspondence should be addressed.
Remote Sens. 2022, 14(18), 4464; https://doi.org/10.3390/rs14184464
Submission received: 13 June 2022 / Revised: 26 August 2022 / Accepted: 6 September 2022 / Published: 7 September 2022
(This article belongs to the Section Remote Sensing in Agriculture and Vegetation)

Abstract

:
Canola (Brassica napus), with its prominent yellow flowers, has unique spectral characteristics and necessitates special spectral indices to quantify the flowers. This study investigated four spectral indices for high-resolution RGB images for segmenting yellow flower pixels. The study compared vegetation indices to digitally quantify canola flower area to develop a seed yield prediction model. A small plot (2.75 m × 6 m) experiment was conducted at Kernen Research Farm, Saskatoon, where canola was grown under six row spacings and eight seeding rates with four replicates (192 plots). The flower canopy reflectance was imaged using a high-resolution (0.15 cm ground sampling distance) 100 MP iXU 1000 RGB sensor mounted on an unpiloted aerial vehicle (UAV). The spectral indices were evaluated for their efficiency in identifying canola flower pixels using linear discriminant analysis (LDA). Digitized flower pixel area was used as a predictor of seed yield to develop four models. Seventy percent of the data were used for model training and 30% for testing. Models were compared using performance metrics: coefficient of determination (R2) and root mean squared error (RMSE). The High-resolution Flowering Index (HrFI), a new flower index proposed in this study, was identified as the most accurate in detecting flower pixels, especially in high-resolution imagery containing within-canopy shadow pixels. There were strong, positive associations between digitized flower area and canola seed yield with the peak flowering timing having a greater R2 (0.82) compared to early flowering (0.72). Cumulative flower pixel area predicted 75% of yield. Our results indicate that the HrFI and Modified Yellowness Index (MYI) were better predictors of canola yield compared to the NDYI and RBNI (Red Blue Normalizing Index) as they were able to discriminate between canola petals and within-canopy shadows. We suggest further studies to evaluate the performance of the HrFI and MYI vegetation indices using medium-resolution UAV and satellite imagery.

Graphical Abstract

1. Introduction

Canola (Brassica napus L.) is an oilseed crop produced for its edible oil, high protein meal, and biofuel [1]. Canada is the leading producer of canola with 18.7 million tons produced in 2020 [2]. The crop is primarily produced in western Canada, with Saskatchewan accounting for 55 percent of total production. Increased consumer demand necessitates increased canola production and enhanced genetic and agronomic approaches are indispensable for achieving high crop yields.
Crop yield forecasts assist farmers and stakeholders in making timely decisions concerning imports and exports, management decisions, and financial aspects [3]. Genotype and environmental factors and their interactions make yield prediction extremely difficult due to the intricacy of the yield components. Traditionally, the experts forecasted yield via within-season crop measurements and using statistical approaches such as regression [4]. With advancements in technology, the acquisition of non-destructive, accurate phenotypic data through remote sensing approaches substantially improved the data quality necessary for reliable yield estimations.
Canola has an indeterminate growth habit with overlapping growth stages. However, each stage can be easily distinguished due to prominent phenological changes. Canola flowers are bright yellow, and hence the reproductive stage can be easily distinguished from the vegetative stage. Flowering intensity is considered a direct contributor to seed yield, as each flower determines the potential number of pods [5]. Therefore, quantifying flowers through high-throughput methods as an indirect approach to determine the yield is pivotal for plant breeding and crop production [6]. Furthermore, quantifying the conspicuous canola flower number using unpiloted aerial vehicle (UAV)-based high-resolution RGB imagery is less labor intensive and time consuming than manually counting flowers, especially in breeding trials with hundreds of plots.
Remote sensing techniques extract canopy spectral information, which can study biophysical parameters such as biomass, chlorophyll content, leaf area, and flowering intensity [7]. Building empirical relationships between the remotely sensed spectral information and the biophysical variables via vegetation indices is quick and straightforward [8]. Despite the inability of the relationship to be applied in an environment outside the representativeness of the calibration dataset, the use of vegetation indices for remote sensing studies keeps increasing.
Canola flowers reflect more green and red light and absorb more blue light when compared to green vegetation [9]. These prominent spectral characteristics have led to the development of new vegetation indices [10,11,12,13,14]. To our knowledge, except for the Normalized Difference Yellowness Index (NDYI) developed by Long and Sulik [12], other yellowness indices use spectral information from bands outside the visible region of the electromagnetic spectrum. The contrast between the reduced blue reflectance and increased green and red reflectance [9] in canola flowers was enhanced through multiplying the band differences to develop the HrFI (Table 1).
Within-canopy shadow pixels where the canopy objects (leaves and flowers) completely block direct light are problematic, especially when quantifying canola flower pixels using high-resolution imagery (<1 cm), making it necessary to address these challenges [15]. Therefore, it is imperative to develop new vegetation indices that can filter the pixels disturbed by the effects of shadows in RGB imagery. Furthermore, use of visible bands for these vegetation indices will enhance their applicability where hyperspectral or multispectral sensors are scarce and expensive. Moreover, RGB sensors usually have ultra-high spatial resolution compared to hyper- and multi-spectral imagery, allowing users to gather spatially explicit data. Accurate flower pixel segmentation is critical for image-based assessments with high spatial resolution imagery [16]. However, intense natural illumination and the influence of shadows from the crop canopy make canopy segmentation from flowers challenging because of the high contrast ratio. The segmentation process often confounds background noise with significant spectral information of flowers [17] and significantly influences its estimates.
The accuracy of image processing steps such as thresholding and segmentation significantly influences its estimates. The confusion matrix, which is at the core of remote sensing accuracy assessment methods, is used to refine the threshold/classification to improve the accuracy of the process [18].
We attempt to propose a straightforward method using standard index-based thresholding to detect and extract flower area for yield estimation. Furthermore, the exclusion of the noise from the target is extensively evaluated to improve the classification accuracy of canola flowers.
The study hypothesized that digitized flower information with minimum noise from high-resolution RGB imagery will be a strong indicator of canola reproductive potential and could be used to estimate the canola seed yield. The study aimed to (1) compare vegetation indices to quantify canola flowers using very-high-resolution RGB imagery, and (2) develop canola seed yield prediction models using digitized flower area.

2. Materials and Methods

2.1. Study Area

The study area is located in the Kernen Research Farm, Saskatoon, Canada (52°16′N, 106°51′W). The Dekalb Roundup Ready variety was seeded on 15 May 2019, with an R-tech row spacing disc drill, and the treatments were applied under a randomized complete block design (RCBD) with four replicates. Two guard rows were placed at both ends of the trial to reduce the environmental effect on the treatments. The plants were seeded in 2.75 m × 6 m plots at six row spacings (15, 30, 45, 60, 75, 90 cm) and eight seeding rates (5, 10, 20, 40, 60, 80, 100, 140 plants/m2) at a seeding depth of 2 cm to create plots with different flowering density. Due to the higher number of treatments (48), the treatments within each replicate were stacked into two rows (Figure 1). The plots were manually weeded once every two weeks to ensure that the images only captured the canola canopy. The fertilizer application was based on the soil test recommendations.

2.2. Ground-Truthed Yield Data

Canola plants were hand-harvested at physiological maturity and depending on the treatment allocated for each plot (row spacing), plants were uprooted systematically. All plots were harvested for seed yield on 4 September 2019. For the data analysis, the initial grain yield (g/plot) of all plots was converted to kg/ha. The trial was regularly monitored for significant changes in the phenology to identify the onset of the reproductive stage. The flowering stage started on 27 June (BBCH: 61) and ended on 9 August (BBCH: 67), 2019 [19].

2.3. Image Acquisition and Pre-Processing

A 100 MP iXU 1000 RGB camera (Phase One, Frederiksberg, Denmark) mounted on a multirotor hexacopter DJI M600 UAV (SZ DJI Technology Co., Shenzhen, China) was used for aerial image acquisition. The RGB camera provides three band images—red, green, and blue. Three flights were scheduled during the reproductive stage: 1st flight at early flowering (BBCH: 63, 4 July); 2nd flight at peak flowering (BBCH: 65, 12 July); and 3rd flight at the end of flowering (BBCH, 67, 23 July).
The UAV followed a pre-set flight path autonomously at a ground speed of 3 m/s (meters per second), following a 70% frontal and 75% side image overlap. The imagery was captured from an altitude of 20 m AGL (above ground level), achieving ~0.17 cm ground sample distance. The RGB sensor captured aerial images from a nadir view. The workflow of the image processing is presented in Figure 2.
Acquired imagery was preprocessed using the Agisoft Metashape software (Agisoft LLC., St. Petersburg, Russia) to develop three 8-bit orthomosaics with 79,175 pixels × 78,977 pixels. The pixel brightness (digital number) was used in processing the orthomosaics and henceforth collectively called reflectance. The geo-referenced orthomosaics with three bands were further used for flower segmentation. The images were processed using ESRI ArcGIS Pro Version 10.6 software [20].

2.4. Image Processing

2.4.1. Vegetation Indices

The use of vegetative indices to monitor and assess changes in crop phenology is a simple yet effective method commonly used in remote sensing studies [21]. This study used four spectral indices to segment flower pixels in high-resolution RGB imagery. Vegetation indices that require only red, green, and blue bands were selected for this study. The NDYI is a well-known yellowness index used to study the yellow color flowers. The normalized difference between blue and green bands was used to enhance canola flower reflectance [12]. The NDYI was used as a reference index to compare the performance of three new reproductive indices proposed in this study. The Modified Yellowness Index (MYI) and High-resolution Flowering Index (HrFI) use all three visible bands, while the Red Blue Normalizing Index (RBNI) uses the normalized red and blue bands. The suggested new indices in this study (MYI and HrFI) are proposed for very-high-resolution RGB imagery. Red, green, and blue in the following vegetation indices refer to the reflectance value of the respective bands in the RGB sensor.

2.4.2. Comparison of Vegetation Indices

Vegetative indices enhance certain plant biophysical features and help distinguish them from background features. For example, NDVI was developed to quantify green vegetation, but the presence of yellow flowers reduces the accuracy of the estimates [12]. Four ground cover classes (flowers, soils, shadows, and leaves) were used to investigate the value distribution of each vegetation index in the four features. A vector point layer was developed, and a total of 120 points were assigned (30 points per ground cover type). The NDYI, MYI, HrFI, and RBNI values were extracted using the polygon layer, and the value distributions in each ground cover category were visualized via a boxplot. Furthermore, linear discriminant analysis was used to identify the index that can best separate the above ground cover classes using information from all three image dates. Based on the boxplot and LDA results, the vegetation index that best differentiated flower pixels from non-flower pixels was chosen to calculate the digitized flower area in the three orthomosaics.

2.4.3. Threshold Selection through Accuracy Assessment

Threshold-based segmentation was used to filter the non-flower pixels from the orthomosaic [22]. The boxplot results were used to identify an approximate range of threshold values for flower segmentation for the most accurate index. Based on the threshold range, several thresholds were identified to create corresponding binary classifications of flower and non-flower pixels. The accuracy of each classification was assessed via a confusion matrix [23]. Initially, a reference layer with two classes (flowers, non-flowers) was created, and for each class, 150 reference sampling points were assigned. A confusion matrix was created for each threshold-based classification, and the accuracy was assessed using the kappa coefficient. The kappa coefficient indicates the reliability of a classification compared to random chance [24]. Kappa statistics range from −1 to +1 with values between 0.81–1.00 as perfect agreement. Therefore, the threshold with the highest kappa coefficient was selected for the final segmentation to extract the flower area for each image date.

2.4.4. Data Extraction

The vector polygon layer was developed to calculate the flower pixel area from the imagery to match the ground area harvested for yield. Therefore, only the flower information from the relevant rows used in seed yield calculation was extracted for data analysis (Figure 3). The treatment structure used in this study allowed for different flowering densities between plots. For the three orthomosaics, the digitized flower area in each plot was calculated and extracted through the polygon layer. The total number of flower pixels in each plot is defined as the digitized flower area.

2.5. Model Development

Extracted data were imported and analyzed using RStudio Version 3.6.1 (RStudio, Boston, MA, USA, 2018). Two approaches were used in developing and comparing the yield models (Table 2). The first approach used digitized flower area (FA) of individual image date as a predictor (Model 1, 2, and 3). In the second approach (Model 4), the time series curves of the digitized flower area were used to calculate the cumulative flower area (CFA) [25]. The cumulative flower area over time per plot was calculated using the library ‘bayestestR’ trapezoid method [26].
Models 1 and 2 were developed using two non-linear three-parameter asymptotic regression models, using package ‘drc’ [27]. Y represents the canola seed yield in the three-parameter model, and d and c represent the upper and lower horizontal asymptotes. The e parameter determines the steepness of the curve. Models 3 and 4 were simple linear regression models, where Y, x, m, and c represent the seed yield, predictor variable (flower area), slope, and intercept, respectively.
All models were trained and validated using the holding-out method, where 70% of the data was used to train the model, and 30% was used to validate the model [28]. Two accuracy metrics—coefficient of determination (R2) and root mean squared error (RMSE)—were used for the performance evaluation of the validation dataset.

3. Results

3.1. Comparison of Yellowness Indices

The vegetative indices differed visually in their ability to correctly identify canola flowers. The indices were stretched from the maximum to the minimum raster values using the multipart color scheme in ArcGIS Pro. Visual observation of the spectral indices indicated poor contrast between flower and non-flower classes in the NDYI and RBNI compared to the HrFI and MYI. Furthermore, in the RBNI and NDYI, the within-canopy shadow pixels appear as flower pixels, indicating poor execution of these indices.
The descriptive analysis showed the difference between the vegetative indices in their ability to discriminate flowers from other classes and the visual observations (Figure 4) are confirmed through the ground cover class analysis (Figure 5). The HrFI had the most distinct difference between the flower class and the other three ground cover classes, followed by the MYI (Figure 5). However, the difference between the flower class and non-flower class in the MYI was not as prominent as in the HrFI; wherein the MYI, there was a slight overlap between the flower and shadow classes. The NDYI and RBNI did not distinguish between the flower and shadow classes. The complete overlap of index values between flower pixels and the shadow pixels illustrates the poor performance of the NDYI and RBNI in identifying canola flowers from the background, especially when the image has pure within-canopy shadow pixels. However, mixed shadow pixels that are not completely occluded by direct sunlight do not influence the NDYI and RBNI in segmenting flower pixels. High-resolution RGB imagery used in this study consisted of a considerable amount of within-canopy shadow pixels. Therefore, normalized yellowness indices such as the NDYI and RBNI perform poorly when used to quantify canola flower pixels.
The bi-plot (Figure 6) of the LDA complemented the patterns observed through the boxplot. The HrFI was identified as the best index to distinguish flower pixels from other ground cover classes. However, the NDYI and RBNI, which are perpendicular to the flower class, could not segment the ground cover classes. Percentage separation by the first discriminant function was 68% and the second was 19%. Based on the boxplot and LDA results, the HrFI was selected to quantify the digitized flower cover area.
After selecting the HrFI out of four yellowness indices, a confusion matrix was used to select the optimum threshold for flower segmentation (Figure 7). The kappa coefficient for the confusion matrix evaluates the prediction performance of each classification, i.e., each threshold. The higher the kappa value, the higher the classifier performance. The results identified the optimum threshold value for all three orthomosaics as 3000. The optimum kappa values of the raster developed for the 4th, 12th, and 23rd of July for threshold-3000 are 1.0, 0.96, and 0.99, respectively. Therefore, for each raster, a threshold of 3000 was applied for the HrFI to mask out non-flower pixels.

3.2. Yield Modelling

The four canola seed yield prediction models developed in this study used digitized flower area as a predictor to estimate the seed yield. Model 1, 2, and 3 used flower area from individual image dates to develop the seed models, while Model 3 used the cumulative flower area. The flower area had a non-linear asymptotic relationship with the seed yield during early (Figure 8a, Table 3) and peak flowering (Figure 9a, Table 4). However, during the late flowering stage, the flower and yield relationship was linear (Figure 10a, Table 5). Similarly, the cumulative flower area over the reproductive stage had a linear relationship with the canola seed yield (Figure 11a, Table 6).
The model developed on the peak flowering date outperformed the other three models with an R2 of 0.82 and an RMSE of 310 (Table 7). This was followed by Model 4 and Model 1 with R2 and RMSE of 0.75 and 0.72, and 440 and 438, respectively. Model 3, developed using the late flowering date, had poor R and RMSE for model testing (0.01 and 859).

4. Discussion

This study aimed to develop a canola seed yield prediction model using high-resolution RGB imagery. The use of vegetation indices to study phenology changes is mathematically less complex, computationally efficient, and cost-effective. The biggest issue with high-resolution images is within-canopy pure shadow pixels that often blend with flower pixels. Canola flowering canopies have different spectral reflectance characteristics than their vegetative canopy, in which flowers reflect more radiation between 500 and 700 nm and absorb slightly more between 400 and 500 nm [29]. Therefore, the potential of using floral canopy signature for predicting canola seed yield using UAV-based high-resolution RGB imagery was evaluated with a robust dataset from the canola field trial [13].
In this study, brightness values (DN) were used for processing the imagery. The experimental trial was conducted on fairly uniform land with vegetation ground cover as the material of interest. Furthermore, the three images were acquired under similar environmental conditions at the same time of day. Hence, the effect of atmospheric and illumination effects on uniform land would be less compared to undulating land. However, when mapping a larger landscape with significant changes in topography, use of surface reflectance would be advised, especially when multi-temporal data are used. Property-based methods in which the shadows are identified using the brightness of the shadows are commonly used in the literature [30]. Vegetative indices are impacted by shadow pixels, and studies have shown that statistical differences exist between shadowed images and sunlit images in soil and vegetation indices [31]. The pixels completely blocked by any illumination have low reflectance in the red, green, and blue bands. Therefore, due to the inherent nature of their reflectance, when normalized vegetative indices are developed for such pixels, they tend to have higher values (NDYI, RBNI). These pixels have even greater values than flower pixels (Figure 5), which significantly reduces the accuracy of the extracted features. We also discovered that cast shadow pixels partially occluded by direct light have lower index values than flower pixels and can be automatically masked by a simple background mask. Additionally, a simple multiplication takes advantage of the low reflectance of dark shadow pixels to mask them out. Therefore, the HrFI and MYI introduced in these studies have low values for within-canopy shadow pixels. In most remote sensing studies, shadow pixel restoration is essential [32]. However, in canola, as the flower canopy is the topmost layer, restored shadow pixels may not significantly contribute to the inferences. The within-canopy shadow pixels may correspond with middle or lower layers of the canola canopy. Hence, the removal of shadow pixels may not have a significant effect on the analysis of the study.
In this study, the within-canopy shadow pixels were considered as image noise. The proposed indices (HrFI) aim to enhance the contrast between area of interest (flower pixels) and within-canopy shadow (noise) (Figure 5). Spectral ratio transformations enhance noise, especially in pixels with low reflectance (within-canopy shadow pixels in this study). Since the NDYI and RBNI are band ratios, the noise in the images is enhanced [33]. Therefore, if these ratio indices are to be applied to high-resolution RGB imagery with within-canopy shadows, a bias correction must be undertaken first. The HrFI and MYI overcome the issue of generic enhancement of noise pixels, providing a straightforward method in identifying canola flower pixels.
Plots with high-density planting had a significant amount of pure within-canopy shadows (pure shadow pixels) due to the narrower row spacing than plots with wider row spacing. On the other hand, wider row plots had a higher number of mixed shadow pixels that were partially blocked by direct sunlight, and hence had a higher reflectance than pure shadow pixels. Therefore, removing the effect of within-canopy shadows in variable planting trials is essential. Our analysis of the vegetation indices demonstrated that the HrFI is efficient in capturing the flower canopy reflectance and could distinguish flowers from the soil, shadow, and leaf pixels (Figure 4, Figure 5 and Figure 6). In contrast, other VIs confounded spectral information of floral features, especially when within-canopy shadow pixels are dominant in the imagery. Moreover, we noted that the performance of the MYI closely follows that of the HrFI in thresholding flowering pixels in high-resolution RGB imagery. This could be interpreted as those vegetation indices that enhance the contrast between reduced blue band reflectance, and enhanced red and green band reflectance can efficiently capture the flowering canopy signals of canola.
Remote sensing techniques are increasingly used to investigate the relationship between yield and plant growth characteristics. This study found that 82% of the yield variability can be explained using a digitized flower area using a single image date acquired during the peak flowering period. Peak flowering indicates the maximum reproductive potential of the plant and is conducive to estimate the seed yield. Despite canola producing more floral primordials than its photosynthetic capacity, peak flowering has been found to have a strong relationship with the yield [12]. The early flowering stage can also be used to predict the canola seed yield as the flower area is correlated with the final seed yield at 0.85. The authors of [5] found that 75% of the pods that are retained until maturity are made from flowers that opened within 11 days from flowering. This could explain the higher predictive power of the early flowering model.
The cumulative flower area explained 75% of the yield potential as a predictor. In contrast, the late flowering model had poor model performance that can be attributed to the late flowering in treatments with low planting density. Plots with high planting density flowered earlier compared to plots with low seeding density [34]. This effect can be seen in Figure 10a, where the training model indicates data points arranged in two opposite directions. The negative correlation between the yield and the flower area can be explained by the majority of data points belonging to the high crop density plots, where at that time point (July 23) most flowers have become pods. The minority of data points that indicate a positive trend within the graph flowered later due to the low planting density.
Using a single image date for yield prediction can be less expensive and straightforward than using cumulative flower area. However, the applicability of the single image date data could be sensitive to environmental conditions such as drought and heat stress. Furthermore, it is challenging to identify the exact peak flowering time. High temperatures during the reproductive stage negatively affect seed yield [35]. Especially when canola is exposed to high heat stress during the late reproductive stage, recovery is low [36]. Therefore, the flowering information extracted during the peak flowering period before exposure to high environmental stress would not represent the accurate yield potential of the plant. Alternatively, using a time-series change of flower area can capture these environmental effects and could better represent the final seed yield. The authors of [25] suggested using integrated flowering intensity as a strong indicator of yield potential over a single-date regression approach for medium-resolution UAV-multispectral data. The differences in our study and findings from Zhang et al. (2021) could be attributed to the higher genetic diversity among plots and lower resolution of images in the latter study. The trade-off between the accuracy of the yield prediction and the time expense in using single-date vs. time-series data is to be decided by the user.
This study only used three time points during the reproductive stage. A higher temporal resolution might provide detailed phenology changes to identify the ideal time point necessary to develop an accurate yield model. A major drawback of processing ultra-high-resolution imagery is that it requires high computational power. Furthermore, high-resolution imagery requires a lower flying altitude, necessitating longer flight time. Furthermore, RGB sensors with ultra-high spatial resolution can be expensive. Despite high spatial information, the spectral information of RGB imagery is limited, making it inadequate for certain studies.
The results of this study agree with the hypothesis made in the study. The HrFI developed in this study could successfully segment flower pixels from high-resolution RGB imagery to quantify the flower pixels. The digitized flower area was proven to be a strong predictor of seed yield, both when a single image date and cumulative images over the reproductive period are used. Furthermore, the results of this study accentuate the importance of using high-resolution images in yield prediction.

5. Conclusions

This study introduced a new vegetation index, the HrFI, for high-resolution RGB UAV imagery to quantify the canola flower pixels. The index accurately identified canola flower pixels when the orthomosaics had a significant amount of within-canopy shadow pixels. A digitized flower area was used as an indicator of reproductive potential to develop a within-season seed yield estimation model. The findings indicate that the flower area at peak flowering has strong yield predictive power (R2 = 0.82). Digitized flower area calculated at early flowering could also be used as an indicator of yield potential. Alternatively, the cumulative flower area indicating the total yield potential can also be used as a predictor of canola seed yield. The HrFI should be further evaluated for low-resolution imagery to quantify the flowering intensity as opposed to flower pixels.

Author Contributions

Conceptualization, H.F., T.H. and S.J.S.; methodology, H.F., T.H., A.A., D.B. and K.A.N.; software, H.F.; formal analysis, H.F.; data curation, H.F. and O.K.-O.; writing—original draft preparation, H.F.; writing—review, and editing, H.F., T.H., A.A., D.B. and S.J.S.; supervision, S.J.S.; project administration, S.J.S.; funding acquisition, S.J.S. All authors have read and agreed to the published version of the manuscript.

Funding

The authors would like to acknowledge the P2IRC funding 422082, Canada First Research Excellent Fund (CFREF), Canola Council, and Natural Sciences and Engineering Research Council (NSERC).

Data Availability Statement

Data are available for research purposes upon request to the authors’ institutions.

Acknowledgments

The authors would like to acknowledge the contributions from Eric Johnson, Seungbum Ryu, and staff at Kernen Research Farm, University of Saskatchewan.

Conflicts of Interest

The authors declare that there is no conflict of interest regarding the publication of this article.

References

  1. Canola Council of Canada. About Canola. 2017. Available online: https://www.canolacouncil.org/about-canola/ (accessed on 8 March 2022).
  2. Statistics Canada. Table 32-10-0359-01; Estimated Areas, Production, Average Farm Price and Total Farm Value of Principal Field Crops, in Metric and Imperial Units. Statistics Canada: Ottawa, ON, Canada, 2020.
  3. Khaki, S.; Wang, L. Crop Yield Prediction Using Deep Neural Networks. Front. Plant Sci. 2019, 10, 621. [Google Scholar] [CrossRef] [PubMed]
  4. Basso, B.; Cammarano, D.; Carfagna, E. Review of crop yield forecasting methods and early warning systems. In Report Presented to the First Meeting of the Scientific Advisory Committee of the Global Strategy to Improve Agricultural and Rural Statistics; FAO Headquarters: Rome, Italy, 2013. [Google Scholar]
  5. Edwards, J.; Jensen, B. Canola Growth and Development; Department of Primary Industries: Orange, NSW, Australia, 2011.
  6. Dreccer, M.F.; Molero, G.; Rivera-Amado, C.; John-Bejai, C.; Wilson, Z. Yielding to the image: How phenotyping reproductive growth can assist crop improvement and production. Plant Sci. 2019, 282, 73–82. [Google Scholar] [CrossRef]
  7. Gong, Y.; Duan, B.; Fang, S.; Zhu, R.; Wu, X.; Ma, Y.; Peng, Y. Remote estimation of rapeseed yield with unmanned aerial vehicle (UAV) imaging and spectral mixture analysis. Plant Methods 2018, 14, 70. [Google Scholar] [CrossRef] [PubMed]
  8. Weiss, M.; Jacob, F.; Duveiller, G. Remote sensing for agricultural applications: A meta-review. Remote Sens. Environ. 2020, 236, 111402. [Google Scholar] [CrossRef]
  9. Sulik, J.J.; Long, D.S. Spectral indices for yellow canola flowers. Int. J. Remote Sens. 2015, 36, 2751–2765. [Google Scholar] [CrossRef]
  10. Chen, B.; Jin, Y.; Brown, P. An enhanced bloom index for quantifying floral phenology using multi-scale remote sensing observations. ISPRS J. Photogramm. Remote Sens. 2019, 156, 108–120. [Google Scholar] [CrossRef]
  11. Chen, J.; Shen, M.; Zhu, X.; Tang, Y. Indicator of flower status derived from in situ hyperspectral measurement in an alpine meadow on the Tibetan Plateau. Ecol. Indic. 2009, 9, 818–823. [Google Scholar] [CrossRef]
  12. Sulik, J.J.; Long, D.S. Spectral considerations for modeling yield of canola. Remote Sens. Environ. 2016, 184, 161–174. [Google Scholar] [CrossRef]
  13. Ashourloo, D.; Shahrabi, H.S.; Azadbakht, M.; Aghighi, H.; Nematollahi, H.; Alimohammadi, A.; Matkan, A.A. Automatic canola mapping using time series of sentinel 2 images. ISPRS J. Photogramm. Remote Sens. 2019, 156, 63–76. [Google Scholar] [CrossRef]
  14. Tian, H.; Chen, T.; Li, Q.; Mei, Q.; Wang, S.; Yang, M.; Wang, Y.; Qin, Y. A Novel Spectral Index for Automatic Canola Mapping by Using Sentinel-2 Imagery. Remote Sens. 2022, 14, 1113. [Google Scholar] [CrossRef]
  15. Pons, X.; Padró, J.-C. An Empirical Approach on Shadow Reduction of UAV Imagery in Forests. In Proceedings of the IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Yokohama, Japan, 28 July–2 August 2019; pp. 2463–2466. [Google Scholar]
  16. Suh, H.K.; Hofstee, J.W.; van Henten, E.J. Improved vegetation segmentation with ground shadow removal using an HDR camera. Precis. Agric. 2018, 19, 218–237. [Google Scholar] [CrossRef]
  17. Dare, P. Shadow Analysis in High-Resolution Satellite Imagery of Urban Areas. Photogramm. Eng. Remote Sens. 2005, 71, 169–177. [Google Scholar] [CrossRef]
  18. Foody, G.M. Status of land cover classification accuracy assessment. Remote Sens. Environ. 2002, 80, 185–201. [Google Scholar] [CrossRef]
  19. Meier, U. Growth Stages of Mono-and Dicotyledonous Plants; Federal Biological Research Centre for Agriculture and Forestry: Bonn, Germany, 2001. [Google Scholar]
  20. ESRI. ArcGIS Desktop; ESRI: Redlands, CA, USA, 2011. [Google Scholar]
  21. Xue, J.; Su, B. Significant Remote Sensing Vegetation Indices: A Review of Developments and Applications. J. Sens. 2017, 2017, 1353691. [Google Scholar] [CrossRef]
  22. Bhargavi, K.; Jyothi, S. A survey on threshold based segmentation technique in image processing. Int. J. Innov. Res. Dev. 2014, 3, 234–239. [Google Scholar]
  23. Story, M.; Congalton, R.G. Accuracy assessment: A user’s perspective. Photogramm. Eng. Remote Sens. 1986, 52, 397–399. [Google Scholar]
  24. McHugh, M.L. Interrater reliability: The kappa statistic. Biochem. Med. 2012, 22, 276–282. [Google Scholar] [CrossRef]
  25. Zhang, T.; Vail, S.; Duddu, H.S.; Parkin, I.A.; Guo, X.; Johnson, E.N.; Shirtliffe, S.J. Phenotyping Flowering in Canola (Brassica napus L.) and Estimating Seed Yield Using an Unmanned Aerial Vehicle-Based Imagery. Front. Plant Sci. 2021, 12, 686332. [Google Scholar] [CrossRef] [PubMed]
  26. Makowski, D.; Ben-Shachar, M.; Lüdecke, D. Bayestestr: Describing Effects and their Uncertainty, Existence and Significance within the Bayesian Framework. J. Open Source Softw. 2019, 4, 1541. [Google Scholar] [CrossRef]
  27. Ritz, C.; Baty, F.; Streibig, J.C.; Gerhard, D. Dose-Response Analysis Using R. PLoS ONE 2015, 10, e0146021. [Google Scholar] [CrossRef] [PubMed]
  28. Refaeilzadeh, P.; Tang, L.; Liu, H. Cross-Validation, in Encyclopedia of Database Systems; Liu, L., ÖZsu, M.T., Eds.; Springer: Boston, MA, USA, 2009; pp. 532–538. [Google Scholar]
  29. Yates, D.J. Reflexion and absorption of solar radiation by flowering canopies of oil-seed rape (Brassica napus L.). J. Agric. Sci. 1987, 109, 495–502. [Google Scholar] [CrossRef]
  30. Ma, H.; Qin, Q.; Shen, X. Shadow Segmentation and Compensation in High Resolution Satellite Images. In Proceedings of the IGARSS 2008—2008 IEEE International Geoscience and Remote Sensing Symposium, Boston, MA, USA, 7–11 July 2008. [Google Scholar]
  31. Aboutalebi, M.; Torres-Rua, A.F.; McKee, M.; Kustas, W.; Nieto, H.; Coopmans, C. Behavior of Vegetation/Soil Indices in Shaded and Sunlit Pixels and Evaluation of Different Shadow Compensation Methods Using UAV High-Resolution Imagery over Vineyards; The Society of Photo-Optical Instrumentation Engineers (SPIE) Commercial + Scientific Sensing and Imaging: Bellingham, WA, USA, 2018; Volume 10664, p. 6. [Google Scholar]
  32. Mostafa, Y. A review on various shadow detection and compensation techniques in remote sensing images. Can. J. Remote Sens. 2017, 43, 545–562. [Google Scholar] [CrossRef]
  33. Schowengerdt, R.A. Remote Sensing: Models and Methods for Image Processing; Elsevier: Amsterdam, The Netherlands, 2006. [Google Scholar]
  34. Attanayake, A.; Shirtliffe, S.J. Optimization of Ground Cover Phenology Models to Simulate Canola Yield for Precision Farming. In Soils and Crops; University of Saskatchewan: Saskatoon, SK, Canada, 2022. [Google Scholar]
  35. Pokharel, M.; Chiluwal, A.; Stamm, M.; Min, D.; Rhodes, D.; Jagadish, S.K. High night-time temperature during flowering and pod filling affects flower opening, yield and seed fatty acid composition in canola. J. Agron. Crop Sci. 2020, 206, 579–596. [Google Scholar] [CrossRef]
  36. Gan, Y.; Angadi, S.V.; Cutforth, H.; Potts, D.; McDonald, C.L. Canola and mustard response to short periods of temperature and water stress at different developmental stages. Can. J. Plant Sci. 2004, 84, 697–704. [Google Scholar] [CrossRef]
Figure 1. Experimental layout of the field trial, 2019. Forty-eight treatments were randomly distributed within each block. Yellow polygons denote the four stacked replicates.
Figure 1. Experimental layout of the field trial, 2019. Forty-eight treatments were randomly distributed within each block. Yellow polygons denote the four stacked replicates.
Remotesensing 14 04464 g001
Figure 2. Image processing workflow of the study. NDYI—Normalized Difference Yellowness Index, RBNI—Red Blue Normalizing Index, HrFI—High-resolution Flowering Index, MYI—Modified Yellowness Index.
Figure 2. Image processing workflow of the study. NDYI—Normalized Difference Yellowness Index, RBNI—Red Blue Normalizing Index, HrFI—High-resolution Flowering Index, MYI—Modified Yellowness Index.
Remotesensing 14 04464 g002
Figure 3. Vector layer used to extract the flower information from the image acquired on 12 July. The red polygon represents the area considered for quantifying canola flower area from the orthomosaics.
Figure 3. Vector layer used to extract the flower information from the image acquired on 12 July. The red polygon represents the area considered for quantifying canola flower area from the orthomosaics.
Remotesensing 14 04464 g003
Figure 4. Visual comparison of a clip of the orthomosaic acquired on 4 July, and the four corresponding vegetation indices, RBNI, NDYI, MYI, and HrFI. RBNI: Red Blue Normalizing Index, NDYI: Normalized Difference Yellowness Index, MYI: Modified Yellowness Index, HrFI: High-resolution Flowering Index.
Figure 4. Visual comparison of a clip of the orthomosaic acquired on 4 July, and the four corresponding vegetation indices, RBNI, NDYI, MYI, and HrFI. RBNI: Red Blue Normalizing Index, NDYI: Normalized Difference Yellowness Index, MYI: Modified Yellowness Index, HrFI: High-resolution Flowering Index.
Remotesensing 14 04464 g004
Figure 5. Boxplot comparing the distribution of pixel values in shadow, flowers, leaves, and soil classes for RBNI, NDYI, MYI, and HrFI. The four vegetation indices are normalized from 0 to 1 for ease of comparison. HrFI: High-resolution Flowering Index, RBNI: Red Blue Normalizing Index, NDYI: Normalized Difference Yellowness Index, MYI: Modified Yellowness Index.
Figure 5. Boxplot comparing the distribution of pixel values in shadow, flowers, leaves, and soil classes for RBNI, NDYI, MYI, and HrFI. The four vegetation indices are normalized from 0 to 1 for ease of comparison. HrFI: High-resolution Flowering Index, RBNI: Red Blue Normalizing Index, NDYI: Normalized Difference Yellowness Index, MYI: Modified Yellowness Index.
Remotesensing 14 04464 g005
Figure 6. The bi-plot of the linear discriminant analysis (LDA) illustrating the separation between ground cover classes using four vegetation indices. HrFI: High-resolution Flowering Index, RBNI: Red Blue Normalizing Index, NDYI: Normalized Difference Yellowness Index, MYI: Modified Yellowness Index.
Figure 6. The bi-plot of the linear discriminant analysis (LDA) illustrating the separation between ground cover classes using four vegetation indices. HrFI: High-resolution Flowering Index, RBNI: Red Blue Normalizing Index, NDYI: Normalized Difference Yellowness Index, MYI: Modified Yellowness Index.
Remotesensing 14 04464 g006
Figure 7. The variation of kappa coefficient across different HrFI thresholds used in developing flower, non-flower classifications for the three image dates.
Figure 7. The variation of kappa coefficient across different HrFI thresholds used in developing flower, non-flower classifications for the three image dates.
Remotesensing 14 04464 g007
Figure 8. (a) The trained yield model and (b) the scatterplot of the measured vs. predicted canola seed yield for the testing set developed for the image acquired during early flowering (Model 1). The red line indicates the 1:1 line.
Figure 8. (a) The trained yield model and (b) the scatterplot of the measured vs. predicted canola seed yield for the testing set developed for the image acquired during early flowering (Model 1). The red line indicates the 1:1 line.
Remotesensing 14 04464 g008
Figure 9. (a) The trained model and (b) the scatterplot of the measured vs. predicted canola seed yield for the testing set developed for the image acquired during peak flowering (Model 2). Red line indicates the 1:1 line.
Figure 9. (a) The trained model and (b) the scatterplot of the measured vs. predicted canola seed yield for the testing set developed for the image acquired during peak flowering (Model 2). Red line indicates the 1:1 line.
Remotesensing 14 04464 g009
Figure 10. (a) The trained model and (b) the scatterplot of the measured vs. predicted canola seed yield for the testing set developed for the image acquired during late flowering (Model 3). Red line indicates the 1:1 line.
Figure 10. (a) The trained model and (b) the scatterplot of the measured vs. predicted canola seed yield for the testing set developed for the image acquired during late flowering (Model 3). Red line indicates the 1:1 line.
Remotesensing 14 04464 g010
Figure 11. (a) The trained model and (b) the scatterplot of the measured vs. predicted canola seed yield for the testing set developed for the cumulative flower area (Model 4). Red line indicates the 1:1 line.
Figure 11. (a) The trained model and (b) the scatterplot of the measured vs. predicted canola seed yield for the testing set developed for the cumulative flower area (Model 4). Red line indicates the 1:1 line.
Remotesensing 14 04464 g011
Table 1. Equations of the vegetation indices used in the study.
Table 1. Equations of the vegetation indices used in the study.
Vegetation IndexEquationReferences
Normalized Difference Yellowness Index (NDYI) ( G r e e n B l u e ) ( G r e e n + B l u e ) [12]
Red Blue Normalizing Index (RBNI) ( R e d B l u e ) ( R e d + B l u e ) Proposed in this study
Modified Yellowness Index (MYI) ( R e d × G r e e n ) B l u e Proposed in this study
High-resolution Flowering Index (HrFI) ( R e d B l u e ) × ( G r e e n B l u e ) Proposed in this study
Table 2. Four yield models developed in the study and their details.
Table 2. Four yield models developed in the study and their details.
ModelPredictorEquation
Model 1Flower area at BBCH 63Y = c + (d − c) (1 − exp (−x/e)
Model 2Flower area at BBCH 65Y = c + (d − c) (1 − exp (−x/e)
Model 3Flower area at BBCH 67Y = mx + c
Model 4Cumulative flower areaY = mx + c
Table 3. Parameter estimates of the non-linear training model for the image acquired during early flowering (4 July).
Table 3. Parameter estimates of the non-linear training model for the image acquired during early flowering (4 July).
ParameterEstimateStandard Errort-Valuep-Value
c827.78110.587.48551.227 × 1011 ***
d3336.09135.5724.6070<2.2 × 1016 ***
e31,171.574995.346.24016.6 × 109 ***
Pseudo − R20.100
Significance codes: 0 ‘***’ 0.001. c: lower horizontal asymptotes; d: upper horizontal asymptotes; e: steepness of the curve.
Table 4. Parameter estimates of trained non-linear three-parameter asymptotic regression model developed for the image acquired during peak flowering (12 July).
Table 4. Parameter estimates of trained non-linear three-parameter asymptotic regression model developed for the image acquired during peak flowering (12 July).
ParameterEstimateStandard Errort-Valuep-Value
c567.89119.74 4.746 5.3 × 106 ***
d4298.35375.8211.4373<2.2 × 1016 ***
e651,047.21132,302.604.9209 2.485 × 106 ***
Pseudo − R20.1028071
Significance codes: 0 ‘***’ 0.001. c: lower horizontal asymptotes; d: upper horizontal asymptotes; e: steepness of the curve.
Table 5. Parameter estimates of the trained simple linear regression model for the image acquired during late flowering (23 July).
Table 5. Parameter estimates of the trained simple linear regression model for the image acquired during late flowering (23 July).
ParameterEstimateStandard Errort-Valuep-Value
Intercept2593.661.473× 10216.993<2 × 1016 ***
Slope−1.8324 × 10−36.872 × 10−4 −2.3450.0206 *
Multiple − R20.02904
Significance codes: 0 ‘***’ 0.001, ‘*’ 0.05.
Table 6. Parameter estimates of the trained simple linear regression model for the cumulative flower area.
Table 6. Parameter estimates of the trained simple linear regression model for the cumulative flower area.
ParameterEstimateStandard Errort-Valuep-Value
Intercept6.865 × 1028.190 × 1018.3827.08 × 10−14 ***
Slope2.963 × 10−41.380 × 10−521.468<2 × 10−16 ***
Multiple − R20.7984
Significance codes: 0 ‘***’ 0.001.
Table 7. Pearson’s correlation coefficient (R) and root mean squared error (RMSE) of the validation plots from the four canola seed yield models.
Table 7. Pearson’s correlation coefficient (R) and root mean squared error (RMSE) of the validation plots from the four canola seed yield models.
ModelValidation R2Validation RMSE (kg ha−1)
1: 4 July (Early Flowering)0.72 438
2: 12 July (Peak Flowering)0.82 310
3: 23 July (Late Flowering)0.01 859
4: AUC0.75 440
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Fernando, H.; Ha, T.; Attanayake, A.; Benaragama, D.; Nketia, K.A.; Kanmi-Obembe, O.; Shirtliffe, S.J. High-Resolution Flowering Index for Canola Yield Modelling. Remote Sens. 2022, 14, 4464. https://doi.org/10.3390/rs14184464

AMA Style

Fernando H, Ha T, Attanayake A, Benaragama D, Nketia KA, Kanmi-Obembe O, Shirtliffe SJ. High-Resolution Flowering Index for Canola Yield Modelling. Remote Sensing. 2022; 14(18):4464. https://doi.org/10.3390/rs14184464

Chicago/Turabian Style

Fernando, Hansanee, Thuan Ha, Anjika Attanayake, Dilshan Benaragama, Kwabena Abrefa Nketia, Olakorede Kanmi-Obembe, and Steven J. Shirtliffe. 2022. "High-Resolution Flowering Index for Canola Yield Modelling" Remote Sensing 14, no. 18: 4464. https://doi.org/10.3390/rs14184464

APA Style

Fernando, H., Ha, T., Attanayake, A., Benaragama, D., Nketia, K. A., Kanmi-Obembe, O., & Shirtliffe, S. J. (2022). High-Resolution Flowering Index for Canola Yield Modelling. Remote Sensing, 14(18), 4464. https://doi.org/10.3390/rs14184464

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop