Next Article in Journal
Optical Design of a Common-Aperture Camera for Infrared Guided Polarization Imaging
Next Article in Special Issue
Remote Sensing—Based Assessment of the Water-Use Efficiency of Maize over a Large, Arid, Regional Irrigation District
Previous Article in Journal
Integrated Archaeological Modeling Based on Geomatics Techniques and Ground-Penetrating Radar
Previous Article in Special Issue
Retrospective Predictions of Rice and Other Crop Production in Madagascar Using Soil Moisture and an NDVI-Based Calendar from 2010–2017
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Technical Note

Differentiate Soybean Response to Off-Target Dicamba Damage Based on UAV Imagery and Machine Learning

1
Fisher Delta Research, Extension, and Education Center, Division of Plant Science & Technology, University of Missouri, Portageville, MO 63873, USA
2
Division of Plant Science & Technology, University of Missouri, Columbia, MO 65211, USA
3
Biological Systems Engineering, University of Wisconsin-Madison, Madison, WI 53706, USA
4
Agronomy Department, University of Florida, Gainesville, FL 32611, USA
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Remote Sens. 2022, 14(7), 1618; https://doi.org/10.3390/rs14071618
Submission received: 14 February 2022 / Revised: 20 March 2022 / Accepted: 25 March 2022 / Published: 28 March 2022
(This article belongs to the Special Issue Advances in Remote Sensing for Crop Monitoring and Yield Estimation)

Abstract

:
The wide adoption of dicamba-tolerant (DT) soybean has led to numerous cases of off-target dicamba damage to non-DT soybean and dicot crops. This study aimed to develop a method to differentiate soybean response to dicamba using unmanned-aerial-vehicle-based imagery and machine learning models. Soybean lines were visually classified into three classes of injury, i.e., tolerant, moderate, and susceptible to off-target dicamba. A quadcopter with a built-in RGB camera was used to collect images of field plots at a height of 20 m above ground level. Seven image features were extracted for each plot, including canopy coverage, contrast, entropy, green leaf index, hue, saturation, and triangular greenness index. Classification models based on artificial neural network (ANN) and random forest (RF) algorithms were developed to differentiate the three classes of response to dicamba. Significant differences for each feature were observed among classes and no significant differences across fields were observed. The ANN and RF models were able to precisely distinguish tolerant and susceptible lines with an overall accuracy of 0.74 and 0.75, respectively. The imagery-based classification model can be implemented in a breeding program to effectively differentiate phenotypic dicamba response and identify soybean lines with tolerance to off-target dicamba damage.

1. Introduction

Soybean (Glycine max (L.) Merr.) represents the largest and most concentrated segment of global agricultural trade [1]. The growing demand for soybean is primarily attributed to its unique seed composition and versatile applications in the food, feed, and biodiesel industries as the crop delivers the highest amount of protein per hectare and accounts for over 60% of total global oilseed production [2,3]. Worldwide, soybean production has reached over 384 million metric tons, of which Brazil (144 million metric tons, 37.5%), the United States (120 million metric tons, 31.3%), and Argentina (49.5 million metric tons, 14.4%) account for roughly 85% of the global production [4].
Proper weed management is essential to sustain soybean production, with significant yield reductions as high as 53% being observed when fields are left untreated [5,6,7]. Integrated weed management systems rely on a combination of mechanical, cultural, chemical, and biological practices to minimize environmental impact and potential development of herbicide-resistant weed populations [8]. With the development and commercialization of genetically engineered soybean cultivars resistant to over-the-top applications of dicamba (3,6-dichloro-2-methoxybenzoic acid, DT) [9,10], approximately 55% of the soybean acreage in the United States have quickly adopted the technology [11]. This rapid and widespread adoption has resulted in numerous reports of off-target dicamba damage to non-DT soybean fields as well as multiple dicots plant species [12,13,14,15,16]. Soybean is naturally susceptible to dicamba, and the consequential symptoms include crinkling and cupping of the immature leaves, epinasty, plant height reduction, chlorosis, death of apical meristem, malformed pods, and, ultimately, yield reduction [17,18,19]. The severity of the observed symptoms and yield penalty vary depending on the growth stage, dosage, frequency, and duration of exposure, and potentially genetic background, of which soybean is two to six times more susceptible to dicamba when exposed at the early reproductive stage [19,20,21,22,23,24,25].
The assessment of injuries caused by off-target dicamba exposure is generally reported as categorical variables (tolerant, moderate, susceptible) or percentage of injury (0–100%) based on visual observations. Such assessment is time-consuming, labor-intensive, and is often subjective to the evaluator, which can result in biased and inconsistent ratings [26,27]. Plant breeders often investigate tens of thousands of breeding lines for specific or multiple phenotypes in a growing season [3]. The development of an accurate and high-throughput platform to characterize breeding materials is highly desired. Remote sensing technology is a cost-effective approach to identify and quantify changes in plant biophysical and biochemical properties that has been widely applied in agricultural research [28,29]. These biophysical and biochemical changes in plants can be identified by UAV-image-derived features such as vegetation indices (VIs) and plant geometric features [30]. Vegetation indices can be generated from RGB [31], multispectral [29,32], and hyperspectral [33] images to identify plant vigor and vegetation coverage under different stress conditions [34]. In soybean, multiple studies targeting assessment and quantification of herbicide injuries using image-derived features have been reported, including injuries caused by dicamba [33,35,36,37], 2,4-D (2,4-Dichlorophenoxyacetic acid) [36], glyphosate [38,39,40], and metribuzin (4-amino-6-tert-butyl-3-(methylthio)-1,2,4-triazin-5(4H)-one) [41]. However, all previous studies have been based on a limited number of soybean cultivars with narrow genetic diversity. Therefore, the goals of this research were to (i) develop an RGB image-based classification system using machine learning algorithms and seven image features, including canopy coverage, contrast, entropy, green leaf index (GLI) [42], hue, saturation, and triangular greenness index (TGI) [43], with a total of 230 diverse soybean breeding lines and 10 commercial cultivars (seven DT and three glyphosate-tolerant (GT)) and (ii) assess and compare classification accuracies between artificial neural network (ANN) and random forest (RF) algorithms. The development of a simple and cost-effective RGB-based classification system may allow plant breeders to precisely and rapidly screen and effectively select genotypes tolerant to off-target dicamba damage.

2. Materials and Methods

2.1. Plant Materials and Data Acquisition

A total of 230 diverse soybean breeding lines developed by the University of Missouri Fisher Delta Research, Extension, and Education Center (MU-FDRC) soybean breeding program and 10 commercial cultivars (seven DT and three GT) were used in this study. Breeding lines were derived from 115 unique biparental populations and ranged from relative maturity 4.0 to 5.3. These lines comprised the 2020 advanced yield trials at the MU-FDRC, and a subgroup of lines selected from the 2019 advanced yield trials based on extreme response to off-target dicamba damage (tolerant and susceptible) and yield performance under prolonged off-target dicamba exposure.
Field trials were conducted at the Lee Farm in Portageville, MO (36°23′44.2″N; 89°36′52.3″W) using a randomized complete block design with three replications per environment. The Lee Farm has been exposed to season-long off-target dicamba damage since 2017, where non-DT breeding lines often experience significant yield losses compared to the DT commercial checks [44,45,46]. The 2020 advanced yield trials were grown in three environments (Fld-61, Fld-63, Fld-81) and the subgroup of extreme lines was grown in two locations (Fld-86, Fld-1210) (Table 1). Each plot consisted of four rows 3.66 m long, spaced 0.76 m apart.

2.2. Visual Dicamba Damage Assessment

Field plots were visually assessed for the dicamba damage at early reproductive stages between R3 to R5 depending on the line’s maturity group (approximately 100 to 130 DAP) [47]. Lines were rated as tolerant, moderate, and susceptible based on the severity of dicamba symptoms (Figure 1). The tolerant group showed similar plant development to the DT commercial checks with none to minimal visual dicamba damage, including the typical crinkling and cupping of the immature leaves, reduced canopy area, and plant stunting. The moderate group showed intermediate dicamba damage symptoms, including mild crinkling and cupping of the immature leaves with minimal reduction in canopy area and plant height. The susceptible group exhibited extreme dicamba damage symptoms, including severe crinkling and cupping of the immature leaves and severe reduction in canopy area and plant height.

2.3. UAV Imagery Data Acquisition

Field plot images were collected using a DJI Phantom 4 Pro (Version 1.0, DJI, Shenzhen, China) quadcopter. The quadcopter has a built-in RGB camera mounted onto a gimbal underneath the copter. The camera has an image resolution (number of total pixels) of 5472 × 3648 pixels and was configured to take time-lapse images at 2 frames per second (fps). The embedded global navigation satellite system (GNSS) receiver provides geo-referencing information as a part of metadata to each image frame. The UAV platform was controlled using the flight control mobile app Autopilot (Hangar Technology, Austin, TX, USA) to complete flight missions autonomously by following predefined flight plans. Images were taken at 20 m above ground level (AGL) at the speed of 7 km/h, following a zigzag path to cover each field with the forward overlap of ≥70% and side overlap of ≥65%. The ground sampling distance (GSD) of the camera in this setting was 5.5 mm/pixel. UAV images were collected at noon in each field under a clear sky.

2.4. Image Processing and Features Extraction

An orthomosaic image for each field was generated using Agisoft MetaShape Pro (Agisoft LLC, St. Petersburg, Russia) following the methodology described by Zhou et al. (2019) [48]. Three parameters were set as “high” with generic and reference preselection for image alignment, “high” for reconstruction parameter, and “moderate” for filtering mode [48]. The orthomosaic for each field was generated and exported as .tif images and processed using the Image Processing Toolbox and Computer Vision System Toolbox of MATLAB (ver. 2020a, The MathWorks, Natick, MA, USA).
Individual field plots were separated from the orthomosaic image by manually cropping a rectangle region of interest (ROI) around each plot. The ROI size varied to cover each soybean plot according to its width and length. Overlapping between adjacent plots was avoided based on visual quality control. Background (soil, shadow, and plant residues) was removed from the separated images by detecting projected canopy contours using the “activecontour” function [49] with the “Chan–Vese” method [50]. Pixels within a full contour were considered as foreground (soybean plants), while those outside contours were background (soil and residues). Contours with extremely small regions were identified as noises using the “regionprops” function and then removed from the foreground.
Seven image features were calculated from the processed RGB images, including canopy coverage, color (hue, saturation (Sa)) in HSV color space, image texture (entropy and contrast), and two vegetation indices including TGI [43] and GLI [42] as defined in Equations (1) and (2).
T G I = ( λ R e d   λ B l u e ) ( ρ R e d   ρ G r e e n ) ( λ R e d   λ G r e e n ) ( ρ R e d   ρ B l u e ) 2
G L I = ( G r e e n R e d ) + ( G r e e n B l u e ) ( 2 × G r e e n ) + R e d + B l u e
where lambda (λ) = center wavelengths for the respective bands including red (670 nm), blue (480 nm), and green (550 nm); rho (ρ) = pixel value for the respective bands including red (670 nm), blue (480 nm), and green (550 nm).
Canopy coverage was defined as the total number of pixels in the green channel of each RGB image. The hue and saturation were calculated from the HSV color space converted from the RGB images using the function “rgb2hsv” in MATLAB. Following the protocol described by Zhou et al. (2020) [32], the image texture entropy and contrast were calculated using the “graycomatrix” function in MATLAB after converting each RGB image to a grayscale level by the function “rgb2gray”. Entropy typically quantifies the level of randomness and complexity of an image and can be used to characterize the texture of the image, of which larger entropy indicates higher complexity [51]. Since field plots image collection and visual assessment of off-target dicamba damage were conducted across different fields under variable environmental lighting conditions, UAV-image-derived features were standardized before inclusion in the ANN and RF predictive models. A z-score for each image feature was used for standardizing the model’s predictors by dividing the difference between the observed value and the mean by the standard deviation (Equation (3)).
Z s c o r e = x μ S
where x = observed value for an image feature in an individual plot; µ = mean of all the plots in an individual field for a specific image feature; S = standard deviation of all the plots in an individual field for a specific image feature.

2.5. Feature Significance

A two-way analysis of variance (ANOVA) with an honest significant difference (HSD) Tukey test was conducted to investigate the significance of the difference between the image features and visual dicamba damage assessment among the five fields (Fld-61, Fld-63, Fld-81, Fld-86, Fld-1210). The two-way ANOVA test was generated with a 5% significance level by using the “aov” function original from R [52]. The Tukey’s range test was performed using the “TukeyHSD” function from the agricolae package [53].

2.6. Classification Algorithms and Accuracy

In this study, an ANN model was used to classify soybean responses to off-target dicamba based on image features and ground visual damage assessment. The model was built using the “neuralnet” function of the R package “neuralnet” package [54] with five hidden layers, 1,000,000 iterations, and an error tolerance of 0.02, along with all other settings set to default. Additionally, an RF model was also used to classify soybean response to off-target dicamba based on image features. The model was built using the “randomForest” function of the R package randomForest [55] with the parameters “ntree” = 400, “mtry” = 2, and all other settings set to default. The RF model was configured to output the variable importance during the training process.
Performance of the ANN and RF models were assessed using a five-fold cross-validation method, which is a conventional model’s accuracy evaluation and is commonly used in cases with a limited number of observations [56]. The three classes of visual dicamba damage scores (tolerant, moderate, and susceptible) were evaluated based on the number of samples correctly classified as true positive (TP), falsely classified as false positive (FP), correctly not classified as true negative (TN), and falsely not classified as false negative (FN) for both ANN and RF models. The overall accuracy was calculated using Equation (4). Class accuracy, which represents the ratio of correctly predicted instances and all the instances, was calculated using Equation (5). Precision, which indicates the proportion of predicted presences, was calculated using Equation (6), and sensitivity and specificity, which indicate the ratio of correctly predicted positive and negative classes, respectively, were calculated using Equations (7) and (8).
O v e r a l l   A c c u r a c y = N o .   o f   s a m p l e s   c l a s s i f i e d   c o r r e c t l y   i n   a   t e s t   s e t T o t a l   N o .   o f   s a m p l e s   i n   a   t e s t   s e t × 100 %
C l a s s   A c c u r a c y = T P + T N T P + T N + F P + F N
P r e c i s i o n = T P T P + F P
S e n s i t i v i t y = T P T P + F N
S p e c i f i c i t y = T N T N + F P
where TP = true positive; TN = true negative; FP = false positive; FN = false negative.

3. Results

3.1. Distribution of Visual Dicamba Damage Scores

Across 2300 field plots, approximately 26.7% (614) were visually classified as tolerant, 36.4% (837) as moderate, and 36.9% (849) as susceptible (Figure 2). The overall distribution of scores was balanced. Unbalanced distributions were observed in locations Fld-86 and Fld-1210, where the most rated class was tolerant. This could be attributed to reduced off-target dicamba exposure, late planting dates resulting in favorable environmental conditions to support plant recovery, and/or experimental error associated with the subjective visual assessment of the damage.

3.2. Image Features across Classes of Visual Dicamba Damage Scores

A significant distinction among classes of dicamba response was observed across the seven image features (Table 2). Canopy coverage, entropy, GLI, Sa, and TGI significantly differentiated the three classes, whereas contrast and hue significantly differentiated tolerant and moderate classes from susceptible but did not have significant differences among tolerant and moderate classes. Although the tested fields represented diverse and unique environments (soil types and physical locations), no significant differences among fields were detected across all image features, which reinforces the consistency and importance of these features in differentiating tolerant, moderate, and susceptible soybean lines.
Overall, the observed values of each image feature across dicamba response classes were aligned well with the field observations and expected distributions (Figure 3). Higher values for canopy coverage, entropy, GLI, hue, Sa, and TGI indicate tolerance to dicamba, whereas higher values of contrast indicate susceptibility (Figure 3). Higher values of canopy coverage are expected for the tolerant class primarily due to healthy vegetative growth and minimal to no cupping of the immature leaves. Entropy, as a measurement of image complexity, represents the texture of each plot. Tolerant soybean, primarily due to the lack of cupping of the immature leaves, are logically perceived as homogenous and smoother and therefore have higher values of entropy. In contrast, susceptible soybean, as a consequence of intense cupping of the immature leaves due to dicamba damage, would have an uneven and rough appearance and therefore show lower values of entropy. The GLI and TGI are vegetation indices that represent overall plant health and therefore are expected to be higher in the tolerant group and lower in the susceptible group. Saturation and hue represent the color structure of the image and indicate the intensity of the observed color. The cupping of the leaves affects the overall color reflection of the plant under sunlight, of which lighter tones of low-intensity green become predominant. Therefore, lower values of hue and Sa are observed in susceptible soybean as compared to the tolerant and moderate classes.

3.3. Model Performance and Overall Classification Accuracy

3.3.1. Artificial Neural Network Model Classification

An artificial neural network model was used to classify the different classes of dicamba scores, and the model, based on interactive machine learning using all seven image features as predictors, showed an overall classification accuracy of 0.74 with a five-fold cross-validation, whereas the highest accuracy was observed in Fld-63 (0.78) and lowest in Fld-61 (0.69) (Table 3). The locations Fld-86 and Fld-1210 were not included in the individual analysis due to the reduced sample size but were included in the overall analysis. The purpose of the classification in each field was to investigate whether a limited number of samples in individual fields can precisely classify different responses to dicamba. The model showed high accuracy for the classes tolerant (0.89) and susceptible (0.84) and slightly lower accuracy for the moderate class (0.75). Similarly, high specificity was observed for both tolerant and susceptible classes (0.97 and 0.91, respectively), and low specificity for the moderate class (0.54). Although these metrics showed promising high performance of the ANN model, the precision for the tolerant class (0.41) was substantially lower than both moderate (0.77) and susceptible (0.71) classes. Interestingly, the ANN model often misclassified true-moderate soybean as tolerant but rarely misclassified true-tolerant as susceptible (and vice versa). This indicates that, although the ANN model struggles to differentiate between moderate and tolerant, hence the low precision values, it can clearly distinguish the two extremist classes (tolerant and susceptible). Practically, this distinction is most important for soybean breeding and is adequate in helping breeders to make selections.

3.3.2. Random Forest Model Classification

The overall classification accuracy using the RF model and a five-fold cross-validation was 0.75, whereas the highest accuracy was observed in Field-63 (0.80) and lowest in Fld-61 (0.71) (Table 4). Similar to the ANN model, locations Fld-86 and Fld-1210 were not included in the individual analysis due to reduced sample size but were included in the overall analysis. Classes tolerant and susceptible showed high accuracy (0.89 and 0.84, respectively) but, as opposed to the ANN model, the moderate class had slightly higher accuracy (0.77 and 0.75, respectively). Interestingly, the RF model was able to better distinguish the response classes, particularly tolerant and susceptible soybean. The precision for the tolerant class increased by 51% (0.62 vs 0.41) and 5% for the susceptible class (0.74 vs 0.71) compared to the ANN model. Overall, the model classified only three true-susceptible as tolerant (1.8%) and none of the true-tolerant lines were classified as susceptible. Similar to the ANN model, the RF can precisely distinguish between tolerant and susceptible soybean but with a slight advantage in precisely classifying the tolerant class.

3.3.3. Random Forest Feature Importance

The mean decrease Gini coefficient was calculated for the seven image features included in the RF model (Figure 4). The coefficients indicate how much accuracy the model loses by excluding each variable, of which the higher values of mean decrease Gini indicate higher importance of the variable in the model. The results show that hue, entropy, GLI, and TGI were the most important features in the classification model. Considering that the visual assessment of dicamba damage was primarily based on the overall appearance of field plots, including cupping of the immature leaves (texture represented as entropy), color intensity (represented as hue), and overall crop development and healthiness (represented as GLI and TGI), the results obtained from the decrease Gini coefficient align with field observations and feature importance expectations.

4. Discussion and Conclusions

Unmanned aerial vehicles (UAVs) have been used in agricultural research to collect high-resolution and spectral images from field plots, of which multiple image features and vegetation indices can be extracted and later used to explain observed phenotypic variation. With a hyperspectral plant-sensing hand device, Huang et al. (2016) reported differentiation between treated (dicamba application) and non-treated soybean with accuracies ranging from 76 to 86%, demonstrating the potential of remote sensing in detecting early dicamba injury in soybean [33]. Out of the seven extracted vegetation indices, anthocyanin reflectance index (ARI) [57] and photochemical reflectance index (PRI) [58] were the most significant to differentiate between treated and non-treated soybean (Huang et al., 2016). Zhang et al. (2019) combined 21 sensitive spectral features with three machine learning algorithms (naive Bayes (NB), RF, and support vector machine (SVM)) to assess soybean damage from dicamba. Reported overall accuracies ranged from 0.69 to 0.75, with RF algorithm having the highest overall accuracy (0.75) [35]. These reports are compatible with our findings, particularly the highest accuracy obtained from RF algorithms. The superiority of RF may be associated with its ability to handle high data dimensionality and multicollinearity [35,59]. In addition, RF often performs better than ANN with limited training sets [60]. Abrantes et al. (2021) explored six RGB-derived image features to correlate with visual injury ratings and found the modified green–red vegetation index (MGRVI, R2 = 0.93) [61], modified photochemical reflectance index (MPRI, R2 = 0.93) [58], and excess green (ExG, R2 = 0.89) [62] to be the indices with the highest correlation with dicamba damage [36]. Marques et al. (2021) found the triangular greenness index to be highly correlated with dicamba dosage, with R2 ranging from 0.71 to 0.94 based on days after application [37].
In our study, seven image features were used to classify the visual assessment of dicamba injuries reported as categorical variables (tolerant, moderate, and susceptible). ANOVA results showed significant differences among dicamba classes, although no significant effect was found across locations, indicating a rather homogeneous and uniform off-target dicamba distribution in the testing area. Considering that each field represented a unique and diverse environment with variable soil types, planting dates, and soybean genotypes, these results are promising when it comes to the relevance of these image features to represent injuries caused by off-target dicamba as well as applying and replicating the developed classification machinery in non-tested soybean genotypes and environments. Interestingly, the observed variation for each image feature was consistent with physiological consequences of dicamba damage, as well as expected variations based on overall symptoms of the injuries. For instance, based on our field observations, the severe cupping of the immature leaves drastically reduced vegetative growth and canopy coverage. The image feature representing standardized canopy coverage clearly distinguished the three dicamba response classes, of which the tolerant group had the highest mean value (0.616, a) followed by moderate (0.191, b) and susceptible (−0.709, c) classes. The vegetation index GLI, which represents the photosynthetically functional component of the leaf area index [63], had the highest mean value for the tolerant class (0.455, a) and lowest for the susceptible class (−0.798, c). In addition to canopy coverage and overall vegetation development, color and texture-based image features also significantly differentiated the dicamba response classes. The cupping of the immature leaves produced a low-intensity, lighter green color as well as a rough texture to the plants. Entropy, a feature used in this study to represent leaf texture, describes how much information is provided by an image [64,65,66]. From the physical measurement point of view, higher entropy indicates more information being transmitted and therefore a smoother, higher quality image [67]. The tolerant class showed the highest mean value for entropy (0.512, a), followed by moderate (0.267, b) and susceptible (−0.856, c) classes. Hue and saturation, which indicate the color and intensity of an image [68], significantly distinguished between tolerant and susceptible classes, of which the tolerant class had higher mean values than the susceptible (0.211 and 0.295 vs −0.654 and −0.222, respectively). This means that, as expected, tolerant lines presented a more intense or darker green color as opposed to low-intensity, lighter green color observed in susceptible plants. It is also important to point out that our studies were conducted in five 6-acre fields on a 700+ acre farm with prolonged off-target dicamba exposure, compared to previous studies where direct spray treatment was compared with non-treated controls. In addition, we used 230 diverse soybean lines with different genetic backgrounds in a naturally volatile environment; the phenotypic differences observed should reflect underlining genetic differences of testing lines in response to dicamba, whereas previous studies focused on phenotypic differences caused by the herbicide application treatments rather than genotypes. Moreover, multiple image features were used in our study and showed consistent results in differentiating the phenotypic responses and classifying lines into distinctive classes. Overall, our results show that the technology offers plant breeders a rapid, simple, precise, and efficient tool in breeding and selection for natural tolerance to dicamba.
The development and commercialization of genetically modified DT soybean were followed by widespread reports of off-target damage in non-DT soybean and multiple dicots plant species [12,13,14,15,16]. As a growth regulator herbicide, dicamba is a synthetic auxin that triggers abnormal growth and/or plant death in sensitive dicots [69]. Besides the growth stage, dosage, frequency, and duration of the exposure, the genetic background of soybean genotypes may potentially affect the observed symptoms. This research represents the first large-scale screening of genetically diverse soybean to off-target dicamba damage, including 230 elite breeding lines derived from 115 unique biparental populations. The developed classification machinery precisely distinguished tolerant and susceptible soybean genotypes with diverse genetic backgrounds. In addition, image features and overall classification accuracies showed consistency across several unique environments, which reinforces the ability of this classification machinery to accurately classify non-tested genotypes in non-tested environments. Further studies are encouraged, using multi and hyperspectral images, as well as controlled dicamba dosages and exposure durations, to investigate potential enhancements in classification accuracies. As a cost-effective choice, this UAV RGB-based imagery system can be implemented in plant breeding programs targeting the identification and selection of genotypes showing enhanced tolerance to off-target dicamba damage.

Author Contributions

Conceptualization, C.C.V., S.S., F.T., J.Z. (Jing Zhou), J.Z. (Jianfeng Zhou) and P.C.; Data curation, C.C.V., S.S. and F.T.; Formal analysis, C.C.V., S.S., F.T. and J.Z. (Jing Zhou); Funding acquisition, P.C.; Investigation, C.C.V., S.S., F.T., J.Z. (Jing Zhou), D.J., H.T.N., J.Z. (Jianfeng Zhou) and P.C.; Methodology, C.C.V., S.S., F.T., J.Z. (Jing Zhou), J.Z. (Jianfeng Zhou) and P.C.; Project administration, P.C.; Resources, J.Z. (Jianfeng Zhou) and P.C.; Software, S.S. and F.T.; Supervision, J.Z. (Jianfeng Zhou) and P.C.; Validation, D.J., H.T.N., J.Z. (Jianfeng Zhou) and P.C.; Visualization, C.C.V., S.S., F.T., J.Z. (Jing Zhou), J.Z. (Jianfeng Zhou) and P.C.; Writing—original draft, C.C.V., S.S., F.T. and J.Z. (Jing Zhou); Writing—review & editing, C.C.V., S.S., F.T., J.Z. (Jing Zhou), D.J., H.T.N., J.Z. (Jianfeng Zhou) and P.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Mid-south Soybean Board (20-455-21), United Soybean Board (2120-172-0147), and Missouri Soybean Merchandising Council (20-455-21).

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

The authors would like to thank the University of Missouri—Fisher Delta Research, Extension, and Education Center soybean breeding team for their technical support in preparing and conducting the field trials. The authors would also like to extend their gratitude to the funding agencies that supported this work including the Mid-south Soybean Board, United Soybean Board, and Missouri Soybean Merchandising Council.

Conflicts of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

  1. Gale, F.; Valdes, C.; Ash, M. Interdependence of China, United States, and Brazil in Soybean Trade; US Department of Agriculture’s Economic Research Service (USDA): New York, NY, USA, 2019; pp. 1–48.
  2. United States Department of Agriculture. Oilseeds: World Markets and Rade. 2021. Available online: https://downloads.usda.library.cornell.edu/usda-esmis/files/tx31qh68h/gh93j0912/9s162713p/oilseeds.pdf (accessed on 15 December 2021).
  3. Vieira, C.C.; Chen, P. The numbers game of soybean breeding in the United States. Crop Breed. Appl. Biotechnol. 2021, 21, 387521–387531. [Google Scholar] [CrossRef]
  4. United States Department of Agriculture. World Agricultural Production. Circular Series. 2021; pp. 1–41. Available online: https://apps.fas.usda.gov/psdonline/circulars/production.pdf (accessed on 10 December 2021).
  5. Oerke, E.-C. Crop losses to pests. J. Agric. Sci. 2005, 144, 31–43. [Google Scholar] [CrossRef]
  6. Fickett, N.D.; Boerboom, C.M.; Stoltenberg, D.E. Soybean Yield Loss Potential Associated with Early-Season Weed Competition across 64 Site-Years. Weed Sci. 2013, 61, 500–507. [Google Scholar] [CrossRef]
  7. Soltani, N.; Dille, J.A.; Burke, I.C.; Everman, W.J.; VanGessel, M.J.; Davis, V.M.; Sikkema, P.H. Perspectives on Potential Soybean Yield Losses from Weeds in North America. Weed Technol. 2017, 31, 148–154. [Google Scholar] [CrossRef] [Green Version]
  8. Buhler, D.D.; Gunsolus, J.L.; Ralston, D.F. Integrated Weed Management Techniques to Reduce Herbicide Inputs in Soybean. Agron. J. 1992, 84, 973–978. [Google Scholar] [CrossRef]
  9. Herman, P.L.; Behrens, M.; Chakraborty, S.; Chrastil, B.M.; Barycki, J.; Weeks, D.P. A Three-component Dicamba O-Demethylase from Pseudomonas maltophilia, Strain DI-6. J. Biol. Chem. 2005, 280, 24759–24767. [Google Scholar] [CrossRef] [Green Version]
  10. Wang, C.; Glenn, K.C.; Kessenich, C.; Bell, E.; Burzio, L.A.; Koch, M.S.; Li, B.; Silvanovich, A. Safety assessment of dicamba mono-oxygenases that confer dicamba tolerance to various crops. Regul. Toxicol. Pharmacol. 2016, 81, 171–182. [Google Scholar] [CrossRef] [Green Version]
  11. Bayer CropScience LLC. Bayer Fuels Leading Market Positions in Crop Science through Delivery of Unmatched Innovation. 2021. Available online: https://media.bayer.com/baynews/baynews.nsf/id/9F6D3923DFF05C43C125877200331872?open&ref=irrefndcd (accessed on 17 December 2021).
  12. Bradley, K. A Final Report on Dicamba-Injured Soybean Acres. 2017. Available online: https://ipm.missouri.edu/IPCM/2017/10/final_report_dicamba_injured_soybean/ (accessed on 15 December 2021).
  13. Bradley, K. July 15 Dicamba Injury Update. Different Year, Same Questions. 2018. Available online: https://ipm.missouri.edu/IPCM/2018/7/July-15-Dicamba-injury-update-different-year-same-questions/ (accessed on 10 December 2021).
  14. Wechsler, S.; Smith, D.; McFadden, J.; Dodson, L.; Williamson, S. The Use of Genetically Engineered Dicamba-Tolerant Soybean Seeds Has Increased Quickly, Benefiting Adopters but Damaging Crops in Some Fields. 2019. Available online: https://www.ers.usda.gov/amber-waves/2019/october/the-use-of-genetically-engineered-dicamba-tolerant-soybean-seeds-has-increased-quickly-benefiting-adopters-but-damaging-crops-in-some-fields/ (accessed on 10 December 2021).
  15. Chism, B.; Tindall, K.; Orlowski, J. Dicamba Use on Genetically Modified Dicamba-Tolerant (DT) Cotton and Soybean: Incidents and Impacts to Users and Non-Users from Proposed Registrations; Environmental Protection Agency: Washington, DC, USA, 2020. [Google Scholar]
  16. Wagman, M.; Farruggia, F.; Odenkirchen, E.; Connolly, J. Dicamba DGA and BAPMA Salts—2020 Ecological Assessment of Dicamba Use on Dicamba-Tolerant (DT) Cotton and Soybean Including Effects Determinations for Feder-Ally Listed Threatened and Endangered Species; Environmental Protection Agency: Washington, DC, USA, 2020. [Google Scholar]
  17. Weidenhamer, J.D.; Triplett, G.B.; Sobotka, F.E. Dicamba Injury to Soybean. Agron. J. 1989, 81, 637–643. [Google Scholar] [CrossRef]
  18. Andersen, S.M.; Clay, S.A.; Wrage, L.J.; Matthees, D. Soybean Foliage Residues of Dicamba and 2,4-D and Correlation to Application Rates and Yield. Agron. J. 2004, 96, 750–760. [Google Scholar] [CrossRef]
  19. Kniss, A.R. Soybean Response to Dicamba: A Meta-Analysis. Weed Technol. 2018, 32, 507–512. [Google Scholar] [CrossRef]
  20. Kelley, K.B.; Wax, L.M.; Hager, A.G.; Riechers, D.E. Soybean response to plant growth regulator herbicides is affected by other postemergence herbicides. Weed Sci. 2005, 53, 101–112. [Google Scholar] [CrossRef]
  21. Griffin, J.L.; Bauerle, M.J.; Stephenson, D.O.; Miller, D.K.; Boudreaux, J.M. Soybean Response to Dicamba Applied at Vegetative and Reproductive Growth Stages. Weed Technol. 2013, 27, 696–703. [Google Scholar] [CrossRef]
  22. Robinson, A.P.; Simpson, D.M.; Johnson, W. Response of Glyphosate-Tolerant Soybean Yield Components to Dicamba Exposure. Weed Sci. 2013, 61, 526–536. [Google Scholar] [CrossRef]
  23. Egan, J.F.; Barlow, K.M.; Mortensen, D.A. A Meta-Analysis on the Effects of 2,4-D and Dicamba Drift on Soybean and Cotton. Weed Sci. 2014, 62, 193–206. [Google Scholar] [CrossRef]
  24. Solomon, C.B.; Bradley, K.W. Influence of Application Timings and Sublethal Rates of Synthetic Auxin Herbicides on Soybean. Weed Technol. 2014, 28, 454–464. [Google Scholar] [CrossRef]
  25. Soltani, N.; Nurse, R.E.; Sikkema, P.H. Response of glyphosate-resistant soybean to dicamba spray tank contamination during vegetative and reproductive growth stages. Can. J. Plant Sci. 2016, 96, 160–164. [Google Scholar] [CrossRef]
  26. Naik, H.S.; Zhang, J.; Lofquist, A.; Assefa, T.; Sarkar, S.; Ackerman, D.; Singh, A.; Singh, A.K.; Ganapathysubramanian, B. A real-time phenotyping framework using machine learning for plant stress severity rating in soybean. Plant Methods 2017, 13, 23. [Google Scholar] [CrossRef] [Green Version]
  27. Chawade, A.; Van Ham, J.; Blomquist, H.; Bagge, O.; Alexandersson, E.; Ortiz, R. High-Throughput Field-Phenotyping Tools for Plant Breeding and Precision Agriculture. Agronomy 2019, 9, 258. [Google Scholar] [CrossRef] [Green Version]
  28. Pinter, P.J., Jr.; Hatfield, J.L.; Schepers, J.S.; Barnes, E.M.; Moran, M.S.; Daughtry, C.S.T.; Upchurch, D.R. Remote Sensing for Crop Management. Photogramm. Eng. Remote Sens. 2003, 69, 647–664. [Google Scholar] [CrossRef] [Green Version]
  29. Huang, Y.; Thomson, S.; Lan, Y.; Maas, S. Multispectral imaging systems for airborne remote sensing to support agricultural production management. Int. J. Agric. Biol. Eng. 2010, 3, 50–62. [Google Scholar]
  30. Foster, M.R.; Griffin, J.L. Injury Criteria Associated with Soybean Exposure to Dicamba. Weed Technol. 2018, 32, 656–657. [Google Scholar] [CrossRef] [Green Version]
  31. Gracia-Romero, A.; Kefauver, S.C.; Vergara-Díaz, O.; Zaman-Allah, M.; Prasanna, B.M.; Cairns, J.; Araus, J.L. Comparative Performance of Ground vs. Aerially Assessed RGB and Multispectral Indices for Early-Growth Evaluation of Maize Performance under Phosphorus Fertilization. Front. Plant Sci. 2017, 8, 2004. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  32. Zhou, J.; Zhou, J.; Ye, H.; Ali, L.; Nguyen, H.T.; Chen, P. Classification of soybean leaf wilting due to drought stress using UAV-based imagery. Comput. Electron. Agric. 2020, 175, 105576. [Google Scholar] [CrossRef]
  33. Huang, Y.; Yuan, L.; Reddy, K.; Zhang, J. In-situ plant hyperspectral sensing for early detection of soybean injury from dicamba. Biosyst. Eng. 2016, 149, 51–59. [Google Scholar] [CrossRef]
  34. Xue, J.; Su, B. Significant remote sensing vegetation indices: A review of developments and applications. J. Sens. 2017, 2017, 1353691. [Google Scholar] [CrossRef] [Green Version]
  35. Zhang, J.; Huang, Y.; Reddy, K.N.; Wang, B. Assessing crop damage from dicamba on non-dicamba-tolerant soybean by hyperspectral imaging through machine learning. Pest Manag. Sci. 2019, 75, 3260–3272. [Google Scholar] [CrossRef]
  36. Abrantes, T.C.; Queiroz, A.R.S.; Lucio, F.R.; Júnior, C.W.M.; Kuplich, T.M.; Bredemeier, C.; Júnior, A.M. Assessing the effects of dicamba and 2,4 Dichlorophenoxyacetic acid (2,4D) on soybean through vegetation indices derived from Unmanned Aerial Vehicle (UAV) based RGB imagery. Int. J. Remote Sens. 2021, 42, 2740–2758. [Google Scholar] [CrossRef]
  37. Marques, M.; da Cunha, J.; Lemes, E. Dicamba Injury on Soybean Assessed Visually and with Spectral Vegetation Index. AgriEngineering 2021, 3, 240–250. [Google Scholar] [CrossRef]
  38. Ortiz, B.; Thomson, S.; Huang, Y.; Reddy, K.; Ding, W. Determination of differences in crop injury from aerial application of glyphosate using vegetation indices. Comput. Electron. Agric. 2011, 77, 204–213. [Google Scholar] [CrossRef]
  39. Yao, H.; Huang, Y.; Hruska, Z.; Thomson, S.J.; Reddy, K. Using vegetation index and modified derivative for early detection of soybean plant injury from glyphosate. Comput. Electron. Agric. 2012, 89, 145–157. [Google Scholar] [CrossRef]
  40. Huang, Y.; Reddy, K.N.; Thomson, S.J.; Yao, H. Assessment of soybean injury from glyphosate using airborne multispectral remote sensing. Pest Manag. Sci. 2014, 71, 545–552. [Google Scholar] [CrossRef] [PubMed]
  41. Weber, J.F.; Kunz, C.; Peteinatos, G.G.; Santel, H.-J.; Gerhards, R. Utilization of Chlorophyll Fluorescence Imaging Technology to Detect Plant Injury by Herbicides in Sugar Beet and Soybean. Weed Technol. 2017, 31, 523–535. [Google Scholar] [CrossRef]
  42. Louhaichi, M.; Borman, M.M.; Johnson, D.E. Spatially Located Platform and Aerial Photography for Documentation of Grazing Impacts on Wheat. Geocarto Int. 2001, 16, 65–70. [Google Scholar] [CrossRef]
  43. Hunt, E.R., Jr.; Daughtry, C.S.T.; Eitel, J.U.H.; Long, D.S. Remote Sensing Leaf Chlorophyll Content Using a Visible Band Index. Agron. J. 2011, 103, 1090–1099. [Google Scholar] [CrossRef] [Green Version]
  44. Chen, P.; Shannon, G.; Scaboo, A.; Crisel, M.; Smothers, S.; Clubb, M.; Selves, S.; Vieira, C.C.; Ali, L.; Mitchum, M.G.; et al. Registration of ‘S14-15146GT’ soybean, a high-yielding RR1 cultivar with high oil content and broad disease resistance and adaptation. J. Plant Regist. 2020, 14, 35–42. [Google Scholar] [CrossRef]
  45. Chen, P.; Shannon, G.; Scaboo, A.; Crisel, M.; Smothers, S.; Clubb, M.; Selves, S.; Vieira, C.C.; Ali, M.L.; Mitchum, M.G.; et al. Registration of ‘S13-2743C’ as a conventional soybean cultivar with high oil content, broad disease resistance, and high yield potential. J. Plant Regist. 2021, 15, 306–312. [Google Scholar] [CrossRef]
  46. Chen, P.; Shannon, G.; Scaboo, A.; Crisel, M.; Smothers, S.; Clubb, M.; Selves, S.; Vieira, C.C.; Ali, M.L.; Lee, D.; et al. Registration of ‘S13-3851C’ soybean as a high-yielding conventional cultivar with high oil content and broad disease resistance and adaptation. J. Plant Regist. 2021, 16, 21–28. [Google Scholar] [CrossRef]
  47. Fehr, W.R.; Caviness, C.E.; Burmood, D.T.; Pennington, J.S. Stage of Development Descriptions for Soybeans, Glycine Max (L.) Merrill 1. Crop Sci. 1971, 11, 929–931. [Google Scholar] [CrossRef]
  48. Zhou, J.; Yungbluth, D.; Vong, C.N.; Scaboo, A.; Zhou, J. Estimation of the Maturity Date of Soybean Breeding Lines Using UAV-Based Multispectral Imagery. Remote Sens. 2019, 11, 2075. [Google Scholar] [CrossRef] [Green Version]
  49. Whitaker, R.T. A Level-Set Approach to 3D Reconstruction from Range Data. Int. J. Comput. Vis. 1998, 29, 203–231. [Google Scholar] [CrossRef]
  50. Chan, T.F.; Vese, L.A. Active contours without edges. IEEE Trans. Image Process. 2001, 10, 266–277. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  51. Hall-Beyer, M. Practical guidelines for choosing GLCM textures to use in landscape classification tasks over a range of moderate spatial scales. Int. J. Remote Sens. 2017, 38, 1312–1338. [Google Scholar] [CrossRef]
  52. R Development Core Team. R: A Language and Environment for Statistical Computing; R Foundation for Statistical Computing: Vienna, Austria, 2013; Available online: https://www.R-project.org/ (accessed on 1 December 2021).
  53. Mendiburu, F. Agricolae: Statistical Procedures for Agricultural Research. Version 1.2-4. 2016. Available online: https://cran.r-project.org/web/packages/agricolae/index.html (accessed on 1 December 2021).
  54. Fritsch, S.; Guenther, F.; Wright, M.; Mueller, S. Neuralnet: Training of Neural Networks. R J. 2019, 2, 30. [Google Scholar]
  55. Liaw, A.; Wiener, M. Classification and Regression by random Forest. R News 2002, 2, 18–22. [Google Scholar]
  56. James, G.; Witten, D.; Hastie, T.; Tibshirani, R. An Introduction to Statistical Learning; Springer: New York, NY, USA, 2013; p. 426. ISBN 978-1-4614-7137-0. [Google Scholar] [CrossRef]
  57. Gitelson, A.A.; Merzlyak, M.N.; Zur, Y.; Stark, R.; Gritz, U. Non-destructive and remote sensing techniques for estimation of vegetation status. In Proceedings of the 3rd European Conference on Precision Agriculture, Montpelier, France, 18–20 June 2001; pp. 205–210. [Google Scholar]
  58. Gamon, J.A.; Serrano, L.; Surfus, J.S. The photochemical reflectance index: An optical indicator of photosynthetic radiation use efficiency across species, functional types, and nutrient levels. Oecologia 1997, 112, 492–501. [Google Scholar] [CrossRef]
  59. Belgiu, M.; Drăguţ, L. Random forest in remote sensing: A review of applications and future directions. ISPRS J. Photogramm. Remote Sens. 2016, 114, 24–31. [Google Scholar] [CrossRef]
  60. Han, T.; Jiang, D.; Zhao, Q.; Wang, L.; Yin, K. Comparison of random forest, artificial neural networks and support vector machine for intelligent diagnosis of rotating machinery. Trans. Inst. Meas. Control 2017, 40, 2681–2693. [Google Scholar] [CrossRef]
  61. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  62. Woebbecke, D.M.; Meyer, G.E.; Von Bargen, K.; Mortensen, D.A. Color Indices for Weed Identification Under Various Soil, Residue, and Lighting Conditions. Trans. ASAE 1995, 38, 259–269. [Google Scholar] [CrossRef]
  63. Viña, A.; Gitelson, A.A.; Nguy-Robertson, A.L.; Peng, Y. Comparison of different vegetation indices for the remote assessment of green leaf area index of crops. Remote Sens. Environ. 2011, 115, 3468–3478. [Google Scholar] [CrossRef]
  64. Wernecke, S.J.; D’Addario, L.R. Maximum Entropy Image Reconstruction. IEEE Trans. Comput. 1977, C-26, 351–364. [Google Scholar] [CrossRef]
  65. Gull, S.; Skilling, J. Maximum entropy method in image processing. IEE Proc. F Commun. Radar Signal Process. 1984, 131, 646–659. [Google Scholar] [CrossRef]
  66. Pal, N.; Pal, S. Entropy: A new definition and its applications. IEEE Trans. Syst. Man Cybern. 1991, 21, 1260–1270. [Google Scholar] [CrossRef] [Green Version]
  67. Tsai, D.-Y.; Lee, Y.; Matsuyama, E. Information Entropy Measure for Evaluation of Image Quality. J. Digit. Imaging 2007, 21, 338–347. [Google Scholar] [CrossRef] [Green Version]
  68. Berns, R.S. Billmeyer and Saltzman’s Principles of Color Technology, 4th ed.; John Wiley & Sons: Hoboken, NJ, USA, 2019. [Google Scholar]
  69. Grossmann, K. Auxin herbicides: Current status of mechanism and mode of action. Pest Manag. Sci. 2009, 66, 113–120. [Google Scholar] [CrossRef]
Figure 1. Ground-based classification scale of field plots to off-target dicamba damage with three differential phenotypic classes including tolerant, moderate, and susceptible. In this figure, the tolerant plot is the conventional breeding line S16-12774C, moderate is the conventional breeding line PR17-482, and susceptible is the GT commercial check AG 4135 (Monsanto Co., Creve Coeur, MO, USA).
Figure 1. Ground-based classification scale of field plots to off-target dicamba damage with three differential phenotypic classes including tolerant, moderate, and susceptible. In this figure, the tolerant plot is the conventional breeding line S16-12774C, moderate is the conventional breeding line PR17-482, and susceptible is the GT commercial check AG 4135 (Monsanto Co., Creve Coeur, MO, USA).
Remotesensing 14 01618 g001
Figure 2. Distribution of dicamba response classification in each location and combined across all locations. Overall, approximately 26.7% of the plots were classified as tolerant, 36.4% as moderate, and 36.9% as susceptible.
Figure 2. Distribution of dicamba response classification in each location and combined across all locations. Overall, approximately 26.7% of the plots were classified as tolerant, 36.4% as moderate, and 36.9% as susceptible.
Remotesensing 14 01618 g002
Figure 3. Standardized features distribution across all fields and dicamba response classes. Overall, significant differences among classes were identified for all features, of which higher values for canopy coverage, entropy, GLI, hue, Sa, and TGI indicate tolerance to dicamba, whereas higher values of contrast indicate susceptibility.
Figure 3. Standardized features distribution across all fields and dicamba response classes. Overall, significant differences among classes were identified for all features, of which higher values for canopy coverage, entropy, GLI, hue, Sa, and TGI indicate tolerance to dicamba, whereas higher values of contrast indicate susceptibility.
Remotesensing 14 01618 g003
Figure 4. Feature importance represented as the mean decrease Gini coefficient for the seven image features included in the RF model.
Figure 4. Feature importance represented as the mean decrease Gini coefficient for the seven image features included in the RF model.
Remotesensing 14 01618 g004
Table 1. Field trials conducted to develop a UAV-based classification model for off-target dicamba response.
Table 1. Field trials conducted to develop a UAV-based classification model for off-target dicamba response.
LocationTrial 1#Entries 2#Plots 3PlantingImagingDAP 4Visual ScoringDAP
Fld-61AYT21367004/17/202008/20/20201258/20/2020125
Fld-63AYT21367004/28/202009/08/20201339/9/2020134
Fld-81AYT21367204/18/202008/21/20201258/21/2020125
Fld-86Subset4814406/01/202009/15/20201069/14/2020105
Fld-1210Subset4814405/27/202009/14/20201109/14/2020110
1 Trial: “AYT” corresponds to the 2020 advanced yield trials and “Subset” corresponds to the selected group of soybean lines from the 2019 advanced yield trials based on extreme (tolerant and susceptible) response to off-target dicamba. 2 #Entries: Number of unique soybean lines visually and digitally phenotyped for off-target dicamba damage. The total number of entries among trials (261) exceeds 230 due to the overlapping of soybean lines between trials. 3 #Plots: Total number of plots visually and digitally phenotyped for off-target dicamba damage. Plot number does not necessarily equal unique entries x replications, due to replicated genotypes and/or deactivated plots. 4 DAP: Days after planting, number of days for data collection after planting.
Table 2. Summary and significance of seven image features across dicamba response classes.
Table 2. Summary and significance of seven image features across dicamba response classes.
Image FeatureTolerant 1ModerateSusceptibleField 2
Canopy Coverage0.616a0.191b−0.709cN.S
Contrast−0.151b−0.191b0.531aN.S
Entropy0.512a0.267b−0.856cN.S
GLI0.455a0.252b−0.798cN.S
Hue0.211a0.232a−0.654bN.S
Sa0.295a0.044b−0.222cN.S
TGI0.472a0.204b−0.686cN.S
1 Grouped letters represent significant distinction among dicamba response classes obtained through Tukey’s HSD test (0.05). 2 N.S., non-significant, indicates that the variable “Field” was not significant and therefore the observed value for each feature in each response class across all locations was not significantly different.
Table 3. Confusion matrix and model’s performance metrics for dicamba response classification using RGB-based image features and ANN classifier.
Table 3. Confusion matrix and model’s performance metrics for dicamba response classification using RGB-based image features and ANN classifier.
Dicamba ClassOverall 1Fld-61Fld-63Fld-81
TolModSusTolModSusTolModSusTolModSus
Tolerant12152050572990
Moderate5341068171222620140101913313
Susceptible7461202162507220921
Class Accuracy0.890.750.840.890.700.790.860.790.910.870.770.90
Precision0.410.770.710.000.740.580.360.820.760.500.810.70
Sensitivity0.170.870.630.000.850.490.200.910.650.320.880.62
Specificity0.970.540.910.970.390.890.950.490.960.950.480.95
Overall Accuracy 20.740.690.780.77
1 Overall is the combined analysis including Fld-61, Fld-63, Fld-81, Fld-86, and Fld-1210. 2 Overall accuracy is the average of five-fold cross-validation results.
Table 4. Confusion matrix and model’s performance metrics for dicamba response classification using RGB-based image features and RF classifier.
Table 4. Confusion matrix and model’s performance metrics for dicamba response classification using RGB-based image features and RF classifier.
Dicamba ClassOverall 1Fld-61Fld-63Fld-81
TolModSusTolModSusTolModSusTolModSus
Tolerant18110010400010
Moderate51420699114348137211412917
Susceptible340121018372113001636
Class Accuracy0.890.770.850.950.710.760.950.810.840.930.780.85
Precision0.620.780.740.000.860.520.290.930.590.000.880.68
Sensitivity0.250.890.640.000.730.671.000.830.700.000.810.69
Specificity0.980.540.920.960.660.780.950.770.880.930.690.89
Overall Accuracy 20.750.710.800.77
1 Overall is the combined analysis including Fld-61, Fld-63, Fld-81, Fld-86, and Fld-1210. 2 Overall accuracy is the average of five-fold cross-validation results.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Vieira, C.C.; Sarkar, S.; Tian, F.; Zhou, J.; Jarquin, D.; Nguyen, H.T.; Zhou, J.; Chen, P. Differentiate Soybean Response to Off-Target Dicamba Damage Based on UAV Imagery and Machine Learning. Remote Sens. 2022, 14, 1618. https://doi.org/10.3390/rs14071618

AMA Style

Vieira CC, Sarkar S, Tian F, Zhou J, Jarquin D, Nguyen HT, Zhou J, Chen P. Differentiate Soybean Response to Off-Target Dicamba Damage Based on UAV Imagery and Machine Learning. Remote Sensing. 2022; 14(7):1618. https://doi.org/10.3390/rs14071618

Chicago/Turabian Style

Vieira, Caio Canella, Shagor Sarkar, Fengkai Tian, Jing Zhou, Diego Jarquin, Henry T. Nguyen, Jianfeng Zhou, and Pengyin Chen. 2022. "Differentiate Soybean Response to Off-Target Dicamba Damage Based on UAV Imagery and Machine Learning" Remote Sensing 14, no. 7: 1618. https://doi.org/10.3390/rs14071618

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop