Figure 1.
Classification accuracy scores for the three models (multilayer perceptron (MLP), random forest (RF), and support vector machines (SVMs) based on selected features dataset (SF) and all features (AF) dataset for training.
Figure 1.
Classification accuracy scores for the three models (multilayer perceptron (MLP), random forest (RF), and support vector machines (SVMs) based on selected features dataset (SF) and all features (AF) dataset for training.
Figure 2.
Examples of glasshouse and field segmented plants using the proposed method and selected segmentation methods. (a) Original wheat image, (b) ExG wheat segmented image, (c) proposed method (MLP) segmented image, (d) original cowpea image, (e) ExG cowpea segmented image, and (f) MLP segmented image.
Figure 2.
Examples of glasshouse and field segmented plants using the proposed method and selected segmentation methods. (a) Original wheat image, (b) ExG wheat segmented image, (c) proposed method (MLP) segmented image, (d) original cowpea image, (e) ExG cowpea segmented image, and (f) MLP segmented image.
Figure 3.
Mean segmentation accuracy rate comparison for quality assessment of five different segmentation methods for glasshouse-based images; Qseg measures the segmentation consistency on a pixel-by-pixel basis, Sr measures the consistency of plant pixels between image regions, and Es measures the rate of pixel misclassification. These were applied on the five segmentation methods: multilayer perceptron (MLP), support vector machines (SVMs), random forest (RF), excess green (ExG), and excess green–red (ExGR).
Figure 3.
Mean segmentation accuracy rate comparison for quality assessment of five different segmentation methods for glasshouse-based images; Qseg measures the segmentation consistency on a pixel-by-pixel basis, Sr measures the consistency of plant pixels between image regions, and Es measures the rate of pixel misclassification. These were applied on the five segmentation methods: multilayer perceptron (MLP), support vector machines (SVMs), random forest (RF), excess green (ExG), and excess green–red (ExGR).
Figure 4.
Comparison of segmentation accuracy rate (Qseg-, Sr, Es) for quality assessment of five different segmentation methods; multilayer perceptron (MLP), support vector machines (SVMs), random forest (RF), excess green (ExG), and excess green–red (ExGR) for field-based images. Qseg measures the segmentation consistency on a pixel-by-pixel basis, Sr measures the consistency of plant pixels between image regions, and Es measures the rate of pixel misclassification. These were applied to the five segmentation methods.
Figure 4.
Comparison of segmentation accuracy rate (Qseg-, Sr, Es) for quality assessment of five different segmentation methods; multilayer perceptron (MLP), support vector machines (SVMs), random forest (RF), excess green (ExG), and excess green–red (ExGR) for field-based images. Qseg measures the segmentation consistency on a pixel-by-pixel basis, Sr measures the consistency of plant pixels between image regions, and Es measures the rate of pixel misclassification. These were applied to the five segmentation methods.
Figure 5.
Scatterplots of predicted and observed SPAD values for the glasshouse plants. MLP-GR, RF-GR, SVM-GR, ExGR-GR, and ExG-GR represent the glasshouse-based regression models for the multilayer perceptron, random forest, support vector machine, excess green, and excess green–red segmentation methods, respectively.
Figure 5.
Scatterplots of predicted and observed SPAD values for the glasshouse plants. MLP-GR, RF-GR, SVM-GR, ExGR-GR, and ExG-GR represent the glasshouse-based regression models for the multilayer perceptron, random forest, support vector machine, excess green, and excess green–red segmentation methods, respectively.
Figure 6.
Scatterplots of predicted and observed SPAD values for the field plants. MLP-FR, RF-FR, SVM-FR, ExGR-FR, and ExG-FR are the field-based regression models for the multilayer perceptron, random forest, support vector machine, excess green, and excess green–red segmentation methods, respectively.
Figure 6.
Scatterplots of predicted and observed SPAD values for the field plants. MLP-FR, RF-FR, SVM-FR, ExGR-FR, and ExG-FR are the field-based regression models for the multilayer perceptron, random forest, support vector machine, excess green, and excess green–red segmentation methods, respectively.
Figure 7.
Diagrammatic representation of proposed method for glasshouse and field image, using MLP, multilayer perceptron.
Figure 7.
Diagrammatic representation of proposed method for glasshouse and field image, using MLP, multilayer perceptron.
Figure 8.
Annotation of images into foreground and background patches for feature extraction. (a) Wheat image annotation and (b) Cowpea image annotation. The FG represents the foreground annotation, and the BG is the background annotation.
Figure 8.
Annotation of images into foreground and background patches for feature extraction. (a) Wheat image annotation and (b) Cowpea image annotation. The FG represents the foreground annotation, and the BG is the background annotation.
Figure 9.
Feature-selection process involving correlation analysis and feature ranking based on importance score. (a) is the heatmap of all extracted features and (b) is a plot of feature importance for feature selection. Abbreviations: RGB (R = red, G = green and B = blue channels); HSV (H = hue, S = saturation, and V = value); ybr (y = luma, b = blue component, and r = red component); Lab (L = lightness, and a and b = chromaticity); YUV (Y = luma or brightness, U = blue projection, and V = red projection); Luv (L = luminance, u = blue axis, and v = red axis); hls (h = hue, l = lightness, and s = saturation); and XYZ (X and Z = spectral weighting curves, and Y = luminance).
Figure 9.
Feature-selection process involving correlation analysis and feature ranking based on importance score. (a) is the heatmap of all extracted features and (b) is a plot of feature importance for feature selection. Abbreviations: RGB (R = red, G = green and B = blue channels); HSV (H = hue, S = saturation, and V = value); ybr (y = luma, b = blue component, and r = red component); Lab (L = lightness, and a and b = chromaticity); YUV (Y = luma or brightness, U = blue projection, and V = red projection); Luv (L = luminance, u = blue axis, and v = red axis); hls (h = hue, l = lightness, and s = saturation); and XYZ (X and Z = spectral weighting curves, and Y = luminance).
Figure 10.
Examples of images obtained from the glasshouse and field, segmented as reference images. (a) Original image, (b) manually segmented image, and (c) binary image. Reference images were randomly selected from the dataset with different nutrient content and illumination and at variable growth stages.
Figure 10.
Examples of images obtained from the glasshouse and field, segmented as reference images. (a) Original image, (b) manually segmented image, and (c) binary image. Reference images were randomly selected from the dataset with different nutrient content and illumination and at variable growth stages.
Table 1.
Classification model performance assessment. MLP, RF, and SVM are the multilayer perceptron, random forest, and support vector machine segmentation models, respectively.
Table 1.
Classification model performance assessment. MLP, RF, and SVM are the multilayer perceptron, random forest, and support vector machine segmentation models, respectively.
| All Features Dataset (AL) | Selected Features Dataset (SF) |
---|
Learning Model | Accuracy | F-Score | Recall | Computational Time (s) | Accuracy | F-Score | Recall | Computational Time (s) |
---|
MLP | 98.464 a | 96.142 a | 98.144 a | 175.098 a | 99.382 a | 97.423 a | 98.321 a | 113.521 a |
RF | 98.011 a | 95.331 b | 97.421 a | 191.450 b | 98.354 ab | 96.632 a | 97.931 a | 128.455 b |
SVM | 96.134 b | 92.402 c | 95.651 b | 560.560 c | 96.624 b | 92.822 b | 96.514 b | 321.502 c |
Table 2.
Correlation analysis of color vegetation indices (CVIs) and SPAD readings for cowpea. R, G, and B are the average red, green and blue values. The ExG, ExGR, MLP, SVM, and RF are the excess green, excess green–red, multilayer perceptron, support vector machines, and random forest segmentation methods, respectively.
Table 2.
Correlation analysis of color vegetation indices (CVIs) and SPAD readings for cowpea. R, G, and B are the average red, green and blue values. The ExG, ExGR, MLP, SVM, and RF are the excess green, excess green–red, multilayer perceptron, support vector machines, and random forest segmentation methods, respectively.
CVI | Mathematical Expression | Correlation Coefficient (r) |
---|
Segmentation Method |
---|
ExG | ExGR | MLP | SVM | RF |
---|
Green | - | 0.420 | 0.291 | 0.554 | 0.418 | 0.459 |
Blue | - | −0.228 | −0.424 | −0.682 | −0.562 | −0.605 |
ExG [21] | | 0.742 | 0.763 | 0.893 | 0.824 | 0.838 |
ExR [7] | | 0.692 | 0.822 | 0.885 | 0.745 | 0.820 |
CIVE [22] | | 0.670 | 0.745 | 0.924 | 0.787 | 0.822 |
ERI [23] | | −0.551 | −0.568 | −0.661 | −0.517 | −0.642 |
DGCI [24] | | 0.672 | 0.759 | 0.948 | 0.821 | 0.850 |
GR | | −0.585 | −0.721 | −0.842 | −0.766 | −0.671 |
COM1 [17] | | 0.623 | 0.748 | 0.901 | 0.792 | 0.782 |
GBRG [21] | | 0.689 | 0.824 | 0.870 | 0.844 | 0.802 |
EGI [23] | | 0.231 | 0.425 | 0.464 | 0.382 | 0.295 |
Table 3.
Performance of the regression models during training and testing for glasshouse data. MLP-GR, RF-GR, SVM-FR, ExGR-GR, and ExG-GR are the glasshouse-based regression models for the multilayer perceptron, random forest, support vector machine, excess green, and excess green–red segmentation methods, respectively.
Table 3.
Performance of the regression models during training and testing for glasshouse data. MLP-GR, RF-GR, SVM-FR, ExGR-GR, and ExG-GR are the glasshouse-based regression models for the multilayer perceptron, random forest, support vector machine, excess green, and excess green–red segmentation methods, respectively.
Model | Training Set | Testing Set |
---|
R2 | RMSE | MAE | R2 | RMSE | MAE |
---|
MLP-GR | 0.942 | 2.152 | 2.120 | 0.904 | 2.682 | 2.931 |
RF-GR | 0.881 | 3.885 | 2.581 | 0.837 | 3.958 | 3.780 |
SVM-GR | 0.864 | 4.255 | 3.601 | 0.826 | 3.620 | 3.881 |
ExGR-GR | 0.790 | 6.102 | 4.512 | 0.747 | 5.790 | 4.910 |
ExG-GR | 0.772 | 6.881 | 5.155 | 0.701 | 7.151 | 6.851 |
Table 4.
Correlation analysis of color vegetation indices (CVIs) and measured SPAD readings for wheat. R, G, B, H, S, B are the average red, green, blue, hue, saturation, and brightness values respectively. MLP, RF, SVM, ExGR, and ExG are the segmentation methods using multilayer perceptron, random forest, support vector machine, excess green, and excess green–red segmentation methods, respectively.
Table 4.
Correlation analysis of color vegetation indices (CVIs) and measured SPAD readings for wheat. R, G, B, H, S, B are the average red, green, blue, hue, saturation, and brightness values respectively. MLP, RF, SVM, ExGR, and ExG are the segmentation methods using multilayer perceptron, random forest, support vector machine, excess green, and excess green–red segmentation methods, respectively.
CVI | Mathematical Expression | Correlation Coefficient(r) |
---|
Segmentation Method |
---|
ExG | ExGR | MLP | SVM | RF |
---|
Green | - | 0.242 | 0.212 | 0.443 | 0.38 | 0.291 |
Blue | - | −0.377 | −0.344 | −0.618 | −0.522 | −0.552 |
ExG [21] | | 0.704 | 0.726 | 0.927 | 0.841 | 0.877 |
ExR [7] | | 0.675 | 0.722 | 0.845 | 0.751 | 0.803 |
CIVE [22] | | 0.601 | −0.652 | −0.842 | −0.773 | −0.782 |
ERI [23] | | −0.407 | −0.476 | −0.681 | −0.572 | −0.644 |
DGCI [24] | | 0.623 | 0.690 | 0.881 | 0.791 | 0.747 |
GR | | −0.551 | −0.606 | −0.740 | −0.657 | −0.671 |
COM1 [17] | | 0.532 | 0.682 | 0.813 | 0.722 | 0.78 |
GBRG [21] | | 0.690 | 0.743 | 0.904 | 0.781 | 0.811 |
EGI [23] | | 0.111 | 0.251 | 0.412 | 0.322 | 0.352 |
Table 5.
Performance evaluation of the regression models. MLP-FR, RF-FR, SVM-FR, ExGR-FR, and ExG-FR represent the field-based regression models for the multilayer perceptron, random forest, support vector machine, excess green, and excess green–red segmentation methods, respectively.
Table 5.
Performance evaluation of the regression models. MLP-FR, RF-FR, SVM-FR, ExGR-FR, and ExG-FR represent the field-based regression models for the multilayer perceptron, random forest, support vector machine, excess green, and excess green–red segmentation methods, respectively.
Model | Training Set | Testing Set |
---|
R2 | RMSE | MAE | R2 | RMSE | MAE |
---|
MLP-FR | 0.893 | 3.25 | 2.52 | 0.821 | 3.680 | 3.233 |
RF-FR | 0.815 | 4.05 | 2.78 | 0.775 | 3.785 | 3.781 |
SVM-FR | 0.791 | 5.10 | 3.90 | 0.747 | 7.903 | 4.252 |
ExGR-FR | 0.670 | 7.88 | 5.12 | 0.621 | 8.202 | 6.102 |
ExG-FR | 0.642 | 9.85 | 7.55 | 0.521 | 13.155 | 11.851 |
Table 6.
Optimal hyperparameters for the different classification models.
Table 6.
Optimal hyperparameters for the different classification models.
Hyper Parameters | Model |
---|
MLP | SVM | RF |
---|
Hidden layers | 1 | - | - |
Neurons | 20 | - | - |
Activation function | ReLU | - | - |
Kernel | - | RBF | - |
C | - | 0.001 | - |
γ | - | 1 | - |
n_estimator | - | - | 200 |
max_depth | - | - | 20 |
min_samples_leaf | - | - | 4 |